[ Content | Sidebar ]

netflix

October 8th, 2006

I signed up for Netflix today. It fits in well with my current philosophies; the reason why I’d been holding off was that Liesl and I don’t watch movies very often, and we have a fair number of unwatched movies around. So we wanted to go through the backlog first.

But it’s becoming clear that much of that should simply be treated as sunk cost: maybe we’ll watch them, maybe we won’t, but we shouldn’t let it stand in the way of watching movies that we’re more interested in. So I bought a copy of the first volume of Haibane Renmei last week; after buying that, I realized that I wasn’t sure that I’d want to watch it over and over again, so why did I buy it instead of renting it? Oops. Having said that, we did enjoy it, and I could imagine buying the later volumes at some point, but for now signing up for Netflix and renting the rest of the volumes sounds like a better idea to me – for the cost of the remaining volumes, I could get seven months of a 1-DVD-at-a-time-Netflix subscription instead. (There are also a few other anime series that I’d like to dip into without spending hundreds of dollars.)

We’ll see how it goes, but I’m optimistic; and if that causes us to shift more of our free time to watching movies, that wouldn’t be a bad thing. I’m curious how their series management works; it looks like you can sign up for an entire series at once, and I hope that, if I do that, it will be smart enough to make sure to always send me volumes in order. So if, say, I’ve finished volume 1, and volume 2 is unavailable but 3 is available, it will wait instead of sending me volume 3.

throughput and latency

October 8th, 2006

I’ve been kind of obsessed with the theory of constraints recently, which has gotten me wondering about bottlenecks. One of their points is that, in general, a system has a single bottleneck; you should do everything you can to make that bottleneck as productive as possible.

For a simple bottleneck example, say you have a linear, five-step production process. So there are steps A through E, with A’s output being B’s input, and E’s output being the final product. Assume they can produce at the following rate (assuming the previous step has completed): A can produce 10 per day, B can produce 5 per day, C can produce 10 per day, D can produce 3 per day, and E can produce 10 per day.

If you look at where work is piling up, both steps B and D might look like bottlenecks: A can produce tons of stuff, so step A might get annoyed that B can’t consume its output fast enough. Looked at from the whole system point of view, though, only step D is a bottleneck: no matter how fast the other steps get, the whole system can’t produce more than 3 per day.

This example is very simple, but actually a lot of its similarity is irrelevant for this analysis: even if the paths branch and join in interesting ways, ultimately the rate of production of the whole system is limited by its least productive step. (For more complex graphs, though, the actions you take get more interesting – for example, typically there’s some day-to-day variation in the productivity of any given step, and you have to pay more attention to that on the steps feeding the bottleneck than elsewhere. But that is a discussion for another day.)

This, however, doesn’t mean that you can’t get a lot of good from improving the non-bottleneck steps. The first level of analysis is simply to look to make sure that the non-bottlenecks aren’t stealing resources from the bottlenecks. But, aside from that, the above discussion is about throughput; in many situations, latency can be just as important, or even more important.

So, in the above example, consider step C. It can produce at a rate of 10 per day. Maybe that means that it can carry out its step for 10 items in parallel every day; maybe that means that it can carry out its step for 1 item every .1 days. (Or maybe it can carry out its step for 100 items every 10 days.) From a throughput point of view, these are all the same; from a latency point of view, though, doing it for 1 item every .1 days is the best: compared to the 10 items in parallel version, that reduces the time from when raw materials enter the door to when a finished product leaves the door by .9 days. Which is great.

In fact, from the point of view of the whole solution, even a version of C where it could carry out its step for 1 item every .2 days would be better than a version that could carry out its step for 10 items every day. The latency of that step would go from 1 day to .2 days, which is good. The throughput of that step would go from 10 per day to 5 per day; that looks bad, except the throughput of the whole system is 3 per day, so that reduction in throughput is irrelevant.

Which doesn’t mean that you should go around reducing throughput willy-nilly: you’d better make quite sure you know you’re not the bottleneck before doing that! Fortunately, one lesson of lean and agile is that, if you work at it, you can increase latency without decreasing throughput. (Single Minute Exchange of Dies, as opposed to taking two hours to change dies.) Look for waste, find ways to reduce or eliminate it, and you’ll be happy.

It’s not at all clear to me how to apply this at work: I don’t know where the bottleneck is, and our process isn’t even well-enough defined for me to be confident that there isn’t something big that this analysis is missing. One tool to give clarity on this matter is a value stream map; I’d like to try producing one of those some time over the next week or two. If done right, that will help identify potential bottlenecks, and will probably help with ideas for reducing latency. The other tool is waste reduction; the value stream map will help there, too, and I should go through the traditional lean categories of waste to see what examples I can come up in my organization.

new hard drive

October 7th, 2006

I was given a second hard drive for my computer recently. Some random thoughts:

  • Annoying to have to buy a bracket to mount it in the computer, and even more annoying that Sun doesn’t want to sell me one directly; fortunately, it’s easy enough to get a used one for cheap.
  • Having said that, once I got the bracket, it mounted very easily: no need to fiddle with cables at all. And the box is very easy to open up.
  • You would think that Linux would be well-enough evolved to either pop up some message saying “you have a new disk, what do you want to do with it?” or, at the very least, have an appropriate item in a menu to deal with that. Not so, at least on Fedora Core 5.
  • What they do have is a menu item for logical volume management. This does everything I want, once I type in the magic command # pvcreate /dev/sdb. (Your device name may vary.)
  • LVM is pretty cool; further evidence that there’s nothing that an extra level of indirection won’t solve. I’m still not sure what the best configuration is for my situation; for now, I’ve got a separate volume group on each drive (so that I know what I lose if a drive fails). The original drive has one group with two volumes: one for swap and one for the standard directories (/usr, /home, …) on it; the new drive, for now, has one group with one volume, /backup. It’s nice to know that I can easily, say, increase the swap size if I should want to do so, or divide up the new disk once I think of useful ways to use up all that space. For now, I’m moving more backup stuff there and increasing the amount of stuff I backup; another idea might be to create a second volume there and RAID0 that volume with my main volume on the first disk. (Right now, I’m rsyncing my home directory nightly, which gets me most of the benefits of the RAID0 solution.)

benefits of slack

October 7th, 2006

After some discussions on the leandevelopment list, it would seem that slack has has more benefits than I realized. My current list:

  1. If you’re not the bottleneck in your process, adding slack won’t decrease throughput and may well increase it (since it makes it easier for you to avoid stealing resources from the bottleneck).
  2. Leaving a little slack makes your schedule more predictable: you can work harder to make a date if that should prove to be necessary.
  3. It gives people time to recharge.
  4. It gives people time to experiment, to try to find improvements that they might not have found if their nose was constantly at the grindstone.

One nice thing about lists like this is that you can turn them around to find places where you should be wary of applying it to your situation. In particular:

The first point suggests that, if you are the bottleneck, slack isn’t a particularly good idea: the bottleneck controls throughput, so the less the bottleneck works, the lower the throughput. And, even if you aren’t the bottleneck, you shouldn’t stick in so much slack as to become the bottleneck. (And you should consider spending some of your slack time figuring out how to help the bottleneck.) Exercise for the reader (actually, an exercise for the writer): what’s the bottleneck in your organization?

The second point suggests that you should have enough control over your schedule to know when you should eat into your slack. Mary Poppendieck mentioned a company that had been working on an 8-week cycle, with 6 weeks development and 2 weeks testing. They got the 2 weeks of testing down to one week; they turned the second week into slack intstead of adding it to development. That way, if problems show up after release, people can jump on them immediately. (And people are motivated to make sure problems don’t show up after release!)

The third point is a reminder that, if necessary, people will create their own slack whether you like it or not; if you try to avoid that, your reward will be burned-out employees. The flip side is that you don’t need vast amounts of slack to get this benefit.

And the fourth point is the most interesting one, and the one whose benefits are potentially largest and hardest to predict. Which means that I have nothing coherent to say about it.

how to improve?

October 5th, 2006

I am currently awash in confusion about how we (my team, but also everybody working on the same product) should improve. Tough stuff; I hope I’ll have something more coherent to say here soon. Fortunately, the good folks on the leandevelopment mailing list are helping me sort through my difficulties.

The fact that I’m so confused suggests that a top priority should be getting more visibility into the situation. If I’m taking a lean point of view, we need to eliminate waste, which means that we should make that more visible. (Red cards? Probably start by just listing forms and manifestations of waste.) If I’m taking a theory of constraints point of view, we need to focus on bottlenecks, which means we should make that more visible. (Maybe create a value stream map?)

The latter is where I started: how do I figure out if my team is the bottleneck? (Is there really only just one?) I have no idea; I can see the card piling up before my team, but I don’t really know what happens after the card exits my team. So I can’t tell where work is piling up.

I’ll try to find time over the next few days at work to create some wiki pages on waste and bottlenecks. One fun thing about work is that enough people are subscribed to the wiki notifications that, if you’re thinking about something, you can just create a wiki page on the subject and you’ll attract some random, insightful questions or comments.

Now does seem like a good time to think about this. On the one hand, there is reason to believe that executing efficiently now could be particularly useful. And on the other hand, my boss has too many direct reports, so some sort of reorganization there may happen over the next month or two. So, if I could actually think of something useful to inform the reorganization, with solid reasoning behind it, I might be able to have an effect.

One of the times when I’m happiest is when I’m wandering around in a confused daze. (Don’t get me wrong, figuring things out and getting things done has its pleasures as well.) But I do need to get some concrete outputs from my wandering.

august 2, 1961

October 3rd, 2006

The two titles I was considering for this post are both military analogies. Sigh. So I will go with the title from the section in the book.

From How Children Learn, pp. 36–37:

The other day we went to Carlsbad Caverns, a strange and beautiful place. To get there, we rode many hours in the car. On the way, we played games. The radio was on, and with Lisa [2 years old – DBC] watching, I began to clap my hands in time to the music. She did the same. Then I began to clap one palm against the other fist. She watched a while, then made both her hands into fists, clapped together a bit, looked again, saw this wasn’t right, and soon did what I was doing. From this grew a whole series of games. I clapped hand against head; so did she. I clapped hand against stomach; so did she. I made my games more complicated. I clapped head with one hand and stomach with another; or clapped head with one hand while holding that elbow with the other, and so on. It was most interesting to see how she copied what I was doing. Each time she began by doing something fairly quickly. As she did it, she checked what she was doing against what I was doing. Then she made a change in what she was doing, checked again, and so went on until she was satisfied that what we were both doing was the same. Watching her do this, I was struck by two things. First, she did not feel that she had to get everyting right before she started to do anything. She was willing—no, more than willing, eager—to begin by doing something, and then think about fixing it up. Secondly, she was not satisfied with incorrect imitations, but kept on looking and comparing until she was satisfied that she was correct—which she almost always was.

feeling quiet

September 30th, 2006

I would seem to be in a quiet mood these days. Not feeling much like blogging, not feeling much like programming at home. Maybe because I’ve been programming a fair amount at work; I was worried that, with the new larger group, I’d have almost no programming time, but now that things have settled down (pleasantly!), that is fortunately not the case. (Incidentally, H.264 is charmingly eccentric. Or something.)

Part of the reason, too, is that Okami is tiptop stunning excellent. So I spent all of last weekend playing it, several evenings playing it, and am doing pretty well this weekend so far. I don’t think I’ll quite finish it this weekend, but next weekend certainly. So I guess the game isn’t going to fill the gap until the Wii launch after all; what next? Lego Star Wars II?

Another possibility: I could just not play video games for a month and a half. I would seem to be in a bookish mood these days; or I could spend more time programming. Or spend more timing thinking about stuff and writing about stuff. (Combined with the bookish bit above.)

The latter is increasingly attractive. My thoughts on some of the matters that I’ve been obsessing on over the last few years are starting to settle down. And I’m being reminded (e.g. by helping out with the PACT Parent Ed classes) that there’s stuff that I used to spend a lot of time thinking about that I haven’t recently revisited. So maybe it’s time to, say, go through the complete works of Alfie Kohn (who has a new book out, I should read it) and John Holt and see what, if any, points of contact they have with what I’ve been thinking about recently. Or maybe I should try to actually put some thinking tools into action. Or maybe I should spend every waking hour reading about lean. Or maybe I should spend time thinking about whether my actions are congruent with my stated beliefs and, if not, why not.

Or maybe I should play video games. That would certainly be easier…

backup woes

September 27th, 2006

By the end of the weekend, I’d copied enough of my CD’s that my laptop’s hard drive was beginning to fill up, so I thought it was about time to start backing them up. So I grabbed one of the USB drives that I bought for the purpose. (Half a year ago, when I bought the computer – this was before I got Just in Time religion.)

The computer didn’t recognize the drive. Oops.

I brought it upstairs to the Linux machine; it didn’t like the drive either. After playing around with it for half an hour, the Linux machine seemed happy with it, but the Mac didn’t like it. The other drive I bought at the time still worked, so I didn’t think it was a usage problem: the drive was bad.

So the drive went into the trash. But, actually, this is a blessing in disguise, because it pointed out a whopping big hole in my backup strategy. One that I was vaguely aware of, but had previously been able to ignore by sticking my fingers in my ears and saying “la la la I can’t hear me”. Namely that a single backup isn’t any good: you want lots of the suckers. Or at least two.

I’d originally chosen USB hard drives over optical media because I was doing this for the purpose of backing up degrading optical media, and I didn’t have any confidence that the backup optical media would last any better than the original. But obviously the hard drives had problems; and if one of them were to really fail, it could wipe out a big chunk of my collection. (Hmm: buy a bunch of USB drives, RAID em, and stick them in the safety deposit box? Nah.)

So I think what I’ll do now is back them up to both DVD-Rs and hard drives. Hopefully they won’t both fail simultaneously, and the worst thing that can realistically happen is that my house burns down, the hard drives fail completely and a handful of DVDs fail. If that happens, I’ll still have 95% of my CD collection to comfort me while sitting in the ashes of my house, and the missing 5% will be the least of my worries. Hopefully this won’t be too labor intensive; presumably the Mac comes with some sort of dead-easy DVD burning software. And I’ll keep MD5s of everything around, so I can tell which backup is correct if I get bit rot on one of them.

And I’ll revisit the issue in another decade or so, at which point it will cost next to nothing to back up everything on, say, flash USB storage and a couple of remote hosts providing reliable backup service. We’re only talking 300GB or so, after all. (At which point I’ll be worrying about backing up my DVD collection, I suppose.)

Silly me, though: I really should know better than to make a mistake like that.

no more yosha

September 23rd, 2006

As regular readers are aware, Yosha hadn’t been at his best for a while. A couple of weeks ago, though, he started getting even more lethargic, and pretty much stopped eating. We tried giving him food that actually tastes good; that helped for a while. But then it stopped helping, and he was having a hard time standing up. And was spending even more time asleep. And Zippy was really worried, and was grooming him a lot.

By last Monday or Tuesday, it was clear that the end was near, and while he didn’t seem to be in too much pain, things could get a lot worse any time. So we made an appointment to have him put down in the middle of the day on Friday. (Friday had the advantage that we could stay home with Zippy for three days in a row, to help him make a transition.)

As it turned out, he didn’t quite make it until then: he died Friday morning. In bed, quickly, with Liesl, Zippy, and me all right there.

It will be hard on Zippy – he has basically never ever been apart from Yosha. (Together at the vet, together getting haircuts, together at the kennel, …) Having said that, he seems to be handling it quite well: no despondent howling, no wandering around wondering what’s going on. He knew Yosha was sick, he was there when he died. (And groomed him for a few minutes afterwards, either as one last try to help or as a way to say goodbye.) He basically hasn’t left my side since it happens, but I think he’ll do okay.

It’s hard on the rest of us, too. I’m honestly not sure if I’ve spent more time with Liesl or with Yosha – Liesl and I spend time together out of the house without Yosha, of course, but Yosha and I spent time together in the house without Liesl, especially when I was in grad school.

Having said that, I suspect that most of the grieving went on in the two weeks before he died. It wasn’t a shock that something like this would happen eventually; we’d hoped it wouldn’t come quite soon, but we had enough warning to spend a little more time with him saying goodbye, cuddling with him even more more than normal. And it was very clear by the end that his time had come.

No, we aren’t planning to get another dog soon. For one thing, it has been my position for years that I’m going to need some dog-free time at some point. (Not that we won’t get dogs eventually, I’m just going to want a break for a few years there.) For another thing, I don’t think we’d be doing Zippy any favors by that: he’ll miss Yosha, but he’s 12 years old, and I don’t think he has enough energy to really enjoy playing with a puppy.

mii

September 17th, 2006

As a Nintendo fanboy, I must of course comment on the Wii launch news. Packing in Wii Sports makes sense for their market-broadening strategy; wouldn’t it have helped if they could have included a second wiimote, though? The controller prices are exorbitant. I wish I didn’t have to wait until November 19th.

The news that surprised me the most, though: the Mii Channel. When you turn on the system, you can do various things other than play games; one of those is create an avatar of yourself, called a “mii”. Here are some movies; the avatar editor looks pleasantly accessible, enough so that, when I buy my Wii, the first thing I’ll do will probably be create a mii instead of, say playing a game.

And this avatar can be used in games. (Not all, of course, just those where it makes sense.) So all of a sudden, the low-quality graphics in Wii Sports make sense. As Scott McCloud has taught us, sometimes you can identify better with less detailed representations of people than with high-fidelity ones; a tennis game sounds more fun with four cartoony representations of the people playing. I’m sure the Wii version of Animal Crossing will use this as well, and this will encourage the development of other friendly titles in new genres targeted at non-traditional gamers.

civ 4

September 17th, 2006

Having added extra memory to my Mac (1.5GB, instead of .5GB), and having finished the other games I was in the middle of, I went out and bought a copy of Civilization 4 on the Saturday of labor day weekend. I opened the box, gazed with pleasure upon the technology tree diagram, popped it in, bumped up the graphics settings, and began the tutorial.

Two turns later, the game crashed. For lack of a better idea, I bumped down the graphics settings somewhat; this time, it lasted a good twenty or so turns before crashing. So I bumped it down still more; it finally let me finish the tutorial.

This is one of the reasons why I almost exclusively play games on consoles. Computer game manufacturers are apparently perfectly happy to release crap that wouldn’t make it through even the most basic of playtesting: that’s what patches are for, right? Well, no: I buy a game because I want to play it right now, not because I want to play it in a few months when most of the bugs are ironed out. To be sure, the developer’s job is made more difficult by the various possible system configurations that people might try to run the game on.

Despite that unpleasantness, I gave the game a go. It seemed stable enough, and was quite playable at the lowest graphics settings, so I left it alone – I don’t know for sure if the graphics settings were the trigger, but I really wasn’t up for determining the stability boundaries. And the game proved to be as excellent and as addictive as its predecessors. I stayed up far too late that night with my first game, and woke up early the next morning to finish the game; I then stayed up far too late Sunday night with my second game, and spent Monday morning finishing it.

As is common in its genre, it is addictive out of proportion to its quality. The gameplay is turn-based, and there’s always something little going on to pull you to play just one more turn. Also, there are never any large events – ends of levels, reaching a rare save point, etc. – to give you a push to stop playing. Fortunately, unlike some games in the genre, the gameplay really is excellent. And pushes some of my buttons – I really do like the whole city- / world-building thing, and exploring unknown territory.

So if you’ve never played any games in the series, do yourself a favor and go out and buy a copy: you’re in for a treat. If you have played other games in the series, you know what to expect; this one has more of the same, with a few tweaks (for the better).

Having said that, after spending a weekend playing it, I had a dilemma. On the one hand, I quite enjoyed the game, and would continue doing so for a few more plays. On the other hand, I’ve seen the ideas before, there are a few ways in which it’s not my complete fave, so staying up until midnight playing it isn’t the best idea.

If I could have continued playing it in moderation, I would have done so. But it’s not clear that I’m capable of doing so. So, for better or for worse, I didn’t play it last weekend, I didn’t play it this weekend, and Okami is about to come out, so I’ve decided to go cold turkey on the game. The upshot: it’s the best game that I’ve stopped playing after three days.

One way in which it isn’t the best fit for me: I don’t really enjoy games with its sort of combat model. I’m happy to play, say, turn-based strategy games, as long as I have complete information. But once concepts like fog of war get thrown into the mix, I don’t enjoy the combat nearly as much. In Civilization, you fairly early on reach situations where, in order to expand, you need to take over at least some of your neighbors’ territory. And you don’t really know what you’re up against; also, in general, you have to worry about your neighbors invading you, where you also don’t really know what you’re up against. I can deal with this sort of thing, it’s just not my fave.

Actually, what I really like about this game has nothing at all to do with combat: I like building cities, building an empire, developing land, researching technology, and the like. So maybe the lesson here is that I should spend more time with games that focus on that sort of thing: I should give the latest Sim City another look, or recent Harvest Moon games, or something. Hmm. We’ll see how Spore works for me, when it eventually comes out.

backing up cd collection

September 17th, 2006

I’ve finally started backing up my CD collection, only half a year after I planned to do this. The goals are:

  1. To have lossless backups, including offsite ones, of the entire collection.
  2. To do whatever error correction is possible when creating the backups.

(The vast majority of the CDs are between 10 and 20 years old, and are showing their age.)

If it weren’t for the second point, I would backup the CDs by doing dd if=/dev/cdrom of=.... But I don’t trust that to produce as clear a copy as possible.

Digging around, there seem to be various tools which specialize on doing good copies: cdparanoia for Linux, Max, for MacOs, and Exact Audio Copy, for Windows. Since my CD collection is located in the room where my Mac is, Max seems like the way to go.

Unfortunately, this is where my naivete about just what is on an audio CD begins to show. My naive method produces one big file for the entire disk. Max, however, wants to generate a collection of files, one per track. My understanding is that, if I tell Max to generate WAV files, then that’s more or less the same data as the naive method (on a disk where reads are error-free). But is it exactly the same data (modulo trivial packaging), or is there some sort of extra data that I’m missing? I know CD’s contain almost no metadata, but I’m worried about stuff like gaps between tracks – iTunes has a habit of screwing that sort of thing when ripping CDs (though they claim to have improved that with the latest release), and I have several CDs (e.g. operas) where that is quite annoying, so I’m a bit worried that I’m missing something there.

Does anybody know of a good FAQ on these matters? I’m having a surprisingly hard time piecing together the information that I’m looking for: what exactly is on an audio CD, what the relationship is between that and a collection of WAV files.

Anyways, I’m glad I’ve gotten started. It may take months for me to actually finish the process, but it should be a relatively mindless one from now on. (Assuming that a collection of WAV files proves sufficient.)

a/v formats

September 16th, 2006

What are good formats to use for purchasing and storing music and movies? I remember a time in the past where it was possible to pretend that Ogg Vorbis was a reasonable choice for many of your audio needs; that is, unfortunately, no longer the case.

Desirable qualities for a format, in no particular order:

  • Quality should be as high as possible.
  • File size should be as small as possible.
  • The standard should be open.
  • The standard should be unencumbered.
  • There should be a wide range of software tools (including free ones and best-of-breed ones) to play the format, on all platforms I use.
  • There should be a wide range of hardware devices (including best-of-breed ones) to play the format, in all locations where I’d like to use it (home, car, bus, jogging, …).
  • There should be a wide range of software tools (including free ones and best-of-breed ones) to convert to/from the format.
  • It should be possible to easily purchase content in this format.
  • It should be possible to easily find free content in this format.
  • I should be able to easily copy, excerpt, etc. the content for the forseeable future.
  • The format should support adding metadata (titles, composer, performer, album art, …)
  • The format should support aggregation (e.g. a podcast with multiple pieces of music should be a single file with indexing).

These are, of course, incompatible goals, but never mind that. Given the above, what are the conclusions?

One is that DRM is to be avoided: it fails on the copy/exceprt criteria, and probably also fails on open standard and free tools criteria. Having said that, it’s almost impossible to purchase video without some sort of basic DRM being involved. So, in some circumstances, I guess I can live with DRM, as long as tools for cracking it are widely available. Which is, fortunately, the case for DVD’s, though I’m somewhat worried that hardware might start getting in the way there. (For example, my understand is that many current DVD drives force you to respect region encoding.)

What are suitable audio formats? The aforementioned Ogg Vorbis unfortunately doesn’t look so hot to me. The big reason is that best-of-breed hardware devices don’t play it: I can’t use it on my iPod. There’s very little content available in the format. And it’s lossy, so I can’t store content in it and convert it to other formats as necessary.

Also, its main advantage over MP3s is that it’s unencumbered by patents. The problem is, in this crazy day and age, I don’t know if I can even be sure of that. There are so many overbroad software patents being granted that I can’t be confident that any decent a/v compression format couldn’t be attacked by patent trolls.

So, basically, it only helps me if I want to burn a CD and play it on a computer using particularly purist tools. And that’s a situation where I never find myself. The only Ogg Vorbis files I had on my computer were from Lambda Expressway, but now I see it’s available as MP3s, so that won’t be necessary any more.

So, in practice, there seem to be three obvious candidates for music: Audio CDs, MP3s, and AAC files. Audio CDs are widely available for purchase, playable in lots of ways, as high quality as is easily available, easily convertible to other formats. The downside is that file sizes are somewhat large, and they’re lacking in the metadata department.

MP3s are quite widely available, too. File sizes are smaller; quality is generally acceptable for my listening purposes. They’re lossy when compared to CDs, though, which makes them less suitable for archival purposes. Better than CDs for metadata, but there’s room for improvement there. Bad for aggregation. Also, it’s quite difficult to purchase MP3s; I hope that will change in coming years (decades?), but maybe I’m over-optimistic. Tons of players.

AAC is one of my current favorites: it has most of the advantages of MP3s, but does better in metadata and, especially, aggregation. So it’s a great format for podcasts. Slightly fewer players, but enough of them for my purposes. The patent situation seems somewhat better than that for MP3s, but not perfect.

MP3s and AAC suit my needs for free stuff. For purchasing, CDs are good, but it would be nice to have a format that I liked that would enable me to purchase music digitally. Unfortunately, all the choices suck: Apple’s and Microsoft’s DRM solutions are both loathsome. So, for now, I’m buying music on CDs: I’d rather do that, wait a couple of days to have it shipped to me, and rip it myself, than have music delivered quickly and painlessly in either of those formats. I have to think that this suggests that the recording industry could find a way to make more money off of me if they were willing to give up on DRM; the recording industry is not, alas, well-known for its forward-thinking business acumen.

I am a little worried that, at some point, music that I care about won’t be available on CD. Recently, for example, I wanted to get music of some of the artists I liked from Next Big Hit. They’re all independent, so it wasn’t too surprising that I couldn’t find all of them at Amazon; that’s what CD Baby is for. But I couldn’t find any of the music from one of them there, either; looking at her website, I did find her music for sale, but not on a CD, and some of it was apparently only available from iTunes.

So it sucks that new artists are getting caught up in DRM protection that’s really designed to serve (or “serve”, perhaps) the interests of large labels. Fortunately, in this case I could just e-mail the artist directly, and it turned out that she did have a few copies of her CDs left over. But a taste of a world that I’d just as soon avoid.

For video, the story is less pleasant. There’s no long-standing open format like CDs. Videos that are on the web are generally either in Quicktime (which encompasses many different formats, but these days I can usually play it under Linux) or Windows Media (which I can’t play under Linux). Actually, that’s not even true – videos are frequently hidden behind Flash front-ends to the extent that I don’t know what the underlying format is, and can’t get at the bytes short of doing a tcpdump or something. MP3s and AAC both have analogues, namely MPEG-2 and AVC (a.k.a. H.264), which are as acceptable as their audio compatriates. And there are new physical formats coming out that do look noticeably better than their predecessor, and that will probably be harder to copy and play on free players. (And there’s no reason to believe that this is the end of video formats.)

The upshot is that, for video, I just stick my head in the sand. Fortunately, most stuff on the web I click on once to watch but have no desire to save. And almost nothing I really want to watch is in Windows Media format, so I don’t have to install that viewer on my Mac. I like watching DVDs, but will happily avoid upgrading to newer formats for the indefinite future. (And I’m cautiously optimistic that the general public will be slow to adopt either of the new formats.) I haven’t yet grappled with the whole backup issue: my DVDs aren’t showing signs of age the way my CDs are, and disk space isn’t quite cheap enough for me to want to back them up wholesale anyways.

I suppose I should look on the bright side: no matter what, we live in a much better world now than when we had to deal with LPs, cassette tapes, videotapes. New media are much more robust, much easier to copy, much higher quality, much more broadly available. And the current RIAA leadership will retire eventually.

bad itunes

September 16th, 2006

I just upgraded to iTunes 7; I wish I hadn’t. Downloading fancy album cover is nice (though I wish I hadn’t had to give a credit card number to do so); the reflected images of the album covers are a bit much, but whatever. We’ll see how gapless playback works.

Unfortunately, it turns out that a certain piece of core functionality is broken: as far as I can tell, there is no way to tell it to automatically sync most podcasts with my iPod but to let me manually manage some of them. The previous mechanism that I’d been using for this (only sync checked episodes) has gone away. (The checkboxes themselves are still there; maybe the option will be restored in a future update?) The user interface lets me drag and drop episodes, which should also work just fine, but they don’t actually show up on the iPod.

So: buggy software. If it were free software, I’d probably be able to easily find some place I could get a definitive answer and/or file a bug; with Apple, there seems to be no way to do that. There do seem to be customer forums where I can whine; not clear that doing so will help, but who knows. I do hope there will be an update fixing this soon; if not, I guess I’ll look into free software for managing the beast.

go small companies

September 13th, 2006

Two e-mails I received today:

An e-mail from CD Baby with the name associated to the from address given as “CD Baby loves David”, saying (among other things) the following:

Your CDs have been gently taken from our CD Baby shelves with sterilized contamination-free gloves and placed onto a satin pillow.

A team of 50 employees inspected your CDs and polished them to make sure they were in the best possible condition before mailing.

Our packing specialist from Japan lit a candle and a hush fell over the crowd as he put your CDs into the finest gold-lined box that money can buy.

We all had a wonderful celebration afterwards and the whole party marched down the street to the post office where the entire town of Portland waved ‘Bon Voyage!’ to your package, on its way to you, in our private CD Baby jet on this day, Wednesday, September 13th.

I hope you had a wonderful time shopping at CD Baby. We sure did. Your picture is on our wall as “Customer of the Year”. We’re all exhausted but can’t wait for you to come back to CDBABY.COM!!

I am amused.

And Payseur & Schmidt e-mailed to double-check that I received a book I’d ordered a month ago. Apparetnly they’d been having problems with their ordering system, information had been lost, and they wanted to make sure I’d gotten my order. (And, if not, ship me a replacement ASAP.)

Nice to see people who care.

random links: september 10, 2006

September 10th, 2006

new super mario bros.

September 10th, 2006

I would not be the person I am today were it not for Mario. At some point in grad school, a friend of mine gave me an NES (during the mid 1990’s; I was a bit behind the times), and the original Super Mario Bros. and the third in the series completely blew me away, starting me on my path to console gaming addiction that I have yet to emerge from.

I skipped straight from the NES to the Nintendo 64, however, so I missed all the SNES classics. So I was happy to see Nintendo republish some of them on the GBA. But when I played the GBA version of Super Mario World, I didn’t think so much of it. I’m sure it was impressive enough at the time, but I’d gotten used to relatively open-ended 3D worlds instead of sidescrolling 2D worlds, and it just wasn’t the same.

But then they came out with a new 2D Mario game for the DS, it got good reviews, and I needed games to play while on vacation. So I picked up New Super Mario Bros., and it turns out to be rather charming.  A little more polished than the earlier games, in an unstated way.  They stripped out many of the items that were already a bit overwelming in Super Mario Bros. 3; they added a couple of new items, but you can safely ignore both of them unless you’re in a completist mood.  (In which situation they’re probably quite welcome.)  The game play is still solid, and at a good difficulty level for this modern era where we don’t have to use excessively difficult gameplay to cover up for the shallowness of other aspects of the game design of two decades ago.

So: score one for well-done nostalgia.  I’m not planning to devote more time in the near future to 2D platformers – I’d be happy to wait another decade before giving one a try – but I’m glad I finished this one. Not, to be sure, finished in the sense that I used to try to finish video games – I beat the final boss (“spoiler”: after you finish the castle in the last world, a few more levels will appear, and you’ll probably want to have at least 5 coins to create a save spot in the middle), but I didn’t look in every nook and cranny. Or even most nooks and crannies – there were two whole worlds that I didn’t play at all, and that’s fine with me.

Another surprise: it comes with fifteen or twenty minigames (that’s the single-player count, I didn’t look at the multiplayer ones), and those were surprisingly enjoyable. So if you’ve taken the game on a ride or something, have just hit a save point, and aren’t sure you’ll have enough time to make it to another save point, then give the minigames a look.

The minigames also gave me some serious Super Mario 64 nostalgia – several minigames used snippets of music from that game, and it all came flooding back. Sigh. I still don’t understand quite how Nintendo fell from platformer dominance, when it ushered in the 3D platformer era in such a spectacular fashion: no more followups on that platform, they didn’t even manage a followup for the Gamecube’s launch, and when we finally got one, it was rather underwhelming. It speaks to Nintendo’s strengths that, for their core franchises (Mario, Zelda), they don’t churn out slightly modified versions every year or even two years, but they went too far with that one. At least Super Mario Galaxy is getting a bit of a buzz, but even there I can’t say I’m optimistic – it looks like too much small-scale, maybe even gimmicky, gameplay to me. We’ll find out some time over the next year or so, I suppose.

nintendo 1, sony -1

September 7th, 2006

(Yes, I know that none of my readers care about this. Sorry.)

Two recent news items:

Weird. I’m used to thinking that Sony knows what they’re doing while Nintendo is kind of incompetent; guess not. Who would have thought that the strategy of graphically underpowered consoles with weird control schemes would turn out so well, while the strategy of turning your console into a vehicle for your movie branch’s preferred format would get off on so bad a foot?

Not that this necessarily means anything in the long term – Sony still has solid third party support, and presumably they’ll learn how to make lasers eventually. Plus, Nintendo is still more than capable of screwing up big time. But one more piece of bad news – bad manufacturing quality, defection of a major RPG, or something – and inertia really could shift in Japan.

it’s not luck

September 4th, 2006

Today’s book: It’s Not Luck, the second of Eliyahu Goldratt’s business novels. Which I actually read after the third; cleared up a few issues, but the reading order didn’t matter too much. (I would recommend starting with The Goal, though.)

This book presents some thinking tools for analyzing situations that confuse you, where you’re stuck in a bind and don’t know how to get out of it. On the surface of it, this doesn’t have much to do with other aspects of the Theory of Constraints; there may be some sort of deeper pattern going on here, though. After all, agile methods are usually linked with retrospectives, and root cause analysis / five whys is part of lean (as are other thinking tools, for that matter), so it would seem that methodologies that I’m currently interested in each recommend some sort of disciplined introspection.

Certainly something that I’m interested in these days: one of my problems is that I sometimes leap to canned answers of the “right” way to handle a given situation, which has obvious (and less obvious) flaws. Don’t get me wrong – the opposing attitude of “it doesn’t really matter how you do things” is also a real loser, but I/we could use some help looking closely enough at my/our concrete situation to figure out how to improve. I’m glad we started doing retrospectives before the teams changed; I don’t think I want to revive that quite yet, but hopefully we’ll be able to have one by the end of the month. And the ToC thinking tools might well be a good match for resolving issues that I have personally; if nothing else, they seem admirably concrete. (I imagine there’s also a book explaining lean thinking tools in concrete terms, I just don’t happen to have read it.)

I also liked the comment from this book (on one of the more elaborate tools) that it’s not that hard, or even that time-consuming to do, you just have to force yourself to do it. I certainly have experience with watching myself shy away from doing things that I know are important, for all sorts of unsatisfactory reasons. (And perhaps occasionally for satisfactory reasons, but never mind that.)

Some of the examples that they worked to led to interesting places, too. The “local optimization is the root of all evil” example was a bit too canned for my preference (though the methods that they used to come to that conclusion were interesting). But I did like the idea that business can benefit much more from segmenting markets, including segmenting previously undifferentiated markets, than they currently do. One point of view that lean books rightly attack is the cost view that the appropriate price for a product is its cost plus a reasonable profit margin: the market is under no compulsion to agree with you that a price determined in that manner is worth paying, so if you can’t convince them that a product worth what you charge for it, any amount of whining about a reasonable profit margin is useless. Instead, find ways to tweak your product offerings so as to maximize their perceived value; if you do it right, and if other people can’t easily copy you, then your profit margin can be quite a bit larger than what might a priori seem “reasonable”.

In this light, Sun’s recent strategy of selling hardware while giving away software, or giving away hardware while selling support for software, or providing subscription plans for hardware, or probably other variants that I’m forgetting, starts to make rather more sense. Find ways for customers to choose the package with highest value for them, do so in ways that almost no other companies can easily copy, and do so in ways that put excess capacity to use, and you will start making money. Sounds good in theory; we’ll see if recent positive news is a sign that this strategy is starting to make traction, or if it’s just another mirage.

two weeks with new team

September 1st, 2006

As I mentioned at the time, one of my fellow managers gave notice two and a half weeks ago, with the result that his team got combined with mine.

Interesting couple of weeks. I had some ideas about how I might handle the transition, but they mostly got blown out of the water, for two reasons:

  • One of the members of the other team also gave notice (for apparently unrelated reasons).
  • We had some unusual high-priority interrupts. Of a positive nature, fortunately, but it did mean that normal planning went out the window.

Which, in its own way, turned out to be a good thing, because it admirably focused all of our minds. Rather than worrying about how the cultures would fit together, or having philosophical arguments, it was clear to all of us that we had to focus on doing two things as quickly as efficiently as possible: gathering what knowledge we could from the two departing programmers, and servicing the interrupts. Don’t get me wrong, I very much wish that both the departing programmers were staying with us, but at least their departure gave us a clear goal; the high-priority interrupts were all for the good, because it let us work together towards an important concrete end.

And those efforts were, as far as I can tell, quite successful. We all worked more than normal, and had to overcome problems (in both teams’ former domains); none of us had to work so hard as to burn out, and the code in both teams’ former domains was very much up to the task at hand. There were several opportunities for cross-team collaboration, too. So we know each other a little better (not that we didn’t know each other before – we’ve all been working in the same group of cubicles for the last few years, talking and eating lunch together all the time), and we’ve shown that we can get stuff done by working together.

Having said that, there’s a lot of stuff that didn’t get done. There were several things that I really would have liked to start two weeks ago that I only got around to starting today. (One-on-ones, for example.) We only just got started on September planning; honestly, I’m not completely sure we’ll get a good monthly plan in place until October.

What planning meetings we’ve had were instructive. In the first meeting, we tried to do full card estimation in the same manner that my old team had. And that took forever, for a few reasons: half the group was unfamiliar with what was involved in any given task, deciding between a half-point and a whole point took too long on many cards, and we went off on tangents. So in our second such meeting, we gave up on points, instead identifying candidate cards as either “too big” or “not too big”; we didn’t get on almost any tangents; and things went much more smoothly. But we still have a lot of task breakdown ahead of us, and very little estimation backlog built up, with the result that we might have to just play it week by week for the time being. (Another contributing factor is that more high-priority interrupts of an unpredictable nature are coming.)

Everybody is happy with daily standups, as far as I can tell. A reasonable amount of pairing going on. A reasonable amount of people working on tasks that had, in the past, belonged to the other team, though demarcations are very clearly present. I’m still managing to get my hands dirty occasionally.

Don’t get me wrong, we still have a lot of challenges ahead of us. But now that I’ve finally been able to take a bit of a breather, I’m optimistic.