[ Content | Sidebar ]

backing up cd collection

September 17th, 2006

I’ve finally started backing up my CD collection, only half a year after I planned to do this. The goals are:

  1. To have lossless backups, including offsite ones, of the entire collection.
  2. To do whatever error correction is possible when creating the backups.

(The vast majority of the CDs are between 10 and 20 years old, and are showing their age.)

If it weren’t for the second point, I would backup the CDs by doing dd if=/dev/cdrom of=.... But I don’t trust that to produce as clear a copy as possible.

Digging around, there seem to be various tools which specialize on doing good copies: cdparanoia for Linux, Max, for MacOs, and Exact Audio Copy, for Windows. Since my CD collection is located in the room where my Mac is, Max seems like the way to go.

Unfortunately, this is where my naivete about just what is on an audio CD begins to show. My naive method produces one big file for the entire disk. Max, however, wants to generate a collection of files, one per track. My understanding is that, if I tell Max to generate WAV files, then that’s more or less the same data as the naive method (on a disk where reads are error-free). But is it exactly the same data (modulo trivial packaging), or is there some sort of extra data that I’m missing? I know CD’s contain almost no metadata, but I’m worried about stuff like gaps between tracks – iTunes has a habit of screwing that sort of thing when ripping CDs (though they claim to have improved that with the latest release), and I have several CDs (e.g. operas) where that is quite annoying, so I’m a bit worried that I’m missing something there.

Does anybody know of a good FAQ on these matters? I’m having a surprisingly hard time piecing together the information that I’m looking for: what exactly is on an audio CD, what the relationship is between that and a collection of WAV files.

Anyways, I’m glad I’ve gotten started. It may take months for me to actually finish the process, but it should be a relatively mindless one from now on. (Assuming that a collection of WAV files proves sufficient.)

a/v formats

September 16th, 2006

What are good formats to use for purchasing and storing music and movies? I remember a time in the past where it was possible to pretend that Ogg Vorbis was a reasonable choice for many of your audio needs; that is, unfortunately, no longer the case.

Desirable qualities for a format, in no particular order:

  • Quality should be as high as possible.
  • File size should be as small as possible.
  • The standard should be open.
  • The standard should be unencumbered.
  • There should be a wide range of software tools (including free ones and best-of-breed ones) to play the format, on all platforms I use.
  • There should be a wide range of hardware devices (including best-of-breed ones) to play the format, in all locations where I’d like to use it (home, car, bus, jogging, …).
  • There should be a wide range of software tools (including free ones and best-of-breed ones) to convert to/from the format.
  • It should be possible to easily purchase content in this format.
  • It should be possible to easily find free content in this format.
  • I should be able to easily copy, excerpt, etc. the content for the forseeable future.
  • The format should support adding metadata (titles, composer, performer, album art, …)
  • The format should support aggregation (e.g. a podcast with multiple pieces of music should be a single file with indexing).

These are, of course, incompatible goals, but never mind that. Given the above, what are the conclusions?

One is that DRM is to be avoided: it fails on the copy/exceprt criteria, and probably also fails on open standard and free tools criteria. Having said that, it’s almost impossible to purchase video without some sort of basic DRM being involved. So, in some circumstances, I guess I can live with DRM, as long as tools for cracking it are widely available. Which is, fortunately, the case for DVD’s, though I’m somewhat worried that hardware might start getting in the way there. (For example, my understand is that many current DVD drives force you to respect region encoding.)

What are suitable audio formats? The aforementioned Ogg Vorbis unfortunately doesn’t look so hot to me. The big reason is that best-of-breed hardware devices don’t play it: I can’t use it on my iPod. There’s very little content available in the format. And it’s lossy, so I can’t store content in it and convert it to other formats as necessary.

Also, its main advantage over MP3s is that it’s unencumbered by patents. The problem is, in this crazy day and age, I don’t know if I can even be sure of that. There are so many overbroad software patents being granted that I can’t be confident that any decent a/v compression format couldn’t be attacked by patent trolls.

So, basically, it only helps me if I want to burn a CD and play it on a computer using particularly purist tools. And that’s a situation where I never find myself. The only Ogg Vorbis files I had on my computer were from Lambda Expressway, but now I see it’s available as MP3s, so that won’t be necessary any more.

So, in practice, there seem to be three obvious candidates for music: Audio CDs, MP3s, and AAC files. Audio CDs are widely available for purchase, playable in lots of ways, as high quality as is easily available, easily convertible to other formats. The downside is that file sizes are somewhat large, and they’re lacking in the metadata department.

MP3s are quite widely available, too. File sizes are smaller; quality is generally acceptable for my listening purposes. They’re lossy when compared to CDs, though, which makes them less suitable for archival purposes. Better than CDs for metadata, but there’s room for improvement there. Bad for aggregation. Also, it’s quite difficult to purchase MP3s; I hope that will change in coming years (decades?), but maybe I’m over-optimistic. Tons of players.

AAC is one of my current favorites: it has most of the advantages of MP3s, but does better in metadata and, especially, aggregation. So it’s a great format for podcasts. Slightly fewer players, but enough of them for my purposes. The patent situation seems somewhat better than that for MP3s, but not perfect.

MP3s and AAC suit my needs for free stuff. For purchasing, CDs are good, but it would be nice to have a format that I liked that would enable me to purchase music digitally. Unfortunately, all the choices suck: Apple’s and Microsoft’s DRM solutions are both loathsome. So, for now, I’m buying music on CDs: I’d rather do that, wait a couple of days to have it shipped to me, and rip it myself, than have music delivered quickly and painlessly in either of those formats. I have to think that this suggests that the recording industry could find a way to make more money off of me if they were willing to give up on DRM; the recording industry is not, alas, well-known for its forward-thinking business acumen.

I am a little worried that, at some point, music that I care about won’t be available on CD. Recently, for example, I wanted to get music of some of the artists I liked from Next Big Hit. They’re all independent, so it wasn’t too surprising that I couldn’t find all of them at Amazon; that’s what CD Baby is for. But I couldn’t find any of the music from one of them there, either; looking at her website, I did find her music for sale, but not on a CD, and some of it was apparently only available from iTunes.

So it sucks that new artists are getting caught up in DRM protection that’s really designed to serve (or “serve”, perhaps) the interests of large labels. Fortunately, in this case I could just e-mail the artist directly, and it turned out that she did have a few copies of her CDs left over. But a taste of a world that I’d just as soon avoid.

For video, the story is less pleasant. There’s no long-standing open format like CDs. Videos that are on the web are generally either in Quicktime (which encompasses many different formats, but these days I can usually play it under Linux) or Windows Media (which I can’t play under Linux). Actually, that’s not even true – videos are frequently hidden behind Flash front-ends to the extent that I don’t know what the underlying format is, and can’t get at the bytes short of doing a tcpdump or something. MP3s and AAC both have analogues, namely MPEG-2 and AVC (a.k.a. H.264), which are as acceptable as their audio compatriates. And there are new physical formats coming out that do look noticeably better than their predecessor, and that will probably be harder to copy and play on free players. (And there’s no reason to believe that this is the end of video formats.)

The upshot is that, for video, I just stick my head in the sand. Fortunately, most stuff on the web I click on once to watch but have no desire to save. And almost nothing I really want to watch is in Windows Media format, so I don’t have to install that viewer on my Mac. I like watching DVDs, but will happily avoid upgrading to newer formats for the indefinite future. (And I’m cautiously optimistic that the general public will be slow to adopt either of the new formats.) I haven’t yet grappled with the whole backup issue: my DVDs aren’t showing signs of age the way my CDs are, and disk space isn’t quite cheap enough for me to want to back them up wholesale anyways.

I suppose I should look on the bright side: no matter what, we live in a much better world now than when we had to deal with LPs, cassette tapes, videotapes. New media are much more robust, much easier to copy, much higher quality, much more broadly available. And the current RIAA leadership will retire eventually.

bad itunes

September 16th, 2006

I just upgraded to iTunes 7; I wish I hadn’t. Downloading fancy album cover is nice (though I wish I hadn’t had to give a credit card number to do so); the reflected images of the album covers are a bit much, but whatever. We’ll see how gapless playback works.

Unfortunately, it turns out that a certain piece of core functionality is broken: as far as I can tell, there is no way to tell it to automatically sync most podcasts with my iPod but to let me manually manage some of them. The previous mechanism that I’d been using for this (only sync checked episodes) has gone away. (The checkboxes themselves are still there; maybe the option will be restored in a future update?) The user interface lets me drag and drop episodes, which should also work just fine, but they don’t actually show up on the iPod.

So: buggy software. If it were free software, I’d probably be able to easily find some place I could get a definitive answer and/or file a bug; with Apple, there seems to be no way to do that. There do seem to be customer forums where I can whine; not clear that doing so will help, but who knows. I do hope there will be an update fixing this soon; if not, I guess I’ll look into free software for managing the beast.

go small companies

September 13th, 2006

Two e-mails I received today:

An e-mail from CD Baby with the name associated to the from address given as “CD Baby loves David”, saying (among other things) the following:

Your CDs have been gently taken from our CD Baby shelves with sterilized contamination-free gloves and placed onto a satin pillow.

A team of 50 employees inspected your CDs and polished them to make sure they were in the best possible condition before mailing.

Our packing specialist from Japan lit a candle and a hush fell over the crowd as he put your CDs into the finest gold-lined box that money can buy.

We all had a wonderful celebration afterwards and the whole party marched down the street to the post office where the entire town of Portland waved ‘Bon Voyage!’ to your package, on its way to you, in our private CD Baby jet on this day, Wednesday, September 13th.

I hope you had a wonderful time shopping at CD Baby. We sure did. Your picture is on our wall as “Customer of the Year”. We’re all exhausted but can’t wait for you to come back to CDBABY.COM!!

I am amused.

And Payseur & Schmidt e-mailed to double-check that I received a book I’d ordered a month ago. Apparetnly they’d been having problems with their ordering system, information had been lost, and they wanted to make sure I’d gotten my order. (And, if not, ship me a replacement ASAP.)

Nice to see people who care.

random links: september 10, 2006

September 10th, 2006

new super mario bros.

September 10th, 2006

I would not be the person I am today were it not for Mario. At some point in grad school, a friend of mine gave me an NES (during the mid 1990’s; I was a bit behind the times), and the original Super Mario Bros. and the third in the series completely blew me away, starting me on my path to console gaming addiction that I have yet to emerge from.

I skipped straight from the NES to the Nintendo 64, however, so I missed all the SNES classics. So I was happy to see Nintendo republish some of them on the GBA. But when I played the GBA version of Super Mario World, I didn’t think so much of it. I’m sure it was impressive enough at the time, but I’d gotten used to relatively open-ended 3D worlds instead of sidescrolling 2D worlds, and it just wasn’t the same.

But then they came out with a new 2D Mario game for the DS, it got good reviews, and I needed games to play while on vacation. So I picked up New Super Mario Bros., and it turns out to be rather charming.  A little more polished than the earlier games, in an unstated way.  They stripped out many of the items that were already a bit overwelming in Super Mario Bros. 3; they added a couple of new items, but you can safely ignore both of them unless you’re in a completist mood.  (In which situation they’re probably quite welcome.)  The game play is still solid, and at a good difficulty level for this modern era where we don’t have to use excessively difficult gameplay to cover up for the shallowness of other aspects of the game design of two decades ago.

So: score one for well-done nostalgia.  I’m not planning to devote more time in the near future to 2D platformers – I’d be happy to wait another decade before giving one a try – but I’m glad I finished this one. Not, to be sure, finished in the sense that I used to try to finish video games – I beat the final boss (“spoiler”: after you finish the castle in the last world, a few more levels will appear, and you’ll probably want to have at least 5 coins to create a save spot in the middle), but I didn’t look in every nook and cranny. Or even most nooks and crannies – there were two whole worlds that I didn’t play at all, and that’s fine with me.

Another surprise: it comes with fifteen or twenty minigames (that’s the single-player count, I didn’t look at the multiplayer ones), and those were surprisingly enjoyable. So if you’ve taken the game on a ride or something, have just hit a save point, and aren’t sure you’ll have enough time to make it to another save point, then give the minigames a look.

The minigames also gave me some serious Super Mario 64 nostalgia – several minigames used snippets of music from that game, and it all came flooding back. Sigh. I still don’t understand quite how Nintendo fell from platformer dominance, when it ushered in the 3D platformer era in such a spectacular fashion: no more followups on that platform, they didn’t even manage a followup for the Gamecube’s launch, and when we finally got one, it was rather underwhelming. It speaks to Nintendo’s strengths that, for their core franchises (Mario, Zelda), they don’t churn out slightly modified versions every year or even two years, but they went too far with that one. At least Super Mario Galaxy is getting a bit of a buzz, but even there I can’t say I’m optimistic – it looks like too much small-scale, maybe even gimmicky, gameplay to me. We’ll find out some time over the next year or so, I suppose.

nintendo 1, sony -1

September 7th, 2006

(Yes, I know that none of my readers care about this. Sorry.)

Two recent news items:

Weird. I’m used to thinking that Sony knows what they’re doing while Nintendo is kind of incompetent; guess not. Who would have thought that the strategy of graphically underpowered consoles with weird control schemes would turn out so well, while the strategy of turning your console into a vehicle for your movie branch’s preferred format would get off on so bad a foot?

Not that this necessarily means anything in the long term – Sony still has solid third party support, and presumably they’ll learn how to make lasers eventually. Plus, Nintendo is still more than capable of screwing up big time. But one more piece of bad news – bad manufacturing quality, defection of a major RPG, or something – and inertia really could shift in Japan.

it’s not luck

September 4th, 2006

Today’s book: It’s Not Luck, the second of Eliyahu Goldratt’s business novels. Which I actually read after the third; cleared up a few issues, but the reading order didn’t matter too much. (I would recommend starting with The Goal, though.)

This book presents some thinking tools for analyzing situations that confuse you, where you’re stuck in a bind and don’t know how to get out of it. On the surface of it, this doesn’t have much to do with other aspects of the Theory of Constraints; there may be some sort of deeper pattern going on here, though. After all, agile methods are usually linked with retrospectives, and root cause analysis / five whys is part of lean (as are other thinking tools, for that matter), so it would seem that methodologies that I’m currently interested in each recommend some sort of disciplined introspection.

Certainly something that I’m interested in these days: one of my problems is that I sometimes leap to canned answers of the “right” way to handle a given situation, which has obvious (and less obvious) flaws. Don’t get me wrong – the opposing attitude of “it doesn’t really matter how you do things” is also a real loser, but I/we could use some help looking closely enough at my/our concrete situation to figure out how to improve. I’m glad we started doing retrospectives before the teams changed; I don’t think I want to revive that quite yet, but hopefully we’ll be able to have one by the end of the month. And the ToC thinking tools might well be a good match for resolving issues that I have personally; if nothing else, they seem admirably concrete. (I imagine there’s also a book explaining lean thinking tools in concrete terms, I just don’t happen to have read it.)

I also liked the comment from this book (on one of the more elaborate tools) that it’s not that hard, or even that time-consuming to do, you just have to force yourself to do it. I certainly have experience with watching myself shy away from doing things that I know are important, for all sorts of unsatisfactory reasons. (And perhaps occasionally for satisfactory reasons, but never mind that.)

Some of the examples that they worked to led to interesting places, too. The “local optimization is the root of all evil” example was a bit too canned for my preference (though the methods that they used to come to that conclusion were interesting). But I did like the idea that business can benefit much more from segmenting markets, including segmenting previously undifferentiated markets, than they currently do. One point of view that lean books rightly attack is the cost view that the appropriate price for a product is its cost plus a reasonable profit margin: the market is under no compulsion to agree with you that a price determined in that manner is worth paying, so if you can’t convince them that a product worth what you charge for it, any amount of whining about a reasonable profit margin is useless. Instead, find ways to tweak your product offerings so as to maximize their perceived value; if you do it right, and if other people can’t easily copy you, then your profit margin can be quite a bit larger than what might a priori seem “reasonable”.

In this light, Sun’s recent strategy of selling hardware while giving away software, or giving away hardware while selling support for software, or providing subscription plans for hardware, or probably other variants that I’m forgetting, starts to make rather more sense. Find ways for customers to choose the package with highest value for them, do so in ways that almost no other companies can easily copy, and do so in ways that put excess capacity to use, and you will start making money. Sounds good in theory; we’ll see if recent positive news is a sign that this strategy is starting to make traction, or if it’s just another mirage.

two weeks with new team

September 1st, 2006

As I mentioned at the time, one of my fellow managers gave notice two and a half weeks ago, with the result that his team got combined with mine.

Interesting couple of weeks. I had some ideas about how I might handle the transition, but they mostly got blown out of the water, for two reasons:

  • One of the members of the other team also gave notice (for apparently unrelated reasons).
  • We had some unusual high-priority interrupts. Of a positive nature, fortunately, but it did mean that normal planning went out the window.

Which, in its own way, turned out to be a good thing, because it admirably focused all of our minds. Rather than worrying about how the cultures would fit together, or having philosophical arguments, it was clear to all of us that we had to focus on doing two things as quickly as efficiently as possible: gathering what knowledge we could from the two departing programmers, and servicing the interrupts. Don’t get me wrong, I very much wish that both the departing programmers were staying with us, but at least their departure gave us a clear goal; the high-priority interrupts were all for the good, because it let us work together towards an important concrete end.

And those efforts were, as far as I can tell, quite successful. We all worked more than normal, and had to overcome problems (in both teams’ former domains); none of us had to work so hard as to burn out, and the code in both teams’ former domains was very much up to the task at hand. There were several opportunities for cross-team collaboration, too. So we know each other a little better (not that we didn’t know each other before – we’ve all been working in the same group of cubicles for the last few years, talking and eating lunch together all the time), and we’ve shown that we can get stuff done by working together.

Having said that, there’s a lot of stuff that didn’t get done. There were several things that I really would have liked to start two weeks ago that I only got around to starting today. (One-on-ones, for example.) We only just got started on September planning; honestly, I’m not completely sure we’ll get a good monthly plan in place until October.

What planning meetings we’ve had were instructive. In the first meeting, we tried to do full card estimation in the same manner that my old team had. And that took forever, for a few reasons: half the group was unfamiliar with what was involved in any given task, deciding between a half-point and a whole point took too long on many cards, and we went off on tangents. So in our second such meeting, we gave up on points, instead identifying candidate cards as either “too big” or “not too big”; we didn’t get on almost any tangents; and things went much more smoothly. But we still have a lot of task breakdown ahead of us, and very little estimation backlog built up, with the result that we might have to just play it week by week for the time being. (Another contributing factor is that more high-priority interrupts of an unpredictable nature are coming.)

Everybody is happy with daily standups, as far as I can tell. A reasonable amount of pairing going on. A reasonable amount of people working on tasks that had, in the past, belonged to the other team, though demarcations are very clearly present. I’m still managing to get my hands dirty occasionally.

Don’t get me wrong, we still have a lot of challenges ahead of us. But now that I’ve finally been able to take a bit of a breather, I’m optimistic.

toc vs. jit

August 28th, 2006

I just finished another one of Eliyahu Goldratt’s business novels on the Theory of Constraints. I didn’t lose sleep over it the way I did with The Goal, but it was quite good. And useful to see ToC applied to product development situations, instead of just manufacturing situations.

One thing that caught my eye: not only does it speak somewhat ill of Just in Time, it lumps the latter together with assembly lines. Only somewhat ill, to be sure: assembly lines and JIT are both successful in limiting the amount of inventory that can accumulate between stages, limiting the waste of overproduction. TOC agrees that that’s a good idea in general; but it’s really focused on alleviating the ill effects of bottlenecks, which means that while it’s happy to have production stages that aren’t involved in bottlenecks be idle when not needed (as JIT supports, but perhaps assembly lines have a harder time with?), it wants to make sure that there’s material ready to allow bottleneck production stages to be active whenever possible. Mind you, TOC doesn’t want pre-bottleneck stages to produce for the sake of production, but they definitely shouldn’t run the risk of letting the bottlenecks be idle.

So: who’s right? TOC or lean (of which JIT is a component)? I’m pretty sure I’ve run across favorable mentions of The Goal in lean discussions, so I doubt they’re too strongly opposed. (It would, of course, be inconceivable that both are wrong: I couldn’t possibly be obsessed with two areas of thought that are both mistaken.)

Of course, lean isn’t just JIT: there are other tools in there. And doubtless part of the answer is that, because of the increased predictability that lean’s high quality gives you (via its other tools), the need for TOC-style production buffers is lessened. And kanban, the technique by which JIT is maintained, could probably be used to get TOC-style buffers. Kanban is the pull-style mechanism where each step requests new material from its predecessor by handing it containers to fill. The simplest example is one-bucket kanban: if you have a sequence of steps where each step requires one item from its predecessor, then this is the situation where each step keeps one piece of inventory on hand. That way, if a request comes in from its successor, it will be able to quickly build what is required. (While, at the same time, putting in a similar request to its predecessor, to make sure its one feeding bucket remains full.)

This works great if all steps take the same time, but sometimes different steps take diferent times. In those situations, you can add buckets as necessary. Or, I suppose, remove buckets – for example, when buying books, Amazon normally ships books to me a little faster than I want to buy them, so while I normally use one bucket, I don’t always worry about keeping the bucket full.

If we wanted to adapt this to the TOC concerns, then, I suppose they might recommend that your bottleneck should have more buckets preceding it than would otherwise be strictly necessary to fill that step’s demands: you’re trading off excess inventory before the bottleneck for the risk of having your bottleneck not working at full capacity if something goes wrong. Given TOC’s focus on bottlenecks, that seems like a reasonable tradeoff.

I’m not sure that lean would agree, though – they would probably see this as a failure, and focus instead on improving the predictibility of the bottleneck’s predecessors. Given the apparent effectiveness of lean’s tools for improving predictibility of manufacturing, that sounds like a reasonable response to me. Non-lean shops might want to focus more on the bottleneck’s buffer, though. (Don’t get me wrong, buffers in TOC situations are much letter than buffers that other methods find acceptable.) Also, lean covers more than just manufacturing; maybe it takes more of a TOC-style approach in, say, product development? Honestly, I have no idea.

So: what does this mean about my situation at work? Again, I have no idea: I’m having a hard time finding the sequence of steps that will enable me to even ask the question of what the bottleneck is. For now, probably the best thing is for me to watch and figure out where something inventory-like is piling up, because it will naturally appear before bottlenecks. If the bottleneck is under my control, though, then probably the right thing to do isn’t to focus on the chains leading up to the bottleneck, though: the right thing to do is instead to eliminate the bottleneck by, say, increasing the number of people who can work on that area of code. (If it is an area of code.) So lean over TOC for me.

Kanban calculations have started to appear. For example, one of the questions that my team had been recently asking in its previous incarnation is how large a task backlog we needed from my boss. We’d been doing a monthly planning cycle with my boss, combined with internal weekly planning cycles; it’s not clear that’s a great mix, though. If we really only had a month of cards then, at the end of the month, we’d be mostly done with them, and might be unnecessarily idle because we wouldn’t have quite have enough to work on. So, at any given time, it’s useful to have, say, an extra week of backup cards built up. There’s also a problem at the start of a month: we have a whole month of backlog then, and that can actually be a bit much – sometimes, for example, we had a hard time accurately predicting how long it would take for us to do a card a few weeks off, because we needed to get our hands dirty with the preparatory work first.

So the lesson seemed to be that two or three weeks was about the right size for our kanban bucket: that’s best for production leveling, keeping us working at our maximum productive pace. And, actually, I suspect that that size is actually a bit too large – we need a bit of a buffer in case we’re more productive than expected over the course of a week, but lean suggests that you don’t want buffer for the purpose of making sure that people can’t still work on something if, for whatever reason, they’re not suitable for working on tasks that would otherwise be higher priority. You should instead see the latter as a bug, figure out its root cause, and try to eliminate it.

Lots and lots of stuff for me to learn. Lean concepts are starting to fit together for me, but I’m also becoming increasingly aware of just how little I really understand lean. And how little I really understand several other things…

i do not like first-person shooters

August 27th, 2006

I used to like first-person shooters – in the distant past, I seem to recall having enjoyed Doom, Marathon, System Shock, Dark Forces, and GoldenEye, for example. The last few times I’ve played FPS’s, though, they really haven’t done much for me. I’m not sure why the big change – part is probably that I have other ways to get my graphics fix, and part is probably that I have access to a much wider range of gameplay styles these days, so have a better idea what my tastes are. (Most of those I played before I became a console gamer.) Or maybe those games were just better, or at least had ideas that were new at the time, which isn’t the case for more recent FPS’s that I’ve played – Doom and System Shock certainly qualify on that score, and Marathon and GoldenEye probably do as well.

At least that’s how I feel about the single-player mode. The last time I tried out an FPS in multiplayer mode, I really enjoyed it, and I have no reason to believe that wouldn’t still be the case. My fave was the scripted multiplayer scenarios for Perfect Dark; have more recent FPS’s adopted that as well? As you might guess from that last sentence, however, I unfortunately almost never have a chance to play multiplayer video games these days – I just don’t have that many videogame-playing friends, and for various reasons I don’t play video games online.

Anyways, during the summer game lull I was looking for something to play. And everybody’s been talking about the Halo series for the last five years, so I felt somewhat uncultured at not having played either of them. So I decided to give the first game a try.

I started off on the Normal difficulty setting. (As opposed to Easy, Hard, or Legendary, if I’m remembering correctly.) Which seemed like a reasonable choice for the first couple of levels – I had to take a little care, but really it wasn’t very hard getting past any of the sections.

When I got to the third level, though, my brain gave me its first warning sign. You start off the level with a sniper level; rather than thinking “how nice for them to be varying the game play like this”, my reaction was “crap, that means that I have to spend time moving slowly through the level and sneaking around”. Still, I more or less enjoyed the first part of that level.

The later part of the level had some large areas with a fair number of waves of enemies for you to kill, dodging in and out behind pillars and boxes and such. By the time I got to the largest such room, I was a little low on health, and didn’t manage to make it through the room before I had to go to bed. I hadn’t saved, fortunately, so I wasn’t in a big hole, but it did mean that, the next time I played, I had to replay a fair amount of the level, and be rather more careful.

Which I didn’t want to do; the level was well enough designed, but I just wasn’t enjoying it. I thought about stopping right then, but I remembered the easy setting, so I decided to give that a try.

They made me restart the level from the beginning (why?), which was a bit of a bummer, but my disappointment was quickly erased by the fact that they really weren’t joking when they called that setting “easy”. There were still a few places where you had to take a bit of care, but, most of the time, you can just blithely gun your way through without taking a scratch. Which turned the game into pleasant enough light entertainment, and let me see all the plot points. Honestly, I sometimes wondered if it made the game too easy – maybe there was to much of a gap between Easy and Normal – but later levels brought the difficulty back up to a better level.

Speaking of plot points: people talk about how great the plot is, but I just don’t see it. The plot does exist – we’re not talking about Doom here – but any decent RPG would have ten times as many twists and turns, and FPS’s I played a decade ago had at least as strong a plot as Halo. To me, the single-player game feels like it’s largely an add-on to the multiplayer game – there are rooms all over the place that are filled with platforms and levels and hiding spots that make no plot sense, that don’t make much sense in the single-person game, but would probably be a lot of fun in multiplayer.

The game mechanics are solid. The vehicles are a nice addition. Restricting you to two weapons is a surprisingly good idea. Your suit has a regenerating energy field which means that you can recover from small amounts of damage, so you don’t have to be too much of a perfectionist when dealing with small numbers of enemies at a time. Not many different kinds of weapons, but they’re well-chosen, so that’s a plus instead of a minus.

All in all, I’m happy I finished the game, but I’m certainly not rushing out to play Halo 2. Maybe I’ll reconsider when Halo 3 comes out. Now I’m down to just being in the middle of one game, and I’m almost done with that. Which is good timing – Okami will come out in just over a week, the Wii will launch in approximately two months with at least one must-play game (the next Zelda), and if that proves not to be enough, Civ 4 is now out for the Mac.

random links: august 26, 2006

August 26th, 2006

indigo animal

August 26th, 2006

Nevertheless, the beauty of lawn statuary comforts this perhaps overly-serious animal.

andy bechtolsheim on thumper

August 25th, 2006

I seem to be in a quiet mood these days. But I did like this video from my CEO’s blog. Andy is an excellent geek.

I suppose I could even try joining the modern world and embed it.

Gee, that was easy. Go web 2.0, or something.

expanding team

August 16th, 2006

One of my fellow managers has resigned, and rather than hiring another manager, we’re merging his team with mine. So my team has more than doubled in size. It’ll be interesting to see how things play out, and to see if I’ve learned anything about managing over the last couple of years; I’m looking forward to it. Too bad it will reduce the amount of time I spend programming, though. (It shouldn’t eliminate it entirely, fortunately.)

new aim digs

August 15th, 2006

My, the design for the new digs for the American Institute of Mathematics look posh – quite a change from the Palo Alto Fry’s building. Here are renderings from one side, another side, and a rendered flyover video. I will definitely have to wander over the first time Jordan goes to an event there after it’s been built.

This page, though, confuses me – the writeup mentions a “large building” and “octagonal building” with pictures of buildings that seem, while nice enough, much less dramatic. And, in the pictures, those buildings seem to actually exist, but I thought AIM was still running workshops at Fry’s? And the writeup doesn’t talk explicitly about the castle-style building, even though it shows pictures of it. (I suppose it’s possible that the large building could even be remodeled into the castle; it looks unlikely but not inconceivable.) So I don’t understand how many buildings are envisioned, when they’ll be built, and whether all of them really will be used by the institute. (As opposed to the other projects that Fry has at the site – maybe the castle will turn out to be a really big clubhouse for the golf course…)

what to do next?

August 13th, 2006

I’ve finished the last important code cleanups from my dbcdb code: I removed some proxy objects that had been used for lazy loading. I was really surprised to see how much that cleaned up certain aspects of the code: my Entity objects’ constructors got a lot cleaner, useless attribute setters/getters were removed, and in general responsibilities were greatly clarified: the Entities’ only job is to convert from SQL to HTML.

Which brings me to a pause point. I’ve met some of my objectives: gotten a little more practice with Java and HTML, learned a little about SQL and CSS, and provided an alternate linking structure to use in the blog. Nothing earthshattering, but it’s been of some modest use to myself. And there aren’t any obvious gaping holes to be filled.

So it’s time to take stock and figure out what to do next. For a while, actually, I was considering taking the time I’d been spending on this and using it to learn Japanese instead. (Which would actually take rather more time, but never mind that.) With some regrets, though, I’ve decided that isn’t the best course of action right now. My best guess is that I’ll be looking for another job in about two years from now. (With a huge margin of uncertainty, of course.) And, while I’m not sure what I’ll target in my search, I would like my options to be as many as possible; to that end, spending more time broadening my skills could be of some use. Exactly how much use isn’t clear – having been on the other end of the resumes, I realize how easy it is to reject candidates whose professional experience isn’t exactly what you’re looking for – but it’s worth a shot. So I’ll want to keep this up for another year or so. (After which, I hope to have enough time to take a break and learn Japanese. But who knows what the future will bring.)

So, given that I’m not going to stop now, what next? Rewrite it in Ruby, for one. I’m starting to chafe at Java more and more: just today I ran into a few places where I could use lambda, a few places where static typing was being mildly annoying. So I’ll start by rewriting the CLI tool in Ruby, then rewrite the HTML conversion part in Ruby. After that, I’ll generate the web pages on the fly instead of statically, using mod_ruby. (I don’t plan to learn Rails for now: I don’t have any good applications for that in mind.) After which, who knows; maybe I’ll stop there, maybe I’ll convert the editing tool from a CLI application to a web application. Maybe I’ll play around with web services, scraping book information from Amazon. Hard to say.

The immediate next step isn’t entirely clear. I’ve read/skimmed the Ruby book, but it hasn’t all sunk in; clearly I need to get my hands dirty. And I need to learn how to use Ruby to interface with a database. (Maybe the book talked about that; I skimmed the library section.) It’ll probably take a few months to have anything to show there; I also have a bit of unit-test library cleanup that I’ve been putting off. So don’t be surprised if I go quiet on the programming front for a little while.

dbcdb: improved compound author links

August 12th, 2006

I’ve deprecated the old compound author pages – they’re still there, but now nobody links to them. Instead, pages for books written by multiple authors link directly to the individual authors’ pages.

A matter of a change of a couple of lines of code. (Though all of my acceptance tests passed unchanged after that – oops. That has now been fixed.) I’ll probably eventually improve the database design as a result of this, but doing so is hardly urgent.

lean employer-employee relations

August 10th, 2006

One thing that I never got around to blogging about when I first became lean-obsessed: Toyota never fires anybody. Or something like that; at any rate, one thing that lean bloggers claim is that, for lean manufacturers, employees are a fixed cost instead of a variable cost.

Which has interesting ramifications. In general, it’s a good deal more humane, which I certainly approve of. And it fits in well with constantly asking your employees how to improve matters: if both parties are committed to each other, then it’s natural to see your employees as resources whose insights are to be valued. (Similar to the way they treat their suppliers.) All to the good.

Having said that, there are a few things that I wonder about. In the first place, what if your business isn’t constantly growing like gangbusters? I know, I know, that’s a sign that you can’t possibly be doing lean right, but it still strikes me as conceivable that even the leanest of businesses might have efficiency improvements that aren’t linked to sales increases, or might be hit by an economic downturn that it can’t ride out solely by lowering prices enough to maintain market share. So what do you do? In the former case, you could simply give your employees the excess profits by paying them the same for working fewer hours. The latter case is harder.

At the very least, this suggests that there’s more to this production leveling idea than is obvious at first glance. Even if you have a surprise hit on your hands, a lean company may decide not to provide enough of their product to meet demand, because that could lead to a rate of expansion that would lead to overstaffing in less sucessful times.

That’s one problem; the other is rather darker. I don’t know what matters are like now, but if Womack and Jones are to be believed, not only will Toyota not kick you out, but you won’t be able to leave if you want: salaries are set based on service time with Toyota, not overall skill or experience. So if you change companies, you start again at the bottom of the salary level, which few people are willing to do. And that sucks.

So I’ll go with the Semco model in this regard. You get the same respect for employees, the same benefits of treating them as resources, but without the gilded cage aspects. And they get advantages from an entrepreneurial spirit, where employees not infrequently leave to form companies that turn out to be valued suppliers for Semco; victory all around. They don’t seem to manage their ups and downs quite as smoothly as Toyota does; that’s okay with me (makes them feel more human, at least), and they’re certainly leery about excessive expansion as well. And, I suspect, the highs and lows in Brazil were more extreme than they were in Japan, once the worst of the WWII effects wore off there.

grr

August 10th, 2006

If one is so eccentric as to use a Dvorak keyboard layout, it is easy to accidentally hit clover-Q with one’s thumb. Good thing Emacs has auto-save files; I should get in the habit of saving long e-mails while in the middle of drafting them, though…