[ Content | Sidebar ]

phallos

April 17th, 2005

I just posted the following to delany-list (and the last paragraph is a reference to another post there), but I figured it could do double-duty here as well. For those who don’t know, I’m a huge Delany fan.

I finished Phallos a couple of days ago. On the one hand, I quite enjoyed it. On the other hand, I’m rather disappointed by it, as well.

I first read part of it in a chapbook that came with, I think, 1984. From that, I got the idea that the book was about somebody searching for a copy of a book called Phallos. There was some summarizing and excerpting of the fictional Phallos (I’ll write Phallos when referring to Delany’s book, and just Phallos for the book referred to within Phallos), but I assumed that the chapbook largely consisted of that just because that was the easiest way to get a coherent extract of the full novel.

Unfortunately (as is fairly clear from the page count of the final novel), that summarizing and excerpting is pretty much all you get. It’s actually quite a lot of fun; I like the plot he’s made up for Phallos, I like the characters, and there’s typical Delany playfulness (intellectual and otherwise). But my first reaction upon finishing the book was “that was all well and good, but I wish he’d tried to write Phallos instead of a summary of it”. That would have been a book well worth reading. (Though I’m not sure he could have brought it off in the alleged 500-ish pages that Phallos is supposed to have taken up: just making the dirty bits explicit would probably have bumped up the page count most of the way to that level.)

Another possibility, suggested by my reading of the chapbook, would have been to add more of the outer story of the person searching for Phallos, as I’d initially assumed would have been the case. These days, my reading tastes don’t lean to quite that much layering, but I’m sure Delany would have done a great job with it.

Incidentally, as hinted at above, the explicit dirty bits have been almost entirely removed; there’s description of the dirty bits, but I wouldn’t call Phallos pornography at all. (Certainly not in comparison to Hogg or The Mad Man or Equinox.)

I probably would have had a different reaction if this had come out a year after The Mad Man. But we’ve been waiting a decade for another piece of Delany fiction; I expected something a little more substantial than this.

I’m pretty curious where he’s planning to go next, fiction-wise. And I’m glad that On Writing is almost done. Though, these days, I find the idea of anything by Delany being in its “last edits” almost risible…

dessert presentation

April 13th, 2005

Miranda’s cooking skills are continuing to advance: she’s working on presentation now. Last night’s dessert started with a meringue cookie, surrounded by 5 gummi bears (in a regular pentagon around it, one gummi bear of each color that we have). This represented a flower; on top of the flower, she put marshmallows, which she said was snow covering the flower. And then she put sprinkles on the marshmallows, which represented sunshine glittering on the snow.

She is fabulous.

jak 3

April 11th, 2005

I just finished Jak 3, for the PS2. I didn’t play either of the first two games in the series, despite their getting generally quite good reviews: the first one was a platformer, whose only apparent distinguishing feature (as far as I could tell from reviews) was that the main characters had more “attitude” than, say, Mario. Which I have nothing against, but I was (and am) pretty burned out on platformers, and I’m looking for game design advances that go beyond simply updating a genre to appeal to 14-year-olds instead of 8-year-olds.

The reviews for the second episode didn’t make much of an impression on me, either. But by the third one its gameplay style had apparently changed pretty seriously; its gameplay was much more GTA-style than platformer. At the time this game came out, I hadn’t played any of the GTA games, and felt uncultured because of that; Jak 3 seemed like a way to get exposed to the genre while still being allowed to play the game with Miranda watching.

And Jak 3 was a lot of fun! It’s one mission after another, constantly switching between genres, so you never get bored. The segmentation between missions is very subdued, so you’re constantly in the middle of action. (Impressively little in the way of load times, too.) And the action is very well balanced indeed: I frequently found myself dying, but my reaction to dying was almost never to get frustrated or to reach for an online guide, but to try that segment over and over again until I got it right. Which is pretty rare for me these days: I’ve lost my patience with games that have gameplay that I find gratuitously difficult, but gameplay where I feel like I’m getting better and am about to get past the trouble at hand is quite another matter.

So: you have my blessing to go out and buy the game; the fact that I’ve only written one paragraph about its virtues and am about to start kvetching about it doesn’t mean that it’s not quite good, it just means that its (relatively slight) flaws trigger more thoughts than its positive features do.

The cities in it are rather too linear for my tastes. In GTA, the cities have a big mesh of interconnecting roads, just like cities in the real world do. In Jak‘s cities, though, there’s basically only one path from any place to any other place. The game does use this design to its advantage – the more urban city (it’s not a good sign that I can’t remember the names of the cities, is it?) is segmented into parts that you open up after certain missions, for example, allowing the scope of your travels to gradually expand. And it keeps you busy enough with missions that you don’t have the linearness of the cities rubbed in your face. Nonetheless, it is a flaw: we’ve gotten trained to expect and enjoy wide-open maps.

As I said above, the game keeps you moving from mission to mission, but sometimes you’d like to spend time just exploring. And there is something for you to do at times like that: there are these kiosks all over the maps (for no apparent reason) that give you mini-challenges. If you complete them, you can upgrade your abilities a bit; it’s quite possible to do just fine in the game without completing them, but it’s a nice bonus if you do.

The flip side is that it’s a good thing that it’s quite possible to do well without completing the kiosks, because some of them (in particular, the ones that I happened to try first) are really tedious. They show you a spot on the map, and give you a 15-second timer to get there. Which might be fun if locations on maps had a bit more interesting visual differentiation between locations; as it is, though, all I knew was that I had to travel in a random direction for 15 seconds, and while the maps are linear, they aren’t quite linear enough to narrow the search space enough to make those challenges anything but tedious. Fortunately, not all the kiosk challenges were like that: some of them were fun platforming or racing challenges. But I got off on a bad foot with those challenges, and it took me a while before I gave them a second chance.

You have lots and lots of moves in the game: 12 different guns, a couple of different hand-to-hand attacks, 4 dark powers, 4 light powers. I can imagine a game that used that variety in order to give you a set of interesting tradeoffs at all times (though it would be hard to do); unfortunately, many of the moves I either almost never used or used only in one place (right after I acquired it) where it was essential. I would have prefered fewer choices, with more situations where you’d use each move, and where a higher percentage of your moves would be interestingly applicable in any situation.

Some of the gameplay elements were a bit much. I’m trained to accept wandering monsters as a staple of video games, but even so, having a desert that is chock full of cars which are swarming to attack you yet so weak that you can defeat dozens of them without breaking a sweat was a bit much. If the monsters had been a bit less realistic, I wouldn’t have minded (and, indeed, I didn’t mind the non-car wandering monsters elsewhere in the game that die in a similar profusion), but something about those desert cars just bugged me.

Decent plot; certainly not as intricate as, say, GTA: San Andreas, or a good RPG, but not bad for a series with platformer roots.

I’m not sure what I think about all of the race-like segments where you have to go from one ring to the next in a fixed number of seconds. It’s not a bad gameplay idea, but I wouldn’t have minded some more traditional races as well.

A good length: I’m quite happy to have played through the whole game, but I’m also happy that it didn’t drag the game out by overusing gameplay elements.

Anyways: good game, I’m glad I played it, and Miranda enjoyed watching me play it.

archetypes in video games

April 7th, 2005

Series work differently in video games than they do in, say, books. In books, a sequel continues the story of the previous book in the series. And, to be sure, there are a lot of video game series that follow that same pattern: I just finished playing Jak 3, and it did continue the story from the previous two games in the series. At least I assume so: there’s a lot of what looks like backstory, though I haven’t actually played the first two games in the series. (Which, come to think of it, is another difference between series in books and series in video games: I almost never read a later book in a series without reading the earlier books first, whereas I feel fewer qualms about ignoring earlier video games in a series; more on that below.)

There are also many video game series that follow another pattern that is also familiar from books, though it shows up in nonfiction. Every year, EA Sports releases another football game, just like, every year, tax guide companies print a new edition of their tax guide.

But not all video game series follow these pattern. (Do most? I’m not sure.) Take Zelda, for example. The events of one game don’t follow the events of another game: you have a new Link saving the kingdom from a new Ganondorf, with Princess Zelda implicated somehow. Sometimes they make a pretense of tying the events in one game to those in another, but it’s clearly a sham. So instead of having events in one game follow events in another game, you have later games repeating archetypes from earlier games: there’s an archetypal Link, and archetypal Zelda, and so forth.

This repetition shows up in all sorts of ways: the species repeat themselves as well (Zoras, Gorons, etc.), settings frequently recur (e.g. the Lost Woods), the key items recur, etc. And sometimes this can be quite emotionally powerful. For me, the most surprising instance of the latter is the effects of repeating musical themes from one game to the next: I turned on the recent GBA Zelda game, heard a familiar theme, and immediately felt happy. (On a completely different note, I wonder if video games could take a lesson from Wagner and consciously use leitmotivs in their music? They could even have the music change in response to your actions: if you use some item, say, then the leitmotiv for that item could work its way into the background music. Hmm…)

To be sure, the Zelda games aren’t as similar as I’ve made them seem here: Link is in all of them, but Zelda and Ganon aren’t. Sometimes they’ll alternate more canonical games with less canonical games, for example. But in every game, you’ll see many concepts familiar from earlier games in the series.

At first, I wondered if maybe Zelda was unusual in this regard, but I think it’s a fairly general phenomenon. Take the Final Fantasy games, for example: there’s no pretense of continuing a plot from one game to the next. In fact, the differences between games are so extreme that it can be hard to say why the games deserve to all have the same name at all. To be honest, I’m not sure what they all do have in common: I’m aware of some repetitions (chocobos, the airship guy), but if you plunked me in front of a random Japanese RPG, I’m not sure if I could reliably tell whether or not it was a Final Fantasy game. I suspect that’s more my ignorance showing, though, and that real RPG fans would have no difficulty explaining what makes an FF game different from other Square RPGs or Japanese RPGs from other companies.

So why are video games different this way? One reason probably has a lot to do with the pace of technological change. The difference between different the capabilities of different generations of consoles are enough that I suspect that, if you continued the story of a game from one console onto another console 5 or 10 years later, the contrast would be quite jarring. Modern consoles, for example, allow much richer stories than earlier consoles, with much higher levels of detail in every aspect of the game. So that probably encourages reinforcing concepts at a certain level of abstraction, instead of continuing the details of a story.

We might also look into reasons why people jump into video game series in the middle, because that encourages a focus on archetypes rather than continuing stories. If you’re only now curious about, say, Harry Potter, it’s easy enough to buy the earlier volumes, or just borrow them from the library. If you’re curious about the latest Final Fantasy game, though, it may be next to impossible to get your hands on, say, Final Fantasy III. (At least legally.) Even games from earlier in this generation of consoles may be out of print; games from earlier generations are likely to be both long out print and for hardware you don’t even own. (Fortunately, console makers have started to make their hardware backward-compatible, an idea I whole-heartedly support. Admittedly, companies do reissue older games for newer consoles, which raises a whole host of questions, but not ones for this post.)

Also, if you want to read an earlier Harry Potter novel, it only takes a few hours. But if you want to replay an earlier video game, it will take up ten to forty hours of your life. I only have so much time to play video games; a series has to be pretty good to convince me to spend lots of that time on earlier games in that series. (Especially since I’m not one of the people who waxes nostalgic about earlier generations: I think there have been a lot of important advances in video game design over the years, and in general I far prefer games from recent generations to those from older generations.)

The age demographic may make a difference too, but maybe not; video game players are getting older on average, and it’s much more socially acceptable for people to continue to play video games as they age.

I wonder if books could profit from this same idea? Some books do play with this idea: for example, in Kim Stanley Robinson’s Three Californias trilogy, you see similar groups of characters in different settings in the different volumes. And, of course, mythology uses this idea all the time (e.g. the Coyote stories). Still, there’s probably more that could be done there.

computer unfortunateness

April 5th, 2005

I now have FC3 installed on my home and both work computers. So I spent the morning today upgrading packages on one of my work computers. It took forever, though, because the web proxy that I have to go through at work kept on corrupting data, as far as I can tell. Once I found a mirror site inside of Sun’s network, it went smoothly, but until then, it was like pulling teeth.

That was my desktop machine at work. I’d already finished upgrading my work laptop on Monday; it was in a somewhat iffy state, because I’d done a botched partial install last week (not believing that my CD’s could really all be bad), so it had a bad mixture of packages. I thought it was all sorted out, though.

Today, however, something else seems to have gone wrong with my work laptop: rpm (the package manager) seg faults whenever I try to invoke it. Which is a really unpleasant situation to have to recover from: I can’t just, say, install a new package for it, because I’d have to use it to do the installation! I have no idea what could have caused this (it was working fine yesterday, after all); maybe the hard drive in that machine is going bad? (It seems to be generally falling apart – last Thursday, its display was behaving very strangely, too.) The botched upgrade and the seg faults might convince me to just reinstall the OS from scratch: I almost never use that machine, so there’s none of my personal stuff on it.

And then, when I got home and tried to import some more music into my iPod, Windows complained about a corrupted file. After a bit of poking around, it turned out that the corrupted file was on the iPod itself. (Windows could have told me that in the error message…) Fortunately, the error message told me what to do about it, so I fixed it, and everything seems to be proceeding smoothly. I would chalk this up to the costs of jogging while carrying around a hard drive, except that I haven’t gone jogging since the last time I imported some stuff, and it worked fine then.

So I seem to be a computer disease carrier right now. Maybe I’m giving off really powerful magnetic fields, or something…

fedora core 3 is in tha house

April 3rd, 2005

It turns out that, if you poke around enough at the boot: prompt, you get told about an option for where to find the images to upgrade from. So, a couple of hours of installation later and a couple of hours more of package upgrades later, here I am. Yay.

No big differences are yet obvious. Some different fonts in my browser, a second toolbar appeared on top of the screen (which I have summarily dismissed), but everything seems to have proceeded smoothly.

Next time I’ll probably not wait as long after the OS is released before upgrading. And I’ll know that I only have to burn one image to a CD, and it doesn’t even have to work very well: it just has to work well enough to get into the installer.

miscellany: fedora, work weeks, ddr, pujols

April 2nd, 2005

I was hoping that I’d be using Fedora Core 3 the next time I wrote an entry here, but ’twas not to be: I can’t get the damn CD’s written. I’ve eliminated all variables, using two different CD writers, two different CD readers, two different sources of blank CDs, two different downloads of the images (both md5 summed). Of course, it’s the same images both times, but I doubt that the Fedora folks are distributing images that are incapable of working. I’m not sure what my hypothesis is now: I guess my laptop’s CDRW drive must be having problems, but that can’t be the only issue, otherwise everything would have worked fine at work.

Sigh. Why do I have to burn them to CD’s in the first place? I guess now I’ll explore avoiding that altogether. Also, I forgot to mention another annoyance yesterday: even if this does work, I’ll have spent all this effort effectively downgrading many/most of the packages on my computer to their status at the time of FC3’s release, and then I’ll have to spend a few hours upgrading the computer again.

Counterpunch has been talking about work weeks recently; one of the least emphasized practices of eXtreme Programming is the forty-hour work week. A good practice, which I try to stick to in my team. I do wonder, though, how much the rule came to be because it’s a good idea for humane reasons, as opposed to a good idea for production reasons. Certainly the former is by far the most important reason for me: I love programming, I thoroughly enjoy my job, but I’m not about to let it start squeezing out my family life, or for that matter my own personal non-work intellectual interests. And Kent Beck does admit that those reasons are important to him, too. But XP does do a pretty good job of making a case for it in terms of production reasons: they work very hard to have people write the highest-quality code they can all the time, and it’s not too much of a stretch to imagine that you can’t keep that up if you regularly work evenings or weekends. (And, of course, they have a C3 project anecdote to back that up.)

And I’m not sure the “treat people humane” argument is such a bad one, even in productivity terms. It’s expensive to replace people, in terms of cost, time, lost knowledge, and lost morale. I tend to think that those are the sorts of costs that are underestimated, and that a company that treated them seriously could get a real competitive advantage over other companies. (Also, while we’re on the subject, what’s so magic about a 40-hour work week? How about a 35, or 32, or 30-hour work week? The 40-hour work week was one of the great accomplishments of the 20th century; we should be ashamed that we’ve been going up over the last half-century instead of gonig down.)

We went to an open house at Miranda’s daycare today. In one of the rooms for bigger kids, they have a big TV with a Playstation and a couple of Dance Dance Revolution mats, so I watched some kids play that for a while. (One of them was pretty good, too, certainly better than I am.) I haven’t dragged out my DDR pad since we moved into this house, and I’m not sure why: it’s a fun game, and it’s even decent exercise, so I could have played it on days when I was supposed to jog but it was too wet for me to be thrilled about going outside. Actually, I expect that I’ll hit a bit of a void in video games soon: there aren’t a lot of games coming out soon that I’m excited in, and some of the recent releases that I am interested in are ones that I don’t want to play while Miranda is watching, which drastically cuts down the time when I can play them. (Especially since I’m still playing through Grand Theft Auto.) At one point I expected to make up this void by finally getting around to buying an Xbox, but now that the rumors are that the Xbox 2 will be backwards compatible, I’m holding off on buying that until I know for sure one way or the other.

Albert Pujols hasn’t struck out all spring. (.455 batting average, slugging over .900). Apparently this isn’t all that rare, actually: Eric Young did it just two years ago. Still, he’s really good. Nice to see Jordan last week; too bad that the A’s and O’s are opening against each other…

os upgrade and incremental development

April 1st, 2005

Last year, I upgraded this computer from Red Hat 8 to Fedora Core 2. It was a bigger OS jump than I would have perhaps liked (skipping two OS versions), but now I’ve stuck with FC2 for a while, even though FC3 has been out for several months. At first, I was planning to skip FC3 and jump to FC4 soon after it comes out, but now I’m planning to upgrade to FC3 soon (this weekend, hopefully).

The hope here is that OS upgrades are sufficiently painless that the benefits of taking small leaps, always using a supported OS (I’m not sure FC2 will be supported once FC4 comes out), will outweigh the fear and glitches that come from doing the upgrade. Now that I subscribe to fedora-announce-list, I realize that lots of FC2 package upgrades these days come in parallel with FC3 package upgrades. Which means that

  • The difference between FC2 and FC3 isn’t all that great
  • To the extent that the upgrades don’t come in parallel, the longer I wait the bigger the chance that switching OS’s (whenever I get around to doing it) will cause serious disruption.

On a related note, I’m becoming more and more of a fan of incremental software development every month. Why can’t we develop software in such a way that we always have a working version? And the truth is, as the eXtreme Programming people have taught us, we can develop software that way, and it’s really useful to do so. These days, at work, I get nervous if I have a modified source tree that doesn’t get checked in for two days straight (and if it goes longer than that, I see it as a sign that the code should be thrown away, instead of digging a deeper hole).

So why not apply the same philosophy to OS development, to OS upgrades? The version of FC2 that I’m running now is pretty different from the version that I first installed; every day or so, I do ‘yum update’, and a few packages get upgraded. And many of the updated FC2 packages are the same as the FC3 packages; I’m not sure how different the version of FC2 that I’m currently running is from a current FC3 installation, but I don’t think they’re all that different. So why not go whole hog and release all FC3 packages on FC2, eliminating the difference between the two OS’s?

The big issue is, of course, incompatible changes. If a key library changes its major version number, do you have to upgrade all the packages that depend on that? Or do you leave the old version in place, for old packages to link against? If the latter, when does the old version go away? And libraries are the easy case, because they have a built-in mechanism to have multiple incompatible versions installed simultaneously – if gcc suddenly changes from 3.3 to 3.4 and all your C++ code stops compiling, you might be a bit annoyed. (Or you might be grateful that your nonportable C++ code is being flushed out, of course.) I don’t think this is rocket science, though; I’d be unhappy if, say, 5 years from now I still have to do an OS upgrade in a big hunk. (Then again, it might not be with Fedora: it’s run by Red Hat, which has reasons to have non-incremental upgrades in their enterprise distributions.)

So far, actually, I’m a bit stymied by just getting the CD’s burned: I keep on trying to burn CD’s, and they keep on failing the mediacheck stage. Sigh. Which is a perfect example of non-incremental upgrades getting in my way for stupid reasons. The release notes make me wonder if it’s possible to download the iso’s, mount them from the hard drive (instead of burning them first), and upgrading directly from that somehow. But they don’t give any instructions for doing all of that, and I don’t feel like figuring it out by myself (and I’m not at all sure that it’s possible).

miranda cooking

March 30th, 2005

For the last month or so, Miranda’s been really into helping out with cooking dinner. I’m not quite sure what triggered it; part of it, I suspect, is that with her current bed time, she doesn’t get to spend much time with us in the evenings, and the best way to maximize that time is for her to help us with dinner, since we certainly can’t play with her while cooking! Also, the week before she started helping so much, the cooking segment at school involved her using sharp knives; this may have given her more of a sense of power and accomplishment. (We don’t let her use sharp knives at home, for what that’s worth.) (Once of the many nice things about PACT is that kids get to do stuff like cooking – basically, whatever parents are interested in teaching, kids get to do!)

Actually, though, she’s been cooking for a while, and doing it much more creatively than Liesl and I ever do. She designs her own desserts, and they can be quite distinctive. The basic model is ice cream, chocolate sauce, marshmallows, and a couple of colors of sprinkles, but she quite frequently substitutes in other ingredients (chocolate bars, cookies, fruit, whatever else she thinks of). Not always the most coherent of dishes, but they’re fun to eat (and fun to help her with), and I’m really impressed with her desire to design them.

stan freberg

March 25th, 2005

I was just listening to a collection of Stan Freberg singles. Satirical musical comedy from the 1950’s; pretty good stuff, though it is, of course, somewhat dated.

Reading through the booklet that came with the CD, though, it’s amazing how much pop culture we’ve lost from only a half-century ago. A few of us have heard of Stan Freberg and like him; these records were real hits at the time, however, with (for example) St. George And The Dragonet / Little Blue Riding Hood being apparently the fastest-rising single in the history of the record business up to that time. (That time being 1953; on the other hand, it only spent 4 weeks at #1, so it was perhaps a bit of a flash in the pan even at the time.)

But he did more than put out a few comedy singles. (Apparently quite a lot, actually: there are 21 on the CD, but it claims that there are many more where they came from.) For example, it says that “He and Daws [Butler] wrote and performed as principal actor-puppeteers for a live half-hour show [“Time for Beany”] every weekday for the next five years. … The show was popular with all age groups, went on to win three Emmys and a Peabody”. This is a show that apparently produced hundreds (over a thousand?) episodes, and was well received, but I’ve never heard of it, and there’s almost no media available for it. (Are most of the shows still in existence, or have the tapes been lost?) I doubt the show has aged very well, but that’s still a real shame.

At least modern media is digital, so it’s much more likely that there are copies squirreled around somewhere. If only copyright law could get changed so that people could, say, legally get their hands on old video games that are no longer for sale. One of these decades…

school closure: one more year

March 23rd, 2005

The board finally voted last night. Actually, they voted on two things: they changed their vote of a month ago, and agreed to not close any school this year. And they voted on which school they would close next year: they’ll close Slater (my daughter’s school), PACT will move to Castro, but the rest of Castro will stay as-is (instead of moving the dual immersion program away from Castro or closing the neighborhood strand). They’ll try to get a third magnet program at Castro eventually.

All in all, I think the vote went about as well as I could imagine. I’m obviously quite happy that they’re not closing any school last year. I’m sad that Slater is targeted for closure a year for now; but I can’t honestly say that the proposal they approved isn’t the best one for the district as a whole. In particular, it’s the only proposal that actually had a positive vision for Castro, that didn’t treat Castro as a problem to be swept under the carpet somehow.

I hope that something will happen over the next year to remove the need to close any school next year, either, though I can’t say that I’m optimistic. So I’ll have to do what I can to make PACT’s probable move to Castro a smooth one. But first, a break; it’s been a busy last couple of months.

(A busy one for many people: I have been extraordinarily impressed with the way the Slater community behaved throughout this process. A lot of people worked very hard to get us this result; my heartfelt thanks to all of them.)

blogosphere

March 18th, 2005

Even though I’ve been blogging for half a year now, I get the feeling that I’m not doing it “right”, or at least I’m not doing it the way normal bloggers do. Whenever I read other people’s blogs, they’re usually taking part in actual conversations: I dipped into several blogs a couple of weeks ago, for example, and learned that apparently all the hip bloggers are supposed to have an opinion about Google’s Active Toolbar, and were linking to each other’s arguments, whereas I’d never heard of the thing. Oops.

I’m being flip, of course, but I really do like following the links in other people’s blogs; it increases the chances that I’ll run into something both interesting and unexpected. It reminds me of the early days of the web, when the web sites that were common cultural references were much more individual, idiosyncratic efforts (I still read Dr. Fun regularly…), there was always something new and neat around the corner, but if you kept on tracing through new stuff, you’d find references back to familiar ground. Kind of like Usenet: a huge amount of stuff there, with lots of subcultures, but you’d quickly recognize the regulars on the groups you read, and you’d occasionally see those same regulars in other, unrelated groups.

Still, my lack of links is largely just the way I am: while I do spend lots of time thinking about others’ works, those works aren’t particularly likely to be on the internet; the Amazon links that I provide are just a pathetic pretense of an attempt at electronic reference. Better to chose my topics based on what I actually spend my time thinking about, instead of what happens to be on the internet; there are lots of other people who do the latter much much better than I could.

But part of the reasons were technological. When I read other people’s blogs, I often found them interesting; but the irregularity of their updates meant that I didn’t really want to add them to my list of links that I click on daily. (I actually did most of my blog reading at work, largely because Jonathan Schwartz, Sun’s president, has a good one which freqently links to interesting stuff.) This problem, however, has a well-known solution: RSS. I’d put off reading RSS feeds because Galeon, my browser of choice, doesn’t understand them, and I didn’t want to switch browsers. And I wasn’t sure that RSS reading fit most naturally into my browser: better, perhaps, to read RSS feeds in my mail/news reader, Gnus. And, while I’d heard about people using Gnus to read RSS, I couldn’t find it in the manual (as packaged with XEmacs, or maybe it’s Fedora Core’s fault).

A month or so ago, though, I got fed up with this situation, and did some poking around. It turned out that XEmacs was distributing a slightly out-of-date manual; when I looked at the version of the manual available online, it made it clear that the version of Gnus I was using really did support RSS. But when I followed the instructions in the manual, it completely failed to work! Fortunately, gnu.emacs.gnus came to the rescue, and a few GR’s later in my *Group* buffer, I’m subscribed to RSS feeds, and happily reading blogs regularly.

Not a lot of blogs, though. (As you can see: for now, I’m putting the ones I subscribe to on the links list on the right side of this blog.) I hear about RSS aggregators, but I haven’t yet felt a need for that. (Good thing, too, because I don’t know how to do that in Gnus, though it’s probably possible.) It will probably grow, though, because there’s actually another weird feedback loop going on here: when I’m in an authorial mood, I log on at home more frequently than I used to, which means that I would quickly work through my old, familiar list of regular links (and my regular list of newsgroups), which meant that I’m looking for more stuff to read online. (Then again, I might put a damper on that feedback loop by, say, spending less time on the computer at home, or spending more of my computer time at home programming.)

school closure: second castro meeting

March 17th, 2005

Another meeting at Castro last night. Not too much excitement in the community comments. I did admire (?) the chutzpah of a certain group of parents in the dual immersion program who talked about how horrible it was to close a school in that community, and then floated a plan which would turn the school into a collection of magnet programs, closing down the neighborhood program that kids in the community actually attend. (Not that they couldn’t attend the magnet programs, they just wouldn’t get priority.) I liked the guy who talked about how nobody is talking about closing Huff, despite its being as segregated as Castro, even though leaving Huff open mainly helps about a hundred kids in its neighborhood, all of whose parents have multiple cars to drive their kids to school anyways…

I actually missed the most interesting part, which was the budget discussion and voting. The district’s finance director now no longer believes that they’ll be able to rent out a school next year if they close one. So the new budget doesn’t include any actual revenue from closing a school, and is mum on the issue on whether or not they’ll reduce costs by closing a school (and, for example, eliminating jobs).

Which is a ray of hope. If they close a school, I still tend to think that they’ll close Slater. (And I can’t say I have an informed opinion about whether it would be less harmful to close Slater or Castro.) But maybe the budget news will give one trustee an excuse to shift her vote away from closing a school next year…

down with the State

March 16th, 2005

When we last left our refactoring saga, I was regretting having done a State extraction too early, and was about to throw it out, doing some more class extractions first. Which is what I did, and it was clearly the right decision; I now have some significantly smaller classes, and they’re a lot easier to test. Not perfect yet: one, in particular, has a .cpp file with about 500 lines, which is larger than I’d like, and I’m not completely confident in my unit tests for that class. But it’s a significant improvement: one turning point is that the tests have been easier to write than I expected, rather than harder to write.

So yesterday, I tried again to do the State extraction. And, again, I’m going to throw that work away! But I’m definitely learning: this time, I only went two hours into the refactoring before throwing away my work, which is much better than throwing away three days of work. And, honestly, that two hours of refactoring was really useful, even though I’m throwing it away: maybe I could have noticed another class to extract without doing that speculative refactoring, but I wouldn’t count on it. Spending two hours and understanding my code better at the end of it sounds like a pretty good deal to me.

At this point, I am wondering if I will ever get to use State, though…

school closure: castro community forum

March 13th, 2005

At the last school board meeting about school closure, they put forward a plan where the school to be closed would be Castro instead of Slater. (The latter being the school that Miranda goes to.) They wanted to give Castro parents time to complain about this, so they’re holding a couple of community forums at Castro, the first of which was Thursday. (Excellent idea – if only they’d done the same thing at Slater…)

Pretty interesting. About 30 minutes into the forum, there was a huge flap set off (mostly) by the fact that the board wouldn’t allow bilingual speakers to do their own translation. They were providing translations of everything into whichever of Spanish and English the speaker didn’t speak in; and they insisted that all speakers go through the provided translators. Which lead to an argument which ended with a five-minute break being called in the meeting. Apparently (I’ve subsequently learned) the genesis was that, in the past, people’s own translations haven’t always been accurate (and, in particular, have contained derogatory comments in the Spanish versions but not in the English versions), but the current policy seems like serious overkill to handle that issue: as it is, they’re guaranteeing that the translations are inaccurate. I would feel that way even if the translators were doing an excellent job; I’m sure they were trying their best, but they left a lot out, explicitly resorting to summarizing much of the time.

Anyways. A few people complaining about us Slater whiners. A lot of people talking about how wonderful Castro is. A lot of people talking about how awful an idea it is to close any school. (The last few weeks have seen the district’s financial officer say that she can’t count on being able to rent out a school next year if they close it, and have seen more projections of increasing student enrollment in a couple of years.) Several charges of discrimination. Right before the end were two very strong speeches by Slater teachers. One of the speeches might not have been the most politic in the world, but was interesting to me: our school district, the Mountain View-Whisman school district, got its ungainly name from the merger of two school districts three or four years ago; according to that teacher, the current behavior, motivated by budget fears and No Child Left Behind fears is much more characteristic of the Whisman school district’s pre-merger behavior than of the Mountain View district’s pre-merger behavior. (And it’s not a coincidence that PACT, the program that we’re part of, came out of the Mountain View district.) The other speaker did a great job of pulling all our points together, switching seamlessly between English and Spanish, and bringing the whole room to their feet with a standing ovation at the end.

One more community forum on Wednesday; a school board meeting the week after that. I think the decision is supposed to happen then, but I could be misremembering, and of course we’ve already seen that decisions don’t happen when scheduled.

processor speed

March 7th, 2005

I recently read an article by Herb Sutter that claims that the long rise in processor speed is finally coming to an end. I certainly believe that this is going to happen eventually, maybe within the next decade, because we do seem to be approaching some physical limits; I didn’t think that it was happening quite yet, though. Sutter does present some interesting evidence in favor of his argument: in particular, there’s a graph which shows that the CPU speed of Intel’s processors actually stopped increasing two years ago, and that, if the trends from before 2001 had continued, we’d now have CPU’s approaching 10GHz instead of less than 4GHz.

And it’s true, Intel’s march for higher CPU speed has stalled. The thing is, though, I’m not sure how much weight to give to that argument. My understanding is that, with the Pentium 4, Intel decided that people payed more attention to CPU speed than to other metrics of CPU performance, so they pushed chips’ clock speed even if, say, it sometimes took more clock cycles to carry out the same action. (Which is why AMD started marketing their chips by translating their performance to Intel’s instead of touting their own clock rate.) Given a choice between, say, a 2GHz Opteron and a 3GHz Pentium 4, I know which one I would take. So maybe Intel was playing tricks that are catching up with them now; I’d like to see graphs like that from other manufacturers.

And if you look at the Intel graph in that article, the current plateau isn’t the only change in behavior – at around 1995, the rate of clock spead increase actually increased. If you extend the older line instead of the newer line, then Intel’s current clock speeds don’t look at all out of line. And it does seem that other manufacturers will be hitting 4GHz soon – for example, the recent press releases about IBM/Sony/Toshiba’s Cell processor claim that it will reach that mark. (Admittedly, I’m not sure when it will be released, or how long after release it will take for 4GHz models to appear.)

Still, I do buy the larger point of the article, that to continue to get increased performance, we’ll soon need to switch to other techniques, of which the most interesting is multithreaded code on multicore processors. As a loyal Sun employee, I have to get behind this: my group at Sun is eagerly awaiting the release of dual-core Opteron processors, and Sun’s forthcoming Niagara SPARC processor is going to be a lot of fun to work with. I hope that, one of these years, I have an excuse to program in a multicore environment; my current software team does multithreaded programming, but we do it in a fairly naive way. (And there’s nothing wrong with that, to be sure: simple solutions are better, as long as they get the job done.) Programs are already marching less in lockstep and acting more like a collection of semi-autonomous agents; how far can we take this? Is the number of processors in a computer going to grow exponentially? Are the processors going to get simpler and less powerful while this happens, or is each individual processor going to support as complex a piece of software as those on today’s single processors? If so, it’s going to be very exciting seeing what complex dances the software on these processors trace out, and what unexpected phenomena arise from that.

Down with authoritarianism in software design; long live anarchist collectives!

go bibliography

March 2nd, 2005

I used to play go a lot, and I collected a lot of go books. In fact, by the time I was in grad school, I had copies of all but 10 or 15 or so of all the go books that had ever been published in English. (Just under 100 at the time.) The web was relatively young; I decided to start a web site devoted to go books. It was a lot of fun; my first real foray into writing on the web.

When I was a postdoc at Stanford, I didn’t have nearly as much time to play go: sometimes I would try to go to the local club every other week, but much more frequently I wouldn’t show up for months at a time. (My hands don’t allow me to play go online: I can type fine, but mouse usage kills me.) The go bibliography started to slip a bit: whereas before I got and reviewed each new book within a couple of months of publication, my goal was now to not fall more than a year behind. Which I was more or less able to do: I took the bus to and from work each day, and I often read go books on the bus rides.

I also found other things that I wanted to write about. For a little while, I had some pages on teaching; when I got this computer, I put up some pages about the process of getting it set up. I never found the time to keep them up, though; in fact, sometimes they never got far enough for me to publish them to the world at all (e.g. some pages on video games).

When I started work at Kealia, though, I stopped taking the bus to work (since there wasn’t a convenient route), which ate into my book reading time: and go books certainly aren’t my highest reading priority. And, after thinking about it for a while, I decided that while I did miss having an excuse to occasionally write something for public consumption, I didn’t really miss writing about go books. As my other abortive efforts made clear, though, I probably shouldn’t plan on writing about any other specific theme: anything too formal would pose a high enough barrier that I wouldn’t update it regularly, and my interests change frequently enough that a single-topic site would die pretty quickly.

But with blogs mentioned in newspapers almost daily, it was pretty obvious what I should do. So here I am. It’s sad to think that I may never add another review to the go book site, but such is life. (I’ve asked other people to contribute reviews: I don’t mind doing a bit of work on the site, if other people can help.) To be sure, I don’t really have an idea how long I’ll keep up this blog, but it’s lasted for half a year by now, I’m not getting bored yet, and I still have a backlog of things that I’d like to write about. I certainly feel better writing regularly: it gives me an excuse to think a bit more about certain things, which is always welcome.

more iPod comments

March 1st, 2005

A few random iPod-inspired thoughts:

  • The first time I imported a CD with iTunes, it reported doing it at a rate of about 5x. But the next time I imported a CD, it reported a rate of under 2x, and stayed there. And it was a real problem: it took all evening just to import a handful of CDs. At first, I cursed Windows and iTunes, but the truth is that I’d been thinking browsing on Linux was sluggish ever since I upgraded to Fedora Core 2. I’d blamed that on either the OS upgrade or on the browser upgrade, but now I had concrete evidence that the problem was more widespread than that. After a bit of thinking about possible causes, I went into the bios and told it never to adjust the CPU’s speed unless I was running on batteries; the problem was solved. (There must be something buggy going on with speedstep, though – the fans don’t come on all that often, even when I’m running at full speed all the time, so why was it so persistently slow?) In retrospect, what probably happened was that I upgraded the bios at the time I upgraded the OS (because early bios versions on this computer had a bug that caused time problems), and my old bios settings must have been lost. So hurray for iTunes – without that speed rating, I’m not sure if I ever would have gotten around to looking at the bios, and my web browsing (and blogging!) would still be horribly slow.
  • The iPod has this feature where it remembers what songs I’ve listened to, and how often. It can use this to do things like give you a random playlist with your favorites more heavily weighted; nice idea. The thing is, though, it seems to periodically forget that I’ve listened to music: when I sync it with my computer, it forgets stuff that I listened to since my last sync but more than a few days ago. Very strange – you’d think this sort of information would be stored on the hard drive and never lost.
  • It also frequently forgets what I’m in the middle of listening to, if I’ve stopped it in the middle of an album. A bit of experimentation suggests that maybe it remembers better if I pause it and let it go to sleep by itself, but it forgets if I hold down the pause button to put it to sleep more forcefully. But why should it forget in either situation? My car’s CD player can remember where I was last listening to a CD, and it doesn’t have a hard drive to store that information. So why can’t the iPod do just as good a job?
  • I still have yet to stump the CD database that iTunes uses.
  • I’m really glad I got the iPod. Jogging is a lot more fun, and I really do like listening to music. I’m finally buying CD’s again: I accumulated hundreds when I was an undergrad, but had bought almost none in the intervening decade, and that’s a shame.

tokyo godfathers; movies

February 26th, 2005

We just watched Tokyo Godfathers; very good. About three homeless people who find an abandoned baby, and try to track down its mother and learn why she abandoned it; good characters, good plot, good visuals, pleasantly bizarre.

Hmm: that wasn’t much of a discussion of the movie, was it? The problem is, I’m really not very good at talking about movies. I could give a plot description, but I’m not sure what the point of doing so would be. I’d rather have something a bit more insightful to say, or at least something a bit more analytical. I don’t claim to be god’s gift to video game criticism, but at least I can blather along about the things for paragraphs; not so with movies. I’ve seen a reasonable number of movies (perhaps not so many in recent years, but then again the movies I’ve seen I’ve seen over and over again, which should mean something); I guess the point is that I spend more time thinking about the design of video games as I play them. And, for that matter, I spend lots of time reading video game web sites, so I’m much more exposed to video game criticism than movie criticism. (What are good movie web sites? Also, what are good music web sites?) So I should think more as I watch movies, and not be afraid to write about them, I guess; with practice, I’ll have more to say.

Fortunately, I should have more movie-watching time soon. For years, we’d basically only been able to watch movies that Miranda could watch. But once she started school, we moved her bed time up (or really, gave her a bed time different from ours at all), giving us time that we could watch TV by ourselves. Unfortunately, at about the same time, we bought our disco duro, and we kind of overdosed on Iron Chef and Good Eats. But recently we’ve moved her bed time still earlier, and a significant portion of the Good Eats episodes are ones we’ve seen recently, so we’re plowing through our backlog of recordings.

(The thing I miss most about Boston: the Brattle. Also, why, in my first paragraph, did I not mention that it was either Japanese or animated? I guess I didn’t want to overemphasize either of those facts, given the brevity of the paragraph: I wasn’t up for a comparison of it with anime, or for that matter non-Japanese animation (The Triplets of Belleville; I guess I didn’t talk about that when I first watched it? Maybe I wasn’t blogging yet).)

refactoring twists and turns

February 25th, 2005

(Warning: really boring post follows. This is what’s been on my mind today, but Jordan will probably wish that I would go back to talking about Java.)

There’s this big monster class that I’ve been dealing with at work almost ever since I got there. About a month or two into my job, I had to try to write unit tests for it, and failed miserably. (Because, after all, it’s a big monster class, exactly the sort of class for which the notion of “unit test” is ridiculous.) I did a bit of refactoring at the time, but for various reasons (I wasn’t very good at refactoring, and I wasn’t in charge of the code), it thoroughly repulsed my efforts at civilizing it. I did get a bee in my bonnet that State might be a useful design pattern to use, though.

A year later, I’m now in charge of the code, and it’s been causing me a fair amount of pain (in the form of seg faults, and sleepless nights about the thought of having to add new functionality, as I’ll have to do over the next few months). It actually was a pair of messy classes; I spent January matching wits with the first one. It certainly fought back – near the beginning, for example, I thought I had a nice bit that made sense conceptually to extract as a separate class, and I found a couple of method that looked like a perfect entry point into that section of code. But after spending a few days trying to get that to work, it just got worse and worse: those two functions called functions that called functions that referred to data that I thought was outside of the class in question, and it was really hard to tangle the code apart. So I had to throw away those days of work, and try again. (The second time was the charm, though.) And, actually, while that code is hugely better now than it was, I still wasn’t able to properly tame and test some of the core algorithms…

So this month I’ve been dealing with the other one of these messy classes. At the start, finding refactorings to do was like shooting fish in a barrel – anybody who can’t find methods to extract from 100-line functions isn’t looking very hard. (Let’s start by having every method fit on a single screen…) So far, I’ve extracted three nice little classes from it: they’re quite coherent, much easier to understand, much better tested, and I of course found several bugs in the process. One of the extracted classes, in particular, was an absolute joy to refactor: I knew that I had extracted a coherent chunk of data and methods, but I was having the hardest time figuring out what it was actually doing. This made adding unit tests a surprising pain; I ended up stopping thinking about what the class should do, and just mechanically writing tests that pinned down the class’s behavior, without worrying about how to interpret that behavior. (Well, almost pinned it down – fortunately, I kept around an end-to-end test as a backup, which saved me at one point in my later refactoring.) And when I got to the refactoring proper, I decided to do it strictly by the book, making really small, mindless little changes, and consciously trying to avoid look to far ahead, never making a jump when I could find baby steps that would get me there instead. And it was beautiful: the badness just melted away, the code almost transformed itself, and at the end of the day, everything made sense, and was clear to anybody reading it. I’m completely sold: tiny refactorings for me from now on.

But a week or so ago, I ran out of obvious classes to extract, so I decided to do the State extraction that I’d had on my brain for the last year and a half. I did it (by tiny steps), but while I liked the code more that way, I was pretty sure that most of my coworkers, when reading the code, would find it less clear. So I didn’t check it in: I wanted to spend more time moving functionality into the new State objects, and hoping that other refactorings would become obvious to me as I did so, so that by the time I checked it in the code would be clearly improved.

A couple of days later (I started a week ago, but most of this past week I was at a conference) I’ve got a lot more functionality moved into the State objects, but it’s still not a clear improvement. As I hoped, though, I am starting to see other refactorings that make sense, that really do improve the code.

The thing is, though, these other refactorings don’t depend in any way on my State extraction: in fact, if I hadn’t had this State bee in my bonnet, I would probably have seen them a week ago. Look at all these methods that take 6 arguments – maybe I could get rid of some of them? Look at these three member variables whose names all start with “uncopied”, with a big comment before them explaining how they’re used together – maybe I could extract those all into a class? And, as I do the extraction, I find what must be a bug in the code, but I’m so deep into refactoring upon refactoring that I have a hard time stepping back, figuring out what’s going on, and writing a test to expose the bug.

So I’ll be throwing away the last three days of programming: starting over, doing the obvious refactorings that I came across today, and probably creating a big parser wrapper class whose State I will only extract after I’ve gotten it into a coherent whole. Sigh.

Which, actually, isn’t so bad. At least I can say that I’ve learned something about programming over the last couple of years: I’ve learned that I should be nervous if I want to go for three days without checking in my code, and that it’s better to throw away that work and start over from scratch than keep on forging into the muck. And I’m sure that I’ll hit the ground running next week: having struggled with the code today, I’ll be able to do the first few refactorings in a flash, and it will only take me a day and a half to recreate the work that I’ve done in the last three days. (But there will probably be another week of work inserted into the middle of that day and a half of recap, as I do more refactoring to get it into shape before doing the State extraction.)

The next post will be a non-programming one, I promise…