[ Content | Sidebar ]

sudoku

August 12th, 2005

One of my coworkers pointed me at The Daily Sudoku. I’ve tried and enjoyed a few; I’m not sure how long I’ll keep it up, but I’m not stopping yet. So far I’ve only tried ones rated easy or medium (and, honestly, I can’t tell the difference between the two levels); apparently the hard rated ones are a significant change. I’ll be curious to see if they make me think in interesting ways, or if they just make me go through long tedious searches to make progress. (I tried to order the book from the site – it seems clearly worth two pounds, but I ran into some strange paypal glitch. Sigh. I thought this electronic payment stuff was supposed to work well by now?)

It reminds me of some other puzzle, but I can’t think of what. The common idea is this: say you have boxes where the choices for each box are 12, 12, 1234, 1234. Then you know that the first two boxes use up 1 and 2, even if you don’t know the order, so you can reduce the choices to 12, 12, 34, 34. Where else is this idea important?

copyright office

August 12th, 2005

Just what is the copyright office thinking? I can’t imagine they’re doing this out of bad intentions, but it’s pretty depressing that they’re apparently clueless about this sort of thing…

donkey kong jungle beat

August 9th, 2005

It’s been a while since I discussed a video game I’ve finished, hasn’t it? Not because I haven’t been finishing games, I’ve just been busy writing about other things. (The video games du jour are Shenmue II, on my new Xbox (about which more later), when Miranda is around, and the stunning Resident Evil 4, when she isn’t.)

Oldest on the stack of finished games is Donkey Kong Jungle Beat. I was pretty excited when Donkey Konga, a music game with drums, was announced for US release: another sign that publishers are less likely to prejudge US customers’ tastes in Japanese video games. But when the game itself came out, I ended up not buying it: the song list was full of pop songs that I didn’t particularly like. (I’d much prefer video games with good music that I’ve never heard before; though why did I claim in the linked-to post that DDR is a Namco game? It’s by Konami.) I wouldn’t mind drumming to the Zelda theme, but that’s not enough to get me to buy the game.

Actually, what’s really a sign of the penetration of Japanese games into the US is that there are two drumming video games available here, the other being Namco’s Taiko Drum Master. Which has the Katamari Damacy theme in it, but even that isn’t enough to get me to buy the game by itself. (Even if Miranda and I still spontaneously sing it occasionally.)

But Nintendo had another use for their drum controller: a side-scrolling platformer called Donkey Kong Jungle Beat. Which had been getting positive mentions ever since it appeared at an E3; being a sucker for weird game ideas, I wanted to give it a try.

Not very good, I’m afraid. Ultimately, I just don’t have enough nostalgic fondness for 2d platformers to normally want to play them in preference to today’s much richer games: they’re not the sort of timeless simple pleasure that, say, a good puzzle game (Tetris) is. And while there are good games with very simple controls (e.g. Super Monkey Balls), the controls in DKJB felt to me like a gimmick. There are only three or four things you can do at any point, so when you get to a weird creature, you just hit the controls at random a little, find the magic effect of clapping (or whatever), and continue. If that doesn’t quite work, you have to manipulate the drums to jump in the air at the right location, and then clap. Whoopee.

And, to add injury to insult, my hands really hurt after playing it. Tip for all of you married people out there: take off your wedding ring before trying this game. But even after that, I suspect your hands need some toughening up. Miranda liked it enough that I went through the first eight twelve levels (in all of 4 hours or so: not a hard game unless you want to get as many bananas as possible), but I didn’t feel like replaying earlier levels to get better scores on them to unlock the last four levels.

A decade and a half ago, the gameplay would have been fine, and I would have had limited enough options (for reasons of game availability and finances) that I probably would have been happy to replay the levels over and over to earn the top medals on all of them. I’m happy that I can set my standards higher now.

code reviews, tasks

August 6th, 2005

I was unhappy with the result of our pair programming meeting for various reasons: we were all unhappy with how things were going, I was pretty sure that we were doing something wrong, but I didn’t know what it was. We’d adopted short-term measures to ease some of the pains, but I didn’t see them as leading to a coherent solution that I’d be happy with.

After thinking about it for a day, I decided that our changes were leading in a direction that I certainly wasn’t happy with: while I’m still not sure of the merits and demerits of pairing, I am sure that it’s good for us to spend more time focusing on the quality of our code, and to spend more time in general talking about code. If we’re going to pull back on pairing, we should still try not to give up on that goal: so I instituted a policy that all non-trivial checkins would require a code review. (If the code was entirely developed while pairing, that counts as the code review, of course.) Code reviews are probably not quite as good as pairing for quality control, but they’re a lot better than nothing: I know that, when I was working on GDB, I got a lot of useful feedback from others’ code reviews, for example.

I felt better after that: people were talking more, the checkins were a bit cleaner. Not a lot cleaner, but that will come: editing, like any other skill, improves with practice.

A week or so later, we ran into another problem: the assignment that one of my team members was working on that week wasn’t done, it wasn’t clear to me when it would be done, and I wasn’t at all confident that I’d like the results when I saw it. (Of course, my lack of confidence may have been largely caused by my lack of information: maybe it was great code, I just had no easy way of telling.)

This wasn’t an isolated instance: when we estimated a story as taking a full week to accomplish, it would turn out to take more than a week most of the time. We were fooling ourselves with our estimates, and we were skimping on design: it’s one thing to be against “Big Design Up Front”, but that doesn’t mean that some amount of design isn’t appropriate.

And now a bunch of things clicked. I’d been aware for several months that we weren’t really planning in the XP way: the relevant issue here is that we were working exclusively in terms of “stories” (basically, features with user value that can be implemented in a week or less), but not breaking them down into “tasks” (individual technical steps necessary to implement the features, each of which can be accomplished in a single pairing session). When I first realized that we were doing the planning wrong, it wasn’t clear to me that this difference was a big deal, but all of a sudden introducing tasks seemed to solve several problems that we were having:

  • Breaking a long story into tasks should make it easier to accurately estimate the story’s duration, with a bit of practice: a six task story will probably take longer than a four task story, but that wouldn’t have been so obvious before breaking it up into tasks.
  • The process of breaking a story into tasks gives us a chance to talk about the story together and do an appropriate amount of up-front design.
  • If a task takes longer than expected (in particular, longer than a day), that’s an immediate warning sign that something unexpected has turned up. We can deal with the problem right then, by calling an impromptu design session and breaking up the task into smaller tasks as appropriate.
  • In the unhappy event that a story still takes longer than a week to accomplish, at least I’ll have a much better idea of its current status, because I’ll know what tasks have been accomplished and what tasks haven’t been accomplished.
  • It seems plausible that it will significantly improve our mood towards pairing: it’s not much fun showing up in the middle of somebody else’s project, working on it for a little while without really knowing what’s going on, and then leaving while that person continues. It’s a lot better if you come in at the beginning of a coherent project, work on it together for a few hours, and finish it.

We’ve been doing this for a grand total of a week now; it’s probably largely my imagination, but I’m a lot happier with how things are going. We actually had a pretty bad week in terms of completing stories (we were still underestimating how long long stories were taking), but the one problematic story was in much better shape: we’d finished 5 of the 6 tasks that we’d broken that story into, we knew the last task was turning out to be more complicated than we expected, so we found a coherent way to split it into two tasks.

In our weekly meeting on Friday, most of the stories were fairly well-defined, but one of them was pretty amorphous. So we spent about 20 minutes breaking it up into talks, talking about pros and cons, with lots of people chipping in about what they remembered about the different pieces of affected code. At the end, there was general agreement that the story was significantly less scary than it had seemed before we started talking about it.

And maybe it’s my imagination, but I think I’ve been enjoying pairing more. Yesterday, for example, I had a very pleasant time writing a really solid class. I particularly appreciated my partner’s winces whenever I chose a bad name for a variable: joke all you want, but little things like that are important. (Incidentally, we also tried out programming by intention some more, with good results.)

Not everything is perfect yet, but I’m much more optimistic than I was. We’re still underestimating large stories, but hopefully tasks will give us a better handle on that. Significant issues still remain with pairing: in particular, our differences in familiarity with different parts of the code and in programming background make pairing hard, but I can deal with that, and those differences will lessen over time. As long as we have a plausible path for improvement there, I’m happy.

On the one hand, I feel a bit silly that we didn’t start using tasks a lot earlier: I should have been paying more attention to what the XP books were saying, because the authors of those books have a lot of useful experience. (Incidentally, it’s fascinating reading the XP mailing list.) And I’ll certainly keep on rereading various XP books to find more mismatches between our practice and their descriptions that might shed light on problems we’re having. On the other hand, making mistakes is a classic way to learn, and for good reason: I have a much more active grasp of this issue than I would have if we’d done things right from the start.

My next management issue, aside from monitoring this one: reading about Scrum, to see if we can use that as a blanket methodology for the entire software team (i.e. my group, the other two groups parallel to it, and my manager’s group). It’s compatible with but less specific than XP, and explicitly addresses issues involving multiple groups; with luck, it will be something we can all get behind. But I have some reading to do to learn more about it, to see if I think it is a good match for current and potential problems that the larger group has.

pair programming update

August 5th, 2005

About three months ago, my team started seriously experimenting with pair programming. It’s been more than long enough since then for us to take stock, so we had a meeting three or so weeks ago to talk about our experiences.

The results were mixed, and really hard for me to get a grip on. Some good things:

  • Pairing did help the quality of our code.
  • More people know more about more of the system.
  • The daily standup meetings that we started doing at the same time as we started pairing helped me (as a manager) keep much better track of what was going on midweek.
  • Sometimes, pairing with the right person could save a lot of time debugging an annoying problem.
  • A pair seemed more willing to ask for help quickly than a programmer working alone.

I might have forgotten a few (I’m at home, my notes are at work), but that’s the basic idea. The last one, in particular, interested me: I wasn’t expecting it, though in retrospect it makes sense. After all, the macho programmer ethos means that a single lone programmer is loath to admit that he can’t solve a problem by himself; if two programmers both can’t figure something out quickly, though, they’re much more likely to figure out that they need outside help. (When appropriate, of course, especially when there’s specific knowledge that they’re missing.)

The bad side (again, I might have forgotten a few):

  • It wasn’t at all clear that we were more productive pairing than when working alone.
  • We didn’t look forward to pairing.

The first of those isn’t necessarily a show stopper: we agreed that we were willing to make a tradeoff of less code written of higher quality with more knowledge transferred, and that we weren’t in a situation where we needed to crank out as much code as possible. So it seemed plausible that pairing was a long term productivity gain for us; still, somewhat disconcerting, since the literature suggests that it should be clearer that pairing is improving our productivity.

The second, though, is a real problem: I got the feeling that we (I, certainly) wanted to enjoy pairing, but something really wasn’t working right. And I couldn’t figure out what it was.

Our conclusion was that we saw enough good things that we wanted to keep on trying. But we needed to leave more breathing space, at the very least. We decided to start by drilling down on our feelings of where pairing was more productive and where it was less productive, and then during our standup meetings, we’d use those criteria to figure out who would pair at all that day, not assuming that everybody would always pair.

(It’s getting late, and this is as good a stopping point as any; I’ll post a followup bringing the story up to date in a day or two.)

dan johnson, shanghai crab

August 1st, 2005

I was kind of bummed when Erubiel Durazo got hurt, and Scott Hatteberg’s performance has certainly been nothing to write home about this season. (I still have no idea why he’s gotten the contracts he has from Billy Beane.) But Dan Johnson’s performance has been a pleasant surprise: I’d literally never heard of him, but after 174 plate appearances he’s slugging .500. Who knows how long he’ll keep that up, but I guess he isn’t a complete flash in the pan: looking at his entry in the 2005 Baseball Prospectus, I see “Johnson is ready to step in and take Hatteberg’s job”, and they certainly got that right.

We’re watching the shanghai crab episode of Iron Chef right now. I’m used to seeing live seafood there (driving nails through the heads of pike eels thrashing around on the cutting board), though seeing the poor crabs put live into a hot wok was a bit much. A first for me, though, was a crab with its shell off, in the process of being disemboweled, and you could still see its heart beating…

livres

July 31st, 2005

I did some book shopping in Paris. A bit silly, in these days of www.amazon.fr, but old habits die hard. And FNAC is still pretty cool, though not quite as impressive to me now as it was the first time I set foot in it.

I bought most of Bruno Latour‘s books that hadn’t been translated into English, some comic books (standards: Tintin and Asterix), a few Barbapapa books for Miranda, and SGA1. I felt a little silly about the comic books, not about the ones I did buy (they are both deservedly classic series) but because I didn’t look for anything else: France is one of the great comic book-producing nations, and I walked by several good-looking stores, but I just wasn’t in a very inquisitive mood, I guess.

The new printed version of SGA1 turned out to be the same version that’s available online. Still, it’s nice to have a copy that’s easy to hold in your hand. Who knows when I’ll get around to reading it, but I suspect I will at some point over the next year or two (more likely it than some of the more experimental Bruno Latour books); I think/hope it should be at a level that I can read it without excessive effort, and it’s an important part of mathematical history. I don’t want to lose contact with math entirely, after all, and reading classic works seems like a good way to keep my brain active.

The whole Grothendieck reprint story has to be seen as a victory for the forces of good. I spent some time this weekend reading Free Culture, by Lawrence Lessig, and now I’m really depressed, but it’s great to see some people saying that the current situation is ridiculous and snubbing some of its more odious aspects.

The technical bookstore that I patronized seven years ago seems to have disappeared, more’s the pity. But it remains the case that general-purpose bookstores in Europe have much better math sections than their counterparts in the US. I’m not sure why that is, but I’m not complaining. It was fun browsing; a lot of familiar titles, and some new titles on familiar subjects. Nothing new and exciting that leapt out at me; in a decade or two, maybe I’ll go back and catch up on some of the advances in the field. Probably not, to be honest, but who knows what the future would bring; I’ve enjoyed spending the last two or three years catching up with (some of) the advances in computer science that I missed over the previous seven or eight years, after all.

programming by intention

July 29th, 2005

Ever since I read Refactoring to Patterns, I’ve been thinking that I should use Compose Method more. (I should really reread Smalltalk Best Practice Patterns to see what other low-level patterns I’ve missed.) But I’m too timid to perform quite that drastic surgery to the thicket of code that I’m working on.

I just finished Extreme Programming Installed, though, and the authors talk about an interesting way to develop your code so that the methods are nicely composed. It’s called “programming by intention”, and works as follows: whenever you sit down to implement a method, you simply write down a method call explaining what you want the method to do first, another one explaining what you want it to do second, etc., without worrying yet about whether there are, in fact, methods with those names. If there aren’t, you then go and implement those methods. (Again programming by intention, though it should stop after two or three levels.)

I tried this yesterday, and it was great! I wanted to write a method that parsed a series of data structures; I had some ideas about how the low-level details would work, but I decided to just put those out of my mind and program by intention. We were parsing a sequence of data structures for as long as data remained, and the conditions for when data remains were slightly nontrivial in this context, so I started by typing (more or less, details are changed):

  while ( dataRemains() ) {

Each data structure starts with a type field, and a length field, both expressed as a multibyte value that I hadn’t yet had to parse. So:

    int type = nextMultibyteValue();
    int length = nextMultibyteValue();

Next, we start printing out the data. We’d like to output a string representation of the type codes, so:

    printTypeCode( length );

After this, we needed to output the data as a sequence of bytes, with its length given by the number we just read; I already had code to do that, so I just called that code:

    printNextBytes( length );
  }

Once I’d done that, I implemented dataRemains, nextMultibyteValue, and printTypeCode; each of them was easy to implement now that I wasn’t thinking about anything else. (And I knew I wasn’t wasting my time because I’d already shown that, once those were implemented, I’d have exactly the functionality I needed.) And the resulting methods looked great (though Compose Method suggests that I should have gone further and extracted the entire body of the loop into a method, which probably wouldn’t have been a bad idea).

This dovetails very well with test-driven development. One important benefit of TDD is that it focuses your mind on doing one thing at a time: either you’re focused on writing a test to express your next goal, or you’re focused on getting the test to pass, or you’re focused on cleaning up your code. Programming by intention, in turn, helps narrow your focus during the second of those steps: while you’re getting the test to pass, concentrate on what you want your implementation to do on a conceptual level, then drill down and repeat.

Side note: in a recent post on the XP mailing list, Kent Beck talks about how top down / bottom up isn’t a very useful dichotomy for him. Which I agree with to some extent, but programming by intention suggests that a particular form of top down programming is very useful when programming on a small scale. I’ll have to think about the extent to which this is the case at other levels of XP: is top-down the way to go when you’re trying to get an acceptance test to pass, for example? (Probably on the design level, but not on the implementation level, because you’d go far too long without working code.)

upgrade finished

July 28th, 2005

I spent a little more time playing around with doing the upgrade piecemeal; it turned out that, while there were some pleasant groups of packages that came together in a clump of 10-20, most packages either were happy to be upgraded individually or were part of a huge clump that required a few hundred packages to be upgraded simultaneously. (Upgrade ftp, then libreadline has to be upgraded, then all other CLI programs have to be upgraded, and they pull in all sorts of random libraries to upgrade, etc.) And once that happens, you might as well upgrade everything. So I did; worked fine. (I’m still planning to go and look through my list of installed packages just to see what I should consider removing, though.)

I’m a little annoyed at their “Fedora extras” thing. At first, I was happy because it meant that I could get galeon from them instead of having to find it at another repository. (Good thing, too, because the repository I had been using for that doesn’t seem to have an FC4 version available.) But it turns out that they don’t bother to keep the extras repository in sync with their other repositories; they’ve done an upgrade of mozilla, since their last galeon upgrade, so right now I can’t install galeon at all because there’s no easy way for me to get the old mozilla version instead of the new one. Sigh. Still, they’ll probably work out the kinks over the next few months.

upgrading to fc4

July 26th, 2005

As threatened after my last OS upgrade, I’m upgrading to FC4 relatively soon after its release. This time, the release notes are very clear about the easiest way to upgrade: install a single RPM (which basically tells yum to look for FC4 packages instead of FC3 packages), and then do yum upgrade.

I’m actually not quite doing that: since I’m not sure how long it will take to download all that stuff, I’m trying to do it piecemeal. Which is sort of a fun game: sometimes, if I want to upgrade a single package, it just upgrades that package plus maybe a handful of others, but sometimes it indirectly pulls in hundreds of other packages.

Unfortunately, there’s some sort of version problem with galeon, my web browser of choice. (It’s included in their ‘extras’; maybe they don’t rebuild those as frequently as they should?) So I’m using firefox for now, which is fine. And there’s some sort of dependency failure with certain java-related packages; I’m not sure what the deal is with that, but for now I don’t mind just removing the packages in question.

I expect that I’ll be doing this over the course of the next week or so; gives me something to do, I guess.

(more baseball)

July 25th, 2005

Despite what I said a week and a half ago, maybe the A’s are going to make the playoffs; they’re tied for the lead (and about to take the lead) in the wild card, after all. Even the AL West title seems not out of reach right now. They will, of course, cool off eventually, but they’ve shown over the last few years that they’re more than capable of ridiculous second-half performance. (Is that luck, or is that a skill that some teams or players have? Any studies one way or another?)

Too bad that the Indians are going in the opposite direction, and are so much further behind their division leader…

howl’s moving castle, families

July 25th, 2005

We went to see Howl’s Moving Castle last weekend. Actually, Liesl and I went the weekend before that, to make sure it was okay for Miranda; we decided that it probably was, though we checked first with Miranda to make sure.

We all enjoyed it, though I don’t think it will end up as one of my favorite Miyazaki movies. One thing that struck me: while I have nothing against love stories, they’re all about falling in love instead of loving people. I don’t have any plans to ever do the former again. while the latter is a huge aspect of my life. And while the move had its love story aspects, they were muted, and even explicitly questioned at the end. Instead, the relationship aspects of the movie focused on building a family, something very dear to my heart. And quite a family it was, too: I really like the “collection of misfits banding together” trope, families as a group of people who have made an active choice to stay together. (This was something I really liked about the third volume of the Kushiel trilogy, too.)

On a related note, we watched Shrek 2 on DVD this weekend. It has a little bit of the “actively chosen family” theme in it. But it’s also about two people reaffirming that they are very much in love; I for one cried at the end of it. Again, Kushiel does this, in the second volume instead.

literate programming

July 24th, 2005

Prompted by Knuth’s delightful article “The Errors of TeX”, I just read his collection Literate Programming. (Which contains the aforementioned article, among others.) A fascinating read, for multiple reasons: Knuth is a really smart guy, whose opinions I very much respect, but he’s writing from a context that I frequently find very hard to understand.

The most dramatic example of this is the first article, “Structured Programming with go to Statements”. It was written in 1974, three years after I was born, and I never learned Algol 60 (which I think is more or less the language that the article uses). I understand that goto once was used much more, and I can imagine a world in which people had yet to decide that a few iteration constructs (for, while, and the occasional (in my experience very occasional) dowhile), combined with a sprinkling of break and continue, were good enough for 99.9 percent of your needs. And while I suspect that some of the other solutions he discusses (Zahn’s iteration constructs) are too complicated to be a part of a well-designed programming language, I can imagine an alternate history in which they would have a more prominent role.

But what I can’t imagine is a world without functions, yet that seems to be the world that this paper was written in. Did programming languages of the time really not have functions? (Certainly some of them did.) Or did they have functions but only allow a single exit from those functions, in which case they were irrelevant to the question at hand? That must have been the case, but I still find it hard to wrap my brain around. Compiler technology must have played a big role, too: the article is very concerned with optimization, and I doubt compilers were too good at inlining at the time.

So I felt like I was in bizarro world when I was reading the article. But it’s always fun reading visiting other worlds, and there were some hints of worlds that are dear to my heart in the article. At the end, he hints at object-oriented programming (or at least modules). Much more interestingly, earlier in the article he discusses the possibility of automated refactoring tools (without using the term “refactoring”, of course). But the use he proposes for those tools is completely different than the current uses of those tools: these days, we want to apply behavior-preserving transformations to our code in order to make it clearer, while Knuth wanted to start from clear code and apply behavior-preserving transformations to make the code less clear, but more optimized!

A decade later come the articles on literate programming. I knew that this was a way to embed source code and a description of what the code does in a same file, to enable you to present a nicely-typeset, richly-commented version of the code. I hadn’t realized quite how lengthy the comments were that Knuth proposed, however; I also hadn’t realized that literate programming divides code up into fragments that can be portions of functions, instead of entire functions.

At least functions play more of a role here than in the earlier paper. But they still don’t play enough of a role. Over and over again, I had to ask: why aren’t these fragments of code functions in their own right? There are probably a few answers to this. For one, I suspect that short functions still weren’t in the air much at the time. For another thing, I suspect that compiler and programming language support for inlining wasn’t very good, so it would have been an unacceptable performance hit. And a third thing is that the code fragments wouldn’t have always worked on their own because they referred to variables external to the code fragment: you need objects for that style of programming to come into its glory.

So it’s pretty amazing to see how Knuth comes towards modern best practices without modern programming languages and tools. And it’s embarrassing to realize that Knuth probably understood the need for Compose Method two decades ago better than I do now. He has also responded to my objects to the paper I discussed above: his literate programming discussions are in the context of Pascal, which apparently doesn’t allow multiple exits from a function, so Knuth provides a return macro to get the same effect (using goto).

[Side note: I never learned Pascal, for no particular reason. (I’m the right age to have learned it, but I was cool enough to jump straight to C. Or something.) I was aware of its pointless procedure/function distinction; I was not aware that it distinguished between calculating the return value for a function and actually returning from a function. Weird.]

Getting back to the “literate” part of literate programming, the length and quality of exposition of the comments in the code samples that are given is pretty stunning. Looking at this through XP glasses, I suspect that the comments are rather too long (too many artifacts not pulling their weight), but it would be interesting to try it out once or twice. (At the very least, I should get around to reading TeX: The Program: it’s only been sitting on my bookshelf for a decade and a half by now…) My first reaction was that lengthy comments could be good for presenting a finished, polished program to the outside world, but not so great for a live program that is constantly being modified, because the comments are an extra burden and could get out of date too easily: far better to spend your time on making the code itself clear. But, in “The Errors of TeX”, Knuth talks about how the comments made it much easier for him to perform some serious surgeries on the code, so I could easily be wrong there.

Some more cultural shifts that you see in the book: at the start of the book, he’s talking seriously about mathematical correctness proofs of programs. By the time he gets to “The Errors of TeX”, though, he’s using test input to exercise every aspect of his code. A big improvement; personally, I’d rather lean more on unit tests, but he’s working in a context where that isn’t so realistic. Also, data structures sure have changed over the years. When I think of, say, a binary tree, I think of a node as being represented by a chunk of memory with two pointers and a data value in it. But in this book, a binary tree is three global arrays (data, left, and right), with a node being an index into those arrays. (So many global variables!)

I’m definitely putting more of his books on my “to read” list.

strategy of the weak, revisited

July 24th, 2005

I’ve thought a bit more about the whole strategy of the weak thing. In some sense, actually, the DoD’s analysis is reasonably on-target. The analogy here is to think of the US as a bully: we’re the biggest, strongest kid in the school, and we have no compunctions about beating up people who are substantially less powerful than us (in various ways, not necessarily strictly militarily ones) if we can see short-term benefits in that.

And, of course, it’s natural for a bully to dismiss those less powerful people who won’t meet him on his own terms as “the weak”. There are various strategies that weaker kids can employ against bullies: banding together with other kids (international fora), telling the teacher (judicial processes), or feeding antifreeze to their dogs (terrorism). From this point of view, the DoD’s analysis actually looks pretty good, though the tone could use some work.

But there’s still a serious problem: it says “a strategy of the weak”, not “strategies of the weak”. And this linkage doesn’t work in the real world or in the analogy: the kid feeding antifreeze to the dog is not the kid telling the teacher, and if Osama bin Laden is complaining to the International Criminal Court, I haven’t heard about it. Pretending that these are all part of one strategy is, well, obscene.

strategy of the weak

July 22nd, 2005

From a DoD policy paper:

Our strength as a nation state will continue to be challenged by those who employ a strategy of the weak using international fora, judicial processes, and terrorism.

Like Scott Rosenberg, I am appalled. Several years, I came to the conclusion that the term “terrorist” was almost never a helpful one, but this linkage of it with international fora and judicial processes is simply obscene.

foreign money

July 22nd, 2005

Whenever I go to Europe, I’m happy that the coins reach higher denominations there than in the US: it seems to work pretty well. The thing is, though, I’m not sure that it would work well in the US: my change purse is always a lot fuller than my wallet, and I’m not sure that shifting dollars from the latter to the former would be a good idea.

There are a couple of factors at work that I can identify. For one thing, if they want to charge two euros for something, they change two euros, not some ridiculous price like €1.95 or €1.99. And, for another thing, the list price includes the sales price. Works much better; I acquired almost no 1- and 2-cent coins until we went to a pastry shop that was out of 5- and 10-cent coins.

(Though there is an unfortunate flip side to this: the ATM machines loved giving out 50-euro bills. I wish I could remember the machine that only gave out twenties…)

This was the first trip where we didn’t have to use a single traveler’s check; yay for global networks. We might have used one in the airport, since the ATM there didn’t like our cards, but the rates the change bureau there charged were obscene: convenience is one thing, but not at the expense of a ten percent premium. So we didn’t worry about spending all of our money: that way, the next time we go to Europe, it’ll be easier to get out of the airport.

DEWN

July 17th, 2005

Bonny Doon is the best.

international herald tribune

July 16th, 2005

When in Paris, I read the International Herald Tribune daily. Which was a change of pace: it’s been years since I regularly read any paper other than the Mercury News regularly. (In particular, it’s been a while since I’ve read the New York Times regularly; I read several news magazines regularly, but just the one newspaper.)

And, all things considered, it was a pleasant change of pace. Like all newspapers, it has its target audience, and in some ways I would seem to be not too bad a fit for it: I’ve spent enough time with people who don’t seriously consider any newspaper other than the (New York) Times, and I’m happy with the more international skew on the news that the IHT gives. (To the former point, I would say that I still think it’s weird that the only newspaper that Stanford’s math department subscribes to is the Times: why not the Mercury News or the SF Chronicle? Except that I don’t think that it’s weird at all: academia works very hard to break down loyalty to physical locations, to make you loyal to your discipline as opposed to any other grouping you might be part of, so it’s not so surprising that people are loyal to the paper that does the best job of presenting itself as the paper for the intelligentsia. On a more prosaic note, lots of people who pass through the math department are from elsewhere, so the Times is more likely to be familiar to them than other papers; people who aren’t transient are likely to subscribe to a local paper at home.)

There were times, however, when I would realize that I wasn’t quite the target audience after all. When it moved away from political news, there were a lot more articles on high-end fashion than on baseball, for example. The longest obituary while I was there was about some New York socialite whom I’d never heard of and couldn’t see any reason why I should care about. Even some articles along those lines were vaguely interesting to me, though: I liked the article talking about how you could buy a French chateau for only 700,000 Euros (a lot cheapter than chateaus in Mountain View, that’s for sure), and the weekly opera advertising section amused me.

And then the bombing came, which I should probably devote a separate blog entry to, but whatever. I’m glad I had my vacation in Paris instead of London, though we would have been just waking up at the time of the bombings anyways. George Bush is a liar and asshole. Yes, the bombings are horrible; twisting them for your own political ends is not the correct response. “And the contrast couldn’t be clearer between the intentions and the hearts of those of us who care deeply about human rights and human liberty, and those who kill, those who’ve got such evil in their heart that they will take the lives of innocent folks.” Sure, not a single innocent Iraqi has been killed by US forces, right. Of course, our only motivations for our foreign policy are human rights and human liberty, how could anybody think otherwise? But these comparisons are beyond the pale, in the IHT just as much as in Bush’s brain.

Several other columnists were also happy to use the terrorists’ actions for their own ends: in particular, there were digs at Spain for electing an anti-war government after its bombings, claiming that that’s morally wrong because it’s doing what the terrorists want. Sure, if you don’t know whether doing X or Y is a good idea, then doing the one that terrorists don’t want you to do is probably a good strategy. But if you are sure that one of them is better, then changing your mind because a terrorist agrees with you is only marginally more sensible than changing your mind because a terrorist disagrees with you.

rich harden!

July 14th, 2005

It was very pleasant reading the International Herald Tribune every day (except Sundays) over our vacation, and watching the A’s get closer and closer to .500, and finally reach it. (It would have been nice if they’d given baseball a bit more space in their sports section, though; ah well.)

And then we got home, and the annoying all-star break hit. Sigh. But that’s over now, and wow, what a game for Rich Harden. I’m not surprised to see him throw a shutout, but a two-hitter where he was perfect through seven innings is great, and an 81-pitch complete game is stunning.

I still don’t think they’re going to make the playoffs, but I’m a lot less confident of that than I was. (Ditto for the Indians.)

raclette

July 12th, 2005

A couple of trips to Paris ago, Liesl and I discovered the joys of raclette. We had it at a restaurant (attached to a cheese shop) called the Ferme Saint-Hubert; they gave us this huge chunk of cheese, stuck it on a rack with a heating element, gave us some meats and potatoes, and told us to scrape the cheese onto the meats and potatoes as it melted. Which we did; it tasted great, and was quite the sybaritic experience.

So we went back there again on our last trip to Paris; we also tried raclette at another restaurant, but that melted it themselves in the kitchen, so it wasn’t as much fun. We’ve since bought a tabletop grill that can be used as a raclette maker: not the same experience, since you slice the cheese up in advance instead of putting a heating element next to a half-wheel of cheese, but it tastes just as good, and is probably our favorite thing to serve when we have guests over. (Easy and impressive.) We buy the cheese at the excellent milk pail market (honestly, that store is one of the main reasons why we wanted to stay in Mountain View); for what it’s worth, I have a slight preference for French raclette over Swiss raclette, but you can get them both there.

So, of course, we wanted to go back to the Ferme Saint-Hubert again on this trip. But, alas, it had closed: a restaurant specializing in truffles had replaced it. Unfortunate, but it actually led to our most pleasant food discovery of the trip: down the street was a bistro called the Ferme des Mathurins, with very friendly staff, stunning mozzarella, and a quite interesting 1998 white wine (whose name I’ve forgotten) that tasted rather sherry-like and was the yellowest wine I’ve ever seen.

All was not lost, however: our trusty copy of The Food Lover’s Guide to Paris listed several other places where we could get raclette. (It’s about 10 years old, not even the most recent edition of the book, so not too surprising that one restaurant listed in it was closed.) So a couple of nights later, we tried another cheese restaurant. But it had closed, too! Fortunately, another candidate was within (lengthy) walking distance of that one, so we walked there: closed as well. Sigh; we gave up, and had a decent meal at a brasserie nearby.

So we tried another one a day or too later, and our luck continued: four cheese shops out of business. The one thing we most wanted to eat in Paris, and we couldn’t get it. Actually, that’s not quite true: in the den of cheap restaurants near the boulevards St. Michel and St. Germain, there were four restaurants serving raclette; after the aforementioned failures, we tried one of them, and while they did have a tabletop grill, I wouldn’t have guessed that the cheese was raclette if you hadn’t told me.

Oh well; we did have quite a bit of good food on the trip.

Incidentally, we did go out for Japanese food a couple of times, at Miranda’s request. One restaurant was quite good: lovely decor (including some of the serving dishes), some very interesting dishes, and stunning toro. In general, though, the sushi there seems worse than what we can get around here: in particular, the salmon just wasn’t as good as the stuff we get around here, and neither restaurant had flying fish eggs (Miranda’s favorite sushi). At least one of the restaurants (we didn’t check the other), didn’t have any edamame, either, which surprised me.

Museum restaurant guide: the Picasso museum has surprisingly good food, and the restaurant on the top floor of the Pompidou center is quite nice.