When Steve Jobs died, I felt I should write about him. Probably about Apple, really: I don’t know anything about Jobs, but Apple (the company and its products) occupies a surprising amount of my psychic space.
It took me quite some time to get around to writing the post, however; and, when I started typing, I realized why. To dig into Apple’s place in my psyche, I had to explain my history with Apple products, and indeed with computers in general. And, as it turns out, that takes a while. The result is a post where the tail is rather wagging the dog; interesting to me, at least, but one that could most charitably be described as ungainly. (Feel free to skip ahead to the Apple bits.)
At any rate: the computers I have owned, and why I am fascinated with Apple.
Prehistory
My parents bought us an Apple ][+ in May 1982; I was in fifth grade at the time. That was the only computer we had at home through at least 1989, when I went off to college (my brother got a computer when he went to college a few years earlier); hard to imagine these days. I’m not sure when my parents got a second computer, and I know they continued using the Apple ][+ for several years after I left home, at the very least to run a program they wrote to help manage their finances.
I programmed some on that Apple ][+ (the high point being a text adventure that I wrote), but my memory is that I didn’t program particularly seriously on it. I used it to write papers (and for some other writing projects, I went through a phase when I wrote short stories and a novella). And I played quite a few games on it, high points being various Infocom games and the first four Ultima games, but I also think fondly of Robot Odyssey, Le Prisonnier, Lode Runner, and Wizardry.
In 1987 (my junior year of high school) I started hanging out more at Oberlin College, and I spent quite a bit of time in the various computer clusters in the school library. I got to be a rather fluent VAX/VMS user, and (presumably through some of the math courses I was taking?) started hanging out with some computer science majors. They got me interested in learning to program in C and Scheme, and in the 1988–1989 school year I started using Unix more. I also remember helping one of them install GNU Emacs on that VMS cluster. (At the time, the computer science’s Unix cluster actually had Gosmacs installed instead of (or at least in preference to?) GNU Emacs.)
Oberlin College could send e-mail to other institutions via Bitnet, and had a DECnet connection with a half-dozen or so other colleges. (DECnet was pretty cool.) It also had Usenet feeds. It was not yet on any of the TCP/IP-based networks that became the internet.
College
When I went off to college in the fall of 1989, my parents brought me a Macintosh SE/30; I used it to write papers in non-technical subjects, play games, and do some amount of programming. (I wrote my papers on technical subjects in LaTeX; I’m honestly not sure whether I mostly typed those on my Mac or on one of the clusters mentioned below.) Continuing my habits from the last two years of high school, however, I spent much much more time on the various computer clusters around the college. I begged an account on the math department’s Sun workstation cluster, though the sysadmin and I had an iffy enough relationship that I didn’t spend very much time there. I begged an account on the computer science department’s Sun workstation cluster as well, where I spent more time. (There were probably Ultrix machines in that cluster, too?) And I got a part time sysadmin helper job on the general school cluster. (Mostly Ultrix machines, initially with dumb terminals but X terminals showed up fairly soon.)
I probably spent most of my time on the general school cluster: programming, playing around, and doing system administration work. Coming out of that, I was much more comfortable on Unix than in any other computing environment, and had installed various bits of free software (mostly GNU tools of various sorts) over and over again. I also had a friend from Oberlin who was then working at the Free Software Foundation, so I was getting a strong free software philosophical dose from him as well.
I took a couple of computer science courses (an intro theory course, a compilers course), but not many: mostly because I could learn how to program computers just fine on my own, partly because I had enough other interests competing for my course time. Also, at that time Harvard’s computer science department didn’t have the buzz that I’d gotten from Oberlin. (Though there were students and faculty members that I learned a lot from, don’t get me wrong.) I was into programming languages and compilers at the time: I did some sort of undergrad research project on compilers, I was a course assistant for a few courses on programming languages and compilers, and I spent three out of my four summers during that period doing programming-related work. (One summer at MITRE, one at DEC, one being a course assistant at Boston College; the fourth summer was spent at a math research program whose main benefit was that I became a not-hopelessly-incompetent cook.)
During this period, I had access to TCP/IP-based networks: ARPAnet had evolved into NSFnet, with the internet coming. The web poked its head out right at the end of this period, but it certainly wasn’t clear to me that it was anything more than a peer to the various other network protocol that were floating around at the time.
Life as a Mathematician
Then, after a year’s interlude, I went to math grad school in 1994. I still had my old Mac, Jordan bought a new Mac (that I played Marathon on), Liesl bought a 486 machine running Windows 3.1 (I played Myst, System Shock, and Dark Forces on that), and at some point I was given an X terminal that I could use at home. Most of my computer time was spent on the math department machines, though; and I essentially wasn’t programming at all during this time period. Also, a friend of mine gave me an NES, which started me on a spiral of depravity that I still haven’t emerged from. (One of the first things I did after getting my postdoc acceptance letter was to get a Nintendo 64; good thing my thesis was almost completely written by then…) Actually, though, my dominant leisure activity during that time period was reading books, I averaged more than a book a day over the course of grad school.
I can’t remember if I moved my old (9 years old at the time!) Mac with me when we went to Stanford in 1998; we moved Liesl’s computer, but I’m not sure if we ever turned it on. In general, I did my computing on the machine in my office at the math department; I can’t remember its specs (though I believe it had 4 GB of hard drive space?), but it was running an early Red Hat Linux version. I still wasn’t programming significant amounts: I was busy being a mathematician and a parent (Miranda was born in 1999), trying to figure out how to teach well, and playing video games, doing the latter almost exclusively on consoles instead of computers.
Returning to the Apple theme that triggered this post: during this period, my interest in Apple was quite low. I had a Mac, but barely used it; I certainly wasn’t going to use Windows machines, but really my focus was on Unix. (So, in terms of recent computing deaths, Dennis Ritchie’s is a lot more relevant.) I was at least partly anti-Apple at the time: the Free Software Foundation and the League for Programming Freedom had boycotted Apple because of their use of user interface patents, and that had an effect on me.
Transitioning
In 2002, academia and I came to a mutual decision that we weren’t as good a fit as I had thought. Fortunately, the Stanford math department was willing to let me hang around for another year; so I spent half my time that year teaching calculus and half my time brushing up my programming skills. I learned C++ and Java (object-oriented programming was far from dominant when I was an undergraduate), and contributed a fair number of patches to GDB.
It also became clear that I wouldn’t be able to depend on my employer to provide my computing resources; so I bought domains to use for my various internet presences, and, for the first time since 1989 (13 years!), acquired a new computer. It was a Dell Inspiron 8200 laptop, a behemoth that was barely portable (and that, fortunately, I rarely needed to carry anywhere); we set it up to dual-boot Windows and Linux, and I spent the vast majority of the time on the Linux side.
Also, befitting my academic nature, I started reading books and going to talks. A lot of the books that I read were C++-specific (and I learned a lot from them, C++ is an extremely interesting language); in terms of non-language-specific books, the refactoring book had a big impact. The talk that had the most impact on me was one that a couple of researchers in a local corporate think-tank (?) gave about their experiences with something called “eXtreme Programming”; that was my first exposure to Agile software development.
The GDB work led to consulting work at a startup called Kealia, and I started working there full-time when I left academia in the summer of 2003. We got acquired by Sun a year later; soon after the acquisition, I became a manager, albeit a manager who spent a lot of time programming.
Agile
I spent a lot of time trying to understand Agile software development over the next five or seven years. At first, I was just trying to do this on a personal level, practicing refactoring and trying out test-driven development. Kealia’s legacy code provided some interesting challenges on the former front; the company also already had a bit of a testing culture when I showed up, and we experimented with going farther in that direction. And becoming a manager got me interested in other aspects of Agile: the more explicitly people-focused aspects, the planning aspects. And, as part of planning, the idea that programmers don’t make all of the design decisions (which was quite a change from working on GDB!): other people have a better idea of what the end users really value, what will work well in their context.
As an academic, I’d been quite ivory tower (at least aside from my interest in teaching); that changed. I was working at a startup which got acquired by a larger company that had suffered a lot over the last few years; part of startup life is trying to figure out how to make your business work, and Sun was trying to figure that out at a larger scale. Sun also put enough resources behind StreamStar (Kealia’s video server project) that we had quite a lot of room to experiment with different business strategies, trying to find one that would stick. (Far too much room: the fact that Sun didn’t cancel StreamStar years before I eventually left was a sign of Sun’s own management problems.)
My boss was a big fan of Clayton Christensen’s disruption theories, and I got to see both sides of the difficulties of disruption first-hand. Sun was a large company that was already far along the path of being disrupted by commodity hardware running Linux, and was trying to figure out how to deal with that; StreamStar was trying to disrupt the existing broadcast television infrastructure, replacing it with IP-based solutions. In neither case did we navigate the difficulties well, but I have quite a bit of sympathy for both sets of difficulties: surviving being disrupted is extremely difficult, and when it comes to broadcast television, you have to deal not only with the existing technological infrastructure but with the existing broadcasters and existing content providers. So it’s not surprising that we failed to disrupt broadcast television delivery, whereas Youtube was much more successful with its end run around the last two issues.
During this time, I won an iPod (one of the hard-drive based models), and a couple of years later, an iPod Nano at company raffles. I wouldn’t have bought the first iPod on my own, but its presence made my jogging a lot more presence; I probably wouldn’t have bought the iPod Nano on my own, but I was quite surprised how much more I liked its small size, the lack of skipping, and the general elegance of its design.
Our Dell laptop died in 2006, and had been showing its age enough by then that I was already planning to replace it. For my own Linux use, we got a Sun Ultra 20; to have a computer that Liesl could use and that I could run iTunes on, I got a MacBook Pro. This was the first model after the Intel transition; I felt more comfortable going back to the Mac instead of having a Windows machine around, and the fact that there was now Unix underneath MacOS was a real bonus. (Incidentally, back in 2003 I’d turned down a job offer working on GDB for Apple: I like Unix and the GNU toolchain, but I wasn’t really interested in specializing in the latter.)
At some point while I was at Sun (probably in 2008), I got an iPod Touch. That was really a revelation to me: it was wonderful having a little computer in my pocket, one that was already fairly versatile and was becoming more so every year; I had Wi-Fi access most of the places I spent time (there was even spotty Wi-Fi available from Google when wandering around Mountain View), but I could tell that having a phone network provide almost constant network access would be so much better.
But more than that: Tweetie made me sit up and take notice. That was the Twitter client that eventually became the first-party Twitter client; and despite running on this quite small device, I far preferred using it to any Twitter interface I had available on computers that didn’t fit in my pocket. That didn’t make much sense to me; clearly there was something going on with design that I didn’t understand and that could make a real difference.
At this time, I was also getting more and more tired with having Unix on my desktop. I love Emacs, but it’s stuck in the stone age in so many ways: what really drove that home was once when I fired it up on a machine where I didn’t have my standard .emacs file and realized that, by default, Emacs put the scroll bars on the left. That may have been a perfectly reasonable decision when it was first made, but it wasn’t any more and hadn’t been for at least a decade; did I really want to be working with tools that were so willfully ignorant about design conventions? GNOME had helped civilize X Windows, but it had only brought the experience up to a minimally acceptable level, and even so there were too many non-GNOME applications around.
Reaching the Present
So, when I started work at Playdom, I asked for a Mac for my work machine: that way I could have a Unix command line and tools combined with a GUI that accepted the idea that design was a virtue. Which the IT department was oddly hostile to: you’d think that a company with a large contingent of graphics artists that deploys software to Unix servers would be a natural fit for Macs, but Playdom had its quirks, and its IT department was definitely one of those quirks.
At around this time we got a second Mac laptop at home, and I got an iPhone. (My first cell phone; I am a luddite at times.) The Ultra 20 died; I decided that I wanted to continue to run a Linux server (e.g. to host this blog), but that I would prefer to interact with it through an ssh connection, so I got a virtual machine at Rackspace. Also, I was getting older, and carrying around a laptop during GDC 2010 put a surprising strain on my back; the iPad had been announced, so I decided I’d get one the next time I went to a conference. Which happened sooner than I expected, since I decided to go to GLS later that spring.
My back thanked me for the iPad purchase; but my psyche thanked me as well, to a surprising extent. I found that I preferred reading e-mail on the iPad to reading e-mail in a web browser, and that I far far preferred reading blogs in Reeder than through Google Reader’s web interface, whether I used the latter to go to the blogs’ web pages or stuck with the RSS feed. In both cases, the iPad acted like a wonderfully adaptable piece of paper: the words I wanted were right there, with enough style to be pleasant (unlike the Google Reader web interface) but without any surrounding crap (unlike blogs’ web pages). Having a screen that was much smaller than computer monitors that I was used to, and that was in portrait mode instead of landscape mode, turned out to be excellent for letting me focus on what I was reading. (As it turned out, I even slightly prefer reading blogs through Reeder on my iPhone over reading them through a web interface on a standard computer, despite the rather-too-small size of the former’s screen.)
In early 2011, one of our laptops died; rather than replace it with another laptop, we got an iMac and a second iPad. Our current technology roster is an iMac and a MacBook (one of the white plastic ones); two iPads (one from each generation); three iPhones (one from each of the last three generations, though the oldest one is being used by Miranda as an iPod Touch instead of as a phone); a virtual machine located elsewhere running Linux; and half a dozen game consoles. (My rate of technology purchases has increased enormously since 1998.) Also in 2011, I started working at Sumo Logic; as is typical in startups around here (at least judging from the ones I’ve interviewed at), it’s largely a Mac shop for development (with deployment happening on Linux virtual machines), and my coworkers generally prefer various Apple products for personal use, though there’s more variation on the personal side.
So: that’s the computers and other technology that I’ve used over the course of my life. Apple played a large role when I was young and more recently, but in the middle there was a long phase where my norm was Unix + GNU toolchain, with a strong free software ethos. Why did I shift out of that, what’s behind my recent fascination with Apple’s products and, increasingly, Apple as a company?
Habitable Software
The first is the concept of “habitable software”. I talked about this last year: the idea is that there is software that my brain shies away from using, and there’s software that I actively look forward to using, where the thought of using it relaxes me or brings a smile to my face.
I actually think that console gaming gave me my first nudge in this direction. You stick the cartridge into the machine, you pick up a controller with a relatively constrained set of inputs, you turn on the machine, and it just works. Note too that a console controller, unlike a mouse and a keyboard, is explicitly designed for the task at hand: yes, gamepads may have a few too many or too few buttons and sticks for a given game, but at least it’s focused on the domain of playing games. (Hmm, maybe the controller/game match is why I think back on text adventures with so much fondness?) I keep on installing Windows on machines with the thought that I’ll finally play the many important PC games that are missing from my background; and I keep on deciding that no, I really don’t want to put up with the crap that PC gaming makes you deal with.
But shifting from X Windows back to the Mac also gave me a huge shove towards being sensitive to habitable software; and going from the Mac to iPhone/iPad software like Tweetie and Reeder was, in its own way, just as large a leap. Every time I use X, I find something that feels wrong; a Mac feels neutral, but I don’t generally look forward to turning it on; Tweetie and Reeder make me actively happy. It’s not just software that I’m learning from, either: I was surprised how much happier I was with the iPod Nano because of its small size, light weight, pleasant screen, and lack of skipping.
The Unix command line also makes me actively happy. It’s wonderfully coherent; for certain tasks related to writing and, especially, deploying software, it’s just what I want, I love the interface that it presents to me. So it’s no coincidence that I do my programming on machines where a Unix terminal window is one key combination away, and that I use virtual machines running Linux to deploy software on: I feel completely at home in those contexts when working on those tasks.
Designing Software
Habitability is how I like to express the importance of design in software to me as a user. But I’m a programmer as well, so I see design from that side as well.
When I was younger, I spent much of my programming time concerned with tools for programmers: thinking about programming languages and compilers, working on GDB. In those contexts, I didn’t have to think too hard about design: I was an acceptable proxy for the end user for the software, so if something felt good for me, then that was good enough.
That’s a relatively unusual subset of software, however; as I started to work about other kinds of products, I realized that my design instincts wouldn’t do a very good job. And, at the same time, I got interested in Agile: and one of Agile’s main tenets is that design concerns (personified as the “Customer”) are paramount when deciding what to work on. Not that the technical details aren’t important as well—you get great benefits from keeping your code flexible and well-architected—but ultimately it’s not programmers’ jobs to decide what’s important to present to the users.
Even though it carves out a space where design can happen, Agile isn’t actually very good at giving you advice at how to design well: specific recommendations are much more focused on the programming side of things (e.g. refactoring, test-driven development) or the programming/design interface (estimating, iterating) than on the design side of things. Also, my talents and instincts are much stronger on programming than on design: I still have a lot of room for improvement, but I’ve got some understanding of what’s involved in writing code that’s clean and functional from a technical point of view, whereas I have much less understanding of what’s involved in developing a product that people are actively happy to use.
And, to produce really great products, I’m not convinced by Agile’s engineering/customer representative split. The Lean concept of a Chief Engineer who’s immersed in both worlds seems much more powerful to me, and I see around me wonderful pieces of software written by single individuals, or startups (including Sumo Logic!) run by people with both a vision for what they want to produce and the technical chops to help bring that into existence.
Apple can probably be argued as providing evidence on either side of the argument about that split, but there are clearly individuals who made a huge difference in its products. Apple also points out how ludicrous it is to label the designer as the “Customer” if you really want to produce something new and great, and at the limits of the analytics-focused mindset that I saw so much of at Playdom; in general, Apple’s approach to iteration seems interestingly different from yet related to Agile norms. And their systems approach gives Apple many more design knobs to turn than they would if they were exclusively a software company. (Or exclusively a hardware company, of course.)
Business Success
Back in my academic days, I didn’t care about practical applications of my research. When I started working for startups, though, that changed: if you don’t have your eyes on how you’re going to make money out of your startup, you’re doing the wrong thing. (Not that startups don’t have a heavy dose of ego satisfaction in them, of scratching your own itch.)
Once I started paying more attention to making money, it turns out to be totally fascinating: if you like complex systems, capitalism is full of them. Just figuring out cash flow: where money is coming in, where money is going out, the difference between those two in quantity and in in time. So many possibilities there!
Apple’s business success over the last decade is staggering, of course. But they are fascinating far beyond their simple profit figures: the consequences of their systems approach to design, their use of their savings to buy vast quantities of parts from their component vendors (and even to allow those vendors to purchase tooling!), the role of their physical stores, the list goes on and on. There’s still a stereotype of Apple as making overpriced products, but their competitors are finding it very difficult to build products with the hardware quality of the iPad or MacBook Air while maintaining any sort of profit margin at all.
Of course, lots of startups aren’t focused on being profitable: Silicon Valley is full of company that are trying to get eyeballs, hoping that profitability will come somehow, and perfectly happy to sell the company to somebody else who can worry about that problem. We see echoes of this in the Android / iPhone fight, and these days I’m generally more interested in making money than having users without a good business model; but the iPod shows that you don’t always have to compromise, that you can win on both fronts.
Disruption
I mentioned Clayton Christensen’s disruption theory above: living in Silicon Valley, there’s no end of startups trying to remake an industry, no end of once dominant companies that stumbled, got bought, died.
Apple looked like it was following that latter trajectory; it pulled out of its decline like no other company. And did so in a very interesting way: not only did it disrupt other industries, it also disrupted itself, with the iPhone cannibalizing iPod sales and with the iPad cannibalizing laptop sales. This is extremely difficult to do: existing successes almost always lead to institutional antibodies that attack new products, leaving that success to newcomers.
Over the last decade, we’ve all become aware of disruption; the companies that can figure out how to repeatedly harness the powers of disruption will be the ones that flourish (the ones that survive at all!) over the next few decades. They will have to learn from Apple. And if I’m going to continue to build a career working at exciting companies, I’m going to want to learn from Apple, too, to help me figure out what sorts of qualities to look for the next time I’m on the job market, to pick employers that will disrupt successfully!
Repeatable Creativity
Disruption aside, though, there’s something amazing about Apple’s run of products over the last decade: one interestingly new product after another. I wish I knew how they did that.
It’s easy to ascribe this to a solo genius theory; but, while I don’t want to minimize Steve Jobs’s contributions, I don’t think that’s all that’s going on here. Pixar is another relevant datum: they’ve also managed to be consistently creative, and they continued to do that after Jobs sold the company to Disney. Perhaps because of the domain, people don’t credit Jobs with the same influence on Pixar’s repeated creative success as they do with Jobs; but, to me, the two companies suggest that Jobs has learned something about helping groups to innovate repeatably in a way that goes well beyond his personal contributions.
Over the last couple of years, stories have come out about some sort of Apple University, which seems to be trying to systematize those ideas. This reminds me of Toyota’s conscious efforts to improve themselves as a learning company; Apple is, sadly, much more secretive than Toyota, but I hope more of Apple’s methods will become public over the next decade. And, of course, I hope that Apple will be able to continue to innovate over the next decade, that their innovation really is due in part to a systematizable process.
Bad Apple
During the mid-90’s, I was down on Apple. I hoped that had gone away with the new decade, however: their user interface patents had gone away, and they were active open source contributors, though that clearly wasn’t the company’s main focus.
Unfortunately, those problems have come back in spades. By far the one that I find most distasteful is their aggressive use of patents: I think software patents are bad for the industry, bad for the world, and while I’m more and more bored by other companies that seem to largely be trying to produce knockoffs of Apple’s products, I very much support allowing those companies to do so.
Apple’s recent systems are also much more closed than computing platforms I’d used before then. I would expect that to bother me; for whatever reason, though, it actually doesn’t particularly. Certainly it would if I didn’t have ample access to other computing platforms, or if the tools to develop for iOS platforms weren’t so readily available; and while Apple teeters on the edge of behaving in a manner I find unacceptable in their application approval process, for whatever reason I generally think they’re okay. (I’m actually more worried about Amazon’s behavior in that regard.)
I’m being ungenerous in saying this, but: these days, when I read Richard Stallman complaining about Apple’s closed systems, part of my brain interprets that as RMS wanting it not to be his fault if other people don’t have software they want to use: RMS has made an open system, it’s other people’s fault if they don’t take advantage of that. These open systems are, in all serious, a great good: but actually having good software on your computer is also worthy, and having software that’s a joy to use is a great good. It’s fine if having well-crafted software for the non-programming public isn’t RMS’s concern, there’s no reason why it should be; but I see him as a single-issue voter whose issue is no longer dominant to me, and who is willfully blind to other issues that are important to me.
To those of you who have read this far: I salute you. And to those of you who don’t like Apple’s products, who don’t care about what Apple has done as a company: that’s great, there’s no reason why others’ interests should be my own. And there’s no question that company has flaws, does things I really don’t like. But I’m fascinated for many reasons by what Apple has done over the last decade, and I fully expect to be trying to sort out the implications for much of the next decade.
Some Jobs-related posts that I particularly enjoyed:
- John Shook asking Was Steve Lean?
- Another lean-focused post, this time from Evolving Excellence
- Horace Dediu on what Steve Jobs didn’t do.
- A podcast reminiscence from John Siracusa.
Post Revisions:
This post has not been revised since publication.
Apple has been an important part of my life as well. It’s amusing to recognize small snippets of your life from the history that you just recounted–Playdom, GDB, Harvard. I always suspected and now confirmed you were playing games with your life :)
12/7/2011 @ 7:09 am
Heh. It’s true!
12/8/2011 @ 9:19 pm
[…] that wasn’t always the case: while I’ve played video games regularly since we got our first computer, I used to read a lot more than I do now, and music has been quite important in my life at times, […]
12/13/2011 @ 9:03 pm
[…] areas that I’m not as interested in (though that’s definitely the case, I’ve been a lot more interested in what Apple has done over the last five years than Google), but in how actively distasteful […]
2/28/2012 @ 10:46 pm