As promised, here’s some more information about StreamStar. (Which is officially known as the “Sun Streaming System”.) It’s an extremely high performance video server, targeted at cable companies and telcos trying to move into the video space; its distinguishing features are that it can pump out an extremely high number of video streams at a much lower cost per stream and a much lower footprint (in terms of rack space) than its competitors, backed by a much larger amount of storage than its competitors. It’s a distributed system; some of the nodes (handling video ingest, control traffic, etc.) run on general purpose Opteron servers, but there are also storage nodes (Thumpers, each of which contains 24TB of disk storage plus a couple of Opterons), and one specialized box.

The specialized box, StreamSwitch (whose official name is “Sun Fire x4950”; blech) is quite a beast. Fully loaded, it has 32 10Gb ethernet ports for streaming video out, 32 10Gb ports for getting video off of the Thumpers, and up to 2TB of memory for caching video, so we don’t have to hit the disks every time we want to stream.

This is an obscene amount of streaming and video capacity. On the streaming side, we can send out 320Gb of traffic every second. Modern encoders can generate pretty good-looking standard definition video at 2Mbps using H.264; we can support 160,000 individual streams at that bitrate. (I’m not sure what the best numbers are for high definition H.264 video – maybe 6-8Mbps? Double those numbers if you’re using MPEG-2, but at this point H.264 is mature enough and has enough momentum that there’s a clear shift in its direction.) And that number isn’t marketing hype: while we do lose a little bit of overhead from bandwidth allocation issues (rounding, fragmentation, etc.), the hardware and software really is capable of saturating those ports. We’ve observed performance of above 310Gbps, which is a huge amount of streaming capacity. (Fun fact: 40,000 StreamStar systems provides enough video capacity to send a high-definition stream to every single television on the planet! So this will never be the highest volume of products – no matter how well it takes off, there’s a cap on how many we can sell…)

And this can be hooked up to 32 Thumpers, providing 320Gbps of video data coming into the switch. Again, using the same 2Mbps number, each Thumper can store around 9400 hours of video. So a fully loaded system could conceivably store around 300,000 hours of video. There are a few bits of marketing hype here – for one thing, you’d probably want to store each video twice for redundancy purposes. (Even with RAID providing data security guarantees, you’ll still occasionally want to bring a Thumper down for maintenance or something.) Even if you divide by two for redundancy purposes, though, you can still store 75,000 two-hour movies, or you can store a month’s worth of TV from 200 different channels. (The system is designed to ingest large numbers of video streams in real time, for live TV; it can ingest movies much faster than real time.) Also, we can’t currently saturate the 10Gbe ports coming out of the Thumpers; we’ll work on improving that, but I’m not sure how much it matters. Many people will be watching the same content simultaneously, so we don’t actually need the bandwidth coming into the switch to equal the bandwidth going out of the switch; and having a couple terabytes of cache (almost a minute per stream) gives us a lot of wiggle room to play with there.

So what can you do with all of this? It opens up new usage models in two different ways. For one thing, that’s a lot of storage. (768 TB; I imagine that it won’t be too long before we’re up to a petabyte!) If it’s used for movies, then you can go very far indeed down the long tail. I haven’t checked – how many DVDs have ever been released? I assume that Netflix will still have us beat towards the end of the long tail, but we can put up a good fight. Alternatively, you can use some of that storage for live TV. It lets cable companies store weeks worth of TV channels, so if you miss a TV show, no problem: it can all be sitting there on the server. (At least if the content providers will let them: as the Cablevision case shows, content providers apparently have a horror of letting people actually, say, watch TV shows when they want to.)

For another thing, the streaming bandwidth means that everybody watching TV can get their own unicast video stream. Right now, TV signals are being broadcast out: a TV station sends out a signal once, and everybody watches that same signal. So everybody is watching the same thing at the same time. Cable companies can refine that model somewhat, so different zip codes might, say, get slightly different ads, but it’s pretty far from being personalized.

If everybody gets their own video stream to play with, though, then there are a few new games you can play. For one thing, it can act as a Tivo (except with a lot more disk storage, and with the disks at the head end instead of at each TV): you can pause, rewind, fast-forward at will. (Networked Personal Video Recorder, or NPVR, is the buzzword here.) Which is great for TV watchers. For another thing, it opens up new ad models: each person can get their own customized set of ads. Which is great for TV providers and advertisers; with luck, it will even mean that the ads that TV watchers get will be a little more interesting to us, too.

All of a sudden, we’re a lot closer to the holy grail of video service: we’re not quite at a situation where anybody can watch any movie ever made any time they want, or any television show any time they want, but we’re much much closer to that than we’ve ever been before. And we can do this all at a reasonable price, one which is more than low enough to fit the cable companies’ business models. (And which is significantly lower than the price of existing low-capacity video servers.) In fact, at this stage the barriers to that holy grail seem to be legal rather than technical: we just have to convince the people who make the movies and TV shows to let people watch them!

This has been a fascinating product to work on; it’s hard to believe that I’ve been working on it for almost four years. I have a great bunch of coworkers, I’m still learning something new every day, and we have some interesting things coming up in our road map. It’s wonderful that the product is finally launching; I expect great things going forward.

Post Revisions:

There are no revisions for this post.