Wednesday, 1:30pm–2:30pm: “Prototype Through Production: Pro Guitar in Rock Band 3, by Jason Booth and Sylvain Dubrofsky

The slides are available online.

In 2008: music games were a big thing, needed to innovate.

Harmonix has this idea of The One Question that they focus work on a game around. For Guitar Hero: is it rock? Rock Band: is it an authentic band experience? Rock Band 3: used the same question, but make it more authentic. Which leads to: pro guitar, have people play on a real guitar, really learn how to play!

First, 3 months of early exploration. Question: is the problem space to teach people how to play guitar, or to pay RB with a real guitar? These don’t actually intersect that much. Target audience: hard/expert RB guitar player.

Use the illusion: mute/unmute the sound of real tracks, so the first time you get it right it sounds great. This is very unlike learning to play on a normal guitar, where you sound awful for a long time.

How to notate frets and strings? First idea: fret relative. Columns represent frets, represent strings some other way. Didn’t work so well, so tried having columns represent strings. Worked a little better, still issues to work out.

And some early constraints. It had to: display notes, chords; be pitch accurate; target non-musician RB players. It didn’t have to: work with bass; work with all songs; work in all game modes.

This was enough to give them something to work with when they had time, and to assemble a small strike team. Benefits: implementers make decisions. Focused meetings. Seating proximity. 

How they prototyped. No good hardware solutions: cost, latency, accuracy problems. Do whatever works best for now, use info to inform later decisions.

Question: how to communicate what to play? Approach: focus on building muscle memory, not music theory. Traditional notation good for conductors, highly compressed (e.g. key signatures). Guitar players use tablature instead: physically centered. But it has more problems: doesn’t work well with chords, rhythms. Third possibility is chord charts, but only works well with chords.

Need something else. The notation should be physically centered, compressed enough to be recognizable at a glance, work with time.

Different chords have different shapes on your hand. Can play same shape at multiple places on fret board, how to represent that? Also, what about riffs, loose strumming, arpeggios? All opportunities for compression.

So: new notation system, string relative. Should it be horizontal or vertical? All music notation is horizontal, so it must be better. First version could work either way; they found horizontal moved too fast, hard to represent shapes. So vertical was better.

Song selection for early prototypes. Did songs that would definitely work, e.g. I Love Rock and Roll. Also simplifies early prototyping if doing a 3-chord song!

Next questions: how teach chord shape? What info do they need to understand what to do with their hand? What else do they need to know?

First idea: inline training into songs. Really jarring, though. Next idea: chord book. Tried virtual hand, but players didn’t look at that. Players just need to know what they need to do and what they’re doing wrong.

Added in a “wait mode”: when learning a part in real life, you pause periodically while figuring out what to do next. So they’d do the same thing, where the game would pause while you put your fingers in the right place.

They play tested with various internal groups: little musical experience, lots of musical experience, little guitar experience, lots. Asked how fun it was; players weren’t sure that was the right question, but they persevered.

Big question that playtesters had: what is their left hand actually doing? So they wanted to show the shape hand was actually in.

Still problems: upfront learning, complex chords, screen real-estate, not all songs work well. Won’t be able to solve all of this, but wanted to chip away.

Had done enough to inform hardware requirements: need to know what left hand is doing, need low latency, need low cost. Ideal is actual guitar, but that has problems: never been done, could be expensive, strings break. So maybe plastic guitar approach? Decided to pursue both approaches, and ended up shipping both. 

Prototyping suggestions: reduce team size as much as possible. If an idea / issue keeps on coming up, you need to try it. Don’t skimp on low-hanging fruit.

Cycle: establish core goals and constraints, hack it in, playtest, repeat.

Prototyping took 7 months; on to production. (Which took 13 months.) They were confident coming out of prototyping, decided to raise the stakes. Do it in every song, every mode; anybody can learn easy, every note is in expert. Training on all song on all four difficulty levels, plus bonus music training.

Still need to figure out: standard RB stuff (hopos, solos, …), advanced song techniques, song authoring.

Problem 1: hardware. No MIDI drivers. Seven bridge modules to translate from MIDI to Xbox. Eventually got 7 of the plastic controllers, 3 of the real ones.

Problem 2: late content. Don’t pick all of final setlist until shortly before shipping. Decided to pick snippets of famous songs, to shake out issues. Famous songs, songs that should work well, songs that would stress the system.

Authoring solidifies: easy, medium, hard, expert guidelines. (medium simple chords, hard still parseable in real time.) And 17- vs 22-fret guidelines.

Worked on general theory lessons, too.

Prototype to production took a lot longer than expected.

Problem 3: slow progress. Increased team size, so no shared history. Needed more external focus.

Got back on track: refocused on target audience. Switched to short-term deadlines.  Pulled playtest dates forward to expose problems sooner.

Playtest revealed lots of little issues, and the chord book wasn’t working as well as they’d hoped. But a lot of things worked well; if they could nail those, they’d work well with their core audience, move on to more advanced stuff.

Added numbers to hand shape; pulled chord learning into songs. Then that gave them time to implement a lot of wishlist items.

I asked about the lack out auditory feedback for mistakes. They thought about it a lot, but decided that preserving the illusion was so important that they wouldn’t even allow it as an option.

Post Revisions: