Ira to Go: How We Put Our Amazing Human Head Demo on Our Next-Gen Mobile Processor

by Brian Caulfield

Shock. Awe. Wonder.

Those were the reactions when we unveiled a startlingly lifelike demo of a rendered human head – nicknamed “Ira” – at our GPU Technology Conference earlier this year. Ira’s skin moves and stretches as he blinks and grimaces. His mouth moves realistically as he talks. In short, Ira, based on a model of a real person, looks warm and approachable – a leap forward from the uncanny, rubbery faces generated by older generations of graphics technology.

Now, we’ve built a version of Ira you can take anywhere. Think of it as Ira To Go. That’s because our next-generation mobile technology, Project Logan – which we unveiled this week at the SIGGRAPH visual computing conference – relies on the same Kepler graphics architecture found in the GeForce GTX Titan GPUs we used to generate our original Ira demo.

Take me with you: our next-generation mobile technology relies on the same Kepler graphics architecture found in the GeForce GTX Titan.

“The community has been waiting for NVIDIA to lead the way,” said Kevin Krewell, a senior analyst at The Linley Group. “Now they’re in the position in the industry where they should be: leading.”

Kepler’s unofficial mascot, in other words, has been leading a secret double life. Ever since March — when we first unveiled the demo we created in partnership with the Institute for Creative Technology at the University of Southern California – our in-house development team has been working to put Ira into our next-generation mobile technology.

For most of that time, the team has been working on a pair of developer kits that replicate what Logan can do. It wasn’t until about 10 days ago that they got Logan itself. The chips – thanks to our bring-up team – worked just as expected. Within days, the demo was ready to show off at SIGGRAPH.

It’s an effort that wouldn’t be possible without Logan’s Kepler-based architecture and its support for high-octane graphics technologies typically available only on high-powered PCs. Features such as support for full tone mapping, bloom, FXAA 3.0 and full HDR are tools that have previously only been available on PCs.

These tools let the team build a demo that smiles and grimaces in real time and that holds up under different lighting conditions, while letting the user zoom in and out or pan side to side.

While the demo doesn’t include all the features seen on the original Ira demo – remember, Logan is for low-power mobile devices – it does include a number of grace notes the original demo didn’t. For example, the portable version of Ira includes details such as tear ducts, and the slight redness around the edges of the eye where it meets the eyelid. And the light bounces off the irises of Ira’s eyes in the demo in a more realistic way.

The result, we think: a helping of shock and awe, to go, hold the uncanny aftertaste. “There’s no other mobile processor demo that comes close,” said Krewell. “They’ve raised the bar for the entire graphics industry.”

Video, posted below. Head and skin data courtesy of the Institute for Creative Technologies at USC.