Live: NVIDIA’s CES Press EventJanuary 5, 2014
NVIDIA CEO Jen-Hsun Huang opened our official presence at the 2014 International Consumer Electronics Show, in Las Vegas, Sunday at 8 pm Pacific.
You can catch highlights from the event by scanning our live blog of the event. Or you can watch the whole presentation in the video replay embedded below.
9:37 – Jen-Hsun winds things down quickly. He thanks the audience. People get up, with scattered applause. There’s a reception afterward where the technology will be demo’d, and most of the 300 visitors stream off in that direction.
9:35 – There you have it. our announcement this week. it’s about games, chips and cars – GameStream and G-SYNC; Tegra K1, the heart of GeForce, the soul of Tesla, which will come in two versions, 32-bit and 64-bit; and Tegra K1 VCM. You’ll have a supercomputer in your car, which will make it possible for the car to go toward semi-autonomous driving.
9:34 – The crop circle did bear some hints of Tegra K1 – that the number 192 – which corresponds to the number of cores in Tegra K1 – was cut into the circle in braille. The reporter who decoded this, thought that it was an internet address.
9:32 – Ladies and gentlemen, this is Tegra K1. Before I close, Jen-Hsun says, I have a confession. This is the biggest project our company has worked on, taking Kepler and bringing it into mobile.
Such a supercomputer inside your car could enable next-gen cars to come more quickly. Tegra K1 is impossibly advanced. It’s practically built by aliens. The engineers have worked so hard on this for so many years. I want you to put a marketing campaign behind this that rivals the technical complications. I want everyone to do this, but you have no marketing budget.
So, Jen-Hsun says, NVIDIA was behind a 310-foot crop circle cut a few weeks ago into a field of barley two hours south of San Francisco, near Salinas. Some observers thought that aliens could have carved it out.
“When I learned about it, I learned it on CNN. Some people thought aliens had done it,” he said. The coverage has been fantastic. The site of circle has been visited by fleets of helicopters.
9:28 – Jen-Hsun, who’s been struggling with his voice, just about loses it completely at this point. “This demo just took my breath away,” he jokes.
9:26 – Project Mercury allows you to render designs to build gauges virtually and bring them inside the car.
A demo is shown of how to configure a car’s control panel – using an Android tablet, you can choose the dimensions, materials, sheen, reflectivity of the materials. The gauge itself looks more realistic in some ways that a physical dashboard. In the demo, the rendered dash appears highly realistic, much as it would in a car.
9:22 – Jen-Hsun now shows another initiative. There’s a series of materials that are rendered – ceramic, gold, copper, rubber, marble. The light is bouncing off everything in a highly realistic way.
This is using physically based computer graphics so we can simulate and render photo-realistic objects.
As a result, Tegra K1 VCM’s Project Mercury is intended to make a configurable, customizable tool. It allows physically accurate rendered gauges to be brought into the car.
9:18 – NVIDIA is inside 4.5M cars, with many millions more coming, in more than 100 models.
Today, Jen-Hsun says, we want to introduce you to the Tegra K1 Visual Computing Module (VCM), which is about bringing a supercomputer into your car. The same Kepler architecture that powers the world’s 10 most efficient supercomputers is in Tegra K1. Here’s why we need it in the car: advanced driver assistance. That includes technologies like pedestrian detection, blind-spot monitoring, land departure warning, collision avoidance, traffic-sign recognition, and adaptive cruise control.
We’ll need to provide platforms so that when new software is available, it is automatically downloaded. Tegra K1 can do that because it’s fully programmable.
These features will evolve so that cars will ultimately become self-piloting.
9:13 – Let’s talk now about cars. I’ve got a few more surprises for you that Tegra K1 has up its sleeve.
NVIDIA’s been engaged with the auto industry for years – initially design, then styling, then for simulations. But as we’ve improved rendering so it’s almost photo-realistic, so it can be used for marketing purposes. With the mentoring and tutoring of great customers, we’ve learned a lot about car interiors.
What you see and experience when you’re driving matters so much. We believe parallel computing will revolutionize how a car is built, how it looks, how it drives.
Let me show you what state of the art, physically based computer graphics looks like. There’s a rendering of a liquid-like surfaced sports car that’s being rendered back home in Santa Clara, calculating the movement of every beam of photon of light bouncing through the scene. It’s all being simulated in a supercomputer in Santa Clara. So a retail shop could render what a customer’s car would look like in a way that’s almost terrifyingly realistic.
No question. the car does look like it’s sitting right in front of you.
But what Jen-Hsun now does is configure the car by color – white, orange, chrome – and it’s physically modeled so it looks outrageously real.
9:06 – It’s a modest demo, but it gets a round of applause, mainly from geeks in the audience who appreciate what they’ve seen.
9:05 – Jen-Hsun is summing up this section: Tegra K1, Unreal Engine 4, Next-gen graphics on a mobile device for the first time.
Tegra K1 is based on DX11. Its GPU has more horsepower than Xbox 360 and PS3.
If you compared Tegra K1 to another platform, compare it to an Apple A7. K1 is nearly three-times its performance. When you consider its next-gen graphics capability we’re even further ahead
What I’d really like to go over with you. It’s the first 192-core processor. It comes in two versions, quad-core A15, another version is based on our Denver CPU. It’s a full custom CPU, it’s based on ARM V8 64 bit, very high performance single and double threaded. Tegra K1 will come with dual-core Denver.
I know what you’re thinking. It’s probably a Powerpoint launch. But actually, dual-core Denver has been back for a few days, so I thought we should show it. It’s only back for a few days.
Jen-Hsun shows Android OS running on Tegra K1. He just shows the desktop of an Android device. But the fact is, we’ve worked on this for five years. It’s going to be incredibly fast. It’s the world’s first demo of 64-bit V8 on Android.
8:57 – Next, a pedestrian household scene is shown. A living room with a plain beige couch in UE3; looks pretty tired. But then another version is shown in Unreal Engine 4 with physically based rendering. The couch is soft, looks kinda cushy. The wood floor picks up realistic reflections off the wood floor.
8:55 – These little tiny details are what increases the production quality of games. It lets game developers create something exquisite.
8:53 – Global illumination, high-dynamic range and physics processing can all be done on Tegra K1.
Here’s another demo of Unreal Engine 4 running on K1. Everything in a scene is a light source, which creates a subtle mood in dark moments. There’s a high degree of texture quality. Water is dripping off walls. Brights are bright, with light bouncing sharply off it. Darks are dark, with only glints of light. UE4 calculates what it’s like for your pupil to adapt to different degrees of lighting as you move through a scene.
Jen-Hsun says, “this is real time computer graphics on a tiny mobile chip.”
8:49 – Another demo of global illumination shows a space-age castle towering in a barren world. As you get closer to the building and the bright sky gets blotted out, it gets lighter and more discernable.
8:45 – The face has soft shadows under its lips. Not so long ago, NVIDIA showed this same demo on GeForce Titan, a massively powerful GPU for desktops. Now, it’s being shown on a mobile chip. Jen-Hsun says this is a face that only a mother or computer programmer could love.
8:43 – It took eight years for UE3, running DX9, to go from running on a PC in 2002 to being able to run on a mobile device. Games today are still using the look of DX9.
We introduced DX11.1, running UE4, running with Kepler in 2012. A year later the game console industry brought the same architecture to game consoles. Now, a year later Tegra K1 brings Dx11.1 to mobile – that’s a two-year, rather than an eight-year, gap. This is what has Tim Sweeney excited.
Unreal Engine 4 is all about photo-realism. All of the rendering is physically based – we’re capturing how physics bears down on microstructures . There’s something close to global illumination. The result is a photo-realistic, stunning image.
Jen-Hsun now shows what that means with a human face rendering as a demo. That’s hard to do because we all know what a face looks like. Monsters are a little more obscure. Although this face is being rendered, it looks alive – with pores, blood below the surface, eyes that appear to be a window into the soul.
8:35 – What’s the benefit?
Well, first is solving a problem for game developers. Computer graphics is cinematic, gaming stories are cinematic. How do you keep up with this in a cost effective way? As a game developer, you need to reach more users with a larger footprint. The Architecture of mobile devices and game consoles are radically different. The capabilities are radically different. For a game developer, diversity of platforms creates another degree of complexity that is simply intolerable. We call this the developer’s dilemma.
Today, we’re excited to announce that Epic Games will bring Unreal Engine 4 to Tegra K1.
Tim Sweeney, Founder of Epic Games, has the highest expectations of anyone in the game industry. He said Tegra K1 advances industry by three or four years.
8:32 – This is the first GPU that took a vast jump from the previous generation. It’s inappropriate to call it Tegra 5 because it’s not linear. It’s called K1 because it’s based on Kepler architecture, our most successful GPU architecture. This architecture, because it’s so energy effieinct and programmable, enables us to extend GPU to the cloud with GRID and to the cloud with Tesla. One architecture spans computing from a few watts to megawatts. We’re really, really excited that with Tegra we’ve bridged the gap. We’ve brought the heart of GeForce and soul of Tesla to mobile computing.
8:30 – We believe Android OS will be most important platform for gaming consoles. Why wouldn’t you want to be open, connected to your TV and have access to your photos, music, gaming. It’s just a matter of time before Android disrupts game consoles.
We also believe Android would disrupt the auto industry. We love cars but we believe the car will be your most personal robot. It happens to have some of the best minds in industry working on it.
Question is: after Tegra 2, after Tegra 3, after performance was doubled with Tegra 4, what could we do next? We could do eight cores but that’s pretty obvious. Twelve would be more.
But we decided to make Tegra K1, the first 192-core processor.
8:27 – Now, he switches to chips. Android, Jen-Hsun says, is the most disruptive force in gaming we’ve seen – it’s open source so it’s accessible and easy. No wonder the computing revolution in China took off based on Android. Android disrupted the phone market, worked its way to tablets. But it’s not going to end there. It’s free to all, It allows every company, country and industry to freely innovate. There are more engineers working on Android than any ecosystem today. It’s hard to imagine stopping it.
When we built Tegra 2, the first dual core processor, people asked why. When we built Tegra 3, the first quadcore processor, peopled asked why. Question is, where do we go from here?
8:24 – We’re really excited by G-SYNC. Will be available in Q2 from Acer, AOC, ASUS, BEnQ, Philps and ViewSonic.
8:23 –With V-SYNC off, you see not stuttering but tearing. As soon as the GPU is ready, it sends image to monitor, but the monitor may not be ready for it. The result is tearing.
G-SYNC avoids both stuttering and tearing. As soon as GPU is ready we update the frame. The notion of tearing is gone but the lag is still very, very low. The buzz is high because we’ve been living with stuttering and lag so long.
Jen-Hsun switches to a video of a recent game event where professional gamers are shown G-SYNC. They seem pretty jazzed by it.
8:20 – G-SYNC is next up. Jen-Hsun said we invented this to solve one of the oldest problems in gaming – stutter and lag. This comes from the fact that there’s no game or application where the game rate is constant. Sometimes there are more or less action. Or more or fewer characters. This changes the frame rate. You have a GPU generating frames at a variable rate – sometimes 100 frames a second, sometimes 8 frames a second. But the monitor is sampling at a constant rate of 60 hertz, or frames per second. What we’ve done is instead of having two separate system, we synchronize them – hence, G-SYNC.
First he shows “Starcraft II,” one of the most popular games that’s played in e-sports, a real-time strategy game. You see an expansive territory back and forth, enabling you to see your vast territory in the game. But there’s a stutter in what’s being shown.
You can solve this by buffering, but the more you do that, the more latency you add.
8:15 – Same game, Batman, is now being played right on SHIELD, rather than being streamed to the 4K TV. It’s being streamed at 720p. The fun part, though, is that the server he was streaming from was in southern France. What’s been shown is a game being streamed 6,000 miles. Each input goes over the ocean, gets processed in the GRID server, gets streamed back over the ocean. The lag is about 30 milliseconds in each direction.
8:12 – First demo: a DX11 game, Batman, is being shown being streamed from a PCX to a 4K TV via SHIELD. There are some pretty stunning visuals – Batman’s cape is behaving naturally (for a superhero’s cape), shadows look realistic. The two screens from the PC and the 4K TV are in complete sync.
8:10 – GameStream is incredibly difficult to do. We’ve worked for some time on it but have nearly perfected it. We created a device called SHIELD which could be connected to your PC wirelessly, so it’s as if the game is running on SHIELD. You can stream 1080p, 60Hz games. Because people like to share their greatest gaming moments, you can capture your games on SHIELD and send it Twitch, With GeForce you can capture your game, stream it and then share it remotely
8:07 – Jen-Hsun apologizes for being under the weather and sounds hoarse.
Next gen gaming has started. For NVIDIA, it starts with GeForce. It’s the most powerful GPU in the world and makes possible and turns a PC into an amazing game console. We’ve added to it GeForce Experience, which makes it possible to enjoy a PC but make it as well-behaved as a game console. It enables your PC to adjust to each game. It’s been downloaded 20M times. This is the core of our PC gaming strategy. But it’s not the end of it, we’ve extended the PC with GameStream, which extends computer graphics. Unlike YouTube, there’s little time in gaming to compute and recomputed each scene and still be convincing. Latency is vitally important
8:04 – NVIDIA’s dedicated itself in four directions: Relentless pursuit of photo realism. Making the GPU more programmable. Extending the reach of visual computing to mobile computing. And putting the GPU in the cloud. He’ll talk about these in the context of games, chips and cards.
8:02 – The room’s darkened. The music’s picked up. Jen-Hsun takes the stage. Trade market leather jacket and black pants. In a break from the past, he has a navy blue shirt.
8:00 – CEO Jen-Hsun Huang is usually sharply prompt. Sometimes starts early. We’re right at the edge of the appointed hour, so it shouldn’t be long now….
7:55 – This year, the press conference is at the Cosmopolitan, a yawning Vegas venue on the Strip with almost as many restaurants as poker chips. They’re more expensive, though. The room is decked in NVIDIA’s trademark black curtains, with a huge single screen displaying shards of green triangles sailing slowly left to right.
7:50 – We’re still about 10 minutes out from the start of our CES press conference. The room’s set for about 280 and it’s just about full – reporters, industry, analysts, Wall Street analysts and a smattering of NVIDIAns in black shirts.