CES 2018 Live: NVIDIA CEO Jensen Huang Press Event

by Bob Sherbin

Our press event is over, but CES has just begun. Here’s a quick recap of tonight’s news:

And if you’re at the show, stop by our booth. We’ve got what we think may become the most talked about demo at the show. If you have fond memories of the VW MiniBus, or look forward to driving the VW I.D. Buzz, it’s a demo you won’t want to miss.

9:56 – And so, that’s it. Jensen leaves the stage, the music comes up, and reporters dash off to write their stories.

The CES show starts Tuesday morning and NVIDIA will be showing off on the show floor much of the technology that Jensen showed on stage.

A highlight will be the VW Microbus and I.D. Buzz that we showed off in the Holodeck, which promises to be one of the best demos on the show floor.

See you there.

9:51 – Jensen tells a story about how he came to Vegas 40 years ago and played a lot of table tennis. To do that, he and his teammates drove from Portland to Vegas to play at a table tennis tournament at Caesar’s place. “I lived in a VW bus with my teammates for a week,” Jensen said.

So we decided we’d bring this to life and let people enjoy it, Jensen tells the crowd. To do that, he uses NVIDIA Holodeck, our VR design studio.

Jensen shows the old bus, which bursts out from a block of NVIDIA green squares. It’s red, not particularly rusty, but boxy and deeply retro. Then it’s replaced in the virtual environment with the new VW I.D. Buzz, which bursts out of the same group of green blocks.

A group of avatars, virtual versions of NVIDIA employees, are in the Holodeck in the new I.D. buzz, which is as much glass as it is metal.

They then turn on tunes in the Holodeck and start dancing. It’s a party. “This isn’t just about getting there,” laughs Jensen. Point made.

9:44 – I want to share one more thing,  Jensen declares. ” hat we’d like to do is revolutionize how people drive, how goods are moved, but also how we interact with our car,” Jensen says.

I can’t imagine a more important company to take this technology and change everything we know about cars.

We’re partnering with the largest car company in the world, VW, to infuse AI into its future lineup.

To help us talk about this, Jensen brings on stage the CEO of Volkswagen, Dr. Herbert Diess. Herbert and Jensen hug. Jensen’s in a black leather jacket, and Herbert is looking pretty hip to in a medium-blue blazer and dark blue pants.

Jensen’s recalling one of their first meetings, where he was going to show off image recognition by AI. It turns out NVIDIA’s deep neural network couldn’t recognize the kind of dog, a dachshund, that Dr. Diess had asked it to.

“That was the end of the meeting,” Jensen laughs.

NVIDIA’s made a lot of progress since then.

“I want to congratulate you on your achievements which are stunning,” Dr. Diess says.

“I think,” Dr. Diess says, that cars will become even more important, even more exciting, even sexier than they are today.”

Dr Diess also questions the idea that private cars will be replaced by robotaxis. He says you’ll likely want your own environment, your own stuff, your own car. “This is why I think individual cars will have a bright future,” Dr. Diess says.

9:31 – Next we’re going to show you something really cool, Jensen says. Justin, an NVIDIA engineer, is going to enter the Holodeck – our virtual reality design environment.

So Justin enters Holodecks and we see a La Ferrari, a $1.4 billion supercar that goes zero to 60 in 2.5 seconds. Within the Holodeck, the car appears to be cruising down the road, based purely on augmented reality. We can test this car in a virtual world.

9:28 – What you’re seeing here is something we could pre-record, capture different scenarios, and whenever we have new software stacks, we can run them.

Functional safety is one of the most important things we’re working on . Everyone of our software stacks will be ISO certified. We’re working to get the first neural network ISO certified.

We want to turn whole car into an autonomous machine. Turning whole car into an AI. It has sensors all around it. So you need AI to be in the car because the cloud doesn’t have the contextual awareness.

In the future, every car will be self driving. There will be 100M cars built each year, and millions of robot taxis, ans several hundred thousand trucks. All will be autonomous. On top of this, what will define the driving experience is the AI

We’ve created an SDK, a platform, that allows AI to be exposed to developers. It will do surround perception, speech recognition and synthesis, natural language processing, eye tracking, tracking head position, and gesture recognition.

You could tell it to open the window and it will know which window.

If I have all this capabilities, and I know where I am – I can track where I am and where I’m going. You’d think we could engage cars for next level into the future.

Today, we’re announcing a new platform called NVIDIA DRIVE AR – it’s a software stack that’s focused on augmented reality. It will have graphics that highlights what it see. It almost looks like computer graphics are right there where you see it.

So, we’re going to show it to you.

9:19 – Functional safety is plenty challenging, Jensen says. The ultimate feature of a selfdriving car isn’t that it’s self driving but that it’s safe. How do you address the possibility of a mistake, an error?

This entire system, to enable functional safety, is incredibly complex. No system has ever been this complex before. People in the airline business realize how different this is.

It requires a holistic approach. Our goal is to achieve the highest level of functional safety, ISO26262 ASIL Safety levels. The entire SoC, we have the ability to achieve trace-ability, so errors can be traced back to their source..

To do this, we’re partnering with Blackberry QNX and TTTech, Jensen says. They’ve integrated their software on top of ours.

We want to ensure computer is designed properly and to do so we need to test the car. But it’s just not possible to drive far enough to test, so you need to simulate it. The way to do this, is with software models of our driving stack and run it in a simulator.

He shows something called NVIDIA AutoSim. We’d run it to cover billions of miles.

He shows a demo of a virtual environment. Mark, who’s running this, moves the Sun so it creates glare and shadows.

Then he configures a car, with four cameras, but he plays with the front camera, moving it around the environment, to the driver’s side, to the passenger’s side. Then he configures other sensor – the right hand camera, which he rotates in various areas.

With that, Mark runs the simulator – both on the DGX supercomputer and on the NVIDIA simulator.

Mark sets up some dangerous simulations, where another driver veers improperly near our car.

9:08 – These platforms aren’t’ just powerful, they’re open. We have a whole lot of people who are developing on the platform, enabling autonomous capabilities is a fantastic thing. We’re working with partners all over the world. Last year, we had about 200 working with us. Today, over 320. They include developers cars, trucks, mobility services, suppliers, shipping companies,  sensor companies, startups and research outfits.

Learn more about NVIDIA Xavier, the world’s most powerful SoC

9:06 – NVIDIA DRIVE Pegasus is up next, which is for robotaxis. It has 320 TOPS for AI inferences,but uses just 400 watts of power.

Jensen now shows Pegasus, with two Xaviers and two powerful GPUs. With just one or two you can power a driverless car.Although it’s just one board, it replaces an entire supercomputer.

I’m excited to announce that we’re going to be partnering with Aurora, which is partnering with VW and Hyundai, and they’ve selected NVIDIA’s technology.

I’m also delighted to announce that Uber has chosen NVIDIA technology, we’re partnering together to create self-driving Ubers.

This draws applause.

9:02 – We’re announcing today that Baidu (china’s internet giant) and Germany’s ZF have selected DRIVE Xavier for their autonomous vehicle work in China.

Every car that’s made needs to be China compatible because it’s the largest market in the world. Every car manufacturing with DRIVE Xavier and DRIVE stack will be able to operate in China.

9:00 – DRIVE Xavier he shows running entire DriveWorks and DRIVE AV stack for self driving. This is the power of architectural compatibility.

The processor, though, is just the first step. But we’ve been developing the entire self-driving software stack, using radar, lidar, path planning and more.

Jensen says that over the holidays, one of our engineers, got into a car drove eight miles without touching the steering wheels. He drove eights miles, replete with 23 intersections, eight hard turns and 2 stop signs.

He shows a video that shows the drive — highly sped up, without the driver touching the steering wheel. The amazing thing is it’s powered just by Xavier, not a larger DRIVE PX, and not a trunkfull of processors

8:56 – How is it possible to build a computer so if a fault is detected it can still perform well?

Jensen then talks about some of the architectural intricacies of Xavier, which spells out how – through redundancy and diversity.

It has eight core CPUs, it has a Volta GPU with 512 CUDA cores and 20 tensor cores. It has 1.5 gigapixels (whereas a camera might have 30-40 megapixesl)

Jensen shows a previous-generation DRIVE PX 2 – which has four chips and generates 24 teraflops. When we delivered this, it was the fastest. It used 300 watts of power

The next-gen Xavier delivers 30 teraflops of performance at 30 watts. This can power next-gen vehicles.

He notes that while developers have been using DRIVE Px2, everyone can develop on Xavier, as well.

8:50 – Now, Jensen pivots to driving.

AI, he said, will revolutionize driving. And, man, does it need to be revolutionized.

There are 82M accidents each year, with 1.3M fatalities, costing over $500B in damages.  Moreover, we spend an hour a day community.

“AV (autonomous vehicles) is potentially the greatest revolution that will come to this industry in 100 years.”

He said AV will revolutionize mobility services. And that, too needs to be revolutionized. Think about it, there are going to be another billion people on earth, meaning traffics increases 3x. And given parking spaces, it will create all sorts of social problems.

What would a car be like if it’s not something you get in to get in your destination but it is the destination.

AV will revolutionize trucking, which also needs to be revolutionized. 3.5m truckers, transporting 10.5 billion tons and $750B worth of goods in U.S., drivers need to be 11 hours on, 10 hours off. That means there’sa shortage of 50K truckers in the U.S. alone. Keeping up with the Amazon effect would be a huge benefit.

To solve AV problem, you need to solve problems form the bottom to the top . We’ve built game consoles, supercomputers, PCs. But building a supercomputer for self driving is a massive challenge. It can never fail. Incredible complexity.

With everything like that, because contribution is so enormous, we have to tear it down one step at a time.

So, today, we’re announcing world’s first autonomous machine processors, DRIVE Xavier. Silicon is back Xavier will be sampled to select customers sometime in the first quarter.

It’s most complex system on a chip, or SoC, ever made. It’s largest SoC world’s ever made. It took 8,000 engineering years, has 9,000 transistors, and took $2 billion in R&D.

8:42 – Jensen now talks about the company’s mission being to create the enabling platforms to do deep learning and address unsolvable problems.

He talks about the company’s Research work in…

  • using AI to predict ray tracing – to predict what color should be in a picture
  • using AI to create audio-driven facial animation
  • using AI-based generative adversarial networks, or GANs, to synthesize images of humans, we can generate virtual reality images
  • using AI to write music – to celebrate the latest Star Wars movie, we partnered with Disney to create GPUs with a Star Wars theme and to mark it, we used an AI to write John Williams inspired music

He even plays a clip of the music that was composed by an AI and is performedby an LA-based symphony orchestra, replete with cellos, violins and violas, woodwinds, brass and timpani.

8:37 – Reason we’re growing so fast, value proposition is incredible. It takes four racks with 160 CPU servers to do Resnet50 at 45K images a second. But NVIDIA is 10x more efficient, and can do it with one NVIDIA HGX with eight Tesla V100s, it delivers one-sixth the cost at one-20th the power

Jensen jokes that it’s so efficient that the more you buy, the more you save.

He now gives a demonstration in identifying flowers. He uses TensorFlow a common deep learning network – to recognize flowers and running it on latest generation CPUs, it can identify five images a second. The demo shows how ResNet 152 can infer what flower is which – it identifies flowers likes roses, sword lilly, and a bunch of others with unpronounceable names.

“In one second’s time, the network shows it’s a foxglove” Jensen says. Just a few years ago, it would have been a miracle

But if you run it on NVIDIA’s V100, it identifies not 5 a second but 900+ a second.

This is 200x faster.

But now he shows if you could put eight Volta GPUs in a node, it can identify 7,000 images a second.

The work of one DGX-1 is equivalent, he says, of a stage full of CPU servers.

8:30 – The reason why gaming grows is technology continues to improve. More gamers keep coming into the marketplace. My generation was the first to game. But in my children’s children generation every human will be a gamer. -NVIDA GPU CLoudIdt’s art. It’s sports. It’s how we share.

Now, Jensen talks a bit about NVIDIA serving as the world’s AI Platform, in the past year, the compay announced:

  • The single most complex processors the world’s ever made – Volta, with 21B transistors, though it runs at one volt at 250 amps. The most compact supercomputer the world’s ever known. It has 125 teraflops of performance, so one box either v100s gives you one petaflop of performance, which would rank nicely on the Top500 list of the world’s fastest supercomputers.
  • NVIDA GPU Cloud – this is a cloud-based registry that containerizes software to make it possible to go to the cloud, download the software stack and have access to multiple frameworks
  • Every cloud, every computer maker – Volta has been adopted by every cloud service provide in the world
  • DGX and DGX Station – contains eight and four V100 GPUs, enables
  • Inferencing on TensorRT – an optimized compiler that enables exceptional inferencing at very low power
  • TITAN V – a supercomputer for developers that runs in PCs – demand for this is fantastic, even though it has a $3,000 price tag

8:19 – Jensen shows a racing game with incredible smoothness and clarity. The Max-Q laptop offers four-times the performance of a Macbook Pro, twice the performance of highest performance game console, and no wires.

8:18 – Jensen begins joking about having to clamber up five stairs to get on stage. “Even though my shoes look young, the body is not,” he said, referring to his snazzy Fendi shoes. He takes a swig or two of water and seems ready to push on.

And he does, talking about the growing force of gaming, impelled by VR, and esports, the largest sport in the world. VR should be the largest sport because it could be any sport – it could be  a racing game, a football game, any game. 600 million people, more people than watch NFL football, now watch esports. Video games aren’t just on your PC, they’re also mobile. The fastest growing game console in the U.S. is the Nintendo switch. By working with them, we could make it possible for them to create a game console that’s amazing and mobile at the same time.

At this show, we’ll announce 10 new gaming platforms, and a BFGD – which stands for big format gaming display. So you can enjoy PC gaming on a 65-inch, GSYNC screen. The name draws lots of chuckles (it’s a reference that the community following the keynote on Discord instantly gets, of course).

Next he talks about Max-Q – a technology that lets NVIDIA put a powerful processor into a super-thin laptop,

Jensen shows a 60mm thick 10-pound gaming notebook which was cutting edge last year. Now, there’s one that 18mm and five pounds.

8:13 – So, with that, Jensen strides on stage to cheers, whoops, and applause. He’s definitely playing on his home turf, having given news-filled keynotes for years running.

NVIDIA’s dedicated itself, he says, to GPU computing. The company’s spent 25 years focused on this, enabling company to synthesize reality and virtual worlds, and to understand physical worlds.

We have three drivers:

  1. Gaming – it’s a $100B industry that continues to grow
  2. AI – a $3T IT industry, recent breakthroughs have made it possible for software to write software in a way humans never could, enabling unsolvable problems to be solved
  3. Autonomous vehicles – $10T industry – we’re applying AI to one of the most difficult challenges of all.

“As we’ll see today, we’re making incredible progress” in autonomous driving Jensen says.

8:07 – We’re getting close. The voice of God, as it’s known, fills the room and asks attendees – it’s pretty well filled beyond the gills of this MGM Hotel ballroom – to silence those phone and take those seats

The video that comes up  is a specially cut play on NVIDIA’s “I AM AI: theme — a nicely constructed palindrome (now there’s a word that should itself be palindrome). It’s come to be something of a larger NVIDIA theme over the past year as artificial intelligence has moved to the center of the company. This version of the video is particularly configured to the evening’s automotive theme.

Underneath a swelling soundtrack, a female narrator describes some of the AI capabilities within the auto industry. I am a visionary…. I am a learner….I am a protector, she says. There are some great visuals that play off that, showing how AI can warn a driver of an approaching biker, open the hatchback for someone walking with packages.

7:56 – Okay, the music’s shifting a bit. Up comes the English singer Thea, liquid-voiced, as ever.

We’ll be starting with the trademark NVIDIA opening video, something the company always puts a lot of shoulder into.

7:46 – Reporters, as usual are in the first dozen or so rows, with others to be let in a bit later.

They’re all listening to some bouncy soul, and viewing  a fairly oversized screen that’s filled with undulating polygons in NVIDIA green and adjacent Pantone shades. Its size, 15×45 feet, isn’t all that surprising. But it’s the quality of resolution that’s impressive It’s fully LED and shows edges sharp enough to cut with.

7:40 – We’re still 15 minutes or so out from Jensen taking the stage at NVIDIA’s CES keynote.

But the doors have opened and the crowd, which has been pretty well fed and watered, well more than watered, over the past hour, is making their way in.

We’d provisionally expected about 400-500 attendees but kept ratcheting that up. We’d blown well through 800 confirmations a few days ago. So, let’s see how this plays out.

This is the 51st annual CES show, and they made stops in NYC and Chicago before making Vegas home. The first in a creaky hotel featured 200 exhibitors focused on pocket radios and TVs with integrated circuits. There were 17,500 attendees.

This year, the show’s organizer is looking for 20X as many exhibitors and nearly 10x as many visitors.

And a main theme will be automotive. That’s what Jensen’s here to talk about. A bit of CES trivia: auto first showed up at CES in 2009 when NVIDIA announced its first collaboration with Audi.

NVIDIA gave the official opening keynote last year, introducing GeForce NOW game streaming, deeper Audi collaboration, AI copilot. This time Intel has the official open but that’s not till tomorrow. And we’ve got a pretty wild evening ahead of us.

+++

NVIDIA CEO Jensen Huang will get things rolling at CES 2018, in Las Vegas, Sunday night at 8pm Pacific.

Stay tuned. We’ll be live blogging from this page throughout the event.

Hit refresh on your browser for updates.