Quantcast

NVIDIA CEO Jen-Hsun Huang kicks off our annual GPU Technology Conference in San Jose this morning.

Stay tuned, we’ll be live blogging throughout the event, starting at 9 o’clock Pacific.

Hit refresh on your browser for updates.

11:35 - Jen-Hsun’s winding down and recapping everything he said in each of the five areas.

“Thanks for coming today. Have a great GTC.”

11:30 - Octane Render is world’s first photo-realistic real-time render on the cloud. You’d download a tiny app from Otoy and it would let you work from any computer. Directors can tell their stories more interactively, more quickly. As a result, we should get better movies. It’s now available as a service, so anyone with the chops can tell a story in 3D. This replaces supercomputers and workstations, putting it on a desktop anywhere. The cost of it is, with GRID VCA: $24,900 for eight GPUs, with a maximum of $39,900 with 16 GPUs. Software license are $2400 a year or $4,800 a year, respectively.

11:26 - So, Jules goes to a laptop and shows an original clip from ‘Transformers,’ which Otoy helped make.

He now shows pre-visualization of the scene on a wireframe, which is being streamed from Los Angeles.

We’re going to show what 112 GPUs can do for simulations, it renders the scene in less than a second, rather than a few hours as in the past. And it’s being done remotely.

11:23 - So ability to work cheaper, lets you do it more. We can create better movies, Jen-Hsun said.

Jules from Otoy rolls out Octane Render, which allows hundreds or thousands of GPUs to be in the cloud and used by renderers.

11:20 - Josh is talking about how you use CGI to build a movie. You take a pre-viz and bid it out to studios. You go from stick figuring to an effects house that puts the heart into it. Jules is talking about how his CGI work can now be rendered with GPUs 40-100x faster than on a CPU. An artist now has unlimited rendering power.

The technology of making things go fast is critical.

11:15 - One more thing, he says.

Computer graphics is now standard in Hollywood, and used especially well in the ‘Life of Pi.’ In the movie, the tiger is in the movie for 80 percent of its length but it was almost never filmed. The whole thing was digitally made with CGI. It was done by a group called Rhythm and Hues. They animated the whole tiger, which involved articulating the jaws, paws, face – it was muscle with a bag of skin over it. but in the movie, it all looks realistic, in the air, in the water. It was all rendered beautifully. 10M hairs on the tiger, several hundred million hours of CPU time was dedicated to this. It was ground-breaking work but arduous.

Showing off a digital tiger generated for ‘Life of Pi’ by Rhythm and Hues.

Jen-Hsun: I want to introduce someone who talks about the challenge of creating a movie like this. Brings up Jules Urbach, founder and CEO of Otoy, and Josh Trank, director of ‘Fantastic Four.’

Jen-Hsun Huang, ‘Fantastic Four’ Director Josh Tranck, and Otoy CEO Jules Urbach.

11:10 - Jen-Hsun: These future modern auto dealers, they’re really striving to do this because it lets them reduce their real-estate cost, reduce inventory cost and it’s easier for dealers to up-sell with a system like this.

11:08 - Jen-Hsun shows RTT’s point of sale configurator, being streamed from VCA to a tablet. The choices from the tablet go back to GRID, which takes the options and re-renders the image and delivers it back to the tablet.

Ludwig shows his tablet. They’re going to put together an R8 in 3D. They look at color options, silver, black, red and gray. They keep changing the car’s color. Now, they’re changing the wheels. They zoom in under the hood, under the spoiler. Now, they’re looking to select leather and they customize the look of the seats – different colors of leather, insets, stitching.

GRID VCA could be in a dealer and through the Internet, we could stream the output to any number of dealerships.

11:02 - Exec who made this possible comes on stage. His name is Ludwig Fuchs, he’s CEO of RTT, a Munich company that provides design standards for the auto industry.

Ludwig talks about how Audi was looking for a way to make new-age showrooms. It requires enormous hardware, people and resources. It relies on Audi already having RTT models in place, which they use for their configurators, so not everything had to be made from scratch. VCA will help us a lot with this.

Now, you can imagine showrooms like this being around the world, though for now they’re still hard to build. “It will be a new way to interact with consumers,” Ludwig said.

TK TK TK.
Car shopping gets virtual with the GRID VCA. 

10:59 - Now, I’d like to show you another example of a GRID application. It will show you how it will change the car-purchasing experience.

We get a pretty dreamy video from Audi that shows upscale customers going to a new-age showroom and where they have virtually constructed their new car, so they can see just how it works. Very cool, they see different color options, glinting in the sun, rotating around an unseen axis.

This kind of experience simply doesn’t exist anywhere in the world right now.

Adobe's James Fox talks about GRID VCA.
Dawnrunner CEO James Fox talks about GRID VCA.

10:56 - Next up is James Fox, who’s CEO of Dawnrunner, and among the earliest users of GRID. He’s been using it for Adobe and Autodesk.

“Earth shattering is what gets talked about in the office,” he said. We don’t have an IT department. With GRID we launch right into work whether they’re remote or not. We can do a video, take it to a customer and then change it on the spot for the customer, if that’s what they want. “we don’t need to haul around a 75-pound workstation. We do it on a laptop.”

Jen-Hsun: “Sometimes, everyone is using each of the 16 virtual machines on GRID. But other time, I understand just a few of you share the full power of GRID from just a single workstation.”

10:53 - SolidWorks’ Bassi is talking about how his software is used to design the world’s fastest motorcycle.

Jen-Hsun: “We’ve heard working with SolidWorks, that [the] workforce changes in size, and the workforce is always changing the type of work it does.” But it’s hard to install SolidWorks and uninstall it on various computers. This is a big challenge for small companies. But it gets addressed with GRID.

10:50 - “It’s as if you have your own personal PC under your desk,” Jen-Hsun said. But really it’s like having 16 separate systems. It’s a virtual environment, a virtual machine, that’s GPU accelerated, and compatible with the latest design and creative software.

TK TKT TK
Jen-Hsun and Gian Paolo Bassi, from SolidWorks. 

Special guest comes on stage, Gian Paolo Bassi, VP of R&D at SolidWorks. He’s young, wearing a gray suit and white open-collar shirt.

His company makes software that helps companies do CAD to design products in 11 industries, from medical devices, aerospace, automotive, architectural and design products. If you have a SubZero refrigerator it was designed on SolidWorks. There are 180K companies in 80 countries who use it.

Another look at GRID VCA.
Another look at GRID VCA.

10:44 – With GRID, VCA, your remote workspace just turns up. It doesn’t matter if it’s a Mac, PC, Android, ARM, x86 – the experience is that you have your own personal computer.

Ian, an NVIDIA engineer comes on stage. He shows a GRID VCA, connected wirelessly to Ian’s Macbook Pro. We’re going to look at Ian’s Mac where there are three different screens, each with a different image that’s being streamed. He’s doing image processing in real time, remotely, on one of just three of his screens.

TKTKTK
Introducing the GRID Visual Computing Appliance. 

10:40 - GRID VCA is our first integrated system. It’s 4U in height, fits into a server rack. Inside it are two of the highest performance Xeon processors. Backed up by eight GRID GPUs, each with two Kepler GPUs, integrated into a single appliance. It supports 16 virtual machines. Each device needs to download a single client called the GRID client.

10:37 - We’re now in production with GRID enterprise servers.

But there’s some work that can’t be solved this way – SMBs who outsource work and don’t have an IT department. They buy their computers from the Apple store but they still have massive computational challenges. They want to be remote, work on one massive database, without sync’ing and copying all over the place. Plus, the data needs to be completely secure. What if you wanted to sell cars without having every car model available.

What if you have 600M subscribers, they’re your cable TV subscribers. The ability for them to maintain software when boxes have been purchased over the past many years. What if we could put that in the cloud. When I change the channel, I want it to happen right away. How do you do this?

What if you’re an SMB and you want to work in a modern way, without an IT department but you want to collaborate. You want to walk away from your desk and take all your stuff with you. How do you do that?

What these work areas need isn’t a large rack of servers. It’s a visual computing appliance. It’s NVIDIA’s first end-to-end system, and it’s called the NVIDIA GRID VCA. It’s a visual computing appliance

10:32 - This was the antithesis of staying in sync. Today, the data is too big. It takes too long to copy data from one site to another. Data has become so large that it makes no sense to copy it to your PC.

You now want to copy your PC to the data, so now we have cloud computing. A decade ago, we invented the virtual GPU to enable the new enterprise, where data is in the server, we do computing in the server but we can process it so quickly that you think the computer is under your desk and in your device. This is remote graphics.

We described the technology last year. This year, I’m pleased to announce that we have in our GRID cloud offerings partners like Microsoft, Citrix, VMware, Dell, Cisco, IBM and HP. There are 75 large scale trials while we speak. You’ll be able to go see this form Applied Materials. They build semiconductor manufacturing equipment. Moving the data to each one of their workstations, sync’ing them was a huge challenge. Their engineers today could sit anywhere. All the work is done in the server. It sends the pixel output to your laptop or tablet so fast you think you’re connected to the computer. You could go anywhere with complete security. Everyone’s connected all the time.

10:28 – Next, we’re on remote graphics.

The way we work has changed. It’s now BYOD – bring your own device to work. Back in the old days, we got company cars. In old days, we got a company computer. But why not bring your own computer? The network has now become heterogeneous, and IT departments are going berserk. Our networks were set up to do work the old way. But we’re not going to be working that way any more.

Introducing Kayla
Introducing Kayla

10:26 – So, the guys went off and built a super-low power GPU, combine it with ARM and build the most powerful ARM computer. This is Logan’s girlfriend, Kayla. It has a Tegra 3 because of its PCI Express, What’s amazing is that Logan will be the size of a dime, whereas Kayla is now the size of tablet PC. Let’s look at what Kayla can do. It’s running real-time ray tracing. (draws more applause). This is showing the kind of demos we used to do on massive GPUs. The most advanced computing staff we know of, CUDA 5, Linux, PhysX processing, all running on this little ARM computer.

10:23 – Let me show you one more. Next gen beyond Logan has a peculiar name, Parker. Parker brings three ideas to the market:

First with Denver. First 64 bit ARM processor coupled with our next-gen GPU Maxwell. First to use FinFET transistors. In five years time, we’ll increase Tegra by 100 times, Moore’s Law would suggest an eight-fold increase.

“I can’t wait any longer. I want to see ARM and CUDA now. I want to see ARM on CUDA now.”

10:21 - So, what’s next? Our next-gen mobile processor, Tegra, is called Logan. It has something that we’ve been dying to bring to the world. It incorporates for first time our most advanced GPU, the first mobile processor with CUDA. (This draws wide applause from the crowd.) It has a Kepler GPU, with full CUDA 5, and OpenGL 4.3. We’ll see Logan in production early next year.

Jen-Hsun talks about our Tegra roadmap.
Jen-Hsun talks about our GPU roadmap.

10:10 - Tegra Roadmap

We felt world is changing, that computers would disappear and be everywhere. We wanted to start investing in computers that are in cars, stores, glasses, watches, phones, tablets, anything that has a display. When we entered the world 20 years ago, there were 100M CRT displays sold. Now, there are 2.5B high-def displays sold, and that will double in a few years.

Our first Tegra didn’t turn out that well, we were just learning. Tegra 2 was first dual-core. Tegra 3 was the first quad-core, with an extra low-power fifth chip. Tegra 4 introduced two new ideas: a software defined radio modem and computational photography, which uses sophisticated mathematics, and a CPU and GPU to take sensor info from the camera and using image processing to create some amazing effects, like HDR or image tracking, when the object is moving.

10:15 - Today, I’d like to show you the next click of our roadmap.

In 2008, we introduced Tesla, our first GPU that incorporated CUDA. Two years later, we introduced Fermi. In 2012 we introduced Kepler.

Next two GPUs from us: Maxwell is coming with unified virtual memory, which makes it possible for GPU operations to see the CPU memory and vice versa, so program-ability is easier. After that is Volta, which is even more energy efficient, which has a new technology called stacked DRAM.

Jen-Hsun talks about Volta.
Jen-Hsun talks about Volta.

 

Volta is going to solve one of the biggest challenges facing GPUs today, which is access to memory bandwidth. With Volta, we’ll prevent from getting off our chip onto a PC board – it will have DRAM on same silica sub-strate, which will carry a whole bunch of DRAMs stacked atop each other. We’ll cut a hole through the silicon and connect each layer. We’re going to achieve one terabyte per second of bandwidth.

“This is unbelievable stuff,” Jen-Hsun says.

This is like taking a full Blu-Ray disc and in 1/50th of a second, moving all the data on the Blu-Ray disc through our chip.

10:08 - Visual Search is based on a lot of imaging technology.

GPUs do image processing much faster – color inversion 5x faster, face detection 6x faster, depth maps 25x faster, points clouds 50x faster. We can re-encode so much faster now. Each minute there are 72 hours of new YouTube video being uploaded – how can you search this for, say, copyright infringement? GPUS can do this very well.

10:05 - Mike tries searching for a wild pattern that’s black with square speckles of various colors. He searches eBay for it, as well, and up come a bunch of options.

10:04 - Jen-Hsun: suppose you want to find a particular pattern, can you find it?

Mike selects an Asian floral pattern, searches for it and finds a collection of dresses and tops that have that very pattern. It’s going through 800K items and finding matches within a second or two.

10:02 – What about image search?

Imagine if you typed in F-150 it comes up Ford F-150, If you type in Ferrari F-150 you get a great car. But a lot of search doesn’t work that way. It works backward. Let’s say you see a beautiful dress or jacket in a magazine, you want to be able to find dresses or jackets just like that. As humans, we can identify a like thing. But this is a very difficult task to do mechanically.

There’s a company called Cortexica that’s allowing you to search for data not from smartcodes, the way it once was, or wine labels or book and CD covers, the way it is now. They’re working on being able to find actual data based on the captured image of brands, 3D objects and faces. They can recognize things that aren’t precise.

Mike Houston, a senior NVIDIA engineer comes up to stage. He has a copy of In Style magazine in his hand. He flips to a picture of Kate Hudsom wearing an Ann Taylor dress. Mike takes a picture of Kate and searches through 800K dresses on eBay and in a few seconds up come dresses very similar to what she’s wearing.

Shazam CEO TK takes the stage.
Shazam CEO Jason Titus takes the stage.

9:56 – Jason talks about putting together fingerprints of all 27M songs, so Shazam can provide millisecond accuracy. We want to be able to recognize songs instantly, without longer play, he said. “We had to build a system to scale to a billion users.” Jason says that GPUs have taken down cost, dramatically increased speed. We’re trying to include more music, more television music, more ethnic folk music.

9:53 – Another is Shazam. It’s for people who no longer listen to the radio but still love music. If you don’t remember the name of the band or know the song, you can use the app to identify who’s playing what. Shazam gets 300M user queries a month, growing by 2M a week.

“This isn’t a business model, it’s an epidemic,” Huang says. So, how do you process 10M queries a day trying to figure which of 27M songs you’re listening to?

To talk to about this Shazam’s CTO, Jason Titus, walk onto the stage. He has shoulder-length, graying hair, jeans and a dark-brown corduroy jacket with elbow patches. He’s talking about how to find patterns amid the noise.

9:50 – Jen-Hsun talks about three companies focused on processing big data in the consumer space where realtime is necessary.

On Twitter, there are 500M tweets a day. Smart companies like Salesforce.com are scanning social media data real time so companies, like Gatorade, will now how their brand is being perceived. Salesforce has a million expressions they’re looking for in 500M tweets sweeping by, monitoring in real time. There’s no way you can save this information. You just have to deal with it. This search has to be done as quickly as possible or the info gets stale. They had courage to try the GPU, which gave them a 35x speedup in processing, allowing them to scale out their service.

Jen-Hsun talks about how GPUs can accelerate  services that rely on big data. 

9:45 – There will be more than 400 sessions at GTC – this is Mecca for scientific discovery. We have representatives from companies in manufacturing, image processing. One is even using CUDA for a dating site to match compatibility  There’s going to be a paper on GPU-accelerated diamond cutting.

9:42 – Nothing’s more important than the research being done on GPU computers. It’s in the areas of high-energy physics, 3D genomics, gigapixel camera arrays, materials simulations, Alzheimer’s Research, centrifuge analysis. At Duke and the University of Arizona, they’re experimenting with a 50 gigapixel camera.

TKTK
Researchers at Duke and the University of Arizona are experimenting with a 50 gigapixel camera. 

9:40 - Oak Ridge has 40 million CUDA processors coming together simultaneously to deliver 10 petaflops of power.

New news: the Swiss Supercomputing Center is going to start building Europe’s fastest GPU Supercomputing Center, called Piz Dant, for weather forecasting purposes.

TK TK TK
Jen-Hsun talks about the Swiss Supercomputing Center’s use of GPUs.

9:39 - If we’re not at the tipping point for GPU Computing, we’re racing at it. There’s a huge spike in GPU-based computers being built for real work – about 20 percent of total Top500 horsepower is GPU. Included in this is the world’s most powerful supercomputer, the Oak Ridge National Laboratory’s Titan supercomputer.

9:37 - We’re now on GPU Computing.

The challenge is how to take a brand new computing model, a new architecture and take it to the world. On one hand, if there are no heterogeneous applications, why would you buy a computer with parallel processing? And if there are no users, why would you create a computer with parallel processing? This is a chicken or egg problem.

GPU had a day job before. It didn’t need applications. It was doing computer graphics. With this inspiration, we decided to take CUDA to market, and look what happened. GTC happened. In 2008 there were 150K downloads, 60 universities were teaching CUDA, and 4,000 academic papers were produced. This year, nearly 500 million CUDA processors shipped, 1.6M downloads of CUDA, 640 universities teaching it, 37K academic papers.

9:32 - The fidelity of Ira, the rendered face, is pretty staggering.

“Let’s have Ira say a few things. What did you have for breakfast.” Ira is complaining that he had yogurt for breakfast, but it was largely frozen fruit. He’s asked about Project SHIELD and screams, “Take my money!”

9:29 – The way the new animation is being done is in partnership between NVIDIA and University of Southern California. It involves a stage with 150-some lights, loaded with cameras. When you walk in, it takes a bunch of pictures that extract 3D geometry, as well as subtle geometries. The miracle isn’t the 3D, it’s that all the pics get turned into video. 30 expressions get captured. There’s a new technology called Faced Works, which takes 32 gig of information and condenses it further into 400 megs. What’s left are 3D meshes which we articulate using our GPUs, they’re synthesized and tessellated in real time. This is a new way to render facial descriptions.

A new face comes on – digital Ira. A bald guy who’s being rendered. It’s not a recording. It looks hyper-realistic. There are something like 2 teraflops of data flowing. Great shadows, realistic eyes.

The Uncanny Valley.
The Uncanny Valley.

9:26 – Now on screen is Dawn, a fairy who’s been rendered. Subtle skin tones and fairly realistic hair. There’s a lifelike look. She’s a breakthrough in many respects – soft shadows, you can see her pores. Dawn now performs but as she moves she’s a bit jerky. We’re dwelling in the uncanny valley now. While she was beautiful when still, now that she’s moving it’s clearly an animation. “A little bit creepy,” he notes.

9:23 - Simulating the ocean is hard. Simulating a face is harder. Our facial expressions are used to communicate in subtle ways. At some point facial rendering gets so realistic it gets creepy, according to a theory called “The Uncanny Valley.” Jen-Hsun said that Tin Tin gets to the verge of creepiness. In Beowolf, some of the characters went deep into creepy. We’re now trying to get to the other side of creepy.

Jen-Hsun talks about the challenge of simulating ocean waves.
Jen-Hsun talks about the simulating ocean waves.

9:20 – First he shows how ocean simulations have looked to this point. The waves are consistent, there’s no spray, no foam. It doesn’t respond to wind.

Now, what we’re about to do is in real time. As the wind picks up, the waves respond accordingly. The seas are getting rougher, there’s foam, spray, smoke from the boat. The simulation is reflecting the size of the ship, its speed, the force of the water, the direction and speed of the wind. It’s now cranking up to gale-force winds. The sky is darkening.

Titan, our latest GPU.
Titan, our latest GPU.

9:17 - He’s talking about NVIDIA’s newest GeForce GPU, Titan, which is based on the same technology as that in the world’s fastest supercomputer. What can it do? He’s showing Real-Time Beaufort-Scale Ocean Simulation, which simulates oceans in a way that’s visually realistic and physically believable. This is where science, art and engineering all meet.

9:14 - Five things today: 1) breakthroughs in computer graphics, 2) update on GPU computing to showcase some of the audience’s work, 3) our road map, a glimpse into the next click of NVIDIA’s tech road map, 4) an update on remote graphics, 5) a new product announcements.

9:13 – GTC was created, he’s saying, to pull ideas and industry and field together, so we can connect with each other’s ideas. This will be the best GTC ever – inventions, new technologies, scientific breakthroughs, and a lot of great people.

Jen-Hsun takes the stage
Jen-Hsun takes the stage.

9:12 – Film’s done. Jen-Hsun comes up on stage. Trademark black jacket, black shirt and… off-white pants? He’s looking pretty buoyant.

9:10 - Some neat medical-imaging shots. Audi race vehicles hammering down the track. The new Titan supercomputer, the world’s fastest, comes on screen. Now some Hollywood shots from films with particularly deft special effects

9:09 – Very cool intro video playing The GPU and You. It’s showing how the GPU is used with rockets, undersea mine searching, satellite tracking. Searching for distant galaxies

9:07 - Voice of God coming is coming up. Should be just a few minutes at this point.

9:06 – Beatles still playing. The room, where part of the San Jose Auto Show was held a few months ago, is jammed to the point that people are standing in the back. A promising sign.

9:05 – “What song’s playing,” someone just asked. A guy next to me said, use Shazam. He’s checking the app. It’s ‘Come Together Right Now; by the Beatles. Can’t believe the question had to be asked in the first place.

9:03 - Slogan of event is flashing up. The Smartest People. The Best Ideas. The Biggest Opportunities. Music’s winding down, slipping into a quieter, funkier gear

8:59 - Looks like there are about 180 press and analysts in the first rows of tables. Amazing how many have video cams, DSLRs and tablets. Not like an event just a few years ago.

8:58 - The screen has close-up shots of chips in fuscia, lime green, deep purple. Looks like a future urbanscape

8:55 – GTC 2013 is about to kick off in five or 10 mins with Jen-Hsun Huang’s keynote. We’re expecting about 3,000 folks at the show here in San Jose. Feels like most of them are in the convention hall right now.



Video streaming by Ustream

  • http://www.facebook.com/bigsampson408 Sam Mallory

    “9:05 – “What song’s playing,” someone just asked. A guy next to me said, use Shazam. He’s checking the app. It’s ‘Come Together Right Now; by the Beatles. Can’t believe the question had to be asked in the first place.”   Thanks for the laugh. You summed it up with our response on that one haha.

  • epaper flip

    Thanks for the
    update, is there any way I can receive an email sent to me when you write a
    fresh article?
     

  • Brian_Caulfield

    Absolutely. Just hit the ‘subscribe via email’ link at the top of this page’s right rail. Thanks! 

  • Brian_Caulfield

    I’m just glad the GPU-powered service could help those without the age and experience to recognize one of the signature songs of the 1960s. ;-) .

  • http://www.facebook.com/riccha.roye Riccha Roye

    First off all I would like to say thanks for this nice and informative blog. Your idea is so nice and useful. There is no doubt that computer has great influence all over the world and it has become our first and far most need to change the world. You did a good job by expend this info especially for business work. GPU Roadmap will prove helpful.
                          Once again thanks for shearing.