NVIDIA CEO on How Deep Learning Makes Turing’s Graphics Scream

by Brian Caulfield

The deep learning revolution sweeping the globe started with processors — GPUs — originally made for gaming. With our Turing architecture, deep learning is coming back to gaming, and bringing stunning performance with it.

Turing combines next-generation programmable shaders; support for real-time ray tracing — the holy grail of computer graphics; and Tensor Cores, a new kind of processor which accelerates all kinds of deep learning tasks, NVIDIA CEO Jensen Huang told a crowd of more than 3,000 at the GPU Technology Conference in Europe this week.

This deep learning power allows Turing to leap forward in ways no other processor can, Huang explained.

“If we can create a neural network architecture and an AI that can infer and can imagine certain types of pixels, we can run that on our 114 teraflops of Tensor Cores, and as a result increase performance while generating beautiful images,” Huang said.

“Well, we’ve done so with Turing with computer graphics,” Huang added.

Deep Learning Super Sampling, or DLSS, allows Turing to generate some pixels with shaders and imagine others with AI.

“As a result, with the combination of our 114 teraflops of Tensor Core performance and 15 teraflops of programmable shader performance, we’re able to generate incredible results,” Huang said.

That translates into an enormous leap in performance.

Infiltrator Benchmark @ 4K resolution with custom settings (Max settings + all GameWorks enabled). System: Core i9-7900X 3.3GHz CPU with 16GB Corsair DDR4 memory, Windows 10 (v1803) 64-bit, 416.25 NVIDIA drivers. 

“In each and every series, the Turing GPU is twice the performance,” Huang said. “This is a brand new way of doing computer graphics — it merges together traditional computer graphics and deep learning into a cohesive pipeline.”

With a stunning demo, Huang showcased how our latest NVIDIA RTX GPUs — which enable real-time ray-tracing for the first time — allowed our team to digitally rebuild the scene around one of the moon landing’s iconic photographs, that of astronaut Buzz Aldrin clambering down the lunar module’s ladder.

The demonstration puts to rest the assertion that the photo can’t be real because Aldrin is lit too well as he climbs down to the surface of the moon while in the shadow of the lunar lander. Instead the simulation shows how the reflectivity of the surface of the moon accounts for exactly what’s seen in the photo.

“This is the benefit of NVIDIA RTX. Using this type of rendering technology, we can simulate light physics and things are going to look the way things should look,” Huang said.

To see the benefits for yourself, grab a GeForce RTX 2080 Ti or 2080 now, or a GeForce RTX 2070 on October 17.