Get ready to dig in this week.
SIGGRAPH is here and we’re helping graphics professionals, researchers, developers and students of all kinds take advantage of the latest advances in graphics, including new possibilities in real-time ray tracing, AI, and augmented reality.
SIGGRAPH is the most important computer graphics conference in the world, and our research team and collaborators from top universities and many industries are here with us.
At the top of the list: ray tracing, using NVIDIA’s RTX platform, which fuses ray tracing, deep learning and rasterization. We’re directly involved in 34 of 50 ray tracing-related technical sessions this week — far more than any other company. And our talks are drawing luminaries from around the industry, with four technical Academy Award winners participating in NVIDIA sponsored sessions.
Beyond the technical sessions, we’ll be showcasing new developer tools, and giving attendees a first-hand look at some of our most exciting work. One great example is NVIDIA GauGAN an interactive paint program that uses GANs (generative adversarial networks) to create works of art from simple brush strokes. Now everybody can be an artist.
Never been to the moon? A stunning new demo virtually transports visitors to the Apollo 11 landing site using never-before-shown AI pose estimation that captures their body movements in real time. This all became possible by combining NVIDIA Omniverse technology, AI and RTX ray tracing.
The story behind all these stories: our 200-person strong NVIDIA Research team — spread across 11 worldwide locations. The group embodies NVIDIA’s commitment to bringing innovative new ideas to customers in everything from machine learning, computer vision, self-driving cars, robotics, graphics, computer architecture, programming systems and more.
A Host of Papers, Talks, Tutorials
We’ll be leading or participating in six SIGGRAPH courses that detail various facets of the next-generation graphics technologies we’ve played a leading role in bringing to market.
These courses touch on everything from an introduction to real-time ray tracing, the use of the NVIDIA OptiX API, Monte Carlo and quasi-Monte Carlo sampling techniques, the latest in path tracing techniques, open problems in real-time rendering, and the future of ray tracing as a whole.
The common denominator: RTX. The real-time ray-tracing capabilities RTX unleashes offer far more realistic lighting effects than traditional real-time rendering techniques.
We’re also sponsoring seven courses on topics ranging from deep learning for content creation and real-time rendering to GPU ray tracing for film and design.
And we’re presenting technical papers that detail how our latest near-eye AR display demo works and take the next leap in denoising Monte Carlo rendering using convolutional neural networks — a cornerstone of AI — effectively using modern AI techniques to greatly reduce the time required to generate realistic images.
Rendering More than 7,000 Moving Lights in Real Time
We also revealed late-breaking research results in the Open Problems in Real-Time Rendering course Tuesday afternoon. Our researchers showed the first real-time rendering of scenes from a short film, Zero-Day. Amazing what is possible when you can render more than 7,000 moving lights in real time.
“I was blown away by the demo I saw and I’m excited to see where this goes,” said Mike Winkelmann, aka Beeple, the digital artist who created Zero-Day.
The Eyes Have It: Prescription-Embedded AR Display Wins Best in Show Award
You’ll be able to get hands-on with our latest technology in SIGGRAPH’s Emerging Technologies area. That’s where we have a pair of wearable augmented reality displays technology you need to see, especially if you don’t see very well without regular eyeglasses.
The first, “Prescription AR,” is a prescription-embedded AR display that won a SIGGRAPH Best in Show Emerging Technology award Monday.
The display is many times thinner and lighter and has a wider field of view that current-generation AR devices. Virtual objects appear throughout the natural field of view instead of clustered in the center, and it has your prescription built right into it if you wear corrective optics. This brings us much closer to the goal of comfortable, practical and socially-acceptable AR displays than anything currently available.
The second research demonstration, “Foveated AR,” is a headset that adapts to your gaze in real time using deep learning. It adjusts the resolution of the images it displays and their focal depth to match wherever you are looking and gives both sharper images and a wider field of view than any previous AR display.
To do this, it combines two different displays per eye, a high-resolution small field of view displaying images to the portion of the human retina where visual acuity is highest, and a low-resolution display for peripheral vision. The result is high-quality visual experiences with reduced power and computation.
TITAN RTX Giveway
Finally, NVIDIA is thanking the student volunteer community at SIGGRAPH with a daily giveaway of TITAN RTX while exhibit hall is open. These students are the future of one of the world’s most vibrant professional communities, a community we’re privileged to be a part of.