Pascal and VRWorks Infuse VR with New Level of Presence

With the launch of our first NVIDIA Pascal gaming GPU, we’re dramatically expanding VRWorks to bring new levels of realism to virtual reality through sight, sound and touch.

True presence in VR must be convincing to all your senses. An incorrectly lit object or missing echo can quickly undermine the 3D reality being created.NVIDIA VRWorks

So with Pascal, we’ve enhanced VRWorks to include not only performance enhancements for things we see, but also technologies that enhance presence for things you hear and touch.

VR and the Sense of Sight

Most of our early VRWorks technologies focused on tackling the performance challenges of rendering to dual 1080×1200 displays at 90 frames per second. Any performance dip, even the slightest stutter, could cause discomfort and ruin the VR experience. Today, we’re offering the VR development community new tools to increase graphics performance and deliver even more immersive experiences.

We’ve taken the new Simultaneous Multi-Projection architecture of NVIDIA Pascal-based GPUs to create two major new techniques for tackling the unique performance challenges VR creates: Lens Matched Shading and Single Pass Stereo.

Lens Matched Shading improves pixel shading performance by rendering more natively to the unique dimensions of VR display output. This avoids rendering many pixels that would otherwise be discarded before the image is output to the VR headset.

Single Pass Stereo turbocharges geometry performance by allowing the head-mounted display’s left and right displays to share a single geometry pass. We’re effectively halving the workload of traditional VR rendering, which requires the GPU to draw geometry twice — once for the left eye and once for the right eye.

Both techniques allow developers to increase performance and visual detail of their VR applications. Combined with the performance of GTX 1080 GPUs, Simultaneous Multi-Projection delivers a dramatic 2x VR performance improvement over the GeForce GTX TITAN X.*

VR and the Sense of Sound

Audio can have a huge impact on presence in VR. Traditional VR audio provides an accurate 3D position of the sound source within a virtual environment. However, sound in the real world reflects more than just the location of the source. It’s changed by the physical environment as the waves move through walls and bounce off objects, creating echoes, reverberations or muffled sound. We expect these subtle changes in real life, so their absence in virtual environments subtracts from the realism.

VRWorks AudioTo solve this, NVIDIA has developed VRWorks Audio, our new path-traced audio technology. VRWorks Audio uses the NVIDIA OptiX ray-tracing engine to simulate the movement, or propagation, of sound within an environment, changing the sound in real time based on the size, shape and material properties of your virtual world — just as you’d experience in real life.

VR and the Sense of Touch

Visuals and sound are critical to placing you in a believable scene. But just as important are accurate physics and touch interactivity. Virtual objects that don’t respond to your movements or behave as they would in the real world will quickly disconnect you with your virtual world.

Today’s VR experiences deliver touch interactivity through a combination of positional tracking, hand controllers and haptics. The NVIDIA PhysX engine allows developers to detect when a hand controller interacts with a virtual object and enables the game engine to provide a physically accurate visual and haptic response.

The PhysX engine also allows developers to model the physical behavior of the virtual world around you so that all interactions, whether it be an explosion or a hand splashing through water, are accurately simulated and behave as in the real world.

VRWorks and PhysX Converge in Next-Gen VR Experiences

NVIDIA is integrating VRWorks Graphics, Audio and PhysX technologies into a brand new VR experience called VR Funhouse, which is coming soon to Steam. VR Funhouse demonstrates the immersion and fun of bringing together great graphics, physical audio, touch interactivity and a fully simulated environment.

Another great example is Sólfar Studio’s Everest VR. NVIDIA worked with Sólfar to implement our VRWorks Multi-Res Shading technique, resulting in a substantial performance gain. Sólfar took advantage of those performance gains and added NVIDIA Turbulence to simulate swirling snow being blown off the peaks of Everest. The effect is subtle, but powerful, drawing people into the experience of climbing the world’s tallest peak.

VR and game developers have already begun working with GeForce GTX 1080 GPUs and VRWorks to bring a new level of immersion to their experiences.

  • “Forward-thinking features in NVIDIA VRWorks enable us to push the limits of photoreal graphics, bringing higher immersion and presence to virtual reality.” — Kjartan Pierre Emilsson, CEO, Sólfar Studios
  • “We are looking forward to bringing NVIDIA’s new VRWorks features to Valkyrie to take the game’s visuals and performance to another level.” — Hilmar Veigar Pétursson, CEO, CCP Games
  • “The new VRWorks Lens Matched Shading and Single Pass Stereo features are smart, novel approaches to the performance challenges VR applications are faced with.” — Brandon Laatsch, co-founder, Stress Level Zero
  • “We took the performance headroom we gained by adding VRWorks technologies to Mars 2030 and we used it to bring new levels of immersion and realism to experience.” — Julian Reyes, lead VR producer, FUSION

We look forward to sharing the work they’re doing soon!

* Performance measured in Mech VR demo.

Similar Stories

  • http://psychomike2k1.deviantart.com/ Like MoPinto

    I can’t wait to experience it for myself. Soon enough!

  • Leonardo Phillips

    I fear that Single Pass Stereo may be fake 3D like Crisis 3 and 3d film not in native 3d stero… am i wrong?