Move over, Emmy awards. Real-time ray tracing just stole the spotlight in broadcast television.
Riot Games used The Future Group’s Pixotope mixed-reality virtual production software to deliver the first real-time ray-traced live broadcast, at The League of Legends Pro League regional finals in Shanghai earlier this month.
For the opening ceremony of the esports show, an augmented reality gaming character was shown being interviewed by the host, answering questions in real time and performing a choreographed number with other dancers on stage.
To achieve this level of realism, Pixotope software relied on NVIDIA RTX GPUs for graphics computing and ray tracing, along with Cubic Motion for real-time facial animation, Animatrik for managing motion capture and Stype for camera tracking. The combination creates real-time photorealistic graphics and visual effects on live television — and takes immersive mixed reality to the next level.
The Future Group, based in Norway, provides its software products and services for virtual production around the world. It first popularized immersive mixed reality on broadcast TV when The Weather Channel used its Frontier technology to create a real-time visualization that conveyed the severity of a hurricane.
Making It Real
Traditionally, a big challenge in working with live production has been maintaining standard video frame rates for broadcast while using compute-intensive ray tracing. But RTX-powered Pixotope software makes it possible for studios to take advantage of real-time ray tracing.
Pixotope’s unique architecture is built on Unreal Engine, the gaming engine developed by Epic Games. It also uses a single-pass render pipeline to provide low rendering overhead. The result is ray tracing while maintaining video playback frame rates.
With Pixotope, Riot Games and The Future Group were able to tap into the NVIDIA Turing architecture’s RT Cores for ray tracing, Tensor Cores for denoising, and CUDA cores for shader computing to create new levels of realism with realistic details — from the different areas of lighting on stage to soft shadow effects.
RTX-powered ray tracing, along with real-time facial animation, helped integrate the animated character into the broadcast and have its movements and reactions match the presenters and dancers on stage.
“Ray tracing requires an immense amount of rendering power, and we needed a powerful solution in order to make the performance believable and seamless,” said Marcus B. Brodersen, CTO at The Future Group. “With NVIDIA RTX technology, The Future Group and Riot Games are showcasing real-time animation, broadcast graphics, virtual sets and mixed reality. This new technology is elevating the bar of what’s possible for live broadcast.”
Learn more about NVIDIA RTX in design and visualization.