Imagine watching your favorite artist perform and then noticing the crowd isn’t the only thing dancing with the music — the concert backdrop is too.
That’s exactly what attendees experienced at the Pharos festival, a three-day music event that combined irresistible beats, high-tech and immersive effects.
Grammy winner Childish Gambino, also known as award-winning artist Donald Glover, performed three shows over three nights at Pharos, which took place in November in Auckland, New Zealand.
The concerts were held inside a 160-foot-tall dome. Visuals were projected in a 360-degree view to support the stage design, with Childish Gambino performing at the center. When audiences looked up, they saw different characters moving to the music and colorful scenery lighting up the dome.
Kicking Things Off With Quadro
The system was made up of five machines — each responsible for a region of the dome. 2n and Weta orchestrated the five machines into one seamless, circular render using powerful NVIDIA Quadro P6000 graphics.
VR Brings Visuals to Life
To ensure the visuals were captivating from all viewpoints and angles inside the dome, the companies developed content within a VR implementation. This allowed them to review and iterate the designs before setting up the system.
“Creating visual narratives that were interactive and could respond to Donald’s performance – as well as audience mood and energy – was an amazing opportunity,” said Keith Miller, visual effects supervisor at Weta Digital. “We were able to art-direct movements inside the dome that made every vantage point feel special – and we were able to preview that environment by working in a real-time VR workspace.”
“VR helped us wrap our heads around the 360-degree design,” said Alejandro Crawford, creative director for Pharos at 2n Design. “We could tell if a tree was too close to the camera, or if a creature needed to move a different way. VR helped us compose the shots and choreography the way we wanted.”
Real-Time Visuals During a Real Performance
Once the visuals were ready, the team had to synchronize the graphics to the performance and make sure the entire system was rendering in sync.
The system architecture was developed in Unreal Engine, where images were rendered in real time using the graphics engine’s nDisplay technology. NVIDIA Quadro P6000 GPUs powered high-resolution renders during the live production.
“We wanted seamless transitions from one world to another, as opposed to fading in and out. Waiting for levels to load was not an option,” said Crawford. “Our strategy was to load everything into memory, and the Quadro GPUs had enough video texture memory for this to work.”
The team then built a custom engine to receive content from Unreal and individual feeds were stitched in real time, producing stunning projections at over 5K resolution.
Audio reactive parameters were designed and mapped to the system so the visuals were influenced by Childish Gambino’s performance.
The result: an immersive music event with beautiful graphics projected throughout the dome, all reacting to the live event.
“Real-time technology is changing how content is consumed and created. Epic and NVIDIA are pioneering the latest technology that’s enabling artists to tackle large-scale projects and build extraordinary visual environments,” said Marc Petit, general manager of Unreal Engine and Enterprise at Epic Games. “With the rendering system in Unreal Engine and the power of Quadro GPUs, designers can create immersive, interactive experiences like never before.”
This unique project has received recognition among fans and journalists who attended the event. Last month, 2n Design and Weta won the award for Outstanding Visual Effects in a Special Venue Project at the 17th annual VES Awards.
Get Epic at GTC
See all the sessions on real-time graphics and rendering at GTC. And below, check out more highlights of what to expect in media and entertainment at the show.