This weekend’s Academy Awards show features a twice-nominated newcomer to the Oscars: AI-powered visual effects.
Two nominees in the visual effects category, The Irishman and Avengers: Endgame, used AI to push the boundaries between human actors and digital characters — de-aging the stars of The Irishman and bringing the infamous villain Thanos to life in Avengers.
Behind this groundbreaking, AI-enhanced storytelling are VFX studios Industrial Light & Magic and Digital Domain, which use NVIDIA Quadro RTX GPUs to accelerate production.
AI Time Machine
From World War II to a nursing home in the 2000s, and every decade in between, Netflix’s The Irishman tells the tale of hitman Frank Sheeran through scenes from different times in his life.
But all three leads in the film — Robert DeNiro, Al Pacino and Joe Pesci — are in their 70s. A makeup department couldn’t realistically transform the actors back to their 20s and 30s. And director Martin Scorcese was against using the typical motion capture markers or other intrusive equipment that gets in the way of raw performances during filming.
To meet this requirement, ILM developed a new three-camera rig to capture the actors’ performances on set — using the director’s camera flanked by two infrared cameras to record 3D geometry and textures. The team also developed software called ILM Facefinder that used AI to sift through thousands of images of the actors’ past performances.
The tool located frames that matched the camera angle, framing, lighting and expression of the scene being rendered, giving ILM artists a relevant reference to compare against every frame in the shot. These visual references were used to refine digital doubles created for each actor, so they could be transformed into the target age for each specific scene in the film.
“AI and machine learning are becoming a part of everything we do in VFX,” said Pablo Helman, VFX supervisor on The Irishman at ILM. “Paired with the NVIDIA Quadro RTX GPUs powering our production pipeline, these technologies have us excited for what the next decade will bring.”
Building Better VFX Villains
The highest-grossing film of all time, Marvel’s Avengers: Endgame included over 2,500 visual effects shots. VFX teams at Digital Domain used machine learning to animate actor Josh Brolin’s performance onto the digital version of the film franchise’s villain, the mighty Thanos.
A machine learning system called Masquerade was developed to take low resolution scans of the actor’s performance and facial movements, and then accurately transfer his expressions onto the high-resolution mesh of Thanos’ face. The technology saves time for VFX artists who would otherwise have to painstakingly animate the subtle facial movements manually to generate a realistic, emoting digital human.
“Key to this process were immediate realistic rendered previews of the characters’ emotional performances, which was made possible using NVIDIA GPU technology,” said Darren Hendler, head of Digital Humans at Digital Domain. “We now use NVIDIA RTX technology to drive all of our real-time ray-traced digital human projects.”
RTX It in Post: Studios, Apps Adopt AI-Accelerated VFX
ILM and Digital Domain are just two of a growing set of visual effects studios and apps adopting AI tools accelerated by NVIDIA RTX GPUs.
In HBO’s The Righteous Gemstones series, lead actor John Goodman looks 30 years younger than he is. This de-aging effect was achieved with Shapeshifter, a custom software that uses AI to analyze face motion — how the skin stretches and moves over muscle and bone.
VFX studio Gradient Effects used Shapeshifter to transform the actor’s face in a process that, using NVIDIA GPUs, took weeks instead of months.
Companies such as Adobe, Autodesk and Blackmagic Design have developed RTX-accelerated apps to tackle other visual effects challenges with AI, including live-action scene depth reclamation, color adjustment, relighting and retouching, speed warp motion estimation for retiming and upscaling.
Netflix Greenlights AI-Powered Predictions
Offscreen, streaming services such as Netflix use AI-powered recommendation engines to provide customers with personalized content based on their viewing history, or a similarity index that serves up content watched by people with similar viewing habits.
Netflix also customizes movie thumbnails to appeal to individual users, and uses AI to help optimize streaming quality at lower bandwidths. The company uses NVIDIA GPUs to accelerate its work with complex data models, enabling rapid iteration.
Rolling Out the Red Carpet at GTC 2020
Top studios including Lucasfilm’s ILMxLAB, Magnopus and Digital Domain will be speaking at NVIDIA’s GPU Technology Conference in San Jose, March 23-26.
Feature image courtesy of Industrial Light & Magic. © 2019 NETFLIX