Whether creating realistic digital humans that can express emotion or building immersive virtual worlds, 3D artists can reach new heights with NVIDIA Omniverse, a platform for creating and operating metaverse applications.
A new Blender alpha release, now available in the Omniverse Launcher, lets users of the 3D graphics software optimize scenes and streamline workflows with AI-powered character animations.
Save Time, Effort With New Blender Add-Ons
The new scene optimization add-on in the Blender release enables creators to fix bad geometry and generate automatic UVs, or 2D maps of 3D objects. It also reduces the number of polygons that need to be rendered to increase the scene’s overall performance, which significantly brings down file size, as well as CPU and GPU memory usage.
Plus, anyone can now accomplish what used to require a technical rigger or animator using an Audio2Face add-on.
A panel in the add-on makes it easier to use Blender characters in Audio2Face, an AI-enabled tool that automatically generates realistic facial expressions from an audio file.
This new functionality eases the process of bringing generated face shapes back onto rigs — that is, digital skeletons — by applying shapes exported through the Universal Scene Description (USD) framework onto a character even if it is fully rigged, meaning its whole body has a working digital skeleton. The integration of the facial shapes doesn’t alter the rigs, so Audio2Face shapes and animation can be applied to characters — whether for games, shows and films, or simulations — at any point in the artist’s workflow.
Realistic Character Animation Made Easy
Audio2Face puts AI-powered facial animation in the hands of every Blender user who works with Omniverse.
Using the new Blender add-on for Audio2Face, animator and popular YouTuber Marko Matosevic, aka Markom 3D, rigged and animated a Battletoads-inspired character using just an audio file.
Australia-based Matosevic joined Dave Tyner, a technical evangelist at NVIDIA, on a livestream to showcase their live collaboration across time zones, connecting 3D applications in a real-time Omniverse jam session. The two used the new Blender alpha release with Omniverse to make progress on one of Matosevic’s short animations.
The new Blender release was also on display last month at CES in The Artists’ Metaverse, a demo featuring seven artists, across time zones, who used Omniverse Nucleus Cloud, Autodesk, SideFX, Unreal Engine and more to create a short cinematic in real time.
Creators can save time and simplify processes with the add-ons available in Omniverse’s Blender build.
NVIDIA principal artist Zhelong Xu, for example, used Blender and Omniverse to visualize an NVIDIA-themed “Year of the Rabbit” zodiac.
“I got the desired effect very quickly and tested a variety of lighting effects,” said Xu, an award-winning 3D artist who’s previously worked at top game studio Tencent and made key contributions to an animated show on Netflix.
Get Plugged Into the Omniverse
Learn more about Blender and Omniverse integrations by watching a community livestream on Wednesday, Feb. 15, at 11 a.m. PT via Twitch and YouTube.
And the session catalog for NVIDIA GTC, a global AI conference running online March 20-23, features hundreds of curated talks and workshops for 3D creators and developers. Register free to hear from NVIDIA experts and industry luminaries on the future of technology.
Creators and developers can download NVIDIA Omniverse free. Enterprises can try Omniverse Enterprise free on NVIDIA LaunchPad. Follow NVIDIA Omniverse on Instagram, Medium, Twitter and YouTube for additional resources and inspiration. Check out the Omniverse forums, and join our Discord server and Twitch channel to chat with the community.