Unreal Engine with Rivermax will enable studios to enhance flexibility and performance for virtual productions, so creative teams can deliver high-quality, synchronized content to multiple displays.
Rivermax is a unique internet protocol (IP)-based solution for media and data streaming. With the Rivermax plug-in for Unreal Engine, users can now deliver synchronized video over IP and improve manageability of hardware resources.
Unreal Engine with Rivermax delivers two key benefits:
- Low-latency, ultra-high-bandwidth, industry-standard interconnection between render nodes and external video devices.
- An optimized stage architecture to provide the highest visual quality productions while minimizing latency.
Unreal Engine 5.1 Adds Experimental Support for SMPTE ST 2110
Many studios and creatives use virtual production and in-camera visual effects (VFX) to elevate storytelling, while iterating on ideas in real time. As more studios adopt virtual production, it’s important to ensure the stages are optimally configured to deliver the highest-quality content.
With the experimental Rivermax integration, Unreal Engine adds support for SMPTE ST 2110, a suite of standards for sending digital content over IP networks. Virtual production video streaming requires a fixed bitrate and scheduled delivery. It also requires packet burst control — the network card will send as much data, or “packets,” as it can in the shortest time.
But some receivers can’t handle huge amounts of packets, which can result in buffering issues. So teams want to control the number of packets that’s sent over a short time, as this helps maintain a fixed bitrate while avoiding network congestion and buffer overflow.
NVIDIA BlueField data processing units (DPUs) running the NVIDIA DOCA Firefly Service and Rivermax SDK help address these challenges by providing accurate timing to any operating system. Rivermax is also the only fully virtualized streaming solution that complies with the stringent timing and traffic-flow requirement of SMPTE ST 2110.
The nDisplay system for Unreal Engine enables users to combine network configuration data, details about display mechanisms, distribution of rendering, and image displays for large surfaces and screens. The latest release of Unreal Engine features the new Media I/O Mapping System, which sets the stage for nDisplay render nodes to directly stream to LED wall processors that support SMPTE 2110.
Going Behind the Screens of Virtual Scenes
In an in-camera VFX virtual production workflow, the picture on the main LED wall captured by the camera is called the inner frustum, which moves in sync with the camera. The content displayed on the LED volume outside of the camera’s view is called the outer viewports — this is commonly used as a dynamic light and reflection source for the physical set.
The latest release of Unreal Engine introduces a new hardware configuration option for nDisplay systems. While the existing architecture requires the inner frustum to be rendered by each machine in the cluster to ensure proper synchronization, the new support for SMPTE 2110 offers the potential for the inner frustum to be rendered once by a dedicated machine, and then shared with the other machines rendering the outer viewports. This attempts to maximize the resources available for the inner frustum render to ensure the best possible image quality is captured by the camera.
Unreal Engine 5.1 is the pivotal first step in what will be a multiple release effort toward the next generation of in-camera VFX stage deployments. The goal is to improve content performance, increase linear scalability of the render nodes and reduce latency. Epic Games and NVIDIA will continue to collaborate on this development and look forward to a future where real-world productions can reap the benefits of SMPTE 2110.
And check out other NVIDIA technologies for enhancing virtual productions at the upcoming NAB Show.