With Hurricane Florence threatening flash floods, The Weather Channel on Thursday broadcast its first-ever live simulation to convey the storm’s severity before it hit land.
The Atlanta-based television network has adopted graphics processing more common to video game makers in its productions. The result — see video below — is the stunning, immersive mixed reality visual to accompany meteorologists in live broadcasts.
Hurricane Florence slammed into the southeastern shore of North Carolina early Friday morning. Wind speeds of the category 1 hurricane have reached 90 miles per hour, and up to 40 inches of rain have been forecast to drench the region.
Warnings for life-threatening storm surge flooding have been in effect along the North Carolina coast.
The Weather Channel in 2016 began working with this immersive mixed reality to better display the severity of conditions with graphically intense simulations using high performance computing. This type of immersive mixed reality for broadcast news has only recently become a technique used to convey the severity of life-threatening weather conditions.
In June, The Weather Channel began releasing immersive mixed reality for live broadcasts, tapping The Future Group along with its own teams of meteorologists and designers. Their objective was to deliver new ways to convey the weather severity, said Michael Potts, vice president of design at The Weather Channel.
“Our larger vision is to evolve and transform how The Weather Channel puts on its presentation, to leverage this immersive technology,” he added.
The Weather Channel takes the traditional green-screen setting — the background setup for visual — and places the meteorologist in the center for a live broadcast. The weather simulation displays the forecast via green screen, which wraps around the broadcaster with real-time visuals in synch with the broadcast. “It’s a tremendous amount of real-time processing, enabled by NVIDIA GPUs,” said Potts.
It’s science-based. The Weather Channel takes wind speed, direction, rainfall and countless other meteorological data points fed into the 3D renderings to provide accurate visualizations.
Video game-like production was made possible through The Weather Channel’s partnership with Oslo, Norway-based The Future Group, a mixed reality company with U.S. offices. The Future Group’s Frontier graphics platform, based on the Epic Games Unreal Engine 4 gaming engine, was enlisted to deliver photorealistic immersive mixed reality backdrops.
“The NVIDIA GPUs are allowing us to really push the boundaries. We’re rendering 4.7 million polygons in real time,” said Lawrence Jones, executive vice president of the Americas at The Future Group. “The pixels that are being drawn are actually changing lives.”