Quantcast

Fire, smoke, dust and explosions are staples of motion picture visual effects. A modern effects film like “Star Trek: Into Darkness” or “Transformers: Dark of the Moon” contains thousands of these elements. Each requires fluid dynamics simulation and volumetric rendering. It’s computationally intensive stuff, and takes a lot of artist time to get right.

That’s why the team behind Industrial Light & Magic’s Plume software – Olivier Maury, Ian Sachs and Dan Piponi – will be picking up an Academy Award on Feb. 15 for scientific and technical achievement for their work.

Because it runs simulation and rendering together on the GPU, Plume delivers the speed and flexibility digital effects artists need to boost their productivity.

Utilizing the NVIDIA CUDA parallel computing architecture, Plume has shown a 10x speed advantage over existing CPU-based fluid-dynamics software. To achieve this speed, Maury and his colleagues chose parallel-friendly algorithms.

And by doing more work on the GPU, they also avoided the hassle of needing to write simulation results to a disk, where it can be read later by a separate rendering program. Plume renders the final high-resolution frame on the GPU, from the full simulation grid. The render includes shadows, single and multiple scattering and the lighting setup defined by the artist.

Plume was first developed for the fluid effects in “The Last Airbender,” and has since been used on such movies as “Battleship,” “The Avengers,” “Star Trek Into Darkness” and “Pacific Rim.” For the film “Transformers: Dark of the Moon,” Plume was used for 13,000 separate effects elements.

At that scale, it’s important that every artist be at their most productive. To get artists what they needed quickly, ILM used a GPU render farm, where an NVIDIA Quadro GPU can be allocated to an individual artist working with Plume on demand.

Thanks to GPUs, Plume’s speed advantage also helps artists to explore more of what their tools can do – rather than having to wait overnight for a render. Plume also has a library system where stored simulation setups can be easily copied and modified for a new shot.

Key to Plume’s success is its flexibility, which allows artists to set Plume parameters in the Python programming language. Making use of the programmability of NVIDIA CUDA (see “What Is CUDA?”), the Plume team even built a Python bytecode interpreter that runs on the GPU. This gives the artist more ways to tie the simulation to the 3D scene, while it still runs at GPU speed.

As ILM continues to break new ground in the world of visual effects, it’s no wonder it has received 15 Academy Awards for Best Visual Effects, 26 Scientific and Technical Achievement Awards and was presented with the National Medal of Technology by the President of the United States in 2004.

Congratulations, ILM, for nearly 40 years you have helped set the standard for visual effects and created some of the most stunning images in the history of film.

###

CUDA Education and Training: GPU Accelerated Computing with Python

  • Killer_Kopy

    I’m ready for VR movies. Imagine the acting taking place all around you during a scene. The ability to move around the environment in VR having a movie take place around you would probably be one of the coolest experiences ever. It’s going to take a new generation of computing and directing, but if implemented correctly it would change movies forever.