Making the World Go Round: How Littlstar Brings VR, 360-Degree Video to the Masses

New NVIDIA DesignWorks features put the world of VR and 360-degree video in developers’ hands.

Ever thought about walking into the ring with John Cena in Wrestlemania? Or hanging out on the couch in Elliot’s apartment in Mr. Robot?

Maybe sitting on the ops deck of the Rocinante in The Expanse or swim with great white sharks in National Geographic’s Isle of Jaws?

Step right into the operations deck of the Martian frigate Rocinante.
Step right onto the operations deck of the Martian frigate Rocinante.

You can with Littlstar, a content distribution network that’s using immersive VR and 360-degree video powered by the NVIDIA DesignWorks, Video Codec SDK.

This week at the GPU Technology Conference, we’ve launched new updates to DesignWorks that help developers take advantage of an entire suite of tools and technologies designed for cutting-edge 360-degree video and VR.

With DesignWorks, Littlstar streams video from major brands like Sony Music, The Economist, CNN and Showtime.

Individuals can also post to its platform their own 360-degree video for people to experience from every angle.

Whether you’re using a VR headset, a mobile app, the NVIDIA SHIELD TV or a laptop, Littlstar opens the world of 360-degree content to everyone. And NVIDIA technology helps make it happen.

Littlstar second header

Littlstar uses Pascal architecture-based NVIDIA Tesla GPU accelerators and the NVIDIA DesignWorks Video Codec SDK to compress source footage into multiple formats, resolutions and bitrates. NVIDIA GPUs with hardware video encoders make fast work of massive amounts of footage.

John Cena, clearing out some room in the ring.
John Cena, clearing out some room in the ring.

And Littlstar operates at the bleeding edge. It uses the NVIDIA Video Codec SDK through the popular FFmpeg libraries to provide both H.264 and HEVC video content.

“Using the Video Codec SDK via FFmpeg, we’ve been able to encode media for MPEG-DASH delivery using only our GPUs — cutting CPUs out of the picture entirely,” said Andrew Grathwohl, director of media technology at Littlstar. “Our entire encoding stack for DASH, from decoding to scaling and filtering, all the way to the eventual encodes and transcodes, is performed on our NVIDIA Tesla GPUs, dramatically increasing our encoding efficiency and guaranteeing timebase-synchronized outputs.”

New enhancements in DesignWorks enable developers to significantly expand their production quality and performance capabilities. Updates include:

  • GVDB Voxels: This first public release enables the rendering of complex particle simulations of sparse voxels in medical, manufacturing, science and even movies. GVDB is 10-30x faster than CPU-based rendering. In addition, GVDB delivers the ability to intelligently create, render and generate complex interior structures in 3D printing, providing stability while saving on materials.
  • Video Codec SDK 8: With improved quality now supports 12-bit decode giving developers the ability to accept high-dynamic range content for brighter, more immersive video.

Read more about the DesignWorks updates.

The GPU Technology Conference, taking place this week at the San Jose Convention Center, features multiple sessions on DesignWorks.

Check out more 360-degree video from Littlstar.

Similar Stories