NVIDIA Propels Deep Learning with TITAN X, New DIGITS Training System and DevBox

by Kimberly Powell

Addressing one of today’s most sophisticated technology challenges, we just unveiled new hardware and software that bring unprecedented speed, ease and power to deep learning research.

Deep learning – a rapidly growing segment of artificial intelligence – is an engine of computing innovation for areas as diverse as advanced medical and pharmaceutical research to fully autonomous, self-driving cars.

NVIDIA CEO and co-founder Jen-Hsun Huang showcased three new technologies that will fuel deep learning during his opening keynote address to the 4,000 attendees of the GPU Technology Conference:

  • NVIDIA GeForce GTX TITAN X – the most powerful processor ever built for training deep neural networks.
  • DIGITS Deep Learning GPU Training System – a software application that makes it far easier for data scientists and researchers to quickly create high-quality deep neural networks.
  • DIGITS DevBox – the world’s fastest deskside deep learning appliance — purpose-built for the task, powered by four TITAN X GPUs and loaded with the intuitive-to-use DIGITS training system.

Another Side of TITAN X

Double life: Titan X can spin elaborate virtual worlds or blast through heavy-duty science.

TITAN X is our new flagship GeForce gaming GPU, but it’s also uniquely suited for deep learning.

We gave a sneak peek of TITAN X two weeks ago at the Game Developers Conference, in San Francisco, where it drove a stunning virtual reality experience called “Thief in the Shadows,” based on the dragon Smaug, from “The Hobbit.”

The latest AAA titles are breathtaking on TITAN X in 4K. Middle-earth: Shadow of Mordor, for example, runs at 40 frames per second on high settings with FXAA enabled, compared with 30fps on the GeForce GTX 980, released in September.

Built on the NVIDIA Maxwell GPU architecture, TITAN X delivers twice the performance and double the power efficiency of its predecessor by combining 3,072 processing cores for 7 teraflops of peak single-precision performance with 12GB of onboard memory.

With that processing power and 336.5GB/s of memory bandwidth, it can rip through the millions of pieces of data used to train deep neural networks. On AlexNet, an industry-standard model, for example, TITAN X took less than three days to train the model using the 1.2 million image ImageNet dataset, compared with over 40 days for a 16-core CPU.

Available today, TITAN X is selling for just $999.

DIGITS: Quick, Easy Path to the Best Deep Neural Net

Using deep neural networks to train computers to teach themselves how to classify and recognize objects can be an onerous, time-consuming task.

The DIGITS Deep Learning GPU Training System software changes that by giving users what they need from start to finish to build the best possible deep neural nets.

Available for download at http://developer.nvidia.com/digits, it’s the first all-in-one graphical system for designing, training and validating deep neural networks for image classification.

Titan X for Deep Learning
DIGITS guides users through the process of setting up, configuring and training deep neural networks – handling the heavy lifting so that scientists can focus on the research and results.

Preparing and loading training data sets with DIGITS – whether on a local system or from the web – is simple, thanks to its intuitive user interface and workflow management capabilities.

It’s the first system of its kind to provide real-time monitoring and visualization, so users can fine tune their work. And it supports the GPU-accelerated version of Caffe, the popular framework used by many data scientists and researchers today to build neural nets (see “DIGITs: Deep Learning Training System” on our Parallel Forall blog for more details).

DIGITS guides users through the process of setting up, configuring and training deep neural networks – handling the heavy lifting so that scientists can focus on the research and results.

DIGITS DevBox: World’s Fastest Deskside Deep Learning Machine

Built by the NVIDIA deep learning engineering team for its own R&D work, the DIGITS DevBox is an all-in-one powerhouse of a platform for speeding up deep learning research.

Starting with its four TITAN X GPUs, every component of the DevBox – from memory to I/O to power – has been optimized to deliver highly efficient performance for the toughest deep learning research.

Built to Go Deep: Every component of our DIGIT DevBox has been optimized for deep learning research.

It comes preinstalled with all the software data scientists and researchers require to develop their own deep neural networks. This includes the DIGITS software package, the most popular deep learning frameworks – Caffe, Theano and Torch – and cuDNN 2.0, NVIDIA’s robust GPU-accelerated deep learning library.

And it’s all wrapped up in an energy-efficient, quiet, cool-running — and great-looking — package that fits under a desk and plugs into an ordinary wall socket. We’re selling this research powerhouse for $15,000.

Very early results of multi-GPU training show the DIGITS DevBox delivers almost 4X higher performance than a single TITAN X on key deep learning benchmarks. Training AlexNet can be completed in only 13 hours with the DIGITS DevBox, compared to over 2 days with the best single GPU PC, or over a month with a CPU-only system.

Learn more about the NVIDIA DIGITS DevBox.

Broadcast live streaming video on Ustream