AI is still young, but software is available to help even relatively unsophisticated users harness it.
That’s according to Ian Buck, general manager of NVIDIA’s accelerated computing group, who shared his views in our latest AI Podcast.
Buck, who helped lay the foundation for GPU computing as a Stanford doctoral candidate, will deliver the keynote address at GTC DC on Nov. 5. His talk will give an audience inside the Beltway a software-flavored update on the status and outlook of AI.
Like the tech industry, the U.S. government is embracing deep learning. “A few years ago, there was still some skepticism, but today that’s not the case,” said Buck.
Federal planners have “gotten the message for sure. You can see from the executive orders coming out and the work of the Office of Science and Technology Policy that they are putting out mandates and putting money into budgets — it’s great to see that literally billions of dollars are being invested,” he said.
The next steps will include nurturing a wide variety of AI projects to come.
“We have the mandate and budget, now we have to help all the agencies and parts of the government down to state and local levels help take advantage of this disruptive technology in areas like predictive maintenance, traffic congestion, power-grid management and disaster relief,” Buck said.
From Computer Vision to Tougher Challenges
On the commercial horizon, users already deeply engaged in AI are moving from work in computer vision to tougher challenges in natural language processing. The neural network models needed to understand human speech can be hundreds of thousands of times larger than the early models used, for example, to identify breeds of cats in the seminal 2012 ImageNet contest.
“Conversational AI represents a new level of complexity and a new level of opportunity with new use cases,” Buck said.
AI is definitely hard, he said. The good news is that companies like NVIDIA are bundling 80 percent of the software modules users need to get started into packages tailored for specific markets such as Clara for healthcare or Metropolis for smart cities.
Unleashing GPUs
Software is a field close to Ian Buck’s heart. As part of his PhD work, he developed the Brook language to harness the power of GPUs for parallel computing. His efforts evolved into CUDA, GPU programming tools at the foundation of offerings such as Clara, Metropolis and NVIDIA DRIVE software for automated vehicles.
Users “can program down at the CUDA level” or at the higher level of frameworks such as Pytorch and TensorFlow, “or go up the stack to work with our vertical market solutions,” Buck said.
It’s a journey that’s just getting started.
“AI will be pervasive all the way down to the doorbell and thermostat. NVIDIA’s mission is to help enable that future,” Buck said.
To hear our full conversation with Buck and other AI luminaries, tune into our AI Podcast wherever you download your podcasts.
(You can see Buck’s keynote live by attending GTC DC. Use the promotional code GMPOD for a 20 percent discount.)
Help Make the AI Podcast Better
Have a few minutes to spare? Fill out this short listener survey. Your answers will help us make a better podcast.
How to Tune in to the AI Podcast
Get the AI Podcast through iTunes, Google Play, Castbox, DoggCatcher, Overcast, PlayerFM, Pocket Casts, Podbay, PodBean, PodCruncher, PodKicker, Soundcloud, Spotify, Stitcher and TuneIn. Your favorite not listed here? Email us at aipodcast [at] nvidia [dot] com.