AI Devices that Walk, Roll and Fly — and Tacos — Draw Developers to NVIDIA HQ

At an event that spanned everything from robots to the software and semiconductors that power them, our team in charge of autonomous machines hosted Thursday the first public meetup at NVIDIA’s new headquarters building.

The evening event drew hundreds of developers to the heart of Silicon Valley to nosh on tacos, network and listen to talks on NVIDIA’s Jetson embedded computing platform, our new TensoRT 3 inferencing software and the Redtail AI framework for autonomous mobile robotics.

NVIDIA’s Jesse Clayton explained how AI at the edge will be key to building autonomous machines that can tackle a vast range of challenges.

The gathering comes amidst an AI boom — unleashed by the parallel processing power of NVIDIA GPUs. Welcoming developers to the event, NVIDIA’s Jesse Clayton explained how this revolution has given vast numbers of people access to image recognition, voice recognition and real-time translation services that can go beyond what humans are capable of.

The next phase: bringing the power of NVIDIA’s parallel computing platform to devices with a limited power budget. That’s where Jetson comes in. “Deep learning is very computationally intensive; it’s a parallel computing problem, that’s why researchers turn to GPUs,” Clayton said.

Spreading parallel computing power to what technologists call the “edge” — or the world outside of data centers and desktops — is key to building devices with the intelligence to tackle last-mile delivery challenges, manage crops to feed a hungry planet, inspect infrastructure such as roads and bridges, and even perform surgery, Clayton explained.

Slightech’s Mynt Sdeno was one of the meetup’s highlights.

The attendees at Thursday’s gathering were eager to get started. Alvise Memo, a software engineer at Palo Alto-based startup Aquifi, explained he already relies on NVIDIA GPUs for machine learning — and is eager to put that knowledge to work on Jetson.

A few highlights from the meet-up:

  • A robot that understands you — The Mynt Sdeno Robot is more than just a machine that can find its way around. The curvy, diminutive robot relies on the Jetson TX1 embedded processor — which boasts 256 GPU cores — and offers processing power equivalent to 32,000 PCs running 1997-era Pentium Pros. That turns it into a kind of smart mobile assistant that can recognize your voice, follow you around and not just respond to your commands, but recognize your moods.
  • A rough and ready autonomous roverAion Robotics’ ArduRover R1 is both tough and tiny, with nobby 6-inch wheels and a heavy-gauge aluminum chassis. Inside, this rover takes advantage of our Jetson platform — and four powerful DC motors — to tackle a wide range of outdoor autonomous missions.
  • Software that helps drones find their way through forests — Our NVIDIA Redtail project includes deep neural networks, computer vision and control code, hardware instructions and more so users can build a drone or a ground vehicle that can autonomously navigate through challenging environments ranging from forest trails to urban sidewalks.

For more details see our NVIDIA embedded computing page, and sign up for our Jetson Developer Challenge.

Similar Stories