The number of white knuckles on Silicon Valley highways may be on its way down.
This month, an NVIDIA autonomous test vehicle, nicknamed BB8, completed an 80-kilometer (approximately 50-mile) route near company headquarters in Santa Clara, Calif., without the safety driver ever taking control.
Running on the NVIDIA DRIVE AGX Pegasus AI supercomputer, the car handled highway on- and off-ramps and numerous lane changes entirely on its own, marking a significant step toward a self-driving future.
With industry-leading compute performance, a single DRIVE AGX Pegasus platform ran both NVIDIA DRIVE AV software, which handles autonomous driving functions like object detection and 360-degree surround perception, and the DRIVE IX intelligent experience software stack, which processes voice commands and monitors the driver to ensure they are paying attention.
This revolutionary drive was no science experiment — it was completed using entirely shippable parts. The sensors, embedded DRIVE AGX Pegasus platform and software are all available to autonomous vehicle developers.
“This is not a demo, this is something you can get right now,” said NVIDIA CEO Jensen Huang at GTC Europe Wednesday.
NVIDIA always conducts public road tests with two trained drivers in the vehicle. One supervises the environment from the driver’s seat and another in the passenger seat observes both the driver and the environment. In addition, this test drive deployed Drive IX, which uses infrared camera to monitor where the driver is looking at all times as well as attentiveness and fatigue.
Two Software Stacks, One Platform
In the past decade of development, self-driving test vehicles have relied on bulky PCs and servers wired up in the trunk to run the various deep learning and path planning algorithms for autonomous driving. However, to deploy production-level self-driving cars, autonomous driving hardware must have a significantly smaller footprint and consume much less energy.
At the size of a laptop computer, DRIVE AGX Pegasus is an incredibly energy-efficient hardware solution, capable of 320 trillion operations per second (TOPS) of performance. The supercomputer is able to run multiple deep neural networks at once for autonomous driving, including those for DRIVE AV and DRIVE IX.
This parallel processing capability means the vehicle is able to run deep neural networks for perception — like those found in DRIVE AV — while also running the algorithms for natural language processing to enable voice commands in DRIVE IX.
So, passengers may ask, “What’s the weather in San Francisco?” as their car operates in autonomous mode, with the same computer running both tasks simultaneously.
Way Beyond Cruise Control
With congested highways, frequent entrances and exits, and various construction sites, San Francisco Bay Area traffic is no Sunday drive. However, with a robust hardware and software suite, the NVIDIA BB8 vehicle handled infamous California traffic with ease.
By using deep neural networks to identify lanes, other vehicles and driveable space, the autonomous test car slowed when cars entered from the on-ramp into its lane, speeding back up to the pace of traffic once the car was at a safe distance.
BB8 also seamlessly executed lane changes, safely moving between lanes once it detected ample space.
The ability to navigate these complex and unpredictable traffic scenarios without intervention from a human driver is a major milestone in the journey to full autonomy, where roads will be safer, more efficient and open.