Whether they drive themselves or improve the safety of their driver, tomorrow’s vehicles will be defined by software. However, it won’t be written by developers but by processing data.
To prepare for that future, the transportation industry is integrating AI car computers into cars, trucks and shuttles and training them using deep learning in the data center. A benefit of such a software-defined system is that it’s capable of handling a wide range of automated driving — from Level 2 to Level 5.
Data Is the Source Code
Speaking in Tokyo at the last stop on NVIDIA’s seven-city GPU Technology Conference world tour, NVIDIA founder and CEO Jensen Huang demonstrated how the NVIDIA DRIVE platform provides this scalable architecture for autonomous driving.
“The future is surely a software defined car,” said Huang. “It will be a functional safe operating system, with incredible algorithms, and all kinds of applications built on top.”
“Every car company needs a scalable system they can leverage as the requirements grow,” he continued, as he showcased the work of NVIDIA R&D teams in teaching our own test vehicles to drive themselves — incorporating everything from sensor fusion, to AI and parallel computing, to recommending or taking action.
Powering the next generation NVIDIA DRIVE platform is Xavier, the world’s most complex system on a chip. “Xavier is coming out of the fab and we can’t wait to put in the hands of car companies and roboticists all over the world,” Huang said.
With Xavier, applications will process sensor data from outside and within the car, using deep neural networks to deliver applications built on 360-degree surround perception coupled with eye tracking, gesture recognition and natural language understanding.
“AI is changing the way we interact with our cars,” Huang added. “It’s not only the experience outside of the car that will be revolutionized, but how we enjoy the car and how we interact with it will be transformed.”
With the NVIDIA DRIVE IX software development kit for intelligent experience, equipped vehicles will be able to notify the driver of potential safety hazards outside the vehicle. Inside the vehicle, these cars could detect driver drowsiness and distraction, providing the appropriate safety alerts.
On the highway, DRIVE Xavier can take over driving as a co-pilot, using its full surround perception to enable adaptive cruise control, lane keeping and automatic lane changing. And with over-the-air software updates via Wi-Fi or cellular connections, the system’s capabilities can be expanded throughout the life of the vehicle.
“The car is not just an autopilot, but uses AI as a co-pilot to assist you and keep you safe,” Huang said.
DRIVE Partners at GTC Japan
The NVIDIA DRIVE platform is only as powerful as the partners who use it in their vehicles, and several were on display at GTC Japan.
During the event, Pioneer — one of the world’s top automotive electronics manufacturers — announced a collaboration with NVIDIA to combine its 3D-LiDAR sensors with the DRIVE platform. Pioneer recently started supplying samples of the 3D-LiDAR to car manufacturers, ICT-related companies, and others in Japan and around the world.
Tier IV showed off their golf cart-based electric vehicle, called “Milee,” which relies on NVIDIA DRIVE for fully autonomous driving in last-mile scenarios. Milee has a full range of autonomous driving functionality, including a 3D mapping positioning system with laser scanner, object detection, decision-making algorithms and route mapping. With a top speed of 20 km/h, Milee is designed for short trips.
Making it’s Japanese debut, our research car, BB8, was at GTC to demonstrate the building blocks of autonomous driving. Event attendees had a chance to see the various sensors that enable self-driving, and learn about how artificial intelligence is revolutionizing transportation.
To learn more, join us for GTC 2018 Silicon Valley in March.