When it comes to creating safer roads, there’s no need to wait for vehicles that can drive themselves. AI has the potential to improve every new car today.
At CES 2019 in Las Vegas, NVIDIA introduced DRIVE AutoPilot, a Level 2+ automated driving solution that uniquely provides both world-class autonomous driving perception and an intelligent cockpit.
We built the advanced platform on the DRIVE AGX Xavier system-on-a-chip (SoC) and DRIVE Software, integrating for the first time our DRIVE AV autonomous driving and DRIVE IX intelligent experience capabilities.
DRIVE AutoPilot is part of the open, flexible NVIDIA DRIVE platform, which is being used by hundreds of companies worldwide to build autonomous vehicle solutions that increase road safety while reducing driver fatigue and stress on long drives or in stop-and-go traffic.
The new Level 2+ system complements the NVIDIA DRIVE AGX Pegasus system that provides Level 5 capabilities for robotaxis, bringing advanced AI safety features to the roads sooner.
Harnessing Compute for a Safer Driver
Most of today’s advanced driver assistance systems rely on the lower-level processing of standard automotive electronic control units. These are capable of powering features like automatic emergency braking — but they can’t catch every possible braking situation. Nor can automakers update them within the vehicle.
With NVIDIA DRIVE AGX Xavier, tier 1 suppliers and automakers can put 30 trillion operations per second of compute to work. Yet it uses just 30 watts of energy — half that of an incandescent light bulb.
This power and efficiency makes possible a much wider range of active safety features. It allows your car to run deep neural networks in parallel for surround perception, identifying and reacting to a variety of hazards all around the vehicle.
DRIVE AGX Xavier runs NVIDIA DRIVE Software for object detection, traffic light and sign recognition. As an open platform, developers can choose to use all or parts of the DRIVE Software stack. It’s also continually updated over the air.
DRIVE Software also includes DRIVE IX. This intelligent experience platform enables both driver monitoring and in-vehicle visualization. With intelligent driver monitoring, the system can ensure the driver’s attention is on the road. And it can then take action if the driver is drowsy or distracted.
For vehicle positioning, DRIVE AutoPilot also offers a new personal mapping feature called “My Route,” which remembers where you have driven and can create a self-driving route even if no HD map is available. This will enable point-to-point automated driving.
A Single Architecture
Powering AI-infused automated driving with the DRIVE AutoPilot also makes it easier to advance to higher levels of autonomy. DRIVE AGX Xavier can be upgraded to DRIVE AGX Pegasus — capable of 320 trillion operations per second — with few architectural changes.
NVIDIA DRIVE ecosystem partners are already using DRIVE to develop such systems.
Kicking off CES this week was the unveiling of ZF’s ProAI scalable autonomous driving solution, which offers a unique modular hardware concept and open software architecture, and is set for production this year. Built on NVIDIA DRIVE Xavier processors and DRIVE Software, the platform ranges from Level 2+ to Level 5, with the newly announced ProAI Robothink based on DRIVE AGX Pegasus.
Continental also announced a production-level automated driving architecture that will bridge from Premium Assist to future automated functionalities. The Level 2+ system uses Continental’s portfolio of radar, lidar, camera and Automated Driving Control Unit (ADCU) technology, powered by NVIDIA DRIVE, and will go into production in 2020.
And in October, Volvo Cars announced it’s developing Level 2+ systems on DRIVE AGX Xavier for vehicles slated for production in the early 2020s.
Partners like these are using DRIVE AutoPilot to make cars safer now, and autonomous tomorrow. That makes building AI into vehicles a journey well worth starting today.
If you’re at CES, stop by for a live demonstration at NVIDIA booth 6306.