Driverless Car Creeping Closer to Your Driveway, with New Work Shown at Tokyo Auto Show

by Danny Shapiro

Pedestrian and object detection. Automatic braking. Self-parking. Today’s cars feature sophisticated driver assistance systems straight out of science fiction. But we’re still many years away from cars that drive themselves. Aren’t we?

Well, the future may be closer than you think.

Consider what’s being shown this week at the Tokyo Motor Show by Japan’s ZMP, which enables its work with NVIDIA graphics processors.

ZMP is a start-up that develops and sells R&D platforms for autonomous driving. By fusing insights from the field of robotics with automotive, the company aims to enhance safety, preserve the environment and provide power-saving solutions for the next-generation of mobility.

1/10 scale RoboCar 3
1/10 scale RoboCar 3

Starting out with robotic technology and moving to remote control cars at 1/10 scale, ZMP developed and launched a driverless car it calls “RoboCar,” a so-called plug-in hybrid that can be driven autonomously by computers.

ZMP’s research engineer, Dr. Daniel Watman, started building the brains for RoboCar with field-programmable gate arrays – chips that can be configured by customers after they’re manufactured – but soon realized GPUs are faster and easier to develop on, so he switched to NVIDIA’s GPUs running CUDA, our parallel programming architecture.

The RoboCar uses cameras, lasers and radar to sense what is happening around the vehicle. The camera data is processed using the complex algorithm known as HOG – Histogram of Oriented Gradient – to detect pedestrians crossing the street. Without the powerful Kepler GPU, the video couldn’t be processed in real time, and the RoboCar would not be able to stop in time.

Collaborating with Virgina Tech and the University of Technology Sydney, ZMP also uses SLAM (Simultaneous Localization and Mapping) technology – first developed for robots – to guide the car without GPS or road signs.

Vehicle data is also collected via the car’s CAN (controller area network) bus and monitored through the cloud on a laptop PC or tablet – aiding analysis of the car’s behavior.

No driver, only a passenger.
No driver, only passengers.

There’s more coming.  Watman is using the latest GPU feature, “dynamic parallelism,” to enhance his system. Look for his demonstration at the Tokyo Motor Show. Just don’t expect to spend any time behind the wheel.

The Tokyo Motor show runs from Nov. 22 to Dec. 1 at Tokyo Big Sight in Koto-ku, Tokyo.