You’re driving home from a long day at work. Traffic has been stop and go, and you feel as if you’ve been in the car for hours. Your eyelids get heavier, your mind starts to wander…
This situation is a far too common occurrence. Driver distraction and fatigue cause nearly 400,000 accidents in the U.S. each year, according to the National Highway Traffic Safety Administration. If a driver takes their eyes off the road for just two seconds while traveling 65 miles per hour, they could travel about 200 feet without seeing what lies ahead or around them.
Human drivers are imperfect, and can make dangerous mistakes when they focus their attention away from the road. Incorporating AI into the vehicle cockpit can add a robust layer of safety, ensuring drivers stay focused, or taking action if they aren’t paying attention.
A trusted travel companion
NVIDIA DRIVE IX is our intelligent experience software stack that gives car makers the flexibility to develop in-vehicle AI in a variety of ways. The system relies on a driver-facing camera and sophisticated deep learning software running on the NVIDIA DRIVE platform in the car.
One such AI enabled application utilizing DRIVE IX system is driver monitoring. Similar to how deep-learning algorithms are trained on driving data to operate a vehicle, the algorithms built on DRIVE IX can be trained to identify certain behaviors and infer whether a driver is paying attention — for example, by tracking head and eyes to understand where driver is paying attention, and monitoring blink frequency to assess fatigue and drowsiness. Depending on manufacturer preferences, the system can alert the driver via audio, visual or haptic warnings to return focus on the road.
DRIVE IX can also monitor the environment outside the vehicle, and determine whether the driver is aware of oncoming obstacles. If a driver is about to exit the vehicle without looking as a bicyclist approaches alongside, DRIVE IX can intervene, alerting the driver or preventing the door from opening until the bicyclist has passed.
In addition to reading the driver’s facial expressions and body language, DRIVE IX can apply deep learning to decipher the intentions of those outside the car, like a pedestrian on the corner or a person gesturing. This capability helps keep tabs on the environment surrounding the vehicle and to act as a guardian angel if the driver misses important safety cues.
DRIVE IX extends AI capability beyond the driver to other occupants of the vehicle. It can detect individual passengers in a vehicle, allowing riders to use voice commands for actions like temperature control or rolling down the window, and the system will know for whom to perform the action. Passenger detection also enables DRIVE IX to alert the vehicle owner if a child or pet has been accidentally left in the back seat.
AI at your service
Once autonomous vehicles and robotaxis hit the road, cars will be built with an emphasis on the passengers rather than the driver. Besides safety, AI assistants enabled by DRIVE IX can provide convenience features for every passenger.
The same deep learning algorithms that detects distracted drivers can also read body language to tell if a rider will need a cupholder after they take a sip of their drink or alert them if they’ve forgotten a personal item in the car. Passengers’ gaze, gestures and speech will become the primary cabin controllers as riders lean back in their seats and move away from buttons and knobs.