Our phones, homes and televisions have all had an IQ upgrade in the past decade. Now, it’s time for the way we travel to get smart.
At the Smart Mobility Conference in Tel Aviv this week, members of the NVIDIA DRIVE ecosystem showed the latest in autonomous vehicle development as the industry moves toward production. Companies also unveiled new designs for traffic management, using connected technologies to improve safety and efficiency for both human-driven and autonomous vehicles.
The week kicked off with Blue White Robotics, an autonomous vehicle testing and certification company, detailing its collaboration with NVIDIA for a self-driving car test-bed.
BWR will use the NVIDIA DRIVE AGX platform — which includes an automotive-grade AI computer and a full software stack with a suite of deep neural networks — to allow for faster integration, testing and assessment of all levels of autonomous capabilities.
And there was even more intelligence on display from the NVIDIA ecosystem on the SMC show floor.
Taking the Self-Driving Future for a Spin
With high-performance compute, the auto industry can incorporate AI into every level of development, from training and testing to fully driverless operation.
Cognata, an Israel-based simulation company that is part of the DRIVE Constellation open platform, provides detailed traffic and scenario models using data captured by traffic cameras around the world.
At SMC, the company rolled out its latest offering, Synthetic Datasets, to improve accuracy, diversity and scale in simulation testing and validation. Cognata’s datasets aim to streamline the training and testing process by providing high-quality, photorealistic synthetic data paired with fully labeled and annotated ground truth data.
When driving in the real world, autonomous vehicles can communicate with their environment using connected technology.
Autotalks delivers an advanced, secure and global vehicle-to-everything communication solution for autonomous vehicles. Supported by the NVIDIA DRIVE platform, it complements the information coming from other sensors, specifically in rough weather or poor lighting conditions.
Teleoperation company Ottopia provides a remote control platform for autonomous vehicles. This over-the-air connectivity makes it possible for human operators to safely navigate cars that encounter difficult scenarios, like construction zones. The company is integrating its advanced teleoperation platform with NVIDIA DRIVE AGX Pegasus.
When driving without a human, autonomous vehicles must be able to detect the world around them in real time. Self-driving software company VayaVision has developed a highly accurate environmental perception model, applying deep learning and computer vision algorithms to vehicle sensor data. VayaVision is leveraging NVIDIA DRIVE for instantaneous processing and sensor fusion.
Traffic Jams No More
Connected technology doesn’t just help autonomous vehicles navigate, it can also be integrated into the road infrastructure to optimize traffic flows.
As part of the NVIDIA ecosystem, NoTraffic and ITC have developed AI platforms using data from connected vehicles and sensors to optimize traffic at intersections. By analyzing road users and recognizing traffic patterns, the platforms can change traffic light allocations to improve the flow of vehicles.
Thermal sensing company Adasky, which develops on NVIDIA DRIVE, is also working to improve safety and efficiency at intersections. The company said this week that its latest thermal sensor integrates into smart intersections to enable detection, classification and positioning of all road users in all visibility conditions — day or night, rain or shine.
The system transmits information to vehicles approaching the intersection as well as to transportation control centers, allowing preventative actions in potentially dangerous situations.
With the combination of AI-powered vehicles and infrastructure, the NVIDIA ecosystem in Tel Aviv is building mobility solutions for a smarter future.