Developing autonomous vehicles with large scale simulation requires an ecosystem of partners and tools that’s just as wide ranging.
NVIDIA DRIVE Sim powered by Omniverse addresses AV development challenges with a scalable, diverse and physically accurate simulation platform. With DRIVE Sim, developers can improve productivity and test coverage, accelerating their time to market while minimizing the need for real-world driving.
The variety and depth of companies that form the DRIVE Sim ecosystem are core components to what makes the platform the foremost solution for autonomous vehicle simulation.
DRIVE Sim enables high-fidelity simulation by tapping into NVIDIA’s core technologies, including NVIDIA RTX, Omniverse and AI, to deliver a powerful, cloud-based simulation platform. It can generate datasets to train the vehicle’s perception system or provide a virtual proving ground to test the vehicle’s decision-making and control logic.
The platform can be connected to the AV stack in software-in-the-loop or hardware-in-the-loop configurations to test the full driving experience.
DRIVE Sim comes with a rich library of configurable models for environments, scenarios, vehicles, sensors and traffic that work right out-of-the-box.
It also includes dedicated application programming interfaces that enable developers to build DRIVE Sim connectors, plugins, and extensions to tailor the simulation experience to specific requirements and workflows. These APIs make it possible to leverage past investment and development by allowing integration into pre-established AV simulation tool-chains.
With a broad ecosystem of simulation partners, DRIVE Sim always features the cutting edge in virtual simulation models, rich environments as well as verification and validation tools.
Ever-Changing Environments
Driving behavior varies with the environment the vehicle is driving in. From the dense traffic of urban driving to the sparse, winding roads of highways, self-driving cars must be able to handle different domains, as well as follow the unique laws of different countries.
DRIVE Sim ecosystem partners provide realistic virtual models of the three-dimensional road environment, including tools to create such environments, reference maps to create accurate road network and environment assets such as traffic signs and lights, other vehicles, pedestrians, bicyclists, buildings, trees, lamp posts, fire hydrants and road debris.
NVIDIA is partnering with various 3D model providers to make these assets available for easy download and import via Omniverse into simulated environments and scenarios for DRIVE Sim.
Modeling Vehicle Behavior
In addition to recreating the real-world environment in the virtual world, simulation must accurately reproduce the way the vehicle itself responds to road inputs and controls, such as acceleration, steering and braking.
Vehicle dynamics models respond to vehicle control signals sent by DRIVE Sim with the correct position and orientation of the vehicle given the inputs.
These models simulate the vehicle dynamics to help validate planning and control algorithms with the highest possible fidelity. They can recreate the orientation and motion of sensors as the vehicle turns or brakes suddenly, as well as the sensor reaction to road vibration or other harsh conditions.
Vehicle models also help assess the robustness of the autonomous driving system itself. As the vehicle experiences tire and brake wear, varying cargo loads and wheel alignment, it’s critical to see how the system responds to ensure safety.
NVIDIA is collaborating with all major vehicle dynamics model providers to ensure that their models can be integrated into DRIVE Sim.
Sensing Simulation
Just as with autonomous vehicles in the physical world, virtual vehicles also need sensors to perceive their surroundings. DRIVE Sim comes with a library of standard models for camera, radar, lidar and ultrasonic sensors.
Through APIs, it’s also possible for users and ecosystem partners to integrate dedicated models for sensor simulation into DRIVE Sim.
These models typically simulate sensor components such as transmitters, receivers, imagers and lenses, as well as include signal-processing software and transcoders.
Multiple camera, radar and lidar suppliers already provide models of their sensors for DRIVE Sim. By incorporating sensor models with this level of granularity, DRIVE Sim can accurately recreate the output of what a physical sensor in the real world would create as the vehicle drives.
Finding the Unknowns
Vehicles driving in the real world aren’t the only ones on the road, and the same is true in simulation.
With detailed traffic models, developers can play out specific scenarios with the same variables and unpredictability of the real world. Some DRIVE Sim partners develop naturalistic traffic — or situations where the end result is unknown — to test and validate the autonomous vehicle systems.
Other partners contribute specific scenario-catalogs and scenario-based verification and validation methodologies that evaluate whether an autonomous vehicle system meets specific key performance indicators.
These criteria can be regulatory requirements or industry standards. NVIDIA is participating in multiple projects, consortia and standards organizations across the globe aimed at creating standards for autonomous vehicle simulation.
Always in the Loop
Finally, the DRIVE Sim ecosystem makes it possible to use simulation to test and validate the full autonomous vehicle hardware system.
The NVIDIA DRIVE Constellation hardware-in-the-loop platform, which contains the AI compute system that runs in the vehicle, allows for bit-accurate at-scale validation of the AV stack on the target hardware.
System integration partners provide the infrastructure to connect DRIVE Constellation to the rest of the vehicle’s electronic architecture. This full integration with components like the braking, engine and cockpit control units enables developers to evaluate how the full vehicle reacts in specific self-driving scenarios.
With experienced partners contributing diverse and constantly updated models, self-driving systems can be continually developed, tested and validated using the highest quality content.