NVIDIA Expands Global DRIVE Hyperion Ecosystem to Accelerate the Road to Full Autonomy

Leading transportation and technology partners build on NVIDIA’s robotaxi-ready, level 4 platform, underpinned by NVIDIA Halos safety and development tools, to accelerate the shift in how people and goods move, safely and autonomously.
by Mo Poorsartep

At the CES trade show running this week in Las Vegas, NVIDIA announced that the global DRIVE Hyperion ecosystem is expanding to include tier 1 suppliers, automotive integrators and sensor partners, including Aeva, AUMOVIO, Astemo, Arbe, Bosch, Hesai, Magna, Omnivision, Quanta, Sony and ZF Group.

This builds on collaborations unveiled at NVIDIA GTC Washington, D.C., to advance level 4-ready autonomous passenger vehicles with DRIVE Hyperion, while applying the same platform to long‑haul freight to bring safe and secure full self-driving capabilities across commercial transport.

Together, these partners form an integrated global ecosystem network developing the critical technologies needed to make autonomous driving safer, smarter and more efficient.

“Everything that moves will eventually become autonomous, and DRIVE Hyperion is the backbone that makes that transition possible,” said Ali Kani, vice president of automotive at NVIDIA. “By unifying compute, sensors and safety into one open platform, we’re enabling our entire ecosystem, from automakers to the AV software ecosystem, to bring full autonomy to market faster, with the reliability and trust that mobility at scale demands.”​

This unified ecosystem gives automotive customers the confidence that sensing systems and other hardware are fully compatible with DRIVE Hyperion, ensuring reliable performance and seamless integration while streamlining development, reducing testing time and lowering overall costs.

A Growing Sensor Ecosystem​

Leading companies including Astemo, AUMOVIO, Bosch, Magna, Quanta and ZF Group announced they are building DRIVE Hyperion-based electronic control units.

AUMOVIO, along with Aeva, Arbe, Hesai, Omnivision and Sony, are also among the latest partners to qualify their sensor suites on the open, production‑ready DRIVE Hyperion architecture. This growing sensor ecosystem spans cameras, radar, lidar and ultrasonic technologies that enable automakers and developers to build and validate perception systems optimized for level 4 autonomy.​

Cross-domain control of braking, suspension, and steering — enabled by centralized compute and sensor fusion — supports synchronized, low-latency actuation essential for advanced automated driving, delivered by NVIDIA DRIVE Hyperion’s real-time, safety-certified platform.

By building domain controllers or qualifying sensors and other technologies on DRIVE Hyperion, partners gain seamless compatibility with NVIDIA’s full‑stack AV compute platform, speeding development, simplifying integration and accelerating time to market.​

DRIVE Hyperion Delivers Level 4 Autonomy at Scale​

At the core of this ecosystem is NVIDIA DRIVE Hyperion, a production‑ready compute and sensor reference architecture designed to make any vehicle level 4‑ready. Featuring two NVIDIA DRIVE AGX Thor systems-on-a-chip built on the NVIDIA Blackwell architecture, DRIVE Hyperion delivers more than 2,000 FP4 teraflops — or roughly 1,000 INT8 trillion operations per second — of real‑time compute to fuse a full 360‑degree sensor view.​

This performance enables transformer‑based perception, vision language action models and generative AI workloads that can reason about complex driving scenes in real time. By using a common compute and sensor foundation, partners can focus on differentiation at the software and service layers — delivering unique features while benefiting from the safety, scalability and continuous improvements of NVIDIA’s end‑to‑end AV platform.​

Safety and Trust With NVIDIA Halos​

DRIVE Hyperion deployments will be underpinned by NVIDIA Halos, a comprehensive safety and cybersecurity framework that spans from the data center to the vehicle. Halos provides tools for independent inspection, system validation and certification, helping partners meet rigorous global automotive and robotics safety standards.​

Combined with NVIDIA’s large‑scale simulation and AI data factory workflows, Halos enables continuous testing and improvement across millions of virtual and real‑world driving scenarios, building confidence among developers, regulators and passengers.​

New AI Models and Tools

At CES, NVIDIA also released a new family of AI models and tools at CES, dubbed Alpamayo, purpose‑built to make level 4 development more accessible to the automotive industry.

These models are optimized for real‑time performance on the DRIVE Hyperion platform, accelerating development and deployment of level 4 autonomous systems across both passenger and commercial fleets.

Together, they demonstrate how NVIDIA’s end‑to‑end approach — from high-performance computers and sensor integration to AI training and simulation — streamlines autonomous vehicle development.

Learn more by watching NVIDIA Live at CES.