NVIDIA Research Showcases the Future of Robotics at RSS

At this year’s Robotics: Science and Systems conference, NVIDIA Research is presenting work that advances robot learning across simulation, real-world transfer and decision-making.
by Diego Farinha

NVIDIA Research is advancing methods that combine robotic simulation, optimization and AI to enable more generalizable and adaptable robot behavior.

At this year’s Robotics: Science and Systems (RSS) conference, taking place June 21-25 in Los Angeles, members of the global robotics community are convening to explore breakthroughs in autonomy, perception and physical intelligence.

NVIDIA researchers are presenting cutting-edge work spanning simulation-to-real transfer, agile humanoid robot control, GPU-accelerated planning and foundation models for open-world reasoning.

“The RSS conference stands as a pillar for both foundational research and real-world innovation in robotics,” said Fabio Ramos, principal research scientist at NVIDIA. “This year, NVIDIA’s research — from enabling humanoid robots to learn manipulation skills and agile, full-body motions through real-world data, to advancing reasoning and perception — has brought the robotics community closer to achieving real-time, adaptable and intelligent autonomy in complex environments.”

Below are selected NVIDIA Research papers showcasing advancements in robot learning and control at RSS:

Robotics workshops at RSS featuring NVIDIA speakers include:

Explore the latest work from NVIDIA Research and check out the Robotics Research and Development Digest (R²D²), which gives developers deeper insight into the latest physical AI and robotics breakthroughs.

Robot in featured image courtesy of Unitree.