AI Videos
  • All Technology
  • AI
  • Autonomy
  • B2B Growth
  • Big Data
  • BioTech
  • ClimateTech
  • Consumer Tech
  • Crypto
  • Cybersecurity
  • DevOps
  • Digital Marketing
  • Ecommerce
  • EdTech
  • Enterprise
  • FinTech
  • GovTech
  • Hardware
  • HealthTech
  • HRTech
  • LegalTech
  • Nanotech
  • PropTech
  • Quantum
  • Robotics
  • SaaS
  • SpaceTech
AllNewsDealsSocialBlogsVideosPodcastsDigests

AI Pulse

EMAIL DIGESTS

Daily

Every morning

Weekly

Sunday recap

NewsDealsSocialBlogsVideosPodcasts
AIVideosNVIDIA’s New AI Just Made Real Physics Look Slow
AI

NVIDIA’s New AI Just Made Real Physics Look Slow

•November 5, 2025
0
Two Minute Papers
Two Minute Papers•Nov 5, 2025

Why It Matters

NeRD dramatically speeds up and generalizes robot simulation, cutting the costly simulation‑to‑real gap and paving the way for adaptable robots to tackle messy, real‑world tasks at scale.

Summary

The video spotlights NVIDIA’s newly unveiled neural physics engine, NeRD (Neural Robot Dynamics), which replaces hand‑crafted equations with a deep‑learning model that predicts robot motion. By ingesting massive amounts of simulated footage, NeRD learns the underlying physics and can forecast thousands of future steps, promising a faster, more flexible alternative to traditional, brittle simulators.

Key insights include NeRD’s ability to match or surpass conventional physics engines across classic benchmarks such as cart‑pole balancing and pendulum swings, and to generalize to diverse robot morphologies—from a six‑legged spider to a robotic arm—without any retraining. The model not only replicates the dynamics of a high‑fidelity simulator but does so orders of magnitude quicker, and it even outperformed the Warp simulator that originally generated its training data.

Notable examples feature a side‑by‑side visual of a spider robot simulated in a physics engine (blue) and in NeRD’s learned “imagined” world (orange), where the motions are strikingly similar. In a cube‑tossing test, NeRD’s predictions aligned more closely with real‑world outcomes than the teacher simulator itself. Most compellingly, a controller trained entirely within NeRD’s virtual environment successfully guided a physical robot to touch target points—a step that historically fails when transferring from simulation to reality.

The implications are profound: by collapsing the simulation‑to‑real gap, NeRD could slash development cycles, reduce reliance on painstaking hand‑tuning, and accelerate deployment of robots capable of handling unstructured, deformable objects—tasks that have long eluded practical automation. For manufacturers and AI hardware providers, the technology signals a shift toward data‑driven, general‑purpose physics solvers that scale across platforms and applications.

Original Description

❤️ Check out Lambda here and sign up for their GPU Cloud: https://lambda.ai/papers
Guide:
Rent one of their GPUs with over 16GB of VRAM
Open a terminal
Just get Ollama with this command - https://ollama.com/download/linux
Then run ollama run gpt-oss:120b - https://ollama.com/library/gpt-oss:120b
📝 The paper "Neural Robot Dynamics" is available here:
https://neural-robot-dynamics.github.io/
https://github.com/NVlabs/neural-robot-dynamics
📝 My paper on simulations that look almost like reality is available for free here:
https://rdcu.be/cWPfD
Or this is the orig. Nature Physics link with clickable citations:
https://www.nature.com/articles/s41567-022-01788-5
🙏 We would like to thank our generous Patreon supporters who make Two Minute Papers possible:
Benji Rabhan, B Shang, Christian Ahlin, Gordon Child, Juan Benet, Michael Tedder, Owen Skarpness, Richard Sundvall, Steef, Taras Bobrovytsky, Tybie Fitzhugh, Ueli Gallizzi
If you wish to appear here or pick up other perks, click here: https://www.patreon.com/TwoMinutePapers
My research: https://cg.tuwien.ac.at/~zsolnai/
X/Twitter: https://twitter.com/twominutepapers
Thumbnail design: Felícia Zsolnai-Fehér - http://felicia.hu
#nvidia
0

Comments

Want to join the conversation?

Loading comments...