Nobel Physicist Warns Humanity May Face Extinction Soon | WION Podcast
Why It Matters
If the projected nuclear risk materializes, it could trigger global economic collapse and render long‑term investments untenable, making immediate arms‑control and AI‑governance actions critical for business continuity.
Key Takeaways
- •Nobel physicist David Gross predicts 35-year existential risk horizon.
- •Nuclear war risk estimated at 2% annual probability, 1 in 50.
- •No major arms control treaties signed in past decade, increasing danger.
- •AI could automate weapons, adding another existential threat dimension.
- •Ongoing conflicts (Russia‑Ukraine, US‑Israel‑Iran, India‑Pakistan) heighten nuclear tensions.
Summary
The WION podcast features Nobel‑winning physicist David Gross warning that humanity may have only about 35 years before an existential catastrophe, with nuclear war identified as the most immediate threat.
Gross estimates a 2 % annual probability of nuclear conflict—a one‑in‑50 chance each year—based on half‑life modeling. With nine nuclear‑armed states and no new arms‑control agreements in a decade, the risk has risen sharply. He notes that the last major treaty between the United States and Russia expired in February 2026, covering over 90 % of the world’s warheads.
In his Live Science interview Gross cited the Doomsday Clock’s move to 85 seconds to midnight and warned that artificial intelligence could soon automate weapon systems, compounding the danger. He referenced Enrico Fermi’s paradox, suggesting advanced societies may self‑destruct before achieving long‑term survival.
The warning underscores urgent geopolitical and technological governance challenges. Policymakers, defense contractors, and investors must reassess nuclear risk exposure and AI safety frameworks to mitigate a potentially irreversible crisis.
Comments
Want to join the conversation?
Loading comments...