AI Podcasts
  • All Technology
  • AI
  • Autonomy
  • B2B Growth
  • Big Data
  • BioTech
  • ClimateTech
  • Consumer Tech
  • Crypto
  • Cybersecurity
  • DevOps
  • Digital Marketing
  • Ecommerce
  • EdTech
  • Enterprise
  • FinTech
  • GovTech
  • Hardware
  • HealthTech
  • HRTech
  • LegalTech
  • Nanotech
  • PropTech
  • Quantum
  • Robotics
  • SaaS
  • SpaceTech
AllNewsDealsSocialBlogsVideosPodcastsDigests

AI Pulse

EMAIL DIGESTS

Daily

Every morning

Weekly

Sunday recap

NewsDealsSocialBlogsVideosPodcasts
AIPodcastsGTC DC '25 Pregame - Chapter 3: AI Infrastructure Ecosystem
GTC DC '25 Pregame - Chapter 3: AI Infrastructure Ecosystem
AI

The AI Podcast (NVIDIA)

GTC DC '25 Pregame - Chapter 3: AI Infrastructure Ecosystem

The AI Podcast (NVIDIA)
•November 11, 2025•34 min
0
The AI Podcast (NVIDIA)•Nov 11, 2025

Key Takeaways

  • •AI data centers need massive power, cooling scaling by 2030
  • •Schneider uses AI digital twins to optimize energy efficiency
  • •GE Vernova quadruples gas turbine output, adds nuclear, renewables
  • •800‑volt rack design cuts losses, enables modular AI factories
  • •Startups accelerate grid intelligence, power electronics, and software solutions

Pulse Analysis

The episode frames AI infrastructure as the backbone of the emerging American AI economy. Leaders from Vertiv, Schneider Electric, GE Vernova and Prusso explain that data centers are now the factories that turn electricity into AI tokens, and that scaling to the projected 2030 demand will require unprecedented power and cooling capacity. Vertiv’s CEO highlights the labor‑intensive nature of traditional construction and the need for industrial‑scale solutions, while Schneider stresses that compute growth is inseparable from energy availability. GE Vernova adds that gas turbines, nuclear SMRs, wind, solar and future‑ready hydrogen will all feed the grid to sustain this surge.

Efficiency emerges as a competitive lever. Schneider describes a full‑life‑cycle AI factory built on digital twins that simulate power and thermal performance before a single component is fabricated. The 800‑volt rack architecture championed by NVIDIA and implemented by Prusso demonstrates how raising voltage from 48 V to 800 V can slash conversion losses and simplify cooling, while modular platforms such as Vertiv OneCore integrate power, thermal and control systems into a single prefabricated unit. These co‑design approaches compress build times, boost power density—from a few kilowatts per rack to over a megawatt‑scale—and unlock the economies of scale needed for gigawatt‑scale AI workloads.

Startups and policy are portrayed as the accelerators of this transformation. GE Vernova points to a vibrant ecosystem in Cambridge and Silicon Valley where new power‑electronics, cable‑cooling and real‑time grid‑management software address the volatile supply from renewables and the millisecond‑scale demands of AI workloads. Schneider notes that federal executive orders and close collaboration with local governments are turning AI infrastructure into a national‑security priority, prompting faster permitting and investment. Together, the established players and agile innovators aim to keep the United States at the forefront of AI data‑center capacity, ensuring energy abundance, resilience, and sustained economic growth.

Episode Description

Bonus coverage from the NVIDIA GTC DC '25 Pregame Show

Chapter 3: AI Infrastructure Ecosystem

Behind every breakthrough is an unseen network of data centers, power systems, and partners. Leaders across energy and infrastructure discuss how they’re building the backbone of the AI economy.

Catch up with GTC DC on-demand: ⁠https://www.nvidia.com/en-us/on-demand/⁠

Show Notes

0

Comments

Want to join the conversation?

Loading comments...