AI News and Headlines
  • All Technology
  • AI
  • Autonomy
  • B2B Growth
  • Big Data
  • BioTech
  • ClimateTech
  • Consumer Tech
  • Crypto
  • Cybersecurity
  • DevOps
  • Digital Marketing
  • Ecommerce
  • EdTech
  • Enterprise
  • FinTech
  • GovTech
  • Hardware
  • HealthTech
  • HRTech
  • LegalTech
  • Nanotech
  • PropTech
  • Quantum
  • Robotics
  • SaaS
  • SpaceTech
AllNewsDealsSocialBlogsVideosPodcastsDigests

AI Pulse

EMAIL DIGESTS

Daily

Every morning

Weekly

Sunday recap

NewsDealsSocialBlogsVideosPodcasts
AINewsSam Altman Would Like Remind You that Humans Use a Lot of Energy, Too
Sam Altman Would Like Remind You that Humans Use a Lot of Energy, Too
AIClimateTech

Sam Altman Would Like Remind You that Humans Use a Lot of Energy, Too

•February 21, 2026
0
TechCrunch AI
TechCrunch AI•Feb 21, 2026

Companies Mentioned

OpenAI

OpenAI

Why It Matters

The remarks spotlight the need for transparent energy reporting and sustainable power sources as AI scales, influencing policy, investor scrutiny, and public perception of AI’s climate footprint.

Key Takeaways

  • •Altman dismisses AI water‑usage myths as “totally insane”.
  • •He acknowledges AI’s total energy demand is growing.
  • •Calls for rapid shift to nuclear, wind, solar power.
  • •Argues AI inference may match human energy efficiency.
  • •No legal disclosure rules; scientists estimate usage independently.

Pulse Analysis

The debate over artificial intelligence’s environmental impact has intensified as data‑center footprints expand. Altman’s comments underscore a key technical shift: modern AI clusters now rely on closed‑loop cooling systems, eliminating the evaporative processes that once justified high water‑use estimates. By debunking the viral claim that a single ChatGPT query consumes 17 gallons of water, he redirects attention to the more substantive issue of aggregate electricity consumption across millions of queries daily.

Total energy demand, however, remains a pressing challenge. Altman’s call for a swift transition to nuclear, wind, and solar reflects broader industry pressure to decarbonize compute workloads. As AI models grow in size and inference volume, power‑intensive data centers contribute to rising electricity prices and strain grid stability. Policymakers and investors are increasingly demanding transparent reporting, yet current regulations lack mandatory disclosure, leaving independent researchers to fill the data gap. This regulatory vacuum could spur future legislation aimed at quantifying and limiting AI‑related emissions.

Altman also reframes the efficiency debate by comparing AI inference to human cognition. He argues that training a model consumes energy, but once deployed, each query may consume less power than the lifelong biological processes that enable human reasoning. If this parity holds, AI could become a net‑positive tool for reducing overall societal energy use, provided its deployment aligns with clean‑energy sources. The conversation therefore pivots from sensational per‑query metrics to systemic sustainability strategies, shaping how the tech sector addresses climate responsibility.

Sam Altman would like remind you that humans use a lot of energy, too

Read Original Article
0

Comments

Want to join the conversation?

Loading comments...