TMTB Morning Wrap

TMTB Morning Wrap

TMT Breakout
TMT BreakoutFeb 23, 2026

Key Takeaways

  • OpenAI aims $600B compute spend by 2030
  • $1.4T figure reflects broader CAPEX commitments, not OPEX
  • Stargate JV stalled; no data center progress
  • Wells Fargo lifts Alphabet PT to $387, citing compute lead
  • AI compute capacity becomes decisive competitive factor

Pulse Analysis

OpenAI’s recent filing reveals a $600 billion compute budget for the 2026‑2030 period, separating operational spend from the broader $1.4 trillion capital‑expenditure pledge that includes partner investments from Azure, AWS, Oracle and SoftBank. By isolating OPEX, the firm signals a more disciplined funding approach, yet the sheer magnitude underscores the escalating cost curve of training next‑generation models. Investors are now watching how OpenAI balances cash flow with the need for ever‑larger GPU clusters, a dynamic that could reshape venture financing in the generative‑AI space.

Compounding the funding challenge, the three‑way Stargate venture—intended to deliver dedicated OpenAI data centers—has stalled. Sources describe a leadership vacuum and unresolved governance between OpenAI, Oracle and SoftBank, leaving the partnership without a clear roadmap. This setback forces OpenAI to rely more heavily on existing public‑cloud contracts, potentially inflating per‑compute costs and limiting custom hardware optimization. The episode also illustrates a broader industry lesson: even well‑capitalized AI firms struggle to secure bespoke infrastructure without aligned incentives among cloud providers and investors.

Across the AI landscape, Alphabet’s recent upgrade by Wells Fargo reflects a contrasting narrative. Google’s Project Google aims to boost AI‑compute capacity to 35 GW by 2028, more than doubling its 2025 footprint. This aggressive scaling, combined with deep user data and a global distribution network, positions Alphabet as a de facto AI platform for both consumer and enterprise workloads. As hyperscalers vie for compute supremacy, capacity becomes the decisive moat, influencing everything from model training speed to pricing power in the burgeoning AI services market.

TMTB Morning Wrap

Comments

Want to join the conversation?