
AI chip memory bandwidth has accelerated to 70 million terabytes per second, representing a 4.1× annual growth since 2022. This capacity dwarfs global internet traffic by a factor of roughly 300,000, highlighting the massive data movement required for modern inference. The surge has strained high‑bandwidth memory (HBM) supply, pushing prices higher in early 2026, with AI chips consuming over 90 % of HBM output in 2025. Tracking bandwidth offers a clearer view of the world’s ability to serve increasingly large AI models.

Analysis of job postings at leading AI labs reveals a sharp shift toward go‑to‑market roles. At OpenAI and Anthropic, sales‑related positions have risen to roughly 30% of all openings, while research hires have fallen below 10%. The postings also expose...

The analysis shows that final training runs represent only a minority of AI R&D compute spending. Across OpenAI, MiniMax and Z.ai, final runs account for 9.6%, 22.6% and 12.3% of total compute respectively. OpenAI’s 2024 R&D compute bill was about...

A team led by Kevin Barreto and Liam Price coaxed GPT‑5.4 Pro into solving a Ramsey‑hypergraph conjecture that has been open since a 2019 paper by Will Brian and Paul Larson. The solution marks the first AI‑generated answer on the...

Microsoft reported a $68 billion increase in property, plant and equipment during the second half of 2025, nearly matching the total addition recorded in the previous full fiscal year. The bulk of the spend—57%—went to IT hardware such as GPUs and...