Inside the World's FASTEST Data Center | Cerebras

Matthew Berman
Matthew BermanOct 23, 2025

Why It Matters

This deployment demonstrates a new approach to scaling AI inference—combining wafer-scale chips and liquid cooling—to boost performance and energy efficiency, potentially reshaping cost and latency dynamics for AI services. It also signals growing capital investment in bespoke infrastructure to meet the compute demands of advanced AI workloads.

Summary

Cerebras opened a purpose-built AI data center in Oklahoma City hosting wafer-scale processors that collectively deliver 44 exaflops of compute, which the company says is the fastest AI infrastructure on Earth. The facility uses single, dinner-plate-sized wafer-scale engines with on-chip memory to eliminate off-chip latency and accelerate inference by orders of magnitude. The build is hardened for tornado resilience, relies on large-scale liquid cooling connected to a 6,000-ton chiller plant, and is powered primarily by natural-gas–generated electricity with battery buffers and 3 MW backup generators. The site design also anticipates expansion of cooling and power capacity as demand grows.

Original Description

Join me on a tour of the FASTEST data center in the WORLD
Download One Hundred Ways to Use AI Guide 👇🏼
Download Humanities Last Prompt Engineering Guide (free) 👇🏼
Join My Newsletter for Regular AI Updates 👇🏼
Discover The Best AI Tools👇🏼
My Links 🔗
👉🏻 Forward Future X: https://x.com/forward_future_
Media/Sponsorship Inquiries ✅
Chapters:
0:00 Intro
0:55 Inside Cerebras
1:31 Location
2:39 Wafer Chip
3:36 How are they so fast?
4:16 Cooling
6:48 Power
10:23 2nd Data Hall
11:15 Building in the US
11:55 The toughest challenge
13:01 What's next?

Comments

Want to join the conversation?

Loading comments...