The comments signal OpenAI’s focus on balancing model innovation with pragmatic hardware strategies to scale capability, while underscoring compute/energy constraints that will shape investment and industry competition. That mix of technical realism and bullishness on continuing progress frames near‑term product rollout, infrastructure spending, and labor-market disruption risks.
OpenAI president Greg Brockman discussed scaling multimodal models with the recent Sora 2 release, saying video and text models share core transformer mechanics even as training techniques (diffusion, different inference stacks) and hardware optimizations diverge. He argued continued algorithmic and data scaling will drive progress toward AGI-like capabilities, though the ultimate architecture may evolve beyond today’s transformers. Brockman highlighted compute — and ultimately energy — as the central bottleneck, and described pragmatic partnerships (including work to run on AMD MI450 hardware) alongside continued heavy NVIDIA use. He also warned that bespoke non‑GPU chips have been harder to commercialize than expected and emphasized AI’s disruptive effects on software, jobs, and social contracts.
Comments
Want to join the conversation?
Loading comments...