The post argues that the supposed "scaling wall" in AI pre‑training has been disproven by recent developments: Gemini 3 achieved dramatic performance gains without increasing parameters, showing that algorithmic and compute improvements still drive scaling laws, and Nvidia's record earnings and massive projected AI infrastructure spend confirm that compute capacity is expanding rapidly. These two signals—Gemini 3's breakthrough and Nvidia's trillion‑dollar AI hardware outlook—demonstrate that larger, more powerful models will continue to improve, not plateau. The author adopts a data‑driven, optimistic perspective, using benchmark results and financial forecasts to refute the wall narrative.
Comments
Want to join the conversation?