
Samsung Networks Boss Wonders if AI-RAN Is Too Hot to Handle
Companies Mentioned
Why It Matters
The hardware choice will shape the cost, energy footprint, and speed of AI‑RAN rollouts, influencing how quickly 5G/6G operators can deliver AI‑driven performance gains.
Key Takeaways
- •GPUs generate excessive heat, unsuitable for standard RAN base stations.
- •Samsung favors CPU‑centric virtual RAN to keep power and cost low.
- •Nvidia’s $1 billion stake in Nokia signals push for AI‑RAN hardware.
- •Hot‑spot sites may justify expensive GPU deployment for peak demand.
- •Future low‑power GPUs could become viable for widespread RAN use.
Pulse Analysis
The race to embed artificial intelligence into mobile networks is colliding with a practical engineering problem: heat. Nvidia’s graphics processing units, the workhorse behind large‑language‑model training, consume significant power and can reach temperatures that would melt a conventional base‑station chassis. Samsung Networks’ chief, Woojune Kim, highlighted this issue at MWC Barcelona, noting that without a dramatic reduction in cost and power draw, GPUs are unlikely to see mass deployment across the millions of 5G sites that operators manage today.
Telecom vendors are therefore betting on a different architecture. Samsung, like Ericsson, is pushing a virtual RAN model that leverages commodity servers and high‑performance CPUs, which already meet many AI inference needs with far lower energy footprints. Nvidia’s recent $1 billion investment in Nokia underscores the industry’s desire to keep GPUs in the conversation, but the consensus remains that GPUs may only make sense in ultra‑dense environments—stadiums, city centers, or other hotspots—where the performance premium justifies the added expense and cooling requirements.
Looking ahead, the economics could shift. Historically, once‑prohibitive silicon becomes affordable as process nodes improve and demand scales. Analysts predict that by 2030 low‑power GPUs could be cheap enough for widespread RAN integration, eroding the current CPU advantage. Until that tipping point, operators will balance the allure of GPU‑level AI performance against the realities of power budgets, deployment costs, and the maturity of CPU‑centric AI solutions, shaping the next wave of 5G and emerging 6G networks.
Samsung Networks boss wonders if AI-RAN is too hot to handle
Comments
Want to join the conversation?
Loading comments...