
Akamai Inference Cloud Transforms AI From Core to Edge with NVIDIA
Why It Matters
By moving inference to the edge, the solution cuts latency and bandwidth costs, enabling new high‑performance AI use cases at planetary scale and strengthening Akamai’s position as a critical AI infrastructure provider alongside NVIDIA.
Summary
Akamai Technologies announced the launch of Akamai Inference Cloud, a platform that extends AI inference from core data centers to its global edge network of over 4,200 locations, initially rolling out in 20 sites. The service pairs Akamai’s distributed architecture with NVIDIA’s Blackwell RTX PRO 6000 GPUs, BlueField‑3/4 DPUs and AI Enterprise software to deliver low‑latency, real‑time generative AI at the edge. It promises to accelerate agentic workloads such as personalized digital experiences, real‑time financial decisioning, and physical AI for autonomous systems by automatically routing inference tasks to the optimal edge or core location. The platform also abstracts infrastructure complexity through an intelligent orchestration layer that balances edge and central AI factories.
Akamai Inference Cloud Transforms AI from Core to Edge with NVIDIA
Comments
Want to join the conversation?
Loading comments...