New platform enables agentic and physical AI with low-latency, secure global inference
Akamai Technologies has launched the Akamai Inference Cloud, a globally distributed platform designed to deliver scalable, secure, and low-latency AI inference at the edge. Built in partnership with NVIDIA, the platform redefines how and where AI workloads are processed—bringing intelligent decision-making closer to users, devices, and real-world environments.
“We’re putting intelligence where it matters—at the edge.”
—Dr. Tom Leighton, CEO & Co-founder, Akamai Technologies
Powered by NVIDIA’s Blackwell AI infrastructure and Akamai’s 4,200-location edge network, the Inference Cloud supports next-generation AI applications, including smart commerce agents, streaming financial insights, and real-time physical AI systems. From autonomous vehicles and industrial robotics to fraud detection and personalized digital experiences, the platform enables millisecond-level responsiveness and global scalability.
“Inference is now the most compute-intensive phase of AI.”
—Jensen Huang, Founder & CEO, NVIDIA
Akamai Inference Cloud integrates NVIDIA RTX PRO 6000 GPUs, BlueField DPUs, and NVIDIA AI Enterprise software with Akamai’s distributed cloud infrastructure. It supports agentic AI workflows, streaming inference, and physical AI decisioning with near-instant responsiveness. The platform also features intelligent orchestration, automatically routing tasks between edge and core to optimize performance and accelerate time to value.
The platform is now live in 20 global locations, with expansion underway. Akamai and NVIDIA’s collaboration marks a bold step toward decentralizing AI and unlocking new frontiers in responsiveness, scalability, and intelligent automation.

