Details
Akamai Inference Cloud runs on Akamai's globally distributed network of over 4,200 locations and integrates with NVIDIA's AI Enterprise ecosystem, including Triton, TensorRT, and NVIDIA BlueField DPUs. The platform supports large language model (LLM) inference and predictive AI workloads, with use cases including video intelligence, context-aware chatbots, personalized product recommendations, and AI-powered consumer products. In March 2025, Akamai announced a version highlighting 3x throughput improvement and 86% lower cost than traditional hyperscale infrastructure.
Have evidence about Akamai Technologies's AI practices? Submit a report.
Report a Sighting →