Uber Is the Latest to Be Won over by Amazon’s AI Chips

Uber Is the Latest to Be Won over by Amazon’s AI Chips

TechCrunch (Main)
TechCrunch (Main)Apr 7, 2026

Why It Matters

Uber’s adoption of AWS’s custom chips validates Amazon’s AI‑chip push and could reshape the competitive landscape between cloud providers and Nvidia. It also signals a broader industry move toward ARM‑centric, cost‑efficient AI infrastructure.

Key Takeaways

  • Uber adds Graviton ARM servers, expanding AWS compute footprint.
  • New trial of Trainium3 AI chip targets generative AI workloads.
  • Shift reduces reliance on Oracle and Google cloud services.
  • AWS AI chips challenge Nvidia's dominance in cloud AI acceleration.
  • Uber joins Anthropic, OpenAI, Apple using Amazon's custom silicon.

Pulse Analysis

Uber’s latest expansion of its Amazon Web Services contract marks a decisive step in the ride‑hailing firm’s multi‑year migration away from on‑premise data centers toward public cloud platforms. After signing large‑scale agreements with Oracle and Google in 2023, the company now leans more heavily on AWS, deploying additional Graviton ARM‑based instances to run core ride‑sharing services. The move reflects Uber’s pursuit of lower‑cost, energy‑efficient compute and its desire to standardize workloads on a single provider, simplifying operations and freeing engineering resources for product innovation.

The centerpiece of the new deal is a trial of AWS’s Trainium3 processor, Amazon’s home‑grown alternative to Nvidia’s H100 GPUs for generative‑AI inference and training. Built on a custom ASIC architecture, Trainium3 promises higher throughput per watt and tighter integration with the AWS software stack, which could translate into measurable cost savings for data‑intensive models such as demand‑forecasting and dynamic pricing. By pairing Trainium3 with Graviton CPUs, Uber creates a homogeneous ARM ecosystem that rivals the traditional x86‑Nvidia combination dominating most cloud AI workloads.

From a market perspective, Uber’s endorsement adds credibility to Amazon’s ambition to capture a larger slice of the booming AI‑chip services market, currently led by Nvidia and contested by Google’s TPU. As more high‑profile enterprises adopt in‑house silicon, cloud providers may experience a shift in pricing power and service differentiation. Investors will watch Uber’s performance metrics—latency improvements, cost per inference, and carbon footprint reductions—to gauge whether AWS’s custom silicon can deliver the promised economic advantage and accelerate the broader transition to ARM‑centric cloud infrastructures.

Uber is the latest to be won over by Amazon’s AI chips

Comments

Want to join the conversation?

Loading comments...