Opinion & Analysis

2026.18: Long-term, Peripheral & Myopic Visions

Amazon bets big on inference with Trainium, adds OpenAI models to AWS.

Deep Dive

Amazon's AI strategy has evolved dramatically. Two years ago, when training dominated AI infrastructure, Amazon seemed behind. But the company—through a mix of vision and good fortune—is now well-positioned for the inference era. Its custom chip, ironically named Trainium, is proving ideal for running AI models rather than training them. Now, Amazon is doubling down by adding OpenAI's models to AWS and collaborating on Bedrock Managed Agents, a new enterprise product that combines AWS's cloud reach with OpenAI's frontier models.

This partnership signals a major shift in the AI landscape. While Microsoft has long been OpenAI's primary cloud partner, AWS's addition of OpenAI models gives enterprises more flexibility. The Bedrock Managed Agents product allows companies to deploy AI agents that can take actions across workflows. For Amazon, this validates its bet that inference—not just training—will drive long-term AI demand. For OpenAI, it opens access to AWS's massive enterprise customer base. The move also highlights Amazon's pragmatic approach: rather than relying solely on its own AI models, it's becoming the neutral platform for multiple AI providers.

Key Points
  • Amazon's Trainium chip, originally for training, is now optimized for AI inference workloads as the market shifts.
  • OpenAI and AWS are launching Bedrock Managed Agents, a new enterprise product combining frontier models with AWS infrastructure.
  • The partnership gives enterprises access to OpenAI models on AWS, challenging Microsoft's exclusive cloud relationship with OpenAI.

Why It Matters

Enterprise AI is moving to inference and agents; Amazon's platform play with OpenAI could reshape cloud AI competition.