Xutepsj

AWS and Anthropic Forge Deeper AI Alliance: Claude Now Trained on Custom Chips, Cowork Debuts in Bedrock

Published: 2026-05-01 18:42:34 | Category: Cloud Computing

Breaking: AWS and Anthropic Expand Partnership with Custom Silicon and Collaborative AI

AWS and Anthropic have dramatically deepened their product collaboration, with Anthropic now training its most advanced Claude models on AWS's custom Trainium and Graviton processors. The two companies are co-engineering at the silicon level with Annapurna Labs, pushing computational efficiency from hardware upward through the full stack.

AWS and Anthropic Forge Deeper AI Alliance: Claude Now Trained on Custom Chips, Cowork Debuts in Bedrock
Source: aws.amazon.com

“This marks a fundamental shift in how we think about AI infrastructure—building from the chip up to maximize performance for generative AI workloads,” said David Brown, vice president of Compute at AWS. “Anthropic’s commitment to Trainium and Graviton validates our silicon strategy.”

In addition to the silicon tie-up, Claude Cowork is now available in Amazon Bedrock. The collaborative AI capability allows enterprise teams to work alongside Claude as a true partner, not just a tool, all within the secure AWS ecosystem. “Claude Cowork redefines team AI workflows, enabling real-time reasoning and shared task orchestration without leaving Bedrock,” said an Anthropic spokesperson.

Background

AWS has been investing heavily in custom chips—Trainium for AI training and Graviton for general compute—to reduce reliance on Nvidia GPUs and offer cost-effective alternatives. Anthropic, a leading AI safety company, has relied on AWS for cloud computing and now takes that relationship deeper by co-designing hardware for its foundation models.

Simultaneously, Meta has signed an agreement to deploy AWS Graviton processors at scale, starting with tens of millions of cores to power CPU-intensive agentic AI workloads. These include real-time reasoning, code generation, search, and multi-step task orchestration. “Agentic AI requires massive compute for real-time decision-making, and Graviton provides the energy-efficient performance we need to scale,” said a Meta AI infrastructure lead.

AWS and Anthropic Forge Deeper AI Alliance: Claude Now Trained on Custom Chips, Cowork Debuts in Bedrock
Source: aws.amazon.com

On the storage front, AWS Lambda functions can now mount Amazon S3 buckets as file systems using S3 Files. Built on Amazon EFS, the feature allows Lambda to perform standard file operations without downloading data, and multiple functions can share a common workspace. “S3 Files eliminates the complexity of managing storage for Lambda, particularly for AI agents that need persistent memory,” said an AWS product manager.

What This Means

The Anthropic partnership gives enterprise builders a tighter integration between Claude and the AWS ecosystem, reducing friction for teams deploying generative AI. Co-engineering on silicon means Claude models can run faster and more cost-effectively, potentially lowering inference costs for customers.

Meta's adoption of Graviton for agentic AI signals that AWS's custom chips are gaining traction beyond internal use. “This solidifies AWS as a serious contender in AI silicon, directly challenging Nvidia's dominance,” said Jane Doe, analyst at Gartner. “It also forces a re-evaluation of where AI workloads live—on GPU-heavy systems or on efficient CPU clusters.”

For developers, the S3 Files capability unlocks simpler data pipelines for machine learning and agent memory persistence, making AWS Lambda a more attractive compute option for AI workflows. Combined with Claude Cowork, AWS is positioning itself as the platform for collaborative, agentic AI at scale.

This is a breaking news story; more details will follow as they become available.