The Seattle giant is investing an additional 25 billion dollars in Anthropic — but this generosity has a very specific goal. It’s not about buying shares. It’s about ensuring that Anthropic — the creator of the Claude model, one of the strongest names in AI — trains its systems exclusively on American Trainium chips for the next ten years, rather than on Nvidia processors. The stake is breaking the monopoly that today costs every AI company a fortune.
Anatomy of the Deal
The first tranche — 5 billion dollars — will fund Anthropic’s account almost immediately, with the company currently valued at 380 billion dollars. This is six times more than last March, when Anthropic was worth 61.5 billion. The remaining 20 billion is to be released gradually, as the company achieves subsequent “commercial goals” — i.e., grows.
In return, Anthropic promises to spend over 100 billion dollars on Amazon Web Services over the next decade. Simply put: Amazon’s money will return to Amazon. This is a standard structure in today’s deals between Big Tech and AI companies — the investor de facto finances its own revenues. Microsoft did the same with OpenAI in 2023. Now Amazon is doing it on an even larger scale.
Why Trainium Chips
To understand what this deal is really about, you need to look at Nvidia. Today, anyone who wants to train a serious AI model buys chips from Nvidia — H100, B200, Blackwell. Nvidia dictates prices, Nvidia has waiting lists, Nvidia is the most expensive company in the world. For several years, Amazon has been building its own alternative — Trainium chips, designed specifically for AI tasks, produced in Annapurna Labs (a company bought by Amazon in 2015).
The problem: for Trainium to become a real alternative to Nvidia, it needs a serious client who will train the world’s best models on it and publicly demonstrate that it works. Anthropic is ideally suited for this. Claude competes with OpenAI’s GPT-4 and Google’s Gemini — if Anthropic shows that models of this class can be built on AWS Trainium, the rest of the market will follow suit.
The agreement provides Anthropic with access to 5 gigawatts of computing power — Trainium2, Trainium3, Trainium4, and subsequent generations. By the end of 2026, almost a gigawatt of Trainium2 and Trainium3 is to be launched. In addition, tens of millions of Graviton processor cores. This is a scale that will allow training AI models of sizes no one even talked about two years ago.
Project Rainier and the 200 Billion Stake
The joint Amazon and Anthropic data center in Indiana — Project Rainier — already hosts almost half a million Trainium2 chips. It’s one of the largest AI computing clusters in the world. In February, Amazon announced that in 2026 it would spend approximately 200 billion dollars on capital investments, most of which would be for AI infrastructure. This deal with Anthropic is part of that budget, not something in addition to it.
Andy Jassy, Amazon CEO, stated that “Anthropic’s commitment to using AWS Trainium for the next decade reflects the progress we’ve made together in designing custom chips.” In other words: test with us. We’ll make you better chips. Trust us. Dario Amodei, Anthropic CEO, responded in the same vein — the partnership will allow the company to “continue working on models that are both the most powerful and the most trusted in the world.”

What This Changes for Business
For the average Claude user — nothing. The model will continue to work the same way. But for companies building AI products, a lot is changing. Over 100,000 AWS customers already use Claude in Amazon Bedrock today. Now Claude Platform will be available directly from an AWS account — without additional logins, separate agreements, or invoices. For companies that already have Amazon’s cloud, Claude becomes as natural as launching another virtual machine.
For Polish-American tech companies in America and Polish software houses collaborating with American clients — this is a clear signal. AI tools are becoming cheaper, more accessible, integrated into existing infrastructure. The barrier to entry is falling. Competition is growing. Anyone not yet experimenting with Claude in their stack is falling behind.
Independence That OpenAI Lacked
There’s one detail in this agreement that is rarely mentioned in the main narrative: Anthropic has maintained its independence. Amazon’s stake in Anthropic is limited to below 33%. Google, which also invested 3 billion dollars in Anthropic, has similar restrictions on voting rights. None of the big tech companies control Anthropic independently.
For comparison: Microsoft practically absorbed OpenAI. Sam Altman went through an attempt to be fired and returned, the board was restructured, and the company’s strategy is now closely coordinated with Microsoft. Anthropic — at least on paper — avoided such a fate. In a world where AI is becoming strategic infrastructure for the entire West, this independence has its price. And its significance.
Arthur Skok, poland.us
Amazon × Anthropic — April 20, 2026 | New investment: 5 billion USD + up to 20 billion USD (total 25 billion USD) | Total Amazon in Anthropic: 33 billion USD | Anthropic valuation: 380 billion USD (6x in one year) | Anthropic’s commitment to AWS: 100+ billion USD over 10 years | Computing power: up to 5 GW Trainium chips | Claude customers on AWS Bedrock: 100,000+ | Project Rainier (Indiana): approx. 500,000 Trainium2 chips | Amazon CAPEX 2026: 200 billion USD | Based on Amazon and Anthropic press releases and reports from CNBC, Bloomberg, Reuters (April 20, 2026)
Read more at poland.us. Search on polishpages.com








