-+ 0.00%
-+ 0.00%
-+ 0.00%
Amazon (AMZN.US) invests an additional $5 billion in Anthropic to secure $100 billion cloud computing power order
Share
Listen to the news

On Monday local time, Amazon (AMZN.US) announced that it has reached a landmark agreement with artificial intelligence company Anthropic to immediately invest an additional 5 billion US dollars, and plans to add up to 20 billion US dollars after reaching the commercial milestone. Amazon has previously invested 8 billion US dollars in the creator of the Claude big model. In exchange, Anthropic promised to purchase over 100 billion US dollars of computing resources from AWS within the next ten years and deploy Amazon's own Trainium AI chips on a large scale.

After the news was announced, Amazon's stock price rose nearly 3% in after-hours trading to close at around $250, just one step away from the all-time high closing price. This is a typical strategic combination of “converting capital into power and locking the chip with computing power” — but behind this seemingly calm transaction, the global AI big model competition is undergoing an unprecedented structural transformation.

Anthropic Computing Power Crisis: The “Growth Trouble” Behind $30 billion in Annualized Revenue

According to the terms of the agreement, Amazon's additional $5 billion is an immediate “down payment,” and the subsequent maximum of $20 billion is linked to whether Anthropic has reached a “specific commercial milestone.”

Anthropic's promise of reciprocity is even more shocking: it will purchase more than $100 billion in computing resources from AWS over the next ten years, covering the current and future Trainium AI chip series, as well as tens of millions of Graviton cores. Anthropic will also receive up to 5 gigawatts of computing capacity for training and inference of the Claude model — a volume equivalent to the energy consumption of about 10 large data centers.

Anthropic CEO Dario Amodei said bluntly in a statement that the company needed to “build infrastructure to keep up with rapidly growing demand.” Behind this statement is a barometer that has alerted the entire industry.

The most direct driver behind the agreement being implemented at this time is the serious computing power bottleneck that Anthropic is facing. According to previous reports, Claude's service has continued to experience issues such as downtime, speed limits, and performance degradation recently, and some customers have switched to other AI platforms as a result.

Meanwhile, Anthropic's business is expanding at an alarming rate. According to the latest data disclosed by the company, its annualized revenue has exceeded 30 billion US dollars, more than tripling from about 9 billion US dollars at the end of 2025 in just a few months. The core engine driving this round of growth is its programming aid Claude Code — over 100,000 enterprise customers currently run the Claude model on AWS's Amazon Bedrock platform.

In other words, Anthropic is in a “happy state of affairs”: revenue is running faster than computing power. If the infrastructure cannot be greatly expanded in the short term, its rapid growth curve may be dragged down by its own server capacity. This agreement with Amazon is a quick-impact solution to this contradiction — according to the agreement, additional computing power will be launched over the next three months, and it is expected that a total computing power deployment of nearly 1 gigawatt will be achieved by the end of 2026. Large-scale Trainium 2 capacity will be released in the second quarter. Trainium 3 capacity is expected to be put into use later in the year, and future generations of Trainium 4 have also been included in procurement options.

Amazon's strategic logic: a key leap from “cloud landlord” to “chip maker”

For Amazon, the deal meant far more than a simple equity investment. The first layer of logic is to verify the commercial viability of self-developed chips. AWS CEO Andy Jassy emphasized in a statement that Anthropic promised to run its large-scale language model on AWS Trainium within the next ten years, confirming the joint progress made by both parties in the field of custom chips. Trainium is Amazon's self-developed AI training chip for Nvidia GPUs. Currently, it has deployed more than 1 million Trainium2 chips at the Project Rainier data center in Indiana. Through the large-scale deployment of Anthropic, a “lighthouse customer,” Amazon proved to the market that its self-developed chips are not only technically viable, but also commercially reliable to support large-scale models of top AI — which is strategically significant for AWS to get rid of its supply dependency on Nvidia.

Second, target 100 billion “cloud requirements.” The $100 billion AWS purchase amount promised by Anthropic is equivalent to the full year's revenue volume of Amazon's AWS annual revenue of about US$10.8 billion in 2025, and is locked in for ten years. In the context of the current marginal slowdown in cloud computing growth, this ultra-long-term contract provides AWS with extremely impressive revenue visibility.

Furthermore, as a platform strategy for an “AI model supermarket,” Amazon is building a dual-track AI strategy that “requires both self-research and a third party”. On the one hand, the company is actively developing its own large Nova series models, which have been applied in products such as shopping assistant Rufus, and some scenarios still rely on external models such as Anthropic Claude and Meta Llama through the Bedrock platform to provide customers with a “one-stop” AI service — the new agreement allows AWS customers to directly access the full Cloud native console within AWS without managing additional credentials or billing relationships. This kind of “model supermarket” positioning allows AWS not only to personally win the championship in every model competition, but also to continue to extract profits from the prosperity of the entire AI industry.

The deep binding between Amazon and Anthropic has further clarified the competitive landscape of the global AI model. Amazon's 100-billion-level agreement with Anthropic marks that the AI industry competition is being upgraded from “who has the better model” to “who has a more complete computational power-chip-ecology closed loop”.

Under this new paradigm, computing power is not only investment in R&D, but also the core commercial barrier. Cloud vendors with self-developed chip capabilities and large-scale data center resources are being upgraded from “passive computing power providers” to “active industry pattern shapers.”

As a competitor to Amazon, the exclusive Microsoft bonus is fading away. The company has invested a total of about 13 billion US dollars in OpenAI since 2019, but the market's long-assumed “Azure's exclusive priority over OpenAI computing power” is being disrupted. Amazon promised an investment of up to $50 billion to OpenAI, and Microsoft is even weighing legal action against it. The loss of exclusive control over OpenAI's computing power workload means that Azure's “halo effect” in the field of AI infrastructure is being diluted.

Furthermore, another major competitor, Google, has accumulated deep technology, but the commercialization path is unclear. Google DeepMind recently intensively released Gemini 3.1, the Gemma 4 open source series, and the scientific discovery model AlphaGenome, and continues to lead the way in basic research. However, in terms of the speed of commercialization and depth of ecological binding of enterprise-grade AI, Google has yet to show a flywheel effect comparable to the AWS “model supermarket” or Microsoft's “Copilot office suite.”

As Anthropic's annualized revenue exceeds US$30 billion and plans to launch an IPO within the year, this agreement will be a litmus test for Wall Street to test the “model company+cloud vendor” deep binding model. Anthropic's current valuation is 380 billion US dollars, far lower than OpenAI's 852 billion US dollars, but secondary market data shows that demand for Anthropic shares has surpassed OpenAI, and buyers are willing to pay a valuation premium for Anthropic for the first time. Iconiq Capital partner Roy Luo's statement is very representative: “The market can accommodate two companies, but it is essentially a first and second place pattern, and the first place will have a disproportionate advantage. We chose Anthropic.”

Disclaimer:Webull uses external vendor Google Translation Service for news translations where we endeavour to ensure these are correct, however, we recommend that you please double-check this information accordingly. Webull is not responsible for translation errors or issues.
What's Trending