On Monday Amazon said it would put up to $25 billion more into Anthropic, the maker of Claude. Some $5 billion goes in now at a $350 billion valuation and another $20 billion follows when certain commercial milestones are hit, according to the announcement from Amazon.

That takes Amazon's total commitment to Anthropic to around $33 billion when the earlier $8 billion is included.

The same announcement also said Anthropic has committed to spend more than $100 billion on Amazon Web Services over the next ten years, securing up to five gigawatts of Trainium chip capacity to train and run Claude.

In other words, Amazon invests into Anthropic, Anthropic spends the proceeds with Amazon which then books the compute revenue on its AWS segment.

That same type of deal happened eight weeks ago with OpenAI, where Amazon committed $50 billion to OpenAI at an $840 billion valuation, and OpenAI separately committed $100 billion of AWS spend over eight years. The terms were set out in OpenAI's own announcement at the time.

What to take from this...

The term the market is using is circular financing. The companies are using their unique leverage to create value together.

In this case, cloud providers and chip makers take equity stakes in AI labs, the AI labs sign enormous long-term compute contracts with those same providers and the revenue from those contracts helps justify further capital expenditure and a higher share price for the investor.

Dario Amodei, CEO of Anthropic, has defended the structure publicly, saying that one player has capital and an interest in selling chips while the other is confident it will have the revenue to pay for them in time but does not yet have $50 billion in hand, so the arrangement is a rational way to bridge that gap.

The bear case is that this looks a lot like what happened in telecoms in 1999.

Lucent sold equipment to carriers and lent them the money to buy it, the carriers booked the capacity on their networks, and when end demand failed to arrive at the scale required the whole thing unwound over the following eighteen months.

The counter argument, which is the one being made inside most boardrooms at the moment, is that AI workloads are real and growing fast with one major question.

Pricing.

Currently, people are using AI at a hugely discounted rate to what prices will become. First the AI companies have to get people hooked.

The deals Amazon, Microsoft and Nvidia have written with the major AI labs rest on three stacked assumptions.

  • First, that user acquisition continues at current pace.
  • Second, that users stay once switching costs kick in.
  • Third, that prices can eventually rise to cover true unit costs without losing those users.

The problem is that current pricing is heavily subsidised. OpenAI lost an estimated $5 billion in 2025 on revenues of $3.7 billion, which is to say it spent around $1.35 for every dollar it earned.

Consumer subscriptions at $20 a month do not cover the compute cost of a heavy user. The reason you are getting Claude and ChatGPT this cheaply is that venture capital and hyperscaler balance sheets are paying the difference, on the bet that acquisition now translates to pricing power later.

What makes this fragile is that step three has never been tested at scale in this market.

When prices start moving toward realistic levels, casual users will churn, which is why OpenAI began testing ads in February. Heavy users will restructure their workflows toward cheaper models and smarter prompting. Enterprise buyers running agentic workloads, where a single task can burn five to thirty times the tokens of a simple chat, will see total bills triple even as per-token prices fall.

If demand turns out to be meaningfully elastic to price, a chunk of the five gigawatts of compute Amazon has just committed to build for Anthropic gets delivered into a market that has rationalised its usage.

The deals were signed on the premise that AI keeps acquiring people. Usage costs becoming more realistic is exactly the thing that would change demand and test whether the premise holds.

Anthropic's annualised revenue has reportedly tripled from around $9 billion at the end of last year to more than $30 billion now, with over 100,000 organisations running Claude on Amazon Bedrock.

The circular money only becomes a problem if end customer demand flattens before the infrastructure commitments clear.

The question is - will that happen? In such a fast, volatile race, who knows.


Want to bring your technology costs down?

We've got an app for that.

Sourcing technology to turn strategy into reality

Proudly sponsors the Executive Summary

Find out more

It's a City CIO Club dinner this evening on AI Return on Investment. We have a lovely group of people meeting at the Walbrook Club. If you'd like to become a member of the CCC (you have to be a CIO or a CTO), please apply on the website.


This week I am helping a well-known e-commerce business to drive up its conversions through better SEO strategy, partnerships and digital experience. All of this comes back to better storytelling. If you're hitting flatlined growth, give me a call.


Have a great Wednesday and I hope the tube strikes don't mess up your dinner party.

Dan