
Rather than measure the funding rounds and valuations of AI startups building foundation models in US dollars, perhaps we should just convert that right into GPU-hours rented to train models, since this accounts for the vast majority of spending that OpenAI, Anthropic, and a handful of others do in their day to day operations.
Anthropic is the latest to haul in a massive amount of dough to support its AI training habit, announcing that the latest traunch in its Series E round brought in $3.5 billion from nine different investors, with the lead investor being Lightspeed Venture Partners, which itself kicked in $1 billion.
It is interesting how hard it is to dig out financial information on startups, and how obtuse and obscure companies can be about this, even with the help of sites like Crunchbase, whose funding database has been infused with AI to answer more questions about startups. (Which we find annoying, to tell the truth. We just want data, thank you.) All startups heading towards an initial public offering are terrified of being identified with a “down round,” meaning that the amount of money they raised this time was lower than the last time they shook the coffee cup on the street corner get investors. There are weird multi-year things that are called a single “series” of investment that, we think, look dubious, and that is because no startup wants to get too far past Series E before going public. If it does, it looks like you took too long.
Having said all that, getting from founding to IPO is incredibly hard for any startup, and we think perhaps harder than coming up with an idea and getting the first product out the door. So we have respect for the tenacity this takes.
So about how many hours of GPU compute is that $3.5 billion worth?
A Microsoft Azure NDsr H100 v5 instance with eight Nvidia “Hopper” H100 GPU accelerators, eight 400 Gb/sec InfiniBand ports, and a pair of 48-core “Sapphire Rapids” Xeon 4 processors from Intel as the host with 2 TB of memory and 28 TB of storage. Chop that by a factor of eight – one GPU, a dozen CPU cores with 240 GB of main memory and 3.5 TB of flash – and chop the $98.32 per hour on-demand price of that Azure H100 instance by eight to get $12.29 you have a base unit of AI compute as a reference. So the $3.5 billion gets you 284.78 million GPU-hours of H100 compute.
By the way, Amazon Web Services charges the exact same $98.32 per hour for a p5.48xlarge instance eight H100 GPUs, although its instances have 800 Gb/sec networking and offers 30.7 TB of flash. It’s close enough for the point we are making.
What we really wanted to know as well as this figure was how much money Anthropic has raised to date, and what its valuation is, how it has changed over time, and how many H100 equivalent GPU-hours this all was worth in Hoprcoin.
- Dustin Moscovitz, a co-founder of Facebook (now Meta Platforms), and Jaan Tallinn, a co-founder of Skype, were among the seed investors in Anthropic and also lead the Series A round in May 2021 for $124 million, and was valued at $550 million ahead of this investment.
- The Series B round in April 2022, which weighed in at $580 million, was led by none other than Sam Bankman-Fried and Caroline Ellison of FTX fame. It was valued at $4 billion at the time. (Interestingly, James McClave of trading firm Jane Street also participated in the Series A and B rounds, and is big into cryptocurrency like SBF and& Former Co were.)
- In May 2023, a $450 million Series C bag of cash was gathered up, led by Spark Capital. Google also kicked in $2 billion to get up to a 10 percent stake in Anthropic in 2023. In late 2023, Amazon kicked in $4 billion (paid in two pieces) and the company’s valuation was $18.1 billion by early 2024.
- In February 2024, between the two Amazon traunches, Menlo Ventures lead a $750 million Series D round.
- And here, in March 2025, there is a $3.5 billion Series E investment led by Lightspeed, as mentioned above, with contributions from Bessemer Venture Partners, Cisco Investments, D1 Capital Partners, Fidelity Management & Research Company, General Catalyst, Jane Street, Menlo Ventures, and Salesforce Ventures. This Series E round, which has been interspersed by a $1 billion corporate round from Google a few weeks ago and a secondary market offering in May 2025 for $884 million.
By our investigative work and math, Anthropic has raised $12.4 billion in cash, which is an enormous amount of money and by our math, that is 1,009 million aggregate Hoprcoin GPU-hours on the public cloud.
After the Series E round of a total of $3.5 billion, Anthropic has a post-money valuation of $61.5 billion. That’s not too shabby for a company that had on the order of $100 million of revenues in 2023 and a run rate of $1 billion as it exited 2024 and is growing at about 30 percent right now for 2025.
We can’t wait to read the S-3 document when Anthropic goes public. No word on when that might be. But Series E is when people start to wonder. Series F is when they start to worry.
Viewed from a telescope the over capitalization of AI startups appears almost identical to the over capitalization of the dot-coms at the end of the 90s.
Amazon thrived because they rented out excess CPU cycles on the same infrastructure needed to run their online retail business. Should one look for an AI startup that has a business model which includes renting out surplus GPU cycles to the public?
I think so:
https://www.nextplatform.com/2024/11/25/anthropic-and-openai-show-why-musk-should-build-a-cloud/
I think inevitably, Nvidia will have to as well. And it best get going.
https://www.databricks.com/company/newsroom/press-releases/databricks-raising-10b-series-j-investment-62b-valuation what about Series J?!
Sounds pretty far out there in the alphabet to me. . . .
Anthropic’s post-money valuation of $61.5B is astonishing but OpenAI’s recent post-money valuation of $157B is even more insane. There have been valuation bubbles before but the size of the current AI valuation bubble is unprecedented. The biggest dot com flop was webvan.com, which had a peak valuation of $8B before it went bankrupt less than 2 years later. Pets.com had a peak valuation of $1.2B before it went bankrupt 9 months later.
In October, OpenAI raised $6.6B at a valuation of $157B. That wasn’t insane enough so now OpenAI is raising $40B more at a valuation of $300B, according to the Wall Street Journal. To justify a $300B valuation, OpenAI would need about $10B to $15B per year in earnings. OpenAI lost $5B in 2024 and they are projected to lose $14B per year by 2026. OpenAI’s CEO, Sam Altman, says they will “probably” develop AGI by 2029. I’m amazed that “probably” developing AGI by 2029 is good enough for savvy investors, like Vinod Khosla, who invested in OpenAI at a valuation of $157B.
Pets.com went bankrupt but Chewy.com, which is in the same business, currently has a market cap of $14.6B. Investors seem to be betting that Anthropic and OpenAI will grow like Amazon. Anthropic and OpenAI could end up like Pets.com if some other company develops a better AGI. Some competitors are Safe Superintelligence Inc. (Ilya Sutskever’s company), DeepSeek and xAI. Ilya Sutskever was the former chief scientist at OpenAI.
To reduce the cost of large language model inference, there needs to be a new generation of chips using High Bandwidth Flash and there needs to be multiple sources for High Bandwidth Flash. High Bandwidth Flash is a stack of flash chips with an HBM interface. See pages 97 to 104 below.
https://documents.sandisk.com/content/dam/asset-library/en_us/assets/public/sandisk/corporate/Sandisk-Investor-Day_2025.pdf#page=97
Agreed, and interesting on HBF. I will take a look.