Anthropic and Google Cloud: Access to Millions of TPUs for Claude

Anthropic, developer of the Claude AI models, has announced a historic expansion of its strategic partnership with Google Cloud. Anthropic will gain access to an unprecedented number of Google Cloud TPU (Tensor Processing Unit) chips and services to train and host the next generations of its advanced AI models.

This agreement represents the largest expansion of Anthropic’s TPU footprint to date. The company will have access to up to one million TPU chips, representing “well over a gigawatt of capacity online in 2026” according to the announcement. This investment in computing power is estimated to be in the tens of billions of dollars and will provide Anthropic with the infrastructure it needs to lead AI development in the coming years.

“Anthropic’s choice to significantly expand its usage of TPUs reflects the strong price-performance and efficiency its teams have seen with TPUs for several years,” said Thomas Kurian, CEO at Google Cloud.

Anthropic now serves more than 300,000 business customers, and our number of large accounts—customers that each represent more than $100,000 in run-rate revenue—has grown nearly 7x in the past year. This expansion will help us serve this rapidly growing customer demand. These greater computational resources will also power more thorough testing, alignment research, and responsible deployment at scale.

“Anthropic and Google have a longstanding partnership and this latest expansion will help us continue to grow the compute we need to define the frontier of AI,” said Krishna Rao, CFO of Anthropic. “Our customers—from Fortune 500 companies to AI-native startups—depend on Claude for their most important work, and this expanded capacity ensures we can meet our exponentially growing demand while keeping our models at the cutting edge of the industry.”

The partnership, which was announced in 2023, enabled Anthropic to make its models available through Google Cloud’s Vertex AI and Marketplace, which thousands of companies, including Figma and Palo Alto Networks, already use.

“We are continuing to innovate and drive further efficiencies and increased capacity of our TPUs, building on our already mature AI accelerator portfolio, including our seventh generation TPU, Ironwood.” adds Thomas Kurian.

Leave a Reply

Your email address will not be published. Required fields are marked *