In a landmark move, Anthropic unveiled its $50 billion US data center plan, a strategic investment designed to power future generations of its Claude AI models and escalate the AI infrastructure arms race. The private AI lab’s multi-year initiative will establish specialized facilities in Texas and New York to meet the intensive compute demands of its next-generation AI.
Announced on November 12, 2025, the investment positions Anthropic in an elite group of companies committing billions to proprietary hardware infrastructure, reducing its sole reliance on cloud providers.
Custom steel for custom silicon
Anthropic is constructing two specialized, liquid-cooled data centers in Texas and New York to support its next-generation AI. The $50 billion, multi-year project involves building custom facilities with high-density GPU clusters, giving the company direct control over the hardware needed for training increasingly large Claude models.
To execute this vision, Anthropic has contracted London-based infrastructure specialist Fluidstack to build and operate the initial campuses. The facilities will be engineered for performance, with the Fluidstack blog detailing high-density racks for liquid-cooled GPU clusters that exceed 40 kW per rack. This design anticipates industry trends where liquid cooling becomes essential as AI rack power approaches 50 kW.
CEO Dario Amodei emphasized that owning the facilities will enable the training of larger, safer models while giving researchers precise control over latency and data sovereignty. While Anthropic will maintain its use of public clouds, this owned infrastructure is deemed critical for future Claude versions that could demand tens of exaFLOPS in computing power.
Economic ripple effects
The plan received strong support from state officials, who highlighted the creation of an estimated 3,200 jobs. This includes 800 permanent positions with an average salary of nearly $144,000 and an initial 2,400 construction roles, according to a report from ABC News. Local economies also anticipate significant secondary benefits for suppliers and service providers.
Key project projections include:
- 3.8 million square feet of new industrial space.
- A combined electrical load of 1.1 gigawatts upon full operation in 2026.
- 30% of power from renewable sources, supplemented by on-site battery storage.
The larger campus will be in Texas, capitalizing on its wind energy resources and connection to the ERCOT grid. The New York site, located near Buffalo, will leverage the region’s established hydropower capacity.
Context within the AI compute race
While major cloud providers are also expanding, analysts note that Anthropic’s investment per model is among the highest in the industry. This spending aligns with projections that AI will drive one-fifth of the world’s data center electricity consumption by 2026, which is expected to reach 800 TWh globally. By building its own energy-efficient facilities, Anthropic aims for a Power Usage Effectiveness (PUE) of 1.2, significantly better than the industry average of 1.58.
Although the development roadmap for Claude is not public, company executives have suggested the new data centers will enable training for models with over 10 trillion parameters. The first hardware is scheduled for delivery in early 2026, with capacity increasing quarterly.
This $50 billion initiative is a long-term wager on achieving superior AI performance through purpose-built infrastructure. While investors are optimistic, the project will face regulatory scrutiny over its environmental and grid impact. Ultimately, Anthropic’s move confirms that the future of advanced AI depends as much on custom hardware as it does on algorithms.
















