Back to Insights

The $700 Billion Problem 

Victoria5 min read

Tech companies are willing to spend $700 billion on AI infrastructure in 2026. Half of it is going nowhere.

According to Sightline Climate's latest outlook, nearly 50% of the 16 gigawatts of US data center capacity scheduled to come online this year will either be delayed or cancelled outright. Only 5 GW is actually under construction. The rest exists in press releases, investor decks, and planning applications gathering dust in county offices.

Traditional bottlenecks for data centers are well known: power grid readiness, not enough domestic manufacturing capacity for large electrical transformers, and communities fighting back.

Communities fighting back risk became very much more real last week.

Until last week, no US state had ever enacted a blanket ban on data center construction. That changed on April 14, 2026, when Maine became the first state in American history to pass a statewide moratorium — blocking any new facility drawing 20 MW or more of power until November 2027. Maine is not an outlier. It is the first domino.

Fourteen other states now have active moratorium or restriction bills moving through their legislatures. Vermont has proposed going further — its bill would freeze new AI data center development through July 2030, a four-year pause. The motivation in both states is the same: electricity bills. One Bloomberg analysis found that some communities have seen monthly power prices rise by 267% since 2020. When residents see that number on their bills, the AI revolution feels less like progress and more like someone else's problem landing on their doorstep.

Who Wins From This?

Not everyone loses when data centers stall. The constraints are quietly reshaping the competitive landscape.

Hyperscalers (Microsoft, Google, Amazon) are net beneficiaries in the short term. Their existing facilities — permitted years ago — are insulated. As new supply stalls, their capacity becomes scarcer and more valuable. Every cancelled competitor project is an effective price increase for their cloud services.

Crusoe and SB Energy — companies that generate their own power from flared gas, wind, and natural gas plants — bypass the grid interconnection queue entirely. In a world where waiting for the grid can take a decade, owning your energy source is the most valuable infrastructure moat in AI. It is not a coincidence that these are the companies actually delivering on timeline while others stall.

Chip efficiency plays get a structural tailwind. Maine's 20 MW threshold won't catch a Cerebras inference rack. Etched's transformer ASIC does the work of an 8×H100 GPU cluster in a fraction of the power draw. Every kilowatt saved per unit of compute makes the permitting math easier. The "do more with less power" pitch has graduated from marketing to survival.

Neoclouds face the most direct risk. They lease existing capacity and have no owned energy advantage. As new supply stalls and existing inventory gets absorbed by hyperscalers, their expansion pipeline — and with it their growth story — is directly constrained.

The Cerebras Angle

Speaking of beneficiaries — the timing of last week's news is hard to ignore.

OpenAI has agreed to pay Cerebras more than $20 billion over three years to use servers powered by its wafer-scale chips — double the figure previously associated with the deal. As part of the agreement, OpenAI will receive warrants for a minority stake in Cerebras, with that ownership growing as it spends more. OpenAI has also agreed to provide Cerebras with around $1 billion to fund the development of the data centers that will run its AI products.

This deal makes more sense when you understand what OpenAI is actually trying to solve. Both OpenAI and Anthropic are training larger models that consume enormous compute. They are also serving more customers than ever before. The solution is to bring on as much compute as possible that is optimised for inference, freeing up GPUs for training. Cerebras chips are purpose-built for exactly that. So are Google's TPUs, which is why Anthropic is lining up a million of them — a chip line, in Google's own words, "purpose-built to power thinking, inferential AI models at scale."

Then, on Friday, Cerebras filed to go public.

The filing showed $510 million in revenue in 2025 — up 76% year-on-year — with reported net income of $237.8 million. The asterisk: strip out a one-time accounting gain and the non-GAAP net loss was $75.7 million. The company is targeting a raise of more than $3 billion at a valuation of at least $35 billion — a 60% premium to its last private round of $22 billion in February.

The $25 billion in remaining performance obligations on the balance sheet tells the real story. Most of it doesn't start flowing until 2028. Cerebras is not a 2026 company. It is a bet on where AI infrastructure is heading — and on who survives the wall that everything else is currently hitting.


Enjoyed this article?

Subscribe for private market insights delivered to your inbox.

LBXpro

Stay in the loop

Get private market insights and platform updates delivered to your inbox.

Nothing on this site and the investment platform is intended as an offer to purchase or sell securities or a solicitation or recommendation of our securities transaction. Any financial information presented on the site and the investment platform are opinions, were prepared without taking into account your objectives, financial situation or needs. Investment results are not guarantees of future results.

© All rights to the site are protected, seePrivacy Policy

Leo

AI analyst

Ask anything

Companies, valuations, IPO plans, market activity