The grid is losing
The number everyone should know is this: the United States faces a net power shortfall of 9 to 18 gigawatts for data centers through 2028. That's a 12% to 25% deficit in the electricity needed to run the AI buildout. Not a rounding error, not a temporary blip. A structural crisis. We've spent the last two years talking about chip shortages, GPU allocations, and model architectures. But the real bottleneck has quietly shifted. The constraint on AI isn't silicon anymore. It's electricity. And unlike chips, you can't fab more of it in 18 months.
The numbers behind the gap
Morgan Stanley Research projects that U.S. data center demand could reach 74 GW by 2028, with a projected shortfall of roughly 49 GW in available power access. A Department of Energy-backed study from Lawrence Berkeley National Laboratory paints a similar picture: by 2028, data centers could consume between 74 and 132 gigawatts annually, representing 6.7% to 12% of total U.S. electricity consumption. Globally, the International Energy Agency forecasts data center electricity consumption will double to around 945 TWh by 2030, growing at 15% per year, more than four times faster than electricity consumption from all other sectors combined. These aren't speculative projections from AI optimists. They're coming from energy analysts, government labs, and the utilities that actually have to deliver the power. And they're all saying the same thing: the grid can't keep up.
The insane economics driving the buildout
To understand why the buildout won't slow down despite the power crunch, look at the economics. Morgan Stanley describes an emerging "15-15-15" dynamic taking hold across the data center sector: 15-year leases guaranteed by hyperscalers, generating 15% unlevered free cash flow yields, resulting in $15 per watt of net value creation. Large technology companies are committing more than $1 trillion in spending in the 2025 to 2026 period alone. Google, Meta, and Amazon are projected to spend a combined $325 billion on capital expenditures this year, most of it directed at data centers. The financial incentives are extraordinary. The physics, however, doesn't care about your cap table. When returns are this attractive and lease commitments this long, developers will do whatever it takes to get power, including things that would have been unthinkable just two years ago.
Companies aren't waiting for the grid
By the end of 2025, an estimated 39% of the gas power capacity under development in the United States was designed to serve data centers on-site. That's up from 5% at the end of 2024. The shift happened almost overnight. Tech companies are building private power plants next to their data centers, firing up natural gas turbines, deploying fuel cells, and constructing behind-the-meter generation systems to bypass grid interconnection queues that can stretch to seven years. The New York Times reported that going off-grid was "no one's first choice," since off-grid power generally costs more and is less efficient than utility-scale generation. But when permitting timelines stretch past 84 months in some states, developers are choosing speed over efficiency. Fuel cells are emerging as a serious option. They convert natural gas directly into electricity without combustion, achieving 15% to 20% higher efficiency than open-cycle gas turbines. They're modular, so operators can start small and scale as demand grows. Goldman Sachs has highlighted fuel cells as a key behind-the-meter technology for meeting incremental data center demand. This is the "move fast and break things" era for energy, and it should concern anyone who cares about emissions targets. These are less-efficient, higher-emission ways of generating power. But the AI buildout isn't waiting for clean alternatives.
Bitcoin mines become AI factories
One of the more remarkable pivots happening right now is the conversion of Bitcoin mining facilities into AI compute centers. The economics make it straightforward: Bitcoin miners already have the power infrastructure, the cooling systems, and the real estate. What they increasingly lack is profitability. The 2024 Bitcoin halving forced a reckoning. Miners had to ask whether mining Bitcoin was still the best use of their power capacity, especially when high-performance computing contracts offer more stable, dollar-denominated returns. Analysts at CoinShares project that mining revenue could plummet from around 85% of total revenue in early 2025 to less than 20% by the end of 2026 for companies that have secured AI contracts. The transition is already well underway. Wired reported that America's biggest Bitcoin miners are "transforming their data centers into AI factories." Companies like Hut 8 and HIVE are swapping out racks of Bitcoin mining ASICs and replacing them with NVIDIA GPUs. HIVE's facility in Boden, Sweden, originally built for Bitcoin mining, is being converted into a Tier III, liquid-cooled AI data center designed for training and inference workloads. Reuters estimated that 20% of Bitcoin miner power capacity could pivot to AI by the end of 2027. BitGo summed up the shift: miners are moving "from commodity production to industrial real estate," exchanging volatile Bitcoin-linked revenue for fixed rental yields.
Orbital data centers: crazy or inevitable?
In March 2026, Blue Origin filed an application with the FCC for "Project Sunrise," a constellation of up to 51,600 satellites designed to perform advanced computation in orbit. The filing describes the system as a way to "ease mounting pressure on U.S. communities and natural resources by shifting energy and water-intensive compute away from terrestrial data centers." It sounds like science fiction. But Blue Origin isn't alone. SpaceX and Starcloud have filed similar proposals. When three of the most well-funded space companies on the planet are independently pursuing the same idea, it stops being a thought experiment and starts looking like a hedge against terrestrial power limits. The logic is straightforward: space offers unlimited solar energy, passive cooling through radiative heat dissipation, and no permitting battles with local communities. The challenges, launching thousands of satellites, maintaining them, and dealing with latency, are enormous. But the fact that this is being seriously pursued tells you something about how constrained the ground-based options have become.
The Southeast Asia question
For anyone watching from Singapore or the broader Southeast Asian region, the power constraint carries extra weight. Singapore is a tiny island with a small grid, tropical cooling costs that push power usage effectiveness ratios higher, and near-total dependence on imported energy. Vacancy rates for data centers sit below 1.4%. Average temperatures hit 33°C with humidity above 80%, among the worst conditions for data center cooling on Earth. Singapore imposed a moratorium on new data center builds in 2019, partially lifting it in 2022. In late 2025, the government announced DC-CFA2, allocating at least 200 MW of new capacity with a mandatory requirement that operators source 50% of power from renewable energy. It's the most aggressive sustainability mandate for data centers in Asia-Pacific. Meanwhile, Malaysia's southern state of Johor has become Southeast Asia's fastest-growing data center hub, with approximately 47 data centers operational or in development. The Johor-Singapore Special Economic Zone offers significantly lower land and energy costs alongside proximity to Singapore's subsea cable ecosystem. A dual-location strategy is emerging: low-latency AI inference in Singapore, high-compute training workloads across the border in Johor. But the fundamental question remains: how does a region heavily reliant on fossil fuels for its grid compete in the AI infrastructure race? The IEA projects that Southeast Asia's data center electricity use will more than double by 2030. Malaysia alone could see a sevenfold increase in data center power consumption, equivalent to adding Singapore's entire 2023 electricity usage.
The irony no one wants to talk about
AI is being pitched as one of the most powerful tools for fighting climate change. It can improve renewable energy forecasting, optimize electricity networks, and automate grid operations. The IEA estimates that by 2035, AI could enable the energy sector to reduce CO₂ emissions by up to three times the total emissions produced by today's data centers. But right now, the AI industry is consuming more power than some countries. A single ChatGPT query uses roughly 10 times more electricity than a traditional Google search. A large data center consumes as much electricity annually as 100,000 households. And the growth trajectory is steep: data center power demand is expanding four times faster than electricity demand from all other sectors. The EU's 2020 digital strategy called for data centers to become climate neutral by 2030. That target now looks almost quaint. In Ireland, data centers already account for over 20% of national electricity consumption. Across Europe, the challenge isn't just building enough renewable capacity, it's that the rate of data center growth outpaces the rate at which clean energy can be brought online.
The solutions exist, just not on the right timeline
Nuclear energy, particularly small modular reactors, is gaining real momentum as a long-term answer. NuScale has the only NRC-approved SMR design. Google signed a deal with Kairos Power for 500 MW of nuclear capacity. Microsoft agreed to restart Three Mile Island. Deep Fission announced 12.5 GW in letters of intent from data centers and other large-load customers. But no SMR is currently operational in the United States. The earliest SMR-powered data center operations are projected for late 2027 to early 2028 on aggressive timelines, with more conservative estimates placing first commercial power at 2030. Nuclear projects in the U.S. average nine years from groundbreaking to commercial operation, often with massive cost overruns. The last nuclear plant completed in the U.S. came in at $30 billion, double its budget, and seven years late. Renewables at scale face similar timing issues. Solar costs are declining rapidly, and hybrid configurations combining firm generation with storage and renewable energy are viable. But they can't be deployed fast enough to close a 49 GW gap in three years. The solutions are real. The mismatch is temporal. AI companies need power now, and the clean alternatives won't arrive at scale until the early 2030s at the soonest. That gap is where natural gas turbines, fuel cells, and converted Bitcoin mines fill the void, and where emissions targets quietly slip.
The real race
For every AI product announcement, it's worth asking one question: where does the power come from? Most companies can't answer this convincingly. They'll point to renewable energy credits, future nuclear partnerships, or vague sustainability commitments. But the electricity flowing into their data centers right now is overwhelmingly coming from the same grid that powers everything else, or increasingly, from private gas plants built specifically to avoid waiting for that grid. The conventional wisdom is that the company with the best model wins the AI race. But models are converging. Architectures are being commoditized. Training techniques are being published and replicated. The company that solves AI power wins everything. Not the company that builds the best model, but the one that secures reliable, scalable, affordable electricity at the pace the buildout demands. Power is the new moat.
References
- European Parliament, "AI and the energy sector" (2025)775859_EN.pdf)