4,000 data centers and nowhere to plug in
There are over 4,000 data centers humming across the United States right now. Together, they consumed an estimated 176 terawatt-hours of electricity in 2023, roughly 4.4% of the nation's total. By 2028, that number could double. The International Energy Agency projects that data centers will account for almost half of all U.S. electricity demand growth through 2030. And communities across the country are starting to say: not here. The AI boom's dirtiest secret isn't the models, the training runs, or the benchmark scores. It's the electricity bill. And it's rewriting the politics of where and how technology gets built.
The numbers are staggering
In April 2026, the U.S. Energy Information Administration projected that national power consumption would hit 4,244 billion kilowatt-hours this year, rising to 4,381 billion kWh in 2027, both record highs. The primary driver? AI and data center expansion. Global data center electricity consumption reached approximately 415 TWh in 2024, about 1.5% of the world's total electricity use, growing at 12% per year. The IEA's Electricity 2026 report found that U.S. electricity demand is now growing at roughly 2% annually, more than double the pace of the previous decade, with data centers responsible for nearly half of that increase. A single hyperscale data center can draw more power than a small city. An AI training cluster consumes seven to eight times more energy than a typical computing workload. And these facilities aren't slowing down. OpenAI's Stargate project, backed by $850 billion in planned buildouts, represents an energy appetite equivalent to 17 nuclear power plants.
The NIMBY movement meets Big Tech
Between March and June of 2025, advocates and residents delayed or blocked $98 billion worth of proposed data center projects across the United States. A Morning Consult survey found that 41% of American voters now favor banning AI data centers near where they live, up from 37% just a month prior. The resistance is organized and effective. In Imperial County, California, residents filed paperwork in early 2026 for a ballot initiative called the "Imperial County Data Center Prohibition Act," aiming to block data centers in the county altogether. In Maine, state legislators advanced a bill to temporarily ban new data center construction until November 2027. Senator Bernie Sanders and Representative Alexandria Ocasio-Cortez introduced the Artificial Intelligence Data Center Moratorium Act in Congress. This isn't abstract activism. Local communities are waging what Harvard researchers described as a David-and-Goliath fight against some of the wealthiest companies in the world, and they're winning. New data center capacity under construction in primary U.S. markets actually declined in the second half of 2025, the first drop since 2020, even as demand for compute surged. The vacancy rate hit a record low of 1.4%. Wall Street has noticed. The New York Times reported that tech companies may not be able to build at the pace they promised investors. The concerns are concrete. Data centers strain local power grids and drive up electricity prices for surrounding households. They consume vast quantities of water for cooling. They take up land. And the irony isn't lost on the people fighting back: some activists are now using ChatGPT itself to draft their opposition letters, organize campaigns, and analyze zoning documents. AI, powered by data centers, being used to fight the construction of more data centers.
The tradeoff is real
The economic case for data centers isn't zero. The industry created roughly 195,000 jobs between 2016 and 2023, a 60% increase. Construction phases bring surges in local employment and commerce. Specialized trade workers in data center roles, electricians, HVAC technicians, pipefitters, often earn 25% to 30% more than in other industries, with six-figure salaries increasingly common. But the operational reality is different from the construction boom. Once built, data centers employ remarkably few people relative to their footprint. A University of Michigan study found that data centers frequently negotiate bulk power purchasing agreements at lower rates, effectively shifting energy costs onto residential customers. Some states are now reconsidering the generous tax breaks they offered, realizing the long-term fiscal math doesn't add up. In Virginia's Loudoun County, the data center capital of the world, the state preempts local sales tax authority, meaning localities lose both their tax leverage and their say. The pattern is familiar: temporary construction boom, permanent infrastructure burden. Communities in places like Beaver County, Pennsylvania saw exactly this play out with the Shell petrochemical complex. Once construction wrapped, economic trends reverted to baseline. There's no reason to expect data centers will be different.
Jevons paradox strikes again
Here's where the story gets structurally interesting. You might think that making AI more efficient would ease the energy crunch. DeepSeek proved you could build a competitive model for a fraction of the cost. Inference is getting cheaper by the month. Surely that means less energy consumption? Not if you know your economic history. In 1865, the economist William Stanley Jevons observed that improvements in steam engine efficiency didn't reduce coal consumption. They increased it. More efficient engines made coal-powered industry cheaper, which made it more widespread, which consumed more coal overall. This became known as Jevons paradox. The same dynamic is playing out with AI. As models become more efficient and inference gets cheaper, usage skyrockets. Microsoft CEO Satya Nadella celebrated this openly after DeepSeek's announcement: "Jevons paradox strikes again! As AI gets more efficient and accessible, we will see its use skyrocket, turning it into a commodity we just can't get enough of." A Nature Cities study confirmed the pattern in urban data center energy systems: algorithmic efficiency gains in metropolitan data centers are enlarging, not shrinking, the energy footprint of AI. The SIGARCH computing research community put it bluntly: efficiency improvements alone do not guarantee a lower overall carbon footprint. Lower costs per computation simply enable new applications, larger models, and wider adoption across industries, and demand grows faster than efficiency gains can offset. This means every breakthrough in model efficiency is, counterintuitively, an argument for more data centers, not fewer. The constraint on AI scaling isn't algorithms or data. It's watts.
Where are the reactors?
In a 2024 meeting with the Biden administration, Sam Altman claimed that by 2026 an extensive network of nuclear fusion reactors across the United States would power the AI boom. It's 2026. The reactors don't exist. To be fair, progress is happening. Helion Energy, where Altman serves as chairman (having invested $375 million of his own money), achieved a milestone in February 2026 by driving its Polaris reactor to 150 million degrees Celsius, roughly 75% of the plasma temperature needed for commercial operation. The company maintains its target of delivering power to Microsoft by 2028. OpenAI is reportedly in talks to purchase 12.5% of Helion's output, about 5 gigawatts by 2030. But "on track for 2028" is not "powering data centers in 2026." The gap between fusion ambitions and present-day energy reality is enormous. Private fusion investment hit $12 billion in 2025, but commercial fusion energy remains years away, at minimum. Meanwhile, every new hyperscale facility is being powered by the same grid that powers homes, hospitals, and schools. The tech industry's energy narrative has a pattern: promise revolutionary clean energy in the future, build on fossil fuel-dependent grids in the present, and ask communities to bear the cost in between.
The Singapore question
Singapore offers a useful case study in what happens when a nation takes the infrastructure constraint seriously. In 2019, Singapore imposed a moratorium on new data center construction, one of the first countries to do so. The reasoning was straightforward: Singapore is land-scarce and energy-constrained, and data centers were consuming a disproportionate share of both. The moratorium lasted three years before being partially lifted in 2022 through a pilot program that allocated just 80 MW of new capacity across four carefully selected projects. In late 2025, Singapore launched a second call for applications, offering at least 200 MW of new capacity, but with stringent requirements. Applicants must power at least 50% of their proposed capacity with "eligible green energy pathways." A new national standard launched in August 2025 aims to reduce IT equipment energy consumption in data centers by at least 30%. Singapore's approach represents a fundamentally different philosophy: instead of building first and dealing with consequences later, regulate first and build selectively. The result is a dual-location strategy where low-latency inference workloads stay in Singapore while high-energy training workloads move to neighboring Johor, Malaysia, where energy and space constraints are less severe. For a country that stakes its future on being a digital hub, this is a remarkable concession: admitting that even a wealthy, technologically advanced nation cannot host unlimited compute within its borders.
Could resistance decentralize AI?
Here's the most provocative possibility. If communities keep blocking data center construction, and the vacancy rate keeps dropping, and power constraints keep binding, the result might be something that ideology alone could never achieve: the decentralization of AI infrastructure. The current model is absurdly concentrated. Massive hyperscale facilities, each consuming as much power as a small city, clustered in a handful of regions with favorable tax incentives and grid access. This concentration creates single points of failure, political targets, and the exact kind of community backlash we're seeing now. Edge computing offers an alternative vision: smaller, distributed computing facilities located closer to where data is generated and consumed. This reduces latency, lowers bandwidth costs, and distributes the energy burden across many locations rather than concentrating it in a few. It's not a replacement for hyperscale training, which genuinely requires concentrated compute, but it could handle the growing volume of inference workloads that make up the vast majority of day-to-day AI usage. The NIMBY movement might accomplish through political friction what the decentralization movement couldn't accomplish through persuasion: making it economically and politically necessary to distribute compute rather than concentrate it.
What comes next
The data center energy crisis isn't a temporary growing pain. It's a structural feature of how we've chosen to build AI. Every efficiency gain increases demand. Every new capability requires more compute. And every watt consumed has to come from somewhere. Communities aren't wrong to push back. They're not anti-technology, they're anti-subsidy for infrastructure that raises their electricity bills, strains their water supply, and employs a handful of people after construction wraps. The economic bargain being offered, tax breaks and temporary jobs in exchange for permanent grid strain, is genuinely bad for many of these places. The question isn't whether AI needs energy. It does, enormously. The question is who pays for it, who decides where it gets built, and whether the companies consuming the energy are willing to bear the true cost rather than externalizing it onto communities that never asked to power the future. Four thousand data centers, and nowhere left to plug in. Something has to give.
References
You might also enjoy