Energy is the only moat left
Models are converging. GPT, Claude, Gemini, they all score within a few points of each other on most benchmarks now. Data moats are eroding as synthetic data and open datasets proliferate. Distribution matters, but platforms can be replicated. So what's actually hard to copy? Energy. The bottleneck in AI is no longer intelligence. It's infrastructure, and specifically, it's power. The companies that secure cheap, reliable energy at scale will define the next era of AI. Everyone else is building on rented ground.
The convergence problem
The AI model race is hitting diminishing returns. Each new generation of frontier models delivers incremental improvements at exponential cost. The gap between the top players is shrinking. For most practical tasks, the difference between GPT, Claude, and Gemini is negligible. This convergence means the competitive advantage is shifting. When the models themselves become commodities, the question stops being "who has the best model?" and becomes "who can run the most inference, the fastest, at the lowest cost?" That's an energy question.
Power is the new bottleneck
A single hyperscale AI data center can demand 300 to 500 megawatts of electricity, comparable to the consumption of a mid-sized city. Multiply that across dozens of facilities under construction globally, and you start to see the scale of the problem. The numbers are staggering. Global data center electricity consumption hit around 415 terawatt-hours in 2024, roughly 1.5% of global electricity use. Goldman Sachs projects data center power demand will rise 165% by 2030. In the U.S. alone, data centers already consume more electricity than the entire nation of Pakistan. And it's accelerating. Morgan Stanley notes that hyperscalers could spend over $1 trillion in 2025 and 2026 combined, much of it on energy infrastructure. The International Energy Agency projects data center electricity demand will grow by more than 1 trillion kWh per year through 2030. This isn't a future problem. It's a present one. In Virginia, data centers already consume about 26% of the state's total electricity supply. Power-constrained regions like northern Virginia and Santa Clara are seeing 24 to 36 month wait times for new capacity. Companies aren't bottlenecked by chips or talent anymore. They're bottlenecked by watts.
Hyperscalers are acting like energy companies
The clearest signal that energy is the real moat is what the biggest players are doing about it. They're not waiting for the grid to catch up. They're becoming energy companies. Nuclear deals are everywhere. Microsoft struck a deal to restart Three Mile Island's reactor. Amazon invested $650 million in a data center campus next to Pennsylvania's Susquehanna nuclear plant and committed over $20 billion to expand it into an AI-ready campus powered by carbon-free nuclear energy. Google signed agreements to purchase power from small modular reactors. Meta is actively seeking nuclear power developers for facilities expected to come online in the early 2030s. Renewables at massive scale. In 2025, Amazon, Google, Meta, and Microsoft signed a combined 16,777 megawatts of corporate renewables contracts, roughly 80% of all corporate renewables deals that year. Going off-grid. An increasing number of hyperscale projects are being planned with on-site power or hybrid approaches. Less than a quarter of new data center projects that have identified a power source will use on-site or hybrid setups, but together they represent 44% of total planned capacity. The shift has been driven partly by gas turbine shortages and an antiquated grid, opening a path for alternative energy sources. This isn't philanthropy or ESG posturing. It's strategic positioning. Nuclear operates 24/7, which is exactly what AI workloads demand. You can't run inference on intermittent power.
The startup energy trap
If you're building an AI startup today and your energy strategy is "use AWS," you're outsourcing your most critical dependency. The hyperscalers control the compute, and increasingly, they control the power that feeds it. Pricing, availability, and priority access will all flow to those who own or control their energy supply. Startups building AI applications are, in a very real sense, building on top of someone else's energy infrastructure. When demand spikes and power gets scarce, guess who gets deprioritized? The startups renting capacity, not the hyperscalers who own the plants. This dynamic creates a new kind of startup risk that most pitch decks don't address. Your model might be brilliant, but if you can't afford to run it at scale, that brilliance doesn't translate into a business.
The energy tech opportunity
TechCrunch recently argued that the best AI investment might be in energy tech, and the logic is sound. Power availability has become the primary bottleneck for new AI data center deployments, creating unprecedented demand for energy infrastructure solutions. A wave of startups is targeting the problem from multiple angles. Companies like Amperesand, DG Matrix, and Heron Power are developing new power conversion technologies, including solid-state transformers. Others like Camus, GridBeyond, and Texture are building software to manage electron flow more efficiently. VCs are repositioning portfolios to capture the energy infrastructure buildout that could define the next decade of AI scaling. The opportunity isn't limited to generation. Grid optimization, battery storage, power conversion, cooling technology, and energy management software are all becoming critical layers of the AI stack. The companies solving these problems aren't just energy companies. They're AI infrastructure companies.
The historical parallel
This pattern isn't new. In the industrial era, whoever controlled infrastructure controlled the economy. Railroads didn't just move goods, they determined which cities thrived and which withered. Oil didn't just fuel machines, it reshaped geopolitics for a century. Energy is playing the same role in the AI era. The companies and countries that secure reliable, affordable power will set the pace of AI development. Everyone else will be constrained by someone else's infrastructure decisions. The geopolitical dimension makes this even more pointed. Energy supply chains are vulnerable to disruption, whether from conflicts affecting shipping lanes or policy shifts around nuclear and fossil fuels. AI's dependency on massive, continuous power means that energy security and AI leadership are becoming the same conversation.
Edge AI as a partial escape
There's one counterargument worth taking seriously: edge AI. Running smaller, optimized models on local devices, phones, laptops, IoT sensors, sidesteps the data center energy problem entirely. If enough AI workloads shift to the edge, the centralized energy bottleneck loosens. But edge AI has limits. The most demanding workloads, training frontier models, running large-scale inference, processing massive datasets, will remain in data centers for the foreseeable future. Edge computing is a complement to centralized AI, not a replacement. It reduces the load at the margins, but the core energy problem persists.
What this means
The AI industry is entering a phase where the most important competitive advantages aren't algorithmic. They're physical. Access to cheap, reliable power. Long-term energy contracts. Relationships with grid operators and nuclear plant developers. These aren't the kinds of moats that get built in a sprint. Energy infrastructure takes years, sometimes decades, to develop. That's precisely what makes it a moat. In a world where a new model can be trained in weeks and a new API can be launched overnight, the slow, capital-intensive work of securing energy is the one thing competitors can't replicate quickly. The best AI companies of the next decade won't just have the best models. They'll have the best power purchase agreements.
References
- TechCrunch, "The best AI investment might be in energy tech" (March 2026) — techcrunch.com
- Goldman Sachs Research, "How AI Is Transforming Data Centers and Ramping Up Power Demand" — goldmansachs.com
- Morgan Stanley, "Powering AI: Markets Race to Invest in AI Energy Solutions" — morganstanley.com
- International Energy Agency, "Energy demand from AI" — iea.org
- Forbes, "Why Microsoft And Amazon Are Turning To Nuclear Power For AI" (February 2026) — forbes.com
- S&P Global, "Hyperscalers continue to dominate corporate renewables contracts in 2025" — spglobal.com
- Pew Research Center, "US data centers' energy use amid the artificial intelligence boom" (October 2025) — pewresearch.org
- McKinsey, "The next big shifts in AI workloads and hyperscaler strategies" — mckinsey.com
- Utility Dive, "AI is outpacing America's power grid. Nuclear must become a national priority." (March 2026) — utilitydive.com
- World Economic Forum, "The AI-energy nexus will dictate AI's future" (December 2025) — weforum.org
You might also enjoy