$700 billion on vibes
The four largest hyperscalers in the US, Amazon, Meta, Alphabet, and Microsoft, are on track to spend roughly $700 billion on AI infrastructure in 2026. That figure is up more than 60% from the already historic levels reached in 2025. It is more than the GDP of Sweden, Poland, or Belgium. It is roughly triple what the entire US federal government spends on education in a year. And yet, for all that capital being deployed, the revenue case remains surprisingly thin. Only about 39% of enterprises have AI deployed in production at scale, and just 8.6% have AI agents fully operational. We are watching $700 billion of supply chase a demand curve that has not yet materialized at anywhere near that scale. So is this the greatest capital allocation bet in history, or the greatest bonfire?
The numbers in context
To understand how staggering $700 billion is, it helps to zoom out. Gartner forecasts total worldwide AI spending at $2.52 trillion in 2026, a 44% increase year-over-year. The four US hyperscalers alone account for roughly a quarter of that figure, just on infrastructure. Amazon plans to spend around $200 billion in capex this year. Google is close behind at $175 to $185 billion. Meta and Microsoft round out the group with their own nine-figure commitments. This is not a gradual ramp. It is an acceleration. And it is coming at a cost. In 2024, these four companies generated a combined $237 billion in free cash flow. By 2025, that had dropped to $200 billion, even as revenues grew. The AI buildout is consuming cash faster than these businesses can generate it. For comparison, the entire US telecom infrastructure buildout of the late 1990s ran at about 1.2% of GDP. Current AI capex is estimated at around 0.9% of US GDP, but that figure is climbing quickly, and unlike telecom towers, GPUs depreciate fast.
The Jevons paradox angle
There is a popular counterargument to the "too much spending" thesis, and it goes like this: as AI gets cheaper, demand will explode so dramatically that total spending will actually increase. This is Jevons paradox, the 19th-century observation that more efficient use of coal led to more coal consumption, not less. Microsoft CEO Satya Nadella invoked Jevons paradox directly after DeepSeek demonstrated that high-quality AI inference could be done at a fraction of the cost. "As AI gets more efficient and accessible, we will see its use skyrocket," he wrote, "turning it into a commodity we just can't get enough of." The evidence so far supports this. Nvidia's Blackwell architecture has enabled up to 10x cost reductions in inference for some workloads. Healthcare company Sully.ai cut inference costs by 90% after switching to open-source models on Blackwell-powered infrastructure. But rather than spending less, companies are simply running more workloads, analyzing more data, and deploying AI to more use cases. Per-unit costs collapse, but aggregate spending rises. This is not just theoretical. It is the pattern that has played out with storage, bandwidth, and compute for decades. Cheaper always means more, not less. But there is a catch. Jevons paradox assumes elastic demand, meaning that the number of useful things to do with the resource expands as the price drops. For coal in the 1860s, that was clearly true. For AI inference in 2026, the jury is still out. If enterprises remain stuck in pilot purgatory, with most AI projects never reaching production, then cheaper inference just means cheaper pilots, not a demand explosion.
Who actually benefits
The obvious winner is Nvidia. Every dollar of hyperscaler capex eventually flows through the GPU supply chain, and Nvidia sits at the top of it. But the second-order effects are where things get more interesting. Energy is the new bottleneck. AI data centers are driving a genuine nuclear renaissance. In January 2026, Meta signed deals with TerraPower and Oklo to develop roughly 4 gigawatts of nuclear capacity, enough to power nearly 3 million homes. TerraPower's agreement alone covers up to 8 Natrium reactors, with initial units expected as early as 2032. Amazon has partnered with Talen Energy for 1,920 megawatts of nuclear power through 2042. Even France's President Macron has positioned the country's nuclear fleet as an AI infrastructure asset. This is not a niche trend. As one analyst put it, "I would be shocked if every Big Tech company doesn't make some play on nuclear in 2026, whether a strategic partnership or acquisitions." The energy companies, cooling technology firms, and real estate developers who service data centers are all riding the same wave. Chip equipment makers like Applied Materials and ASML are another second-order play. If TSMC and Micron need to add capacity beyond what they planned, the companies that build the fabrication equipment benefit directly.
The dot-com parallel (and why it is incomplete)
The comparison to the dot-com bubble is unavoidable, but it is also lazy if you stop at "lots of money being spent on tech." Here is what is genuinely different. The companies doing the spending are enormously profitable. During the dot-com era, the three most valuable tech companies (Cisco, Microsoft, Intel) peaked at around $500 billion each in market cap. Today's hyperscalers dwarf those figures, and they are funding capex primarily from operating cash flow, not debt or speculative IPOs. The tech sector currently trades at 35 to 40 times earnings, stretched but nowhere near the 80 times earnings of the dot-com peak. Here is what is genuinely similar. Concentration is intense. In 2000, 80% of venture investments went to internet companies. In 2025, 64% went to AI startups. Both booms feature enormous spending on infrastructure before the demand to justify it fully materializes. And in both cases, the prevailing narrative is "this time is different." The honest answer is that both things are true. AI is a more mature technology being deployed by more financially sound companies than the dot-com startups of 1999. But $700 billion is still $700 billion, and the gap between infrastructure supply and enterprise demand is real.
The demand gap
This is the part that should make you uncomfortable. Gartner's own framing is telling: "AI adoption is fundamentally shaped by the readiness of both human capital and organizational processes, not merely by financial investment." In other words, you can build all the data centers you want, but if organizations are not ready to use them, the capacity sits idle. The numbers bear this out. While worker access to AI rose by 50% in 2025, the number of companies with 40% or more of their AI projects in production is expected to merely double in the next six months, from a small base. Most enterprises are still stuck between pilot and production. Governance, not model quality, has become the main constraint on scaling. And then there is the concentration risk. What happens if even one of the big four blinks? If Amazon decides that $200 billion in annual capex is unsustainable and cuts back by 20%, the cascading effects hit Nvidia, hit the chip equipment makers, hit the energy deals, hit the construction firms building data centers. The entire ecosystem is predicated on all four companies continuing to accelerate simultaneously.
What to make of all this
The $700 billion bet is not irrational. The companies making it have real revenue, real cash flow, and a genuine belief that AI will become as foundational as the internet. The Jevons paradox argument has historical precedent. The nuclear energy deals suggest these companies are planning on a decade-plus time horizon, not a quarter-to-quarter sprint. But it is also not without risk. The demand side of the equation has not caught up. Enterprise adoption is growing but remains patchy. The gap between "AI is transformative" and "AI generates enough revenue to justify $700 billion in annual infrastructure spending" is enormous. And the dot-com parallel, however imperfect, reminds us that transformative technologies can still produce spectacular capital destruction when spending outruns demand. The honest framing is this: the $700 billion is a bet on the future, not a reflection of the present. Whether it pays off depends less on the technology itself, which is genuinely impressive, and more on whether the rest of the economy can absorb AI at the pace these companies are building for. History says transformative infrastructure bets usually pay off, eventually, for someone. It also says the builders do not always end up being the winners.
References
- CNBC, "Tech AI spending approaches $700 billion in 2026, cash taking big hit" (February 2026) https://www.cnbc.com/2026/02/06/google-microsoft-meta-amazon-ai-cash.html
- Gartner, "Worldwide AI Spending Will Total $2.5 Trillion in 2026" (January 2026) https://www.gartner.com/en/newsroom/press-releases/2026-1-15-gartner-says-worldwide-ai-spending-will-total-2-point-5-trillion-dollars-in-2026
- TechCrunch, "Amazon and Google are winning the AI capex race, but what's the prize?" (February 2026) https://techcrunch.com/2026/02/05/amazon-and-google-are-winning-the-ai-capex-race-but-whats-the-prize/
- Goldman Sachs, "Why AI Companies May Invest More than $500 Billion in 2026" https://www.goldmansachs.com/insights/articles/why-ai-companies-may-invest-more-than-500-billion-in-2026
- NPR Planet Money, "Why the AI world is suddenly obsessed with Jevons paradox" (February 2025) https://www.npr.org/sections/planet-money/2025/02/04/g-s1-46018/ai-deepseek-economics-jevons-paradox
- Nvidia Blog, "Leading Inference Providers Cut AI Costs by up to 10x With Open Source Models on NVIDIA Blackwell" https://blogs.nvidia.com/blog/inference-open-source-models-blackwell-reduce-cost-per-token/
- TerraPower, "TerraPower and Meta Enter Agreement for 8 Natrium Advanced Nuclear Plants" (January 2026) https://www.terrapower.com/terrapower-announces-deal-with-meta
- Fortune, "Next-gen nuclear's tipping point: Meta and hyperscalers start deals with TerraPower, Oklo" (February 2026) https://fortune.com/2026/02/07/next-gen-nuclear-tipping-point-meta-hyperscalers-bill-gates-terrapower-sam-altman-oklo/
- New York Times, "Why the A.I. Boom Is Unlike the Dot-Com Boom" (December 2025) https://www.nytimes.com/2025/12/09/technology/ai-boom-unlike-dot-com-boom.html
- Janus Henderson, "AI versus the Dotcom Bubble: 8 reasons the AI wave is different" https://www.janushenderson.com/corporate/article/ai-versus-the-dotcom-bubble-8-reasons-the-ai-wave-is-different/
- Deloitte, "The State of AI in the Enterprise, 2026" https://www.deloitte.com/us/en/what-we-do/capabilities/applied-artificial-intelligence/content/state-of-ai-in-the-enterprise.html
- MIT Sloan Management Review, "Five Trends in AI and Data Science for 2026" https://sloanreview.mit.edu/article/five-trends-in-ai-and-data-science-for-2026/
- Al Jazeera, "Visualising AI spending: How does it compare with history's mega projects?" (February 2026) https://www.aljazeera.com/news/2026/2/19/visualising-ai-spending-how-does-it-compare-with-historys-mega-projects
- Guinness Global Investors, "Are we in an AI bubble?" https://www.guinnessgi.com/insights/are-we-in-an-ai-bubble