Intelligence is a commodity
The models are converging. The prices are collapsing. The real question is what happens after intelligence becomes free. When OpenAI launched ChatGPT in late 2022, access to a frontier AI model felt like a privilege. Only a handful of companies had the resources to train these systems, and using them cost real money. Fast forward to 2026, and the picture looks radically different. Frontier-class models are available from dozens of providers. Prices have dropped by over 80% in barely a year. Open-source alternatives match or beat commercial offerings on most benchmarks. Intelligence, the kind that used to require enormous capital and rare talent, is becoming as generic as electricity. The race is towards zero. Cheaper, smarter, and faster. Every quarter brings another price cut, another efficiency breakthrough, another open-source release that narrows the gap. The question is no longer who can build the smartest model. It's what do you do when everyone has one?
How we got here
Three forces drove intelligence toward commodity status. Open source closed the gap. Meta's release of LLaMA in 2023 cracked the dam. Since then, a steady stream of open-weight models has proven that you don't need billions of dollars in compute to produce world-class AI. Researchers and smaller companies can now fine-tune powerful models on a single GPU cluster, something unimaginable just a few years ago. Competition compressed pricing. The major cloud providers, Microsoft, Google, Amazon, and a growing roster of challengers, have been locked in a fierce pricing war. When DeepSeek demonstrated in early 2025 that frontier-level performance could be achieved at a fraction of the typical training cost, it sent shockwaves through the industry. Nvidia's stock briefly tumbled. OpenAI's valuation assumptions were questioned. The message was clear: the "just add compute" era was ending, and algorithmic cleverness could substitute for brute-force spending. Standardization made models interchangeable. As model performance converges, switching between providers becomes trivial. An API call to one model looks much like an API call to another. When the underlying product is functionally identical, the only remaining differentiator is price, which is the textbook definition of a commodity.
What commoditization actually means
Calling intelligence a commodity is not an exaggeration. In economics, a commodity is a standardized product where any given unit is interchangeable with another. Corn is corn. Crude oil is crude oil. And increasingly, a frontier LLM is a frontier LLM. This does not mean AI is unimportant. Electricity is a commodity too, and it powers everything. The point is that the model layer is no longer where lasting competitive advantage lives. Value is migrating up and down the stack. Up the stack means applications. The companies winning in 2026 are not the ones with the best base model. They are the ones building the best products on top of models, deeply integrated into specific workflows, trained on proprietary data, solving real problems for real users. A generic chatbot is a commodity. An AI system that triages support tickets, cross-references a customer's history, queries an internal knowledge base, and drafts three potential replies for a human to review, that is a competitive moat. Down the stack means infrastructure. Cloud compute, specialized chips, energy supply, and the tooling needed to train, fine-tune, and deploy models. Nvidia's stock recovered quickly after the DeepSeek scare for a reason: even if individual models become cheap, the aggregate demand for compute keeps growing. Cheaper intelligence means more usage, which means more infrastructure. This is the Jevons paradox in action.
The DeepSeek moment
No conversation about AI commoditization is complete without DeepSeek. The Chinese research lab demonstrated that frontier-level reasoning models could be trained for a reported $6 million in GPU costs, a figure that, even if understated, was orders of magnitude below what US labs were spending. The real lesson was not about one model. It was proof of concept: access to the most capital does not guarantee access to the best models. DeepSeek briefly dethroned ChatGPT as the number one app in the US App Store. It forced every incumbent to reconsider their pricing and their assumptions. As Satya Nadella put it, invoking the Jevons paradox, cheaper AI would simply mean more AI. But as one analyst noted, what is good for the industry overall might not be good for any individual incumbent. If users can switch freely between functionally identical models, loyalty evaporates.
Where value lives now
If the model is not the moat, what is? Proprietary data. A model trained on a decade of your customer interactions, your supply chain data, or your scientific research will produce insights that no generic model can replicate. Data is the new defensible asset. Workflow integration. The real unlock is not a standalone AI tool but AI woven invisibly into existing business processes, automating multi-step tasks across systems, reducing friction, and compounding efficiency gains over time. Vertical expertise. General-purpose models are powerful but generic. The companies that deeply understand a specific domain, healthcare, legal, finance, manufacturing, and build tailored experiences around that expertise will capture disproportionate value. Trust and brand. In a world where 90% of online content could be synthetically generated by 2026, authenticity becomes a premium. Brands and institutions that establish trust and credibility will stand out in a sea of AI-generated noise.
The human side
The commoditization of intelligence does not mean the end of human relevance. If anything, it raises the stakes for distinctly human skills. When everyone has access to the same powerful tools, the differentiator is judgment, creativity, ethical reasoning, and the ability to ask the right questions. The most valuable professionals in 2026 are not the ones who can operate AI. They are the ones who can direct it, who understand what to build, who to serve, and why it matters. Some organizations are already experimenting with "AI-free" periods to preserve critical thinking skills. The concern is real: if you outsource every decision to a model, you risk atrophying the very capabilities that make human oversight valuable. The winning formula is not human or machine. It is human with machine, where AI handles the repetitive, data-intensive work and humans focus on strategy, empathy, and the messy, contextual decisions that algorithms still struggle with.
What comes next
The trajectory is clear: intelligence will continue getting cheaper. Open-source models will continue improving. Competition will continue compressing margins. Within a few years, access to a powerful AI model will feel as unremarkable as access to a search engine. The interesting questions are no longer about the technology itself. They are about what we build on top of it. Who captures the value when the foundation is free? How do we govern a world where anyone can spin up a powerful AI with minimal resources? What happens to the business models that assumed intelligence would remain scarce and expensive? Andrew Ng famously said that "AI is the new electricity." He was right, perhaps more literally than he intended. Electricity transformed every industry, created entirely new ones, and became something we barely think about. Intelligence is following the same path. The companies, institutions, and individuals who thrive will be the ones who stop marveling at the technology and start focusing on what to do with it. The race to zero is nearly over. The race to build something meaningful on top of it is just beginning.
References
- Trent Kannegieter, "Taking AI Commoditization Seriously," TechPolicy.Press, March 2025
- Gianluca Mauro, "The Race to Zero: How AI Will Become a Zero-Cost Commodity," Medium, March 2023
- "AI Models Are Becoming a Commodity: Second-Order Effects Reshaping Industries by 2026," Mixflow.AI, November 2025
- Joe McKendrick, "As AI Rapidly Becomes A Commodity, Time To Consider The Next Step," Forbes, February 2024
- "Is AI Already Heading Down the Path to Commoditization?" Fierce Network, September 2025
- "Big Tech Set to Spend $650 Billion in 2026 as AI Investments Soar," Yahoo Finance, February 2026
- "The State of AI: Global Survey 2025," McKinsey, 2025
- "How AI Shook the World in 2025 and What Comes Next," CNN, December 2025
- Siddharth Bhalsod, "The Commoditization of AI Models: Implications for Innovation," LinkedIn
- "Defensive Moats in the Age of LLMs: When AI Commoditizes Your IP," Strategeos