AI stole the internet from 2 billion people
Somewhere in a data center in Iowa, a rack of GPUs is chewing through high-bandwidth memory chips to train the next large language model. Somewhere in Lagos, a would-be first-time internet user is priced out of a smartphone because the memory inside it just got more expensive. These two facts are connected, and the connection should make us uncomfortable. The GSMA, the global trade body representing over 1,000 mobile operators, just sounded an alarm: the AI boom's insatiable appetite for memory chips is directly hindering efforts to bring 2.2 billion unconnected people online. The hardware that could bridge the digital divide is being consumed by data centers instead. This isn't a temporary supply hiccup. It's a structural reallocation of resources, and it reveals an uncomfortable truth about who benefits from AI's rise and who pays the price.
The numbers behind the squeeze
An estimated 2.2 billion people, roughly a quarter of the global population, had no internet connection at all in 2025, according to the United Nations. But here's the critical detail: only 4% of them live in true connectivity blackspots where mobile coverage doesn't reach. The vast majority live within range of existing mobile networks but simply can't afford the devices to connect. Now the affordability problem is getting worse, not better. Data centers consumed around 50% of global DRAM production in 2025, up from 32% just five years earlier. By 2026, they're projected to devour 70% of the world's memory chip supply. The three companies that dominate global memory production, Samsung, SK Hynix, and Micron, are racing to meet AI demand by shifting fabrication capacity toward high-bandwidth memory (HBM), the specialized chips that power AI accelerators and data center GPUs. The problem is that producing HBM is extraordinarily capacity-intensive. Manufacturing one bit of HBM effectively displaces several bits of conventional DRAM, the same commodity memory that goes into smartphones, laptops, routers, and the mobile infrastructure equipment that connects people to the internet. Samsung hiked memory chip prices by up to 60% in late 2025. SK Hynix committed $15 billion to expanding HBM production. Synopsys CEO Sassine Ghazi told CNBC the chip crunch will last through at least 2027. Micron's CEO echoed the forecast, saying markets will remain tight well past 2026. Bloomberg called it a "historic shortage." Yahoo Finance coined a punchier term: RAMageddon.
How AI is eating the supply chain
To understand how we got here, you need to understand the memory chip value chain. The global memory market is an oligopoly. Samsung, SK Hynix, and Micron collectively control the vast majority of DRAM production worldwide. When demand shifts in one direction, the entire market feels it. The AI infrastructure buildout has shifted demand dramatically. Major technology companies are collectively spending roughly $650 billion on AI-related infrastructure in 2026. Hyperscalers like Microsoft, Meta, and Google have locked in multi-year capacity reservations for future DRAM wafer output, effectively claiming chips before they're even manufactured. This creates a cascading squeeze. Memory fabrication plants have finite capacity. Every wafer allocated to HBM for Nvidia's latest GPU is a wafer that doesn't produce the conventional DRAM that goes into a $40 Android phone, a cellular base station, or a broadband router. GSMA Director General Vivek Badrinath put it bluntly: "It is a very tight situation" and "many manufacturers have reduced their efforts on low-end devices." The risk, he said, is that "there are fewer available low-end devices, which in Africa in particular is going to hurt. It is a serious issue." The NCTA, the U.S. broadband industry's trade group, confirmed the downstream effects in an April 2026 report, warning that the memory shortage is already affecting the cost and availability of equipment needed to build and maintain high-speed networks in America. IDC's analysis was even more stark: "What began as an AI infrastructure boom has now rippled outward, with tightening memory supply, inflating prices, and reshaping product and pricing strategies across both consumer and enterprise devices."
The irony of "AI for everyone"
This is where the story gets uncomfortable. Silicon Valley's pitch for AI has always centered on democratization. AI will make information accessible to everyone. AI will be the great equalizer. AI will bring the benefits of technology to billions of people who currently lack access. But the supply chain tells a different story. The physical resources required to build AI, specifically the memory chips that make it run, are being pulled from the very infrastructure that connects people to the digital world in the first place. It's a textbook second-order effect. The technology that promises to democratize knowledge is, through its hardware demands, gatekeeping access to the internet itself. You can't benefit from ChatGPT if you can't get online. The value chain makes the trade-off visible: Path A: Chip fabrication → HBM → AI accelerators → hyperscalers → AI products for the already-connected Path B: Chip fabrication → conventional DRAM → smartphones and telecom equipment → mobile operators → the 2.2 billion unconnected Right now, Path A is winning, and it's not close. With HBM expected to account for roughly 25% of total DRAM wafer production in 2026 and demand growing around 70% year-on-year, the squeeze on Path B will only intensify. This mirrors a pattern we've seen before. The AI energy debate followed the same logic: data centers consuming enormous and growing shares of electricity, straining power grids and raising costs for everyone else. That was energy. This is chips. Different resource, same structural dynamic.
A systems design failure, not an anti-AI argument
It's important to be precise about what this is and what it isn't. This isn't an argument that AI is bad, or that we should stop building data centers. AI will likely create enormous value, and some of that value will eventually reach currently unconnected populations. The problem isn't AI itself. The problem is that we're building AI infrastructure without accounting for the second-order effects on global connectivity. It's a resource allocation failure. The market is functioning exactly as designed: memory manufacturers are following the money, and the money is in AI. HBM commands dramatically higher margins than commodity DRAM. No rational manufacturer would choose to produce cheap phone memory when they can sell HBM at a premium. But markets don't optimize for equity. They optimize for profit. And when the resource in question is a critical input for both AI infrastructure and basic internet connectivity, the market's answer, give it all to AI, has real human consequences. The bottleneck is specifically memory chips, not all semiconductors. Logic chips, processors, and other components have their own supply dynamics. But memory is the pinch point because the same fabrication lines produce both the high-end HBM that AI needs and the commodity DRAM that affordable devices and telecom equipment require. You can't easily separate the two supply chains.
What's being done, and what's still missing
There are efforts underway to address the connectivity gap, but they're working against the tide. The GSMA announced partnerships in March 2026 to pilot $40 smartphones in six African countries: Congo, Ethiopia, Nigeria, Rwanda, Tanzania, and Uganda. The goal is to prove that ultra-cheap 4G devices can bring tens of millions of people online. But even these efforts face headwinds from rising component costs. Low-earth-orbit satellite networks represent another potential backstop. SpaceX's Starlink, Amazon's upcoming Leo service (expected to launch mid-2026), and other operators are developing direct-to-device satellite connectivity that could bypass ground infrastructure entirely. The World Economic Forum estimates LEO satellites could bring high-speed internet to nearly three billion people without reliable connectivity. But satellites are a longer-term play. They don't solve the device affordability problem, which is the primary barrier for most of the unconnected. You still need a phone to connect to a satellite. The GSMA is pushing for policy interventions: tax relief on low-end devices, financing programs, and device recycling initiatives. Some analysts have suggested that chip manufacturers could be incentivized to maintain minimum production levels of commodity DRAM, similar to how pharmaceutical companies are sometimes required to produce essential medicines even when more profitable options exist. But none of these measures address the core tension. As long as AI demand for memory grows faster than total fabrication capacity expands, and capacity takes a minimum of two years to bring online, the squeeze on everything else will continue.
The uncomfortable question
We're making a choice, even if it doesn't feel like one. Every memory chip that goes into an AI data center is a chip that doesn't go into a phone, a router, or a base station. Every dollar of fabrication capacity reserved for HBM is capacity unavailable for the commodity memory that affordable devices depend on. The AI industry talks about building technology for everyone. But "everyone" currently excludes the 2.2 billion people who can't even get online, and the hardware choices being made today are actively making their situation worse. If AI is supposed to be the great equalizer, we need to reckon with the fact that its supply chain is doing the opposite. The question isn't whether to build AI. It's whether we can build it without stealing the internet from the people who need it most.
References
- GSMA, via AFP, "AI-driven chip shortage slowing efforts to get world online" (April 15, 2026) https://techxplore.com/news/2026-04-ai-driven-chip-shortage-efforts.html
- Bloomberg, "AI Chip Manufacturing Demand Creates Historic Shortage" (March 8, 2026) https://www.bloomberg.com/graphics/2026-ai-boom-memory-chip-shortage/
- Tom's Hardware, "Data centers will consume 70 percent of memory chips made in 2026" (January 18, 2026) https://www.tomshardware.com/pc-components/ram/data-centers-will-consume-70-percent-of-memory-chips-made-in-2026-supply-shortfall-will-cause-the-chip-shortage-to-spread-to-other-segments
- Bloomberg Intelligence, via Yahoo Finance, "AI Memory Chip Crunch Emerges as Tech Spending Targets $650 Billion in 2026" https://finance.yahoo.com/news/ai-memory-chip-crunch-emerges-123826248.html
- CNBC, "Memory chip shortage to last through 2027: Synopsys CEO" (January 26, 2026) https://www.cnbc.com/2026/01/26/memory-chip-shortage-synopsys-lenovo-ai-data-centers.html
- Reuters, "Samsung hikes memory chip prices by up to 60% as shortage worsens" (November 14, 2025) https://www.reuters.com/world/china/samsung-hikes-memory-chip-prices-by-up-60-shortage-worsens-sources-say-2025-11-14/
- AnySilicon, "SK hynix Commits $15 Billion to Expand Advanced Memory Production" (February 25, 2026) https://anysilicon.com/news/sk-hynix-commits-15-billion-to-expand-advanced-memory-production/
- NCTA, "A Growing Memory Chip Shortage Is Pressuring America's Digital Infrastructure" (April 13, 2026) https://www.ncta.com/news/growing-memory-chip-shortage-pressuring-americas-digital-infrastructure
- IDC, "Global Memory Shortage Crisis: Market Analysis and the Potential Impact on the Smartphone and PC Markets in 2026" https://www.idc.com/resource-center/blog/global-memory-shortage-crisis-market-analysis-and-the-potential-impact-on-the-smartphone-and-pc-markets-in-2026/
- Investing.com, "Why 2026 Marks a Structural Shift in Tech Economics" https://www.investing.com/analysis/the-end-of-cheap-memory-why-2026-marks-a-structural-shift-in-tech-economics-200675634
- Rest of World, "Africa to pilot $40 smartphones to close 4G digital divide" (March 16, 2026) https://restofworld.org/2026/gsma-cheap-smartphone-africa/
- Yahoo Finance, "AI's memory chip shortage is quietly taxing the entire..." (March 19, 2026) https://finance.yahoo.com/sectors/technology/articles/ai-memory-chip-shortage-quietly-133000039.html