Humans are doomed
We used to learn things. We'd sit with a guitar until our fingers hurt, stare at broken code until the logic clicked, sketch terrible drawings until our hands caught up with our eyes. The process was slow, often painful, and deeply human. Now we type a prompt and get a song, an app, a logo. The output looks polished. But something is missing, and a growing body of research suggests that what we're losing might be more important than what we're gaining.
We stopped struggling, and that's the problem
Learning has always been rooted in struggle. You can't understand harmony without fumbling through chord progressions. You can't truly grasp programming without debugging your own terrible code. The difficulty is the mechanism through which knowledge takes root in the brain. But in 2025 and 2026, the default behavior for a growing number of people looks like this: instead of learning how to make music, open Suno and generate a track in seconds. Instead of learning how to code, use a vibe coding tool like Cursor or Copilot and let the AI write it. Instead of learning how to design, let an image generator handle it. The output arrives instantly. The skill never does.
Your brain on ChatGPT
This isn't just a philosophical concern. Researchers at MIT Media Lab conducted a four-month study where 54 participants wrote SAT-style essays under three conditions: unaided, using Google, or using ChatGPT. Brain activity was monitored via EEG across 32 regions. The results were stark. ChatGPT users showed the lowest brain engagement of all three groups and "consistently underperformed at neural, linguistic, and behavioral levels." Over the course of the study, the ChatGPT group got progressively lazier, with many resorting to copy-and-paste by the final sessions. The researchers described this as an accumulation of cognitive debt, a gradual erosion of the mental effort required to transform information into actual knowledge. The study is still in preprint and has a small sample size, but its findings align with a broader pattern that other institutions have observed.
Microsoft and Carnegie Mellon confirm the trend
A separate study from Microsoft Research and Carnegie Mellon University surveyed 319 knowledge workers about their use of generative AI in everyday tasks. The findings were consistent: the more people relied on AI, the less critical thinking they applied to their work. The researchers warned that "used improperly, technologies can and do result in the deterioration of cognitive faculties that ought to be preserved." They described a key irony of automation: by handling routine tasks for us, AI deprives us of the regular practice that keeps our judgment sharp, leaving us "atrophied and unprepared" when complex situations arise. Higher confidence in AI correlated with less critical thinking. Higher self-confidence in one's own abilities correlated with more. In other words, the people who trust the tool the most are the ones losing the most.
Skill decay is real, and it's invisible
A theoretical analysis published in Computers in Human Behavior examined how AI assistants might accelerate skill decay among experts and hinder skill acquisition among learners. The most unsettling finding was that AI may prevent people from even recognizing that their skills are deteriorating. When the output looks good, it's easy to assume you're still competent. This echoes what researchers at Harvard have called a defining question of our time. As the Harvard Gazette framed it, the question isn't whether AI can make us dumber or smarter. It's whether we engage with it "as a crutch or a tool for growth." A randomized experiment studying how developers learned a new programming library found that those who relied heavily on AI assistance showed impaired conceptual understanding, weaker code reading abilities, and worse debugging skills, all without delivering significant efficiency gains on average. Participants who fully delegated coding tasks saw some productivity improvements, but at the direct cost of learning.
The creative crisis no one wants to talk about
The cognitive cost extends beyond technical skills. In creative fields, AI is quietly hollowing out the incentive to develop craft. Music teachers using Suno AI in classrooms have reported that while the tool raises student motivation and lowers entry barriers, it also risks reducing practice of core skills like melody, harmony, and aural training. When a student can generate a full song from a text prompt, the motivation to learn music theory drops significantly. In software development, the coding bootcamp industry is collapsing. Entry-level hiring has dropped 50% from pre-pandemic levels. Anthropic CEO Dario Amodei has suggested that AI could wipe out half of all entry-level white-collar jobs in the next one to five years. The pipeline that used to turn curious beginners into skilled professionals is being dismantled before our eyes. And in design, illustration, and writing, the same pattern holds: anything in your portfolio can now be dismissed as AI-generated. The perceived value of human creative work has dropped, not because the work is worse, but because no one can tell the difference anymore.
If thinking is what makes us human, what happens when we stop?
This is the question that sits beneath all the productivity metrics and efficiency gains. If the thing that distinguishes humans from other species is our capacity for thought, reasoning, and creative expression, what happens when we systematically outsource those capacities? The MIT study's lead researcher, Dr. Nataliya Kosmyna, found that AI use doesn't just reduce effort. It reduces the relevant cognitive load, the specific type of mental work that converts information into understanding. That's not a shortcut. That's a loss. The philosopher in us might frame it like this: we are not just the products of our thinking. We are our thinking. When we delegate that to machines, we don't become more efficient humans. We become less human, full stop.
What we can actually do about it
The research doesn't suggest we should abandon AI entirely. The MIT study itself noted that integrating AI after the brain has deeply engaged with material may actually support cognitive performance. The key distinction is timing and intention. Some practical principles:
- Struggle first, automate later. Do the hard thinking before reaching for the AI tool. Write the first draft yourself. Debug your own code. Sketch your own ideas. Use AI to refine, not to replace.
- Protect your learning hours. Set aside time where you deliberately work without AI assistance. Treat it like exercise for your brain.
- Be honest about what you actually know. If you can't explain how something works without the AI, you don't understand it yet. That gap matters.
- Value process over output. The song you struggled to write teaches you more than the one Suno generated in four seconds, even if the AI version sounds better.
- Stay skeptical of your own competence. The research shows that skill decay from AI use is often invisible to the person experiencing it. Check yourself regularly.
The real danger isn't that AI replaces us
The real danger is that we let it replace the parts of us that matter most. Not our labor, but our learning. Not our productivity, but our growth. Not our output, but our understanding. We are not doomed because AI is too powerful. We are doomed if we forget why struggling to learn something was ever worth doing in the first place.
References
- Kosmyna, N., et al. "Your Brain on ChatGPT: Accumulation of Cognitive Debt When Using an AI Assistant for Essay Writing Task." MIT Media Lab, arXiv preprint arXiv:2506.08872, 2025. https://www.media.mit.edu/publications/your-brain-on-chatgpt/
- Lee, H.P., et al. "The Impact of Generative AI on Critical Thinking: Self-Reported Reductions in Cognitive Effort and Confidence Effects From a Survey of Knowledge Workers." Microsoft Research and Carnegie Mellon University, 2025. https://www.microsoft.com/en-us/research/publication/the-impact-of-generative-ai-on-critical-thinking-self-reported-reductions-in-cognitive-effort-and-confidence-effects-from-a-survey-of-knowledge-workers/
- "Does Using Artificial Intelligence Assistance Accelerate Skill Decay and Hinder Skill Development Without Performers' Awareness?" Computers in Human Behavior, 2024. https://pmc.ncbi.nlm.nih.gov/articles/PMC11239631/
- Mineo, L. "Is AI Dulling Our Minds?" Harvard Gazette, November 2025. https://news.harvard.edu/gazette/story/2025/11/is-ai-dulling-our-minds/
- "How AI Impacts Skill Formation." arXiv:2601.20245, 2025. https://arxiv.org/abs/2601.20245
- "From Bootcamp to Bust: How AI Is Upending the Software Development Industry." Reuters, August 2025. https://www.reuters.com/lifestyle/bootcamp-bust-how-ai-is-upending-software-development-industry-2025-08-09/
- "Suno AI: Opportunities and Challenges of AI-Generated Music and Lyrics in Secondary Music Education." TechRxiv, 2025. https://www.techrxiv.org/users/964510/articles/1333189-suno-ai-opportunities-and-challenges-of-ai-generated-music-and-lyrics-in-secondary-music-education
- "ChatGPT May Be Eroding Critical Thinking Skills, According to a New MIT Study." TIME, 2025. https://time.com/7295195/ai-chatgpt-google-learning-school/
- "Generative AI: The Risk of Cognitive Atrophy." Polytechnique Insights, 2025. https://www.polytechnique-insights.com/en/columns/neuroscience/generative-ai-the-risk-of-cognitive-atrophy/