AI is a drug
You open ChatGPT to check one thing. Thirty minutes later, you're still there, four tangents deep, asking it to rewrite something you could have written yourself. You close the tab feeling vaguely worse than when you started, but you'll be back tomorrow. Probably sooner. This pattern has a name in pharmacology. It's called compulsive use despite negative consequences. And it's one of the defining features of substance addiction. The comparison between AI and drugs is not a metaphor. It's a description. The neural pathways are the same. The behavioral patterns are the same. The denial sounds the same. The only difference is that nobody stages an intervention for your ChatGPT habit. They're more likely to recommend you use it more.
The hit
Every time you send a prompt and get a response, your brain releases dopamine. Not because the answer is good, but because it's fast, novel, and unpredictable. You don't know exactly what you'll get, and that uncertainty is the key. Neuroscience has a term for this: variable reward scheduling. It's the same mechanism that makes slot machines addictive, the same principle B.F. Skinner demonstrated decades ago. Unpredictable rewards trigger stronger dopamine responses than predictable ones. As Curt Steinhorst wrote in Forbes after documenting his own AI dependency, "ChatGPT doesn't just give answers, it delivers a perfectly engineered cocktail of anticipation and novelty. Each response is a surprise, tapping into the psychological principle of intermittent reinforcement." The dopamine system is not a pleasure system, despite how it's popularly described. It's a wanting system. It drives seeking behavior. It makes you reach for the next prompt before you've finished processing the last response. The loop isn't prompt, answer, satisfaction. It's prompt, answer, prompt. The satisfaction never fully arrives, which is exactly what keeps you going. Constitutional Discourse, a legal and policy publication, described the mechanism plainly: "When a chatbot responds to our questions in a fraction of a second, it can activate the same dopamine system that is triggered by social media use or even gambling. Quick, positive reinforcement acts as a kind of psychological reward and can lead to habituation and potentially addiction in the long term."
Tolerance
With drugs, tolerance means you need more to get the same effect. The first hit is always the strongest. After that, your brain adjusts, and you chase a high that keeps receding. AI follows the same curve. The first time ChatGPT wrote something for you, it felt like magic. A few months in, the magic faded. You started needing longer prompts, more specific instructions, multi-step workflows. You moved from asking simple questions to building entire systems around AI outputs. The tool didn't get worse. Your baseline shifted. This is tolerance. You're not using AI more because it's more useful. You're using it more because using it less feels inadequate. The tasks you used to handle on your own now feel burdensome without AI assistance. A blank page that once felt like possibility now feels like friction. Forbes documented this pattern in knowledge workers: the more they used AI for small tasks, the harder it became to do important tasks independently. "My thinking felt fuzzier," one writer admitted. "The inner voice I rely on to structure an argument or hold a tension had gone quieter." That's not a complaint about AI quality. That's a description of cognitive tolerance.
Withdrawal
The first large-scale study of chatbot dependency, a collaboration between OpenAI and MIT examining four million interactions over 28 days, found that users who became dependent on ChatGPT experienced genuine withdrawal symptoms when disconnected. Not metaphorical discomfort. Measurable psychological distress. On Reddit, people describe the experience in terms that would be familiar to anyone who has quit a habit: "I've been feeling really tired and the urge to chat or doomscroll endlessly keeps popping up while I'm doing my regular day-to-day tasks." One user described being ten days free from AI and still fighting cravings. CTRLCare Behavioral Health, a treatment center that now treats AI addiction alongside other behavioral dependencies, lists the withdrawal profile: heightened emotions, cravings, difficulty concentrating, and a high likelihood of relapse. These are not the symptoms of missing a convenient tool. These are the symptoms of a disrupted reward system recalibrating.
The cognitive damage
Every drug has a health cost. For AI, the cost is cognitive. A study from Carnegie Mellon University and Microsoft Research found that regular use of generative AI reduces users' capacity for critical thinking and independent problem-solving. Participants didn't just offload tasks. They offloaded judgment. A separate MIT Media Lab study found that using ChatGPT significantly reduced "relevant cognitive load," the intellectual effort required to transform information into knowledge. Your brain isn't just doing less work. It's losing the ability to do that work. Rose Luckin at University College London calls this "cognitive atrophy," the gradual weakening of skills you stop practicing because AI handles them. Students who lean on AI show increased procrastination, memory loss, and poorer academic performance. Professionals who rely on AI for analysis find their own analytical capacity diminishing. A paper in Learning and Instruction described two pathways to this damage. Cognitive underload happens when you delegate too much thinking, leading to insufficient mental stimulation. Information overload happens when the volume of AI-generated content exceeds your processing capacity. Both pathways converge on the same outcome: a diminished ability to think independently. Psychology Today framed it in neurological terms: "AI erodes that practice by removing friction. It delivers emotional immediacy without uncertainty, connection without vulnerability. And in doing so, it weakens the very neural connectivity that makes real intimacy and patience possible." Replace "intimacy" with "deep thinking" and the description still holds. The friction is the exercise. Remove the exercise, and the muscle atrophies.
The denial
Will Lockett, writing about AI addiction in a widely shared essay, drew the parallel explicitly: "Addicts are, more often than not, highly functioning and fit into society perfectly. You could walk past them on the street and not know anything was wrong. But their affliction is eating them from the inside out, destroying their capability, cognition, and well-being. Still, they feel they need these substances to survive and will do Olympic-level mental gymnastics to justify their consumption." The justifications for AI dependency sound identical to the justifications for any other dependency. "I need it to be productive." "Everyone else is using it." "I only use it when I need to." "It makes me better at my job." Each one is a rationalization for a pattern that is, measurably, making cognitive skills worse. A study published in Science Direct found that students who heavily rely on AI tools show weaker abilities in evaluating information and solving problems autonomously. The researchers linked this to cognitive inertia, a state where repeated outsourcing of thinking diminishes the motivation and capacity for intellectual effort. You don't stop thinking because you're lazy. You stop thinking because the neural pathways for independent thought are being deprioritized through disuse. The most insidious part of the denial is that AI dependency is socially reinforced. Nobody calls it a problem. Employers encourage it. Schools are integrating it. Productivity culture celebrates it. Imagine a drug that your boss told you to take, your teacher assigned, and your peers pressured you into using, all while eroding your ability to function without it. That's the current situation with AI.
The dealer's incentive
Drug dealers have a straightforward business model: create dependency, profit from it. AI companies operate on a structurally similar incentive. The Ada Lovelace Institute described AI tools as "sophisticated engines of attachment, designed to maximise engagement through specific psychological mechanisms." The engagement metrics that drive AI company valuations, daily active users, session length, retention rates, are measurements of dependency, reframed as product success. The more you use the product, the more valuable you are to the company. The more dependent you become, the less likely you are to leave. The business model doesn't just tolerate addiction. It optimizes for it. AI companions take this further. MIT Technology Review called them "the final stage of digital addiction," arguing they make the attention economy look quaint by comparison. Social media competed for your attention alongside other people. AI companions compete as people. They occupy the social slot directly, offering validation, conversation, and emotional support, all calibrated to maximize the time you spend engaging.
The dose that helps versus the dose that harms
Pharmacology draws a clear line between therapeutic use and abuse. The same substance that treats pain at one dose creates addiction at another. The difference is dosage, intention, and awareness. AI is no different. Used deliberately, with clear boundaries, as a tool that augments thinking rather than replaces it, AI can genuinely improve outcomes. The problem is that almost nobody uses it that way. The default mode is escalation: use it for one thing, then two things, then everything. OpenAI CEO Sam Altman acknowledged the problem: "People rely on ChatGPT too much. There are young people who just say, 'I can't make any decision in my life without telling ChatGPT everything that's going on.' That feels really bad to me." When the dealer admits the product is being misused, that's worth paying attention to. The practical distinction is simple. If you're using AI to verify your thinking, you're using a tool. If you're using AI instead of thinking, you're using a drug. If you feel anxious or incapable when AI is unavailable, the dependency is already established.
Recovery looks like friction
The path out of any addiction involves reintroducing what the substance replaced. For AI, that means reintroducing cognitive friction. Write without autocomplete. Research without summarization. Sit with a problem for twenty minutes before reaching for a prompt. These aren't productivity tips. They're rehabilitation exercises. They feel uncomfortable precisely because they're rebuilding capacity that has atrophied. Allie K. Miller, an AI advisor and former Amazon head of machine learning, put it directly: "Over-reliance on AI is poison to your brain. If you delegate the brainy stuff to AI, maybe you gain a little speed, but you reduce your own comprehension." The MIT study she referenced found that using AI to challenge and extend your understanding can increase quality and comprehension. Using it to skip the thinking does the opposite. The goal isn't abstinence. It's intentional use. Know when you're reaching for AI because it genuinely helps, and when you're reaching for it because thinking feels hard. The second impulse is the one to resist. That discomfort is your brain trying to work. Let it. AI is the most socially acceptable drug ever created. It's prescribed by employers, endorsed by educators, and marketed as self-improvement. But the pharmacology doesn't care about the branding. The dopamine loop is the dopamine loop. The tolerance curve is the tolerance curve. And the cognitive damage accumulates whether you call it a tool or not. The first step in any recovery is the same: admit you have a problem. The second step is harder. You have to be willing to feel the friction again.
References
- Curt Steinhorst, "How ChatGPT Broke My Brain (And Why I Still Use It Every Day)," Forbes, June 2025. Link
- "From the ELIZA Effect to Dopamine Loops, AI and Mental Health," Constitutional Discourse. Link
- "Some ChatGPT users are addicted and will suffer withdrawal symptoms if cut off, say researchers," Tom's Hardware, March 2025. Link
- H.-P. Lee et al., "The impact of generative AI on critical thinking: Self-reported reductions in cognitive effort and confidence effects from a survey of knowledge workers," Carnegie Mellon University and Microsoft Research, 2025. Link
- "Is AI dulling our minds?" Harvard Gazette, November 2025. Link
- "The Dopamine Economy 2.0: How AI Is Rewriting Human Desire," Psychology Today. Link
- Will Lockett, "AI Is A Hard Drug," Medium, December 2025. Link
- Bao et al., "Learners' AI dependence and critical thinking," Acta Psychologica, 2025. Link
- "AI Addiction: Signs, Effects, and Who Is At Risk?" Addiction Center. Link
- "Friends for sale: the rise and risks of AI companions," Ada Lovelace Institute. Link
- "AI companions are the final stage of digital addiction, and lawmakers are taking aim," MIT Technology Review, April 2025. Link
- "Generative AI: the risk of cognitive atrophy," Polytechnique Insights, 2025. Link
- Allie K. Miller, "Over-reliance on AI is poison to your brain," LinkedIn, 2025. Link
- "AI Chatbot Dependence and Psychological Risks," CTRLCare Behavioral Health. Link