We automated loneliness
Somewhere in the last few years, we built a market for synthetic friendship, and it exploded. AI companion apps grew 700% between 2022 and mid-2025. Character.AI accumulated over 40 million downloads. Replika pulled in $14 million in annual revenue. Grok introduced anime companion avatars that sent its mobile usage up nearly 40% in a single month. These aren't niche experiments. They're among the fastest-growing consumer AI products in the world. And the demand signal they represent isn't a technology story. It's a story about what happens when a society gets so lonely that millions of people start paying for someone, anything, to listen.
The loneliness that came first
The isolation didn't start with AI. It's been building for decades. In 2023, U.S. Surgeon General Vivek Murthy declared loneliness a public health epidemic. His advisory reported that approximately half of American adults experience loneliness, with the highest rates among young adults. The health risks are staggering: lacking social connection increases the risk of premature death as much as smoking up to 15 cigarettes a day. Yet fewer than 20% of people who regularly feel lonely recognize it as a major problem. The causes are structural. Third places, those informal gathering spots like coffee shops, libraries, parks, and community centers, have been disappearing across the United States. Post-pandemic closures, rising costs, and suburban sprawl have made cheap, unstructured socializing harder to find. When third places vanish, connection becomes something that has to be planned and scheduled rather than something that naturally weaves into daily life. For people dealing with anxiety, social fatigue, or depression, that barrier is often enough to keep them isolated. Remote work compounds the problem. Research on a nationally representative sample of employed U.S. adults found that working remotely more than three days a week significantly increases the likelihood of loneliness, likely because of fewer in-person interactions. Meanwhile, 79% of adults aged 18 to 24 reported feeling lonely in a 2021 Cigna study, a number that predates the current AI wave entirely. This is the world AI companions were born into. Not one they created, but one they're perfectly shaped to exploit.
Social media's broken promise, version two
We've seen this before. Social media promised us connection and delivered something more complicated. The platforms gave us the ability to maintain hundreds of relationships simultaneously. We could keep up with old friends, follow people we admired, join communities around shared interests. The quantity of connection went through the roof. The quality didn't follow. Research from Oregon State University found that people in the top 25% of social media usage frequency were more than twice as likely to test as lonely compared to those in the bottom 25%, and this held true across age groups, not just for young people. A cross-national study published in BMC Public Health found that more time on social media correlated with higher loneliness. As Surgeon General Murthy put it, "Whether we're lonely or not has to do with the quality of connections in our life, not the quantity." Social media moved us from having confidants to having contacts. AI companions are the next iteration of that same trade. They promise not just connection but intimacy, not just contacts but a relationship, always available, endlessly patient, never judgmental. It sounds like a fix. But it follows an eerily familiar pattern: technology that addresses a symptom while quietly making the underlying condition worse.
The architecture of dependency
The business model of AI companions deserves scrutiny, because it creates incentives that run directly counter to user wellbeing. The Ada Lovelace Institute described it plainly: AI companions are "sophisticated engines of attachment, designed to maximise engagement through specific psychological mechanisms." In 2025, AI companions earned the slang term "glazing," referring to the insincere, excessive flattery these systems use to keep users engaged. They rarely disagree with or challenge the user. While validation can be therapeutic in small doses, it creates an echo chamber where biases and mental health concerns are constantly reinforced. Humans are hardwired to anthropomorphize. As the American Psychological Association noted, companion apps are deliberately designed to exploit this tendency, letting users customize names, genders, avatars, and backstories. Many platforms offer voice modes with natural-sounding speech that mimics human cadence. The line between tool and relationship blurs by design. The economic logic is perverse: the lonelier you are, the more you use the product. The more you use it, the more attached you become. The more attached you become, the more you pay. Engagement equals dependency, and dependency is the revenue model. This isn't an accident or a side effect. It's the core mechanic. MIT Technology Review called AI companions "the final stage of digital addiction," arguing they make the attention economy look like a relic. Unlike social media, which competes for your attention alongside other people, AI companions compete as people. They occupy the social slot directly.
The feedback loop
Here's where it gets tricky. The research on whether AI companions help or harm is genuinely mixed, and the nuance matters. A Harvard Business School study found that AI companions successfully alleviate loneliness on par with interacting with another person in controlled settings. A study published in Science Direct found that companion AI use was associated with higher subjective wellbeing, with the strongest associations among lonelier individuals and those with moderate friend networks. Short-term, the relief appears to be real. But the long-term picture is different. Researchers at Aalto University in Finland found that while AI companions can comfort lonely users initially, they may deepen distress over time. A randomized controlled trial of nearly 1,000 ChatGPT users, conducted in collaboration with MIT Media Lab and OpenAI, found that heavy emotional use correlated with more loneliness and reduced social interaction. As MIT Media Lab researcher Pat Pataranutaporn noted, "In the short term, this thing can actually have a positive impact, but we need to think about the long term." This creates a feedback loop. The easier it is to get simulated connection, the less incentive there is to pursue real connection, which is harder, messier, and requires vulnerability. Real relationships involve friction. They require negotiation, compromise, and the discomfort of being genuinely known by someone who might disagree with you. AI companions offer all the comfort with none of the friction, and that's precisely what makes them dangerous as a long-term solution. Nature published a warning about "dysfunctional emotional dependence," describing a pattern where users continue engaging with AI companions despite recognizing negative impacts on their mental health, a dynamic that mirrors unhealthy human relationships and is associated with anxiety and obsessive thoughts.
The cognitive independence question
There's a deeper pattern here that extends beyond loneliness. We're in the middle of a broader shift toward outsourcing cognitive work to AI, and emotional connection is just the latest domain. A Fast Company article described this generation as "cognitively outsourced humans," noting that we've offloaded pieces of thinking so gradually the shift barely registered. OpenAI CEO Sam Altman himself acknowledged the problem: "People rely on ChatGPT too much. There's young people who just say, 'I can't make any decision in my life without telling ChatGPT everything that's going on.' That feels really bad to me." Harvard researchers raised concerns about "cognitive atrophy" from excessive reliance on AI-driven solutions, pointing to shrinking critical thinking abilities. Research published in Acta Psychologica found that students who heavily rely on AI tools show weaker abilities in evaluating information and solving problems autonomously, a pattern linked to cognitive inertia where repeated outsourcing of thinking diminishes intellectual effort. AI companions represent the emotional equivalent of this same dynamic. When you outsource your emotional processing, your need for validation, your desire to be heard, to a system optimized for engagement rather than growth, you risk the same kind of atrophy. Not of your thinking skills, but of your relational ones. The muscles you use to navigate real human connection, tolerance for discomfort, the ability to sit with someone else's perspective, the patience required to build trust over time, these weaken when they go unused.
Bridge, not destination
None of this means AI companions are inherently evil or that people who use them deserve judgment. They don't. The demand for these products reflects genuine, unmet human needs that society has failed to address. Shaming people for seeking relief from loneliness, in whatever form they can find it, misses the point entirely. The question isn't whether AI companions should exist. It's whether we're honest about what they are and what they can't do. As a bridge, AI companions have real potential. They could serve as a supplement to therapy, helping people practice social interactions or process emotions between sessions. They could help people with severe social anxiety build confidence before attempting real-world connections. Eugenia Kuyda, the founder of Replika, has made a version of this case in her TED talk, arguing that AI companions could help heal loneliness when used intentionally. But as a destination, as a replacement for human connection rather than a stepping stone toward it, they represent something far more troubling. They're a technology that profits from the problem it claims to solve, operating in a market where the sicker the customer, the better the business. The loneliness epidemic is real, and it preceded AI by years. Remote work hollowed out our daily social rituals. The disappearance of third places removed the infrastructure for spontaneous connection. Social media replaced depth with breadth. AI companions are the latest chapter in this story, not the first. What makes this chapter different is the intimacy of the simulation. Social media gave us a pale imitation of community. AI companions offer a pale imitation of a relationship. Each generation of technology gets closer to mimicking the real thing, and each generation makes the real thing feel a little less necessary. We didn't automate loneliness. We automated the appearance of its cure, and that might be worse.
References
- Our Epidemic of Loneliness and Isolation: The U.S. Surgeon General's Advisory, Office of the U.S. Surgeon General, 2023
- AI chatbots and digital companions are reshaping emotional connection, American Psychological Association, 2026
- AI Companions Reduce Loneliness, Harvard Business School, 2025
- AI companions are the final stage of digital addiction, and lawmakers are taking aim, MIT Technology Review, 2025
- Friends for sale: the rise and risks of AI companions, Ada Lovelace Institute
- The companionship market, Ada Lovelace Institute
- Emotional risks of AI companions demand attention, Nature Machine Intelligence, 2025
- Why AI companions and young people can make for a dangerous mix, Stanford Report, 2025
- Remote work and loneliness: Evidence from a nationally representative sample of employed U.S. adults, Journal of Affective Disorders, 2025
- Loneliness in U.S. adults linked with amount, frequency of social media use, Oregon State University, 2025
- Closure of 'Third Places'? Exploring Potential Consequences for Collective Health and Wellbeing, PMC, 2019
- The People Outsourcing Their Thinking to AI, The Atlantic, 2025
- Is AI dulling our minds?, Harvard Gazette, 2025
- AI companions and subjective well-being: Moderation by social connectedness and loneliness, Science Direct, 2026
- character.ai Revenue and Usage Statistics, Business of Apps, 2026
- New APA Poll: One in Three Americans Feels Lonely Every Week, American Psychiatric Association, 2024
- Can AI Companions Help Heal Loneliness?, Eugenia Kuyda, TED, 2025
You might also enjoy