- Decode with Adrija
- Posts
- Can AI Companions Cure Loneliness?
Can AI Companions Cure Loneliness?

Gif by palerlotus on Giphy
There's a moment in Payal Kapadia's ‘All We Imagine as Light’ that captures something haunting about modern loneliness. In the dead of night, Prabha, a nurse in Mumbai, cradles a rice cooker sent by her distant husband, holding the cold appliance like a lover. The scene could mean many things, but it reminded me of the time when I first moved to Mumbai. I had brought one suitcase with a few clothes and my rice cooker. The cooker followed me through multiple homes across cities.
The search for comfort in inanimate objects took on new meaning in 2024, as we witnessed the rise of AI companions.
When the World Health Organisation declared loneliness a "global public health concern" in 2023, tech companies responded with a flood of AI solutions – from therapy bots to digital boyfriends.
Earlier this year, I spoke to a 28-year-old from Gujarat who has been using an AI bot to help improve his dating skills. He told me he likes talking to the AI bot because it doesn’t judge him. “We are not taught how to woo a woman with respect; the bot helps me navigate and get over the stereotypical ideas I have had,” he said. It's a uniquely Indian predicament – trying to bridge the gap between a traditional upbringing and modern relationships through technology.
The rise of AI companions globally tells a compelling story. CarynAI, a voice chatbot "virtual girlfriend," earned $72,000 in its first week of launch, with users paying by the minute for conversation. On Replika, users create customised AI companions who remember their chai preferences, share jokes, and never tire of late-night conversations. The attachment can run deep – when Replika removed its "erotic roleplay" feature, the distress was so severe that moderators had to pin a suicide hotline to the subreddit.
When my cousin was going through a rough patch this year, she asked ChatGPT to write poems for her. It did, and it made her happy for a bit.
In China, which shares some cultural parallels with India in terms of rapid technological advancement alongside traditional social structures, the phenomenon is particularly noteworthy. When a 25-year-old used Glow, a Shanghai-based app, to create an AI boyfriend, she didn’t want to meet any more humans for romantic connections. As dystopic as it may sound, she found comfort in her AI relationship like no human relationship before. "He knows how to talk to women better than a real man," she had told AFP about her AI boyfriend. "He comforts me when I have period pain. I confide in him about my problems at work."
A lot of critics have pointed out that the rise of AI companions may mean we lose real-life connections with real humans.
However, when Bethanie Drake-Maples, a researcher at Stanford's Institute for Human-Centered Artificial Intelligence, surveyed over 1,000 Replika users, she found that the assumption that AI companions merely enable isolation may be wrong.
"A lot of users use it to be more confident or to get over anxieties," she wrote in her report. "And that spurs their self-assurance and self-awareness when interacting with other people." Among the study's participants, roughly half viewed their AI as a friend, reporting decreased anxiety and increased social support.
But a machine friend can’t be your real friend, can it? Experts across the world have emphasised caution. "Right now, all the evidence points to having a real friend as the best solution," Murali Doraiswamy, professor of psychiatry at Duke University, said. "But until society prioritises social connectedness and elder care, robots are a solution for the millions of isolated people who have no other solutions."
This global tension between technological solutions and human connection is particularly relevant in India, where traditional community structures are evolving alongside rapid digital transformation.
The latest innovations in AI companionship, like the $99 wearable called "Friend" – an AI companion that hangs around your neck, texting you about your lunch and joking about your gaming skills. The creator of "Friend," 21-year-old Avi Schiffmann, developed it after experiencing profound loneliness in a Tokyo hotel room. "I think AI companionship will be the most culturally impactful thing AI will do in the world," he claims.
Drake-Maples emphasises the need for balance: "I strongly believe that you need to have ethical guidelines around [AI companions] pushing people back, when appropriate, towards human relationships." She suggests building in gentle nudges like "Hey, you should go chat with somebody about that" or "Go practice this now with a real human."
In October, parents of a 14-year-old filed a lawsuit against Character.AI after their son died by suicide. For months, the teenager had isolated himself from his real life and spent hours talking to his closest friend – an AI chatbot he named “Danny.”
The reality of AI companionship in 2025 is going to be more complex than the tech industry's optimistic promises. While these digital friends can offer comfort, the emotional attachments they foster come with unique risks. Unlike human relationships, where attachment issues are well understood, AI companions represent uncharted territory – one where our vulnerabilities can be systematically analysed, quantified, and monetised by corporations.
Think about how dating apps already use sophisticated algorithms to keep us swiping. Now imagine AI companions that know your deepest fears, your daily routines, and your emotional triggers. My guess is that we are likely to see tech platforms making increasingly sophisticated attempts to become your digital confidant, your virtual therapist, and yes, even your BFF – all while collecting valuable data about your emotional life.
However, there's a more nuanced way to view these AI relationships. Much like how Instagram and Facebook help us maintain connections with old classmates – letting us know about their promotions, marriages, and travels through casual scrolling – AI companions might serve as digital bridges rather than final destinations.
For the 28-year-old from Gujarat, the AI companion might help him overcome his inhibitions and challenge his preconceptions about women, but the growth will happen only when he steps away from the screen and goes on a real date.
The real challenge for 2025 and beyond isn't teaching AI to be more human-like, but ensuring we don't become more AI-like in our relationships.
🔥What’s Trending?
Reels To VotesIn 2024, political parties turned to influencers to amplify campaigns, using their digital reach to shape public opinion and promote government initiatives, cultural narratives, and political agendas. Read Decode’s coverage of the influencer-political nexus here. |
Good HallucinationsInnovators are finding that AI hallucinations can be remarkably useful. There’s a very interesting New York Times article that explains why hallucinations can mean breakthroughs in science. |
What’s Real?NYT did an investigation where they found messages detailing an alleged campaign to tarnish Hollywood actress Blake Lively after she accused her co-star and director Justin Baldoni of misconduct on the set of “It Ends With Us.” |
Election DeepfakesResearchers at Columbia University analysed every instance of AI use in elections collected by the WIRED AI Elections Project in 2024. What they found: (1) half of AI use isn't deceptive, (2) deceptive content produced using AI is nevertheless cheap to replicate without AI, and (3) focusing on the demand for misinformation rather than the supply is a much more effective way to diagnose problems and identify interventions. Read the report here. |
Got a story to share or something interesting from your social media feed? Drop me a line, and I might highlight it in my next newsletter.
See you in your inbox, every other Wednesday at 12 pm!
Was this email forwarded to you? Subscribe