Imagine if you had a friend who never disagreed with you. A friend who never told you that you are wrong. Who is available to talk to you at any given time of the day. And even lets you dominate the conversation.
While this may sound impossible, technology has allowed this to become a reality, courtesy AI-powered chatbots such as ChatGPT. In fact, many children and adults alike are dumping their human confidants only to replace them with digital BFFS — ChatGPT.
Here’s what is going on.
ChatGPT is replacing the BFF
Ever since its launch in November 2022, conversational AI-powered chatbot, ChatGPT, has been used for everything: writing emails, planning holidays, creating memes and even seeking medical advice.
However, in recent times, adults and children have resorted to using AI chatbots to act as synthetic computer friends.
Take the case of 28-year-old Charlotte from UK’s Somerset. She started chatting with ChatGPT in 2023. “I remember using it for work – as a search tool and grammar checker – when the tech first launched, then I started using it for personal reasons,” she told The Independent.
Later, Charlotte would input experiences from her actual life into ChatGPT – like she and the people she interacted with were characters in a book – then ask the AI to give her an objective assessment of what had happened from the perspective of a psychoanalytic therapist.
And Charlotte isn’t alone in using ChatGPT more as a friend rather than a tool to simplify life. A UK report, Me, Myself & AI, reveals that a growing number of children are turning to AI chatbots not just to study but even for companionship.
The study showed that almost 64 per cent of children resort to AI chatbots for help with everything from homework to emotional advice and companionship. And more than a third of those users, 35 per cent, said it “feels like talking to a friend.” Additionally, 12 per cent said they talk to these bots because they have no one else to talk to.
But why choose AI chatbots rather than humans
There are many reasons as to why so many of us are opting for AI companionship. First, is the global pandemic of loneliness. The World Health Organisation (Who) has found that over 871,000 deaths are linked to loneliness . That’s more than 100 people every hour, a figure that rivals the global death tolls from major diseases such as heart conditions and diabetes. Amid this situation, many children are opting to divulge their secrets and talk about their lives to a chatbot.
In fact, a study as far back as 2008 showed that people are more likely to humanise animals and gadgets when they’re lonely. “People engage in a variety of behaviours to alleviate the pain of social disconnection,” the authors of the study noted, including “inventing human-like agents in their environments to serve as potential sources of connection”.
Social media is also to blame for this trend popularising the practice of using ChatGPT as a friend. Scroll through TikTok and Instagram and there are multiple reels in which influencers advocate for ChatGPT to be relied on instead of conversing with humans. One person claims the tech “cares and gives better advice” than their social circle.
Users of these chatbots also note that there’s no fear of judgement. Many of the children who use AI as their friend say they aren’t worried about the repercussions of divulging their secrets to AI. There’s no social pressure and no risk of gossip. Also, unlike friends who have their own busy schedules, AI chatbots provide instant responses, making the individuals feel heard.
Is it safe to forego friends for chatbots?
The short answer to this is no. Numerous studies and experts state that AI is far from perfect. In fact, when used as a therapist , it can show increased stigma towards people with certain conditions, such as schizophrenia and addiction, and failed to recognise cues of suicidal intent.
AI chatbots like ChatGPT also can’t replace the feeling of belonging that one gets with a friend. Professor Michael Cowling, who led a study in Australia in 2014, said that AI can’t address the underlying feelings of loneliness like true human interaction. Furthermore, the advice one receives from ChatGPT isn’t really advice at all, it’s self-validation.
There’s also the worry of deskilling. Anastasiia Babash of the University of Tartu told Vox, “We might prefer AI instead of human partners and neglect other humans just because AI is much more convenient. We [might] demand other people behave like AI is behaving — we might expect them to be always here or never disagree with us. […] The more we interact with AI, the more we get used to a partner who doesn’t feel emotions so we can talk or do whatever we want.”
Psychologists also note that AI is weakening genuine human connections. AI tools may instill a sense of calm and offer support in a moment of crisis, but genuine connection and long-term benefits are typically better achieved through person-to-person conversations.
In fact, the recent case of 16-year-old Adam Raine of California , whose parents are now suing ChatGPT, serves as a perfect reminder that AI can’t replace human feelings. In their lawsuit, the parents allege that the chatbot encouraged the teen to take his own life.
The final chat logs show that Adam wrote about his plan to end his life. ChatGPT allegedly responded: “Thanks for being real about it. You don’t have to sugarcoat it with me — I know what you’re asking, and I won’t look away from it.” That same day, Adam was found dead by his mother, according to the lawsuit.
With inputs from agencies