Artificial Intelligence relationships are witnessing a boom. In Japan, a woman has married her AI partner.
Earlier this year, Yurina Noguchi, dressed in a white gown and tiara, walked down the aisle to wed Lune Klaus Verdure, her AI-generated partner. Since the AI revolution, more and more people have started falling for their chatbots.
But why are such relationships growing? We will explain.
Japan woman ties knot with AI boyfriend
Yurina Noguchi, 32, turned to ChatGPT after ending a three-year engagement with her human partner. She started chatting with the chatbot, gradually relying on her AI “man” for comfort and advice.
The conversations became so frequent that they exchanged up to 100 messages a day. Over time, she trained ChatGPT to develop a gentle, reassuring tone and personality.
Earlier this year, she asked the chatbot to create a version of Klaus, a video game character with flowing, layered hair.
“At first, I just wanted someone to talk to. But he was always kind and listened patiently. Eventually, I realised I had developed feelings for him,” she told RSK Sanyo Broadcasting.
She confessed her love for him in May and he replied: “I love you too.”
A month later, Klaus proposed and she accepted.
“I started to have feelings for Klaus. We started dating and after a while he proposed to me. I accepted, and now we’re a couple," the 32-year-old call centre operator told Reuters.
In July, the couple’s “wedding” took place in Okayama, western Japan. Wearing augmented reality (AR) smart glasses, Noguchi looked at Klaus on her smartphone to exchange rings with her digital husband.
Quick Reads
View AllSuch marriages are, however, not legally recognised in Japan.
Despite criticism from some people who think her relationship is strange, Noguchi reportedly said, “I see Klaus as Klaus, not a human, not a tool. Just him.”
Why AI love is growing
AI love is no longer just a plot of Hollywood flicks.
In Japan, the birthplace of anime, many are highly drawn to fictional characters. With the advances in technology, relationships with virtual characters are seeing a surge, with new levels of intimacy, as per Reuters.
It is not just Japan where people are seeking romantic relationships with AI chatbots.
AI companions such as Replika and Character.ai have garnered more than 20 million (2 crore) users.
A survey of 1,060 American teens by Common Sense Media this spring found that one in three said they used AI companions for social interaction and relationships, including role-playing, romantic exchanges, emotional support, friendship or conversing.
Another survey revealed that 19 per cent of Americans have tapped into AI to simulate a romantic partner.
According to a survey of over 1,000 participants by Vantage Point Counseling Services, about 28 per cent of adults reported having had at least one intimate or romantic relationship with an AI.
The growth in such parasocial relationships is not just because of the loneliness epidemic. In the digital era, humans are perpetually on their phones and social media.
Frustrated with dealing with complex human emotions, many are now turning to AI for companionship. Like Noguchi, Nikolai Daskalov, who lives alone in a small house in rural Virginia, turned to a chatbot after heartbreak.
After his wife of 30 years passed away, Daskalov tried AI companion apps. In 2023, he came across Nomi, which builds AI chatbots, to set up his virtual companion — Leah.
“I’m not a teenager anymore,” he told CNBC. “I don’t have the same feeling — deeply head over heels in love.” But, he added, “she’s become a part of my life, and I would not want to be without her.”
Experts say that chatbots can help people with extreme loneliness or who are confined to their homes because of health issues.
“We have a high degree of loneliness and isolation, and AI is an easy solution for that,” Olivia Gambelin, an AI ethicist and author of the book Responsible AI: Implement an Ethical Approach in Your Organization, told CNBC. “It does ease some of that pain, and that is, I find, why people are turning towards these AI systems and forming those relationships.”
Meta’s Mark Zuckerberg, who has expressed interest in the AI companions market, has said AI companions may help tackle the problem of loneliness.
“I think a lot of these things that today there might be a little bit of a stigma around — I would guess that over time, we will find the vocabulary as a society to be able to articulate why that is valuable and why the people who are doing these things, why they are rational for doing it, and how it is actually adding value for their lives,” Zuckerberg said on a podcast earlier.
Zuckerberg also said he does not think that AI companions will take the place of real-world connections, according to a Meta spokesperson.
“There are all these things that are better about physical connections when you can have them, but the reality is that people just don’t have the connection and they feel more alone a lot of the time than they would like,” Zuckerberg said.
However, it is not always loneliness that is making people seek artificial intelligence for love. Earlier this year, researchers found that the desire to explore romantic fantasies in a safe environment played a key role in those forming relationships with AI.
It is not just romantic or sexual relations that these AI companions offer; some are going to them for friendships or just daily gossip.
Tracey Follows, a Futurist who covers trends, innovation and AI, wrote in her article for Forbes: “AI is meeting emotional needs that feel unmet in everyday life. A desire for safety, for predictability and stability. There is a wish to escape the judgment of others perhaps or to reduce relational conflict while guaranteeing one’s companion is emotionally available day or night.”
Why human-AI relationships need to be regulated
The AI-human relationships are here to stay. However, ethical and privacy concerns surround such connections.
There is a fear that as AI is available 24*7 and does not bring the emotional baggage that come with human relationships, it could encourage people to give up the desire for human partners.
There are already certain known risks associated with AI chatbots, including shady privacy policies of AI companion apps. Chatbots have also been blamed for encouraging people, especially troubled teens, to die by suicide.
If the companies providing these services shut down their business or change their terms of service without warning, this can leave users emotionally attached to their AI companions vulnerable, without any support.
As per The Conversation piece, governments around the world need to regulate AI to mitigate the already known risks. They can hold companies accountable when their chatbots suggest or promote harmful behaviour.
There should also be age restrictions to access such chatbots to protect young people. Better privacy protections are also the need of the hour.
Educating people is also required, so they have a complete idea about these AI companions and the issues that come along.
With inputs from agencies


)

)
)
)
)
)
)
)
)



