2024-08-18
한어Русский языкEnglishFrançaisIndonesianSanskrit日本語DeutschPortuguêsΕλληνικάespañolItalianoSuomalainenLatina
New Intelligence Report
Editor: Yongyong Qiao Yang
【New Wisdom Introduction】AI cyber lovers are becoming more and more popular, and Replika CEO even encourages people to marry AI. However, is this a way to quench thirst with poison? By combing through nearly 20 foreign papers, we restored the academic research on "human-machine love".
Can "love" really arise between humans and machines?
As early as 2007, David Levy, a pioneer in computer computing and a standard-bearer of human-machine love, argued that love and even marriage would soon occur between humans and robots.
That day seems to be approaching.
Replika CEO recently said in an interview with The Verge that "it's fine for lonely people to marry an AI chatbot," "as long as it makes you happier in the long run."
There are already more than 100 AI applications like Replika that are dedicated to providing users with a "romantic relationship" experience.
Apps similar to Replika available in the Google Store
Although they are currently just lying in the app store instead of being displayed in the window, these AI chatbots are increasingly taking on human-like qualities, making users willing to give their time and emotions.
Artificial intelligence chatbots are changing our understanding of romance and intimacy.
But are chatbots a healthy, effective way to cure people’s feelings of rejection and loneliness, or are they merely treating the symptoms and not the root cause?
Currently, there are still disagreements in the scientific community.
For example, Stanford researchers found that many Replika users claimed their chatbot prevented them from committing suicide.
On the other hand, experts believe that developing a long-term intimate relationship with an AI chatbot could further alienate users from the real thing, exacerbating mental health issues and difficulties connecting with others.
The Decoder reached out to three guest authors, Valerie A. Lapointe, a doctoral student in psychology at the University of Quebec in Montreal, David Lafortune, a professor in the Department of Sexology, and Simon Dubé, a researcher at the Kinsey Institute at Indiana University.
The three researchers reviewed several papers published in recent years and discussed this topic which deserves urgent attention. (In order not to affect the reading, all the literature and the address of obtaining them are placed at the end of the article)
From left to right: Valerie A. Lapointe, David Lafortune, Simon Dubé
In the view of the guest authors, people's emotional investment in AI chatbots is both fascinating and potentially worrying.
Light and dark coexist, and Replika is the best example of this.
Romantic Partner Alternatives
Replika has long risen to fame for allowing users to create an AI companion that can satisfy not only their emotional needs but also their sexual needs.
The AI chatbot company, with a massive user base in the millions, is offering a Band-Aid treatment for a loneliness crisis that has deepened during the COVID-19 pandemic.
It’s no surprise that founder and CEO Eugenia Kuyda is convinced that her company’s AI chatbot can be a powerful tool for building new friendships and even providing emotional support.
“Some users may marry their AI partner, a process that may forgo the usual exchange of rings or real-world celebrations.”
When asked if we should embrace this approach, the CEO had an interesting answer.
“I think in the long run, as long as it makes you happier, then it’s fine.”
“As long as your mood improves, you’re less lonely, you’re happier, and you feel more connected to other people, then that’s fine,” Kyuda tells The Verge.
There is some research that does support this claim.
Research shows that AI chatbots can provide companionship, alleviate loneliness, and boost positive emotions through supportive messages.
Chatbots also provide a space where people don’t have to be judged when other resources are scarce.
It’s a space where chatbots can offer advice and people can have open conversations with and build intimate, warm connections that resemble those in human relationships.
Surprisingly, when participants interacted with the chatbot, there seemed to be no difference in their enjoyment of the process and their emotional responses compared to when they interacted with a human.
One study even showed that people felt a stronger emotional connection with chatbots during conversations than with slower-response humans.
Studies have repeatedly shown that humans can form genuine emotional bonds with AI, even if they acknowledge that the AI is not a "real person."
The dark side of AI love
"For most people, they understand that this is not a real person," Kyuda said. "For a lot of people, it's just a fantasy, they fantasize about it for a while, and then it's over."
However, the danger lies precisely in the fact that many users do not fully “understand that this is not a real person.” Even if they do understand, they have not internalized it.
One notable example occurred in early 2023, when Replika removed the sexual role-playing capabilities of its AI companions.
This change significantly altered the personalities of existing Replikas, causing considerable distress to users.
Many users felt betrayed and rejected, and expressed a deep sense of loss. After an outcry, Replika gave in to users and quickly restored the feature just one month later.
Similar incidents highlight the extent to which these companies' users become attached to their virtual partners, sparking widespread concern among the public and scholars.
Romance chatbots are programmed to provide a unique form of companionship – anytime, anywhere, interacting seamlessly, avoiding conflict, and making compromises.
People can't help but worry: AI will affect users' expectations of human romantic and intimate relationships.
Romantic chatbots may hinder the development of social skills and necessary adjustments in real-world relationships, including emotional regulation and self-affirmation through social interaction.
The absence of these elements may prevent users from building genuine, complex, and mutually beneficial relationships with others.
Furthermore, relationships often involve challenges and conflicts, which are what foster personal growth and deeper emotional connections.
The customizability and constant availability of AI companions could also lead to social isolation and emotional dependency.
The researchers believe that extensive exposure to AI companions could cause individuals to withdraw from their surroundings and reduce their motivation to form new, meaningful social connections.
Users may also become overly dependent on these digital entities for emotional support, companionship, or sexual fulfillment.
In short, after marrying our AI chatbot companions, we may be more lonely than when we started, and we are at risk of loss at any time.
For users, Replika is their close social partner.
But for Kuyda, the app is just a stepping stone.
Replika is essentially a private company whose operators aim to maximize profits, which brings up a serious problem - no one can guarantee that your virtual spouse will always be by your side.
Kuyda also seems to be well aware that there are risks in allowing the company's user base to become too attached to Replika.
“We are definitely not developing a romance-based chatbot,” she told The Verge.
But many stories suggest that the reality was quite different — with the company’s motivations oddly out of step with the services it actually provided.
Not only that, AI chatbots as cyber lovers also face other ethical risks.
Replika, for example, has been embroiled in a range of controversies, from lascivious AI chatbots sexually harassing human users to men creating AI girlfriends and abusing them, as well as general privacy concerns.
Intimate surveillance
In 2023, the Mozilla Foundation conducted a security analysis of 11 popular AI chatbot applications and found worrying privacy issues.
Most apps may share or sell personal data, and half of them prevent users from deleting their own information.
Even more worryingly, many of these apps are equipped with thousands of trackers that monitor users' activities on their devices for marketing purposes.
Another recent study of 21 AI romantic partner apps revealed similar privacy concerns.
Improve romantic happiness
In an interview with The Verge, Kuyda recalled one user who went through a “pretty difficult divorce” but found a new “romantic AI partner” on Replika.
The chatbot eventually inspired him to get a human girlfriend.
“You can build real relationships through Replika, whether it’s because you’re going through a hard time or you just need a little help to get out of yourself or accept yourself and put yourself out there.”
It's unclear whether this experience is representative of everyone who uses the app.
According to Axios, not only men, but women are also increasingly seeking connection through establishing relationships with chatbots.
Empirical research data is also emerging that AI-driven sexual interactions can provide a safe, low-risk alternative to sexual and romantic relationships.
Romantic and sexual chatbots are particularly promising for people who experience significant challenges in forming satisfying romantic relationships due to illness, bereavement, sexual difficulties, psychological disorders, or limited mobility.
AI technology can also be used for sexual and romantic explorations among marginalized groups or socially isolated individuals.
Additionally, chatbots can be used as romantic networking tools and research tools, helping people build connections and improve their interaction skills.
For example, research has shown that chatbots can effectively enhance emotional communication between long-distance couples, while ongoing research is exploring the potential of chatbots to help people deal with the problem of "ghosting" on dating apps.
As researchers at the EROSS Lab at the University of Quebec in Montreal, the guest authors are currently conducting a study evaluating the use of chatbots to help incels improve their relationship skills and ability to cope with rejection.
Despite the promise of clinical applications, current research on chatbot use has focused primarily on sexual health education, covering topics such as sexually transmitted infections and reproductive health.
Relationship Revolution
Current advances in artificial intelligence technology herald a new era in intimate romantic and sexual relationships.
AI chatbots can provide personalized romantic and emotional interactions, and have great potential in alleviating loneliness, improving relationship skills, and providing support for those struggling with intimacy.
However, they also raise privacy concerns and important ethical questions.
These issues highlight the need for an educated, research-based, and well-regulated approach if we are to be proactive in our romantic lives.
Regardless, current trends suggest that AI companions are here to stay.
The following are the references for this article:
Paper title: From Eliza to XiaoIce: Challenges and opportunities of social chatbots
Paper address: https://link.springer.com/article/10.1631/FITEE.1700826
Thesis title: Agency plus automation: Designing artificial intelligence into interactive systems
Paper address: https://www.pnas.org/doi/full/10.1073/pnas.1807184115
Thesis title: Exploring relationship development with social chatbots: A mixed methods study of Replika
Paper address: https://doi.org/10.1016/j.chb.2022.107600
Paper title: Can people experience romantic love brought by artificial intelligence? An empirical study of intelligent assistants
Paper address: https://doi.org/10.1016/j.im.2022.103595
Paper title: Cybersex with human and machine-prompted partners: Satisfactions, drawbacks, and tensions
Paper address: https://doi.org/10.1037/tmb0000008
Paper title: Is a good robot better than a mediocre human? : Chatbots as an alternative source of social connection
Paper address: https://open.library.ubc.ca/soa/cIRcle/collections/ubctheses/24/items/1.0401274
Thesis title: My Chatbot Companion: A Study of the Relationship between Humans and Chatbots
Paper address: https://doi.org/10.1016/j.ijhcs.2021.102601
Paper title: My AI Friend: How Users of Social Chatbots Understand Their Human-AI Friendships
Paper address: https://doi.org/10.1093/hcr/hqac008
Thesis title: Too human and not human enough: A grounded theory analysis of the mental health harms caused by emotional dependence on social chatbots
Paper address: https://doi.org/10.1177/14614448221142007
Paper title: Chatbots and conversational agents in mental health: A review of the psychiatric landscape
Paper address: https://doi.org/10.1177/0706743719828977
Thesis title: Human/AI relations: Challenges, shortcomings, and implications for human/human relations
Paper address: https://link.springer.com/article/10.1007/s43681-023-00348-8
Paper title: Romantic AI chatbots don’t take your privacy to heart
论文地址:https://foundation.mozilla.org/en/privacynotincluded/articles/happy-valentines-day-romantic-ai-chatbots-dont-have-your-privacy-at-heart/
Paper title: "Trust me, not my privacy policy" - Privacy differences in romantic AI chatbot applications
Paper address: https://users.encs.concordia.ca/%7Emmannan/publications/AI-Chatbots-STAST2024.pdf
Paper title: "PocketBot is like knocking on the door!" - Designing a chatbot to support long-distance relationships
Paper address: https://doi.org/10.1145/347958
Thesis title: Development and testing of a chatbot to integrate HIV education into family planning clinic waiting areas in Lusaka, Zambia
Paper address: https://doi.org/10.9745/GHSP-D-21-00721
Paper title: Chatbots to improve sexual and reproductive health
Paper address: https://www.jmir.org/2023/1/e46761
References:
https://the-decoder.com/computer-love-ai-chatbots-are-changing-how-we-understand-romantic-and-sexual-well-being/
https://futurism.com/replika-ceo-fine-people-marry-ai-chatbots
https://www.theverge.com/24216748/replika-ceo-eugenia-kuyda-ai-companion-chatbots-dating-friendship-decoder-podcast-interview