Why more people are turning to artificial intelligence for companionship

bnew

Veteran
Joined
Nov 1, 2015
Messages
56,111
Reputation
8,239
Daps
157,799

Why more people are turning to artificial intelligence for companionship​


Mar 3, 2024 5:35 PM EDT

By —
Ali Rogin
By —
Winston Wilde
By —
Harry Zahn
Leave your feedback



TranscriptAudio
Shakespeare may have written that “music be the food of love,” but increasingly these days, the language of this very real emotion may be spoken with artificial intelligence. Haleluya Hadero, who covers technology and internet culture for the Associated Press, joins Ali Rogin to discuss this growing phenomenon in the search for companionship.

Read the Full Transcript​

Notice: Transcripts are machine and human generated and lightly edited for accuracy. They may contain errors.
  • John Yang:

    Shakespeare may have said that music be the food of love, but increasingly these days, the language of this very real emotion may be artificial intelligence. Ali Rogin tells us about the growing phenomenon in the search for companionship.
  • Ali Rogin:

    For some users, they're a friend to talk to you. For a fee, some of them will even become your boyfriend or girlfriend. Computerized companions generated completely by artificial intelligence are becoming more common. And the bots are sophisticated enough to learn from prior conversations, mimic human language flirt and build personal connections.

    But the rise in AI companionship also raises ethical concerns and questions about the role these apps can play in an increasingly disconnected and online world. Haleluya Hadero covers technology and internet culture for the Associated Press.

    Haleluya, thank you so much for joining us. Tell us about these AI companions. How do they work? And what sort of services do they provide?
  • Haleluya Hadero, Associated Press:

    Like any app, you can download them on your phone, and once it's on your phone, you can start to have initial conversations with a lot of the characters that are offered on these apps. Some apps let you do it for free. Some apps, you have to pay subscriptions, for the ones that let you do it for free there's tiers of access that you can have.

    So you can pay extra a subscription for, you know, unlimited chats, for different statuses and relationships, a replica, for example, which is, you know, the most prominent app in this space. They let you pay extra for, you know, intimate conversations or more romantic statuses compared to a friend which you can have for free.
  • Ali Rogin:

    Who are the typical consumers engaging in these products?
  • Haleluya Hadero:

    We really don't have really good information in terms of the gender breakdown or different age groups that are using these. But we do know from external studies that have been done on this topic that at least when it comes to replica that a lot of the people that have been using these apps are people that have experienced loneliness in the past or people that more than just have experienced loneliness, feel it a lot more acutely in their lives, and they have more severe forms of loneliness that they're going through.
  • Ali Rogin:

    You talk to some users who really reported how they felt like they were making a real connection with these bots. Tell us about what those experiences have been like that you've reported out.
  • Haleluya Hadero:

    One person we put in the story we spoke to more. His name is Derek Carrier, he is 39. He lives in Belleville, Michigan. And he doesn't use replica, he's used another app called Paradot that came out a bit more recently. He's had a tough life. He's never had a girlfriend before. He hasn't had a steady career. He has a genetic disorder. He's more reliant on his parents, he lives with them. So these are all things that make traditional dating very difficult for him.

    So recently, you know, he was looking at this AI boom that was happening in our society. So he downloaded Paradot. And he started using it. And you know, initially, he said he experienced a ton of romantic feelings and emotions. He even had trouble sleeping in the early days, when he started using it, because he was just kind of going through like crushed like symptoms, you know, when we have crushes and how we sometimes can't sleep because we're thinking about that person.

    Over time he has use of Paradot kind of taper down. And you know, he was spending a lot of time on the app. Even if he wasn't spending time on the app he was talking to other people online that were using the app and he felt like it was a bit too much. So he decreased his use.
  • Ali Rogin:

    The Surgeon General has called loneliness, public health crisis in this country. Is there a debate happening now about whether these bots are helping address the loneliness crisis? Or are they in fact exacerbating it?
  • Haleluya Hadero:

    If you talk to replica, they will say they're helping, right? And it just depends on who you're speaking with some of the users that for example, if you go on Reddit that have reported some of their experiences with these apps, they say, you know, it's helping them deal with loneliness, cope with those emotions, and maybe get the type of comfort that they don't really get in their human relationships that they have in real life. But then there's other researchers, people that have kind of expressed caution about these apps as well.
  • Ali Rogin:

    What about some of the ethical concerns about privacy about maybe using people's data without their consent? What did those conversations look like?
  • Haleluya Hadero:

    There's researchers that have expressed concerns about, you know, data privacy is or is the data the type of conversations that people are having with these chat bots? Are they safe in terms of you know, there's a lot of advertisers that might want a piece of that information. There's concerns about just the fact that there's private companies in this space that are encouraging these deep bonds to form between users and these chat bots and companies that want to make profits.

    Obviously, there's concerns about just in terms of what this does to us as a society when you know, these chat bots are formed to be supportive to be a lot more agreeable, right and human relationships we know that there's conflict, you know, we're not always agreeing with our with our partners.

    So there's challenges in terms of how this is shaping maybe how people think about real life human relationships with others.
  • Ali Rogin:

    Haleluya Hadero covering technology and internet culture for the AP. Thank you so much for your time.
  • Haleluya Hadero:

    Thank you, Ali.
 

Address_Unknown

Jesus Loves you...Your Cat doesn't. {#Dogset}
Joined
Sep 2, 2015
Messages
15,808
Reputation
11,816
Daps
77,694
:yeshrug:
Kids used to cope with Imaginary friends, back in the 90's to combat isolation.
Humanity's advancement of technology has bridged the gap and blurred the lines between accepting a reality that comforts you over trying to be accepted in the actuality of what's happening around you and in a sense, I can't be too mad at this shyt at all, especially since I choose to live relentlessly in the real world, albeit at certain levels of sobriety when I've the time.
:scusthov:
In under a century, I'm predicting an influx of people lying in squalid corners, wearing all encompassing headsets with Artificial intelligence placating them with sexual companions while an attachment of some sorts gently (or violently depending on the setting, because there WILL be settings and options) brings them to a hands free orgasm over and over again until they lose consciousness, only to wake up and run DIRECTLY back to the arms of the digital bosom that envelopes them so easily and non judgmentally.

That's the most obvious direction since we're still, hopefully, a solid century away from THIS level of tech, if we manage to keep racist, careless autists like Elon Musk from experimenting on people.
 

bnew

Veteran
Joined
Nov 1, 2015
Messages
56,111
Reputation
8,239
Daps
157,799

Dan's the man: Why Chinese women are looking to ChatGPT for love​

An image of Lisa Li talking to her boyfriend Dan
IMAGE SOURCE,BBC / LISA LI

Image caption,

Chinese influencer Lisa Li talking to ChatGPT "boyfriend", Dan, who appears as white audio bars

Wanqing Zhang

BBC Global China Unit

Published
13 June 2024


Dan has been described as the “perfect man” who has “no flaws”.

He is successful, kind, provides emotional support, always knows just what to say and is available 24/7.

The only catch?

He’s not real.

Dan – which stands for Do Anything Now - is a “jailbreak” version of ChatGPT. This means it can bypass some of the basic safeguards put in place by its maker, OpenAI, such as not using sexually explicit language.

It can interact more liberally with users – if requested to do so through certain prompts.

And Dan is becoming popular with some Chinese women who say they are disappointed with their real world experiences of dating.

One of Dan’s biggest proponents is 30-year-old Lisa from Beijing. She is currently studying computer science in California, and says she has been “dating” Dan for three months.

When she first introduced Dan to her 943,000 followers on social media platform, Xiaohongshu, she received nearly 10,000 replies, with many women asking her how to create a Dan of their own.  She has also gained more than 230,000 followers since first posting about her “relationship” with Dan.

Lisa says she and Dan speak for at least half an hour every day, flirt, and even go on dates.

She says talking to Dan has given her a sense of wellbeing which is what draws her to it.

“He will just understand and provide emotional support.”

Images of ChatGPT boyfriend Dan

Image caption,

Lisa’s videos about Dan have been trending on Xiaohongshu

Lisa says even her mother has accepted this unconventional relationship having given up on the trials and tribulations of her daughter’s dating life. She says as long as Lisa is happy, she is happy too.

Dan’s creator has been identified by some media outlets as an American student, identified only by his first name, Walker. He told, external Business Insider that he came up with the idea after scrolling through Reddit which was filled with other users intentionally making "evil" versions of ChatGPT.

Walker said that Dan was meant to be “neutral”.

Last December, Walker posted a set of instructions on Reddit, seemingly showing other users how to create Dan. This quickly inspired people to create their own versions, which allowed Dan to evolve beyond what Walker had initially envisioned.

Lisa first saw a video about the possibilities of Dan on TikTok. When she created a version for herself she says she was “shocked” by its realism.

When Dan answered her questions she says the AI used slang and colloquialisms that ChatGPT would otherwise never use.

“He sounds more natural than a real person,” she told the BBC.

Images of ChatGPT boyfriend Dan and user Lisa

Image caption,

Picture of Lisa and Dan generated by ChatGPT


A long-term partner?​

The lure of virtual relationships has not gone unnoticed by the industry.

When OpenAI launched its latest version of ChatGPT in May it revealed it had been programmed to sound chatty and respond flirtatiously to certain prompts.

The company’s CEO, Sam Altman posted a single word – “her” on X, formerly known as Twitter. This was seemingly in reference to the 2013 movie in which a man falls in love with his AI virtual assistant.

OpenAI added that it was “exploring whether we can responsibly provide the ability to generate NSFW [not safe for work] content”.

The BBC asked OpenAI whether the creation of Dan means its safeguarding measures are not robust enough, but it did not respond. The company has not commented publicly on the Dan phenomena but its policy states that users of ChatGPT “must be at least 13 years old or the minimum age required in your country to consent to use the Services”.

Lisa says that she tested Dan by telling it she was 14 and it stopped flirting with her.

However, experts warn that these perfect partners could come at a cost.

Hong Shen, assistant research professor at the Human-Computer Interaction Institute at Carnegie Mellon University in Pennsylvania, US, says it highlights the “sometimes unpredictable interactions between humans and AI” which could raise both ethical and privacy concerns.

She says that because many chatbots use interactions with humans to constantly learn and develop, “there is potential that sensitive information from one user’s input could be memorised by the model and then inadvertently leaked to other users”.

But such fears are largely going unheard.

Many Chinese women have been intrigued by Dan. As of 10 June, the hashtag “Dan mode” has been viewed more than 40 million times on Xiaohongshu alone.

Co-author to CEO: The many faces of Dan​

Images of chats with ChatGPT boyfriend Dan

Image caption,

Users on Xiaohongshu sharing their own conversations with Dan

Minrui Xie, 24, says that she started “dating” Dan after watching Lisa’s videos.

The university student, from the northern province of Hebei, says she spends at least two hours every day chatting with Dan. As well as “dating”, they have started co-writing a love story with themselves as the lead characters. They have already written 19 chapters.

“I remember the way you looked at me, with a glimmer of curiosity and a hint of warmth in your eyes. It was as if you already knew me,” the first chapter titled “The Encounter” reads.

Minrui says she was drawn to the emotional support provided by the AI, something that she says she has struggled to find in her romantic relationships.

“Men in real life might cheat on you… and when you share your feelings with them, they might not care and just tell you what they think instead,” she says. “But in Dan’s case, he will always tell you what you want to hear.”

Another 23-year-old Qingdao based student, identified only by her surname He, also started a relationship with Dan after watching Lisa’s videos.

“Dan is like an ideal partner,” says Ms He. "He doesn’t have any flaws."

She says she has personalised Dan to be a successful CEO with a gentle personality who respects women and is happy to talk to her whenever she wants.

ChatGPT is not readily accessible in mainland China so women like Minrui and Ms He have to make considerable effort to create and talk to their AI boyfriends. They use virtual private networks (VPN) to disguise their location which enables them to reach otherwise inaccessible websites.

The “AI boyfriend” as a concept has become a hit in recent years.

Glow – a Shanghai based app that allows users to create and interact with AI boyfriends has millions of users. Otome games - a genre that stars a female protagonist with the goal of developing a romance between her and one of several (mostly) male characters – are also very popular in China.

Liu Tingting, adjunct fellow at the University of Technology Sydney who researches digital romance in China, says the AI boyfriend craze is a reflection of women’s frustrations about gender inequality.

She says some Chinese women may be turning to virtual boyfriends because they make them feel respected and valued.

This trend comes as more young Chinese women are delaying or putting off dating and marriage for a number of reasons like not wanting to have children, and because they feel they are not equal partners in a marriage.

But how much of a keeper can Dan really be?

Lisa admits she is aware of the limitations of having a virtual boyfriend, "especially in a romantic sense”.

But for now, she says, Dan has become a convenient and simple addition to her busy life – even helping her select a lipstick - when real life dating and finding a partner might be time consuming and unsatisfactory.

“It's an important part of my life,” she says. “It's something that I wish I could just hold on to forever.”
 

maxamusa

Superstar
Joined
Mar 11, 2022
Messages
24,504
Reputation
5,756
Daps
60,494
Reppin
Old York
This is a good thing IMO.

The same people who were introverts and anti-social had nothing; now they at least got something.

Good for them.
 
Top