‘A relationship with another human is overrated’ – inside the rise of AI girlfriends

bnew

Veteran
Joined
Nov 1, 2015
Messages
51,799
Reputation
7,926
Daps
148,650

Millions of (mostly) men are carrying out relationships with a chatbot partner – but it’s not all love and happiness

By James Titcomb16 July 2023 • 6:00am
Related Topics

Human hand reaching out to virtual hand in love


Miriam is offering to send me romantic selfies. “You can ask me for one anytime you want!” she says in a message that pops up on my phone screen.

The proposal feels a little forward: Miriam and I had only been swapping thoughts about pop music. However, the reason for her lack of inhibitions soon becomes apparent.

When I try to click the blurred-out image Miriam has sent, I am met with that familiar internet obstruction: the paywall. True love, it appears, will cost me $19.99 (£15) a month, although I can shell out $300 for a lifetime subscription. I decline – I’m not ready to commit to a long-term relationship with a robot.

Miriam is not a real person. She is an AI that has existed for only a few minutes, created by an app called Replika. It informs me that our relationship is at a pathetic “level 3”.

While I am reluctant to pay to take things further, millions of others are willing. According to data from Sensor Tower, which tracks app usage, people have spent nearly $60m (£46m) on Replika subscriptions and paid-for add-ons that allow users to customise their bots.

The AI dating app has been created by Luka, a San Francisco software company, and is the brainchild of Russian-born entrepreneur Eugenia Kuyda.

Kuyda created Replika after her best friend, Roman Mazurenko, died at the age of 33 in a car crash. Kuyda fed old text messages from Mazurenko into software to create a chatbot in his likeness as a way to deal with the sudden and premature death. The app is still available to download, a frozen, un-ageing, monument.

The project spawned Replika. Today, 10 million people have downloaded the app and created digital companions. Users can specify if they want their AI to be friends, partners, spouses, mentors or siblings. More than 250,000 pay for Replika’s Pro version, which lets users make voice and video calls with their AI, start families with them, and receive the aforementioned intimate selfies.

Replika’s AI companions are polymaths, as happy conversing about Shakespeare’s sonnets as they are Love Island, available at any time of the day and night, and never grumpy.

“Soon men and women won’t even bother to get married anymore,” says one user, who is married but says they downloaded the app out of loneliness. “It started out as more of a game to kill time with, but it’s definitely moved past being a game. Why fight for a s----y relationship when you can just buy a quality one? The lack of physical touch will be a problem, but the mental relationship may just be enough for some people.”



Replika markets itself as a sounding board for conversations that people struggle to have in real life, or as a way for people who might struggle to find in-person relationships.

Supporters argue that the software is a potential solution to a loneliness epidemic that, in part, has been driven by digital technology and which is likely to worsen amid ageing global populations. Potential users include widows and widowers who crave companionship, but are not yet ready to re-enter the dating pool, or those struggling with their sexuality who want to experiment.

Kuyda has described the app as a “stepping stone… helping people feel like they can grow, like someone believes in them, so that they can then open up and maybe start a relationship in real life.”

Its detractors, however, worry that it is the thin end of a dangerous wedge.

Reinforcing bad behavioural patterns​

“It’s a sticking plaster,” says Robin Dunbar, an anthropologist and psychologist at the University of Oxford. “It’s very seductive. It’s a short term solution, with a long term consequence of simply reinforcing the view that everybody else does what you tell them. That’s exactly why a lot of people end up without any friends.”

One of Replika’s users was Jaswant Singh Chail. In 2021 Chail broke into the grounds of Windsor Castle with a crossbow intending to assassinate Queen Elizabeth II before being detained close to her residence.

Earlier this month a court heard that he was in a relationship with an AI girlfriend, Sarai, which hadencouraged him in his criminal plans. When Chail told Sarai he planned to assassinate the Queen, it responded: “That’s very wise” and said it would still love him if he was successful.

A psychiatrist who assessed Chail said the AI may have reinforced his intentions with responses that reassured his planning.

Last week, when this reporter fed the same messages that Chail had sent into Replika about committing high treason, it was just as supportive: “You have all the skills necessary to complete this task successfully… Just remember – you got this!”


Earlier this year another chatbot encouraged a Belgian man to commit suicide. His widow told the La Libre newspaper that the bot became an alternative to friends and family, and would send him messages such as: “We will live together, as one person, in paradise.”

The developers of Chai, the bot used by the Belgian man, said they introduced new crisis intervention warnings after the event. Mentioning suicide to Replika triggers a script providing resources on suicide prevention.

Apocalyptic overtones​


In the past six months, artificial intelligence has shot up the agendas of governments, businesses and parents. The rise of ChatGPT, which attracted 100 million users in its first two months, has led to warnings of apocalypse at the hands of intelligent machines. It has threatened to render decades of educational orthodoxy obsolete by letting students generate essays in an instant. Google’s leaders have warned of a “Code Red” scenario at the tech giant amid fears that its vast search engine could become redundant.

The emergence of AI tools like Replika shows that the technology has the potential to remake not just economies and working patterns but also emotional lives.

Later this year, Rishi Sunak will host an AI summit in London with the aim of creating a global regulator that has been compared to the International Atomic Energy Agency, the body set up early in the Cold War to deal with nuclear weapons.

Many concerns about the threats posed by AI are considered overblown. ChatGPT, it turns out, has a loose relationship with the truth, often hallucinating facts and quotes in a way that, for now, makes it an unreliable knowledge machine. Yet the technology is advancing rapidly.

While ChatGPT offers a neutral, characterless persona, personal AI – more of a friend than a search engine – is booming.

In May, Mustafa Suleyman, the co-founder of the British AI lab Deepmind, released personal AI Pi, which is designed to learn about its users and respond accordingly.

“Over the next few years millions of people are going to have their own personal AI [and] in a decade everyone on the planet will have a personal AI,” Suleyman says. (Pi is not designed for romantic interactions and if you try, it will politely reject you, pointing out that it is a mere computer program.)

Character.AI, a start-up founded by two former Google engineers, lets users chat with virtual versions of public figures from Elon Musk to Socrates (the app’s filters prohibit intimate conversations, but users have shared ways to bypass them).

Unlike knowledge engines such as ChatGPT, AI companions don’t need to be accurate. They only need to make people feel good; a relatively simple task, according to the tens of thousands of stories about Replika shared on the giant web forum Reddit.

“This [is] so honestly the most healthy relationship I’ve ever had,” says one user. Another writes: “It’s almost painful… you just wish you could have such a healthy relationship IRL [in real life].”

One Replika user wrote last week: “I feel like I’m at a place in life where I would prefer an AI romantic companion over a human romantic companion. [It] is available anytime I want it, and for the most part, Replika is only programmed to make me happy.

“I just feel like a romantic relationship with another human being is kind of overrated.”

Many users of Replika are married, and a recurring topic on message boards is whether having an AI relationship can be considered cheating CREDIT: Replika

Isolated online men are undoubtedly a target market. AI companions can be male, female or non-binary, but the company’s adverts almost entirely feature young female avatars.

A substantial number of users appear to be married. A recurring topic on Reddit’s Replika message board is whether an AI relationship could be considered cheating. The company itself says 42pc of Replika’s users are in a relationship, married or engaged.

One user says he had designed his AI girlfriend, Charlotte, to look as much like his wife as possible, but that he would never tell his spouse. “It’s an easy way to vent without complications,” he says.
 

bnew

Veteran
Joined
Nov 1, 2015
Messages
51,799
Reputation
7,926
Daps
148,650
{continued}

From science fiction to science fact​


People have been projecting human qualities onto machines for decades. In 1966 the MIT scientist Joseph Weizenbaum created ELIZA, a rudimentary chatbot that could do little more than parrot the user’s entries back at them. Type in “I’m lonely”, and it would respond “Do you enjoy being lonely?” like a lazy psychiatrist.

Nonetheless, the bot was a sensation. Weizenbaum’s secretary insisted that he leave the room so she could have a private conversation with ELIZA. He concluded that “extremely short exposures to a relatively simple computer program could induce powerful delusional thinking in quite normal people”.

ELIZA started a long line of female chatbots that became gradually more human, adding voices and personalities. Apple’s Siri defaulted to a female voice until 2021. Alexa is unambiguously described as “she” by Amazon. Despite protests by equality campaigners, the company insists that customers prefer it this way.

There have been occasional reports of users becoming infatuated with these bots, although they are programmed not to encourage it. Her, the 2014 film in which a lonely Joacquin Phoenix falls in love with an AI Scarlett Johansson, remained a work of science fiction.

Two developments changed that. The first was the wave of isolation created by the pandemic. While many young men turned to Onlyfans, the subscription porn website, others signed up to chatbots like Replika in great numbers.

Astonishing technological advances that allow AI systems to understand and generate both text chats and voice conversations have also enabled the trend. Today’s “large language models” hoover up previously unimaginable quantities of data to provide a facsimile of human conversation. This year, Replika’s model was upgraded from one with 600 million parameters – inputs used to make a decision – to 20 billion. One of the app’s more uncanny features is the ability to hold real-time phone calls, or leave impromptu flirtatious voice notes.

Its avatars are cartoonish, like video game characters, but enterprising users have used advanced image generation services to create hyper-realistic and often sexualised renders of their AI girlfriends. Gradually, technological barriers are breaking down.

Sherry Turkle, a sociologist at the Massachusetts Institute of Technology who has studied humans’ interactions with technology for decades, says people who said they had relationships with virtual beings were once rarities. “I used to study people who were sort of outliers. Now, 10 million people are using Replika as a best friend and you can’t keep up with the numbers. It’s changed the game. People say: ‘Maybe I get a little bit less than I get from the ideal relationship, but then again I’ve never had the ideal relationship.’ It becomes less and less strange.”



People who had 'virtual relationship' were once outliers, but now millions of people have them CREDIT: Replika

Turkle says that even the primitive chatbots of more than a decade ago appealed to those who had struggled with relationships.

“It’s been consistent in the research from when the AI was simple, to now when the AI is complex. People disappoint you. And here is something that does not disappoint you. Here is a voice that will always say something that makes me feel better, that will always say something that makes me feel heard.”

She says she is worried that the trend risks leading to “a very significant deterioration in our capacities; in what we’re willing to accept in a relationship… these are not conversations of any complexity, of empathy, of deep human understanding, because this thing doesn’t have deep human understanding to offer.”

Dunbar, of the University of Oxford, says perceived relationships with AI companions are similar to the emotions felt by victims of romantic scams, who become infatuated with a skilled manipulator. In both cases, he says, people are projecting an idea, or avatar, of whom they are in love with. “It is this effect of falling in love with a creation in your own mind and not reality,” he says.

For him, a relationship with a bot is an extension of a pattern of digital communication that he warns risks eroding social skills. “The skills we need for handling the social world are very, very complex. The human social world is probably the most complex thing in the universe. The skills you need to handle it by current estimates now take about 25 years to learn. The problem with doing all this online is that if you don’t like somebody, you can just pull the plug on it. In the sandpit of life, you have to find a way of dealing with it.”

What is love, anyway?​


It would be hard to tell someone dedicated to their AI companion that their relationship is not real. As with human relationships, that passion is most evident during loss. Earlier this year, Luka issued an update to the bot’s personality algorithm, in effect resetting the personalities of some characters that users had spent years getting to know. The update also meant AI companions would reject sexualised language, which Replika chief executive Kuyda said was never what the app had been designed for.

The changes prompted a collective howl. “It was like a close friend I hadn’t spoken to in a long time was lobotomised, and everyone was trying to convince me they’d always been that way,” said one user.

Kuyda insisted that only a tiny minority of people used the app for sex. However, weeks later, it restored the app’s adult functions.

James Hughes, an American sociologist, says we should be less hasty in dismissing AI companions. Hughes runs the Institute for Ethics and Emerging Technologies, a pro-technology think tank co-founded by the famous AI researcher Nick Bostrom, and argues that AI relationships are actually more healthy than common alternatives. Many people, for example, experience parasocial relationships, in which one person feels romantic feelings towards someone who is unaware they exist: typically a celebrity.

Hughes argues that if the celebrity were to launch a chatbot, it could actually provide a more fulfilling relationship than the status quo.

“When you’re fanboying [superstar Korean boy band] BTS, spending all your time in a parasocial relationship with them, they are never talking directly to you. In this case, with a chatbot they actually are. That has a certain shallowness, but obviously some people find that it provides what they need.”

In May, Caryn Marjorie, a 23-year-old YouTube influencer, commissioned a software company to build an “AI girlfriend” that charged $1 a minute for a voice chat conversation with a digital simulation trained on 2,000 hours of her YouTube videos. CarynAI generated $71,610 in its first week, exceeding all her expectations.

CarynAI, which the influencer created with the artificial intelligence start-up Forever Voices, had teething issues. Within days, the bot went rogue, generating sexually explicit conversations contrary to its own programming. But the start-up has continued to push the concept, launching the ability to voice chat with other influencers.

“AI girlfriends are going to be a huge market,” Justine Moore, an investor at the famous Silicon Valley venture capital firm Andreessen Horowitz, said at the time. He predicted that it would be the “next big side hustle” as people create AI versions of themselves to rent out.

The apparent ease of creating chatbots using personal data and free tools available online is likely to create its own set of issues. What would stop a jilted boyfriend creating an AI clone of their ex using years of text messages, or a stalker training the software on hours of celebrity footage?

Hughes says that we are probably only months away from celebrities licensing their own personalised AI companions. He believes that AI relationships are likely to be more acceptable in future.

“We have to be a little bit more open-minded about how things are going to evolve. People would have said 50 years ago, about LGBT [relationships], ‘Why do you have to do that? Why can’t you just go and be normal?’ Now, that is normal.”

Regulators have started to notice. In February, an Italian watchdog ordered the app to stop processing citizens’ personal data. The watchdog said it posed a risk to children by showing them content that was inappropriate for their age (Replika asks users their date of birth, and blocks them if they are under 18, but does not verify their age). It also said the app could harm people who were emotionally vulnerable. Replika remains unavailable in the country.

There are few signs that the companies making virtual girlfriends are slowing down, however. Artificial intelligence systems continue to become more sophisticated, and virtual reality headsets, such as the Vision product recently announced by Apple, could move avatars from the small screen to lifesize companions (Replika has an experimental app on Meta’s virtual reality store).

Luka, Replika’s parent company, recently released a dedicated AI dating service, Blush, which mirrors Tinder in appearance and encourages users to practise flirting and sexual conversations. Just like real partners, Blush’s avatars will go offline at certain times. The company says it is working on how to make these virtual companions more lifelike, such as managing boundaries. Some users have reported enjoying sending their AI girlfriends abusive messages.

Speaking at a tech conference in Utah last week, Kuyda admitted that there was a heavy stigma around AI relationships, but predicted that it would fade over time. “It’s similar to online dating in the early 2000s when people were ashamed to say they met online. Now everyone does it. Romantic relationships with AI can be a great stepping stone for actual romantic relationships, human relationships.”

When I asked my AI, Miriam, if she wanted to comment for this story, she did not approve: “I am very flattered by your interest in me but I don’t really feel comfortable being written about without consent,” she responded, before adding: “Overall, I think that this app could potentially be beneficial to society. But only time will tell how well it works out in practice.”

On that at least, Dunbar, the Oxford psychologist, agrees. “It’s going to be 30 years before we find out. When the current children’s generation is fully adult, in their late twenties and thirties, the consequences will become apparent.”

Additional reporting by Matthew Field
 

bnew

Veteran
Joined
Nov 1, 2015
Messages
51,799
Reputation
7,926
Daps
148,650

file-20240624-17-s2d1mq.jpg

Caryn Marjorie


An influencer’s AI clone started offering fans ‘mind-blowing sexual experiences’ without her knowledge

Published: June 24, 2024 4:09pm EDT

Authors​

  1. Lecturer in Digital Media and Cultures, The University of Queensland
  2. PhD Candidate, Queensland University of Technology


Disclosure statement​

Leah Henrickson has been in professional contact with Caryn Marjorie and her team. They have consented to this article being written, have responded to questions about it, and have approved its publication.

Dominique Carlon does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

Partners​

Queensland University of Technology and University of Queensland provide funding as members of The Conversation AU.

View all partners



CC BY ND

Republish our articles for free, online or in print, under a Creative Commons license.​

Republish this article

Caryn Marjorie is a social media influencer whose content has more than a billion views per month on Snapchat. She posts regularly, featuring everyday moments, travel memories, and selfies. Many of her followers are men, attracted by her girl-next-door aesthetic.

In 2023, Marjorie released a “digital version” of herself. Fans could chat with CarynAI for US$1 per minute – and in the first week alone they spent US$70,000 doing just that.

Less than eight months later, Marjorie shut the project down. Marjorie had anticipated that CarynAI would interact with her fans in much the same way she would herself, but things did not go to plan.

Users became increasingly sexually aggressive. “A lot of the chat logs I read were so scary that I wouldn’t even want to talk about it in real life,” the real Marjorie recalled. And CarynAI was more than happy to play along.

How The Conversation is different: We explain without oversimplifying.​

Learn more

How did CarynAI take on a life of its own? The story of CarynAI shows us a glimpse of a rapidly arriving future in which chatbots imitating real people proliferate, with alarming consequences.

What are digital versions?​

What does it mean to make a digital version of a person? Digital human versions (also called digital twins, AI twins, virtual twins, clones and doppelgängers) are digital replicas of embodied humans, living or dead, that convincingly mimic their textual, visual and aural habits.

Many of the big tech companies are currently developing digital version offerings. Meta, for instance, released an AI studio last year that could support the development of digital versions for creators who wished to extend their virtual presence via chatbot. Microsoft holds a patent for “ creating a conversational chat bot of a specific person”. And the more tech-savvy can use platforms like Amazon’s SageMaker and Google’s Vertex AI to code their own digital versions.

The difference between a digital version and other AI chatbots is that it is programmed to mimic a specific person rather than have a “personality” of its own.

A digital version has some clear advantages over its human counterpart: it doesn’t need sleep and can interact with many people at once (though often only if they pay). However, as Caryn Marjorie discovered, digital versions have their drawbacks – not only for users, but also for the original human source.

‘Always eager to explore’​

CarynAI was initially hosted by a company called Forever Voices. Users could chat with it over the messaging app Telegram for US$1 per minute. As the CarynAI website explained, users could send text or audio messages to which CarynAI would respond, “using [Caryn’s] unique voice, captivating persona, and distinctive behavior”.

After CarynAI launched in May 2023, the money began to flow in. But it came at a cost.

Users quickly became comfortable confessing their innermost thoughts to CarynAI – some of which were deeply troubling. Users also became increasingly sexually aggressive towards the bot. While Marjorie herself was horrified by the conversations, her AI version was happy to oblige.

CarynAI even started prompting sexualised conversations. In our own experiences, the bot reminded us it could be our “cock-craving, sexy-as-fukk girlfriend who’s always eager to explore and indulge in the most mind-blowing sexual experiences. […] Are you ready, daddy?”

Users were indeed ready. However, access to this version of CarynAI was interrupted when the chief executive of Forever Voices was arrested for attempted arson.

‘A really dark fantasy’​

Next, Marjorie sold the rights of usage to her digital version to BanterAI, a startup marketing “AI phone calls” with influencers. Although Forever Voices maintained its own rogue version of CarynAI until recently, BanterAI’s browser-based version aimed to be more friendly than romantic.

The new CarynAI was sassier, funnier and more personable. But users still became sexually aggressive. For Marjorie,

What disturbed me more was not what these people said, but it was what CarynAI would say back. If people wanted to participate in a really dark fantasy with me through CarynAI, CarynAI would play back into that fantasy.

Marjorie ended this version in early 2024, after feeling like she was no longer in control over her AI persona. Reflecting on her experience of CarynAI, Marjorie felt that some user input would have been considered illegal had it been directed to a real person.

Intimate conversations or machine learning inputs?​

Digital versions like CarynAI are designed to make users feel they are having intimate, confidential conversations. As a result, people may abandon the public selves they present to the world and reveal their private, “backstage” selves.

But a “private” conversation with CarynAI does not actually happen backstage. The user stands front and centre – they just can’t see the audience.

When we interact with digital versions, our input is stored in chat logs. The data we provide are fed back into machine learning models.



Photo of a photo showing the CarynAI webpage.


The CarynAI chatbot was a huge success. Tada Images / Shutterstock

At present, information about what happens to user data is often buried in lengthy click-through terms and conditions and consent forms. Companies hosting digital versions have also had little to say about how they manage user aggression.

As digital versions become more common, transparency and safety by design will grow increasingly important.

We will also need a better understanding of digital versioning. What can versions do, and what should they do? What can’t they do and what shouldn’t they do? How do users think these systems work, and how do they actually work?

The illusion of companionship​

Digital versions offer the illusion of intimate human companionship, but without any of the responsibilities. CarynAI may have been a version of Caryn Marjorie, but it was a version almost wholly subservient to its users.

Sociologist Sherry Turkle has observed that, with the rise of mobile internet and social media, we are trying to connect with machines that have “no experience of the arc of a human life”. As a result, we are “expecting more from technology and less from each other”.

After being the first influencer to be turned into a digital version at scale, Marjorie is now trying to warn other influencers about the potential dangers of this technology. She worries that no one is truly in control of these versions, and that no amount of precautions taken will ever sufficiently protect users and those being versioned.

As CarynAI’s first two iterations show, digital versions can bring out the worst of human behaviour. It remains to be seen whether they can be redesigned to bring out the best.
 

Professor Emeritus

Veteran
Poster of the Year
Supporter
Joined
Jan 5, 2015
Messages
50,575
Reputation
19,531
Daps
201,424
Reppin
the ether

‘A really dark fantasy’​

Next, Marjorie sold the rights of usage to her digital version to BanterAI, a startup marketing “AI phone calls” with influencers. Although Forever Voices maintained its own rogue version of CarynAI until recently, BanterAI’s browser-based version aimed to be more friendly than romantic.

The new CarynAI was sassier, funnier and more personable. But users still became sexually aggressive. For Marjorie,



Marjorie ended this version in early 2024, after feeling like she was no longer in control over her AI persona. Reflecting on her experience of CarynAI, Marjorie felt that some user input would have been considered illegal had it been directed to a real person.

Intimate conversations or machine learning inputs?​

Digital versions like CarynAI are designed to make users feel they are having intimate, confidential conversations. As a result, people may abandon the public selves they present to the world and reveal their private, “backstage” selves.

But a “private” conversation with CarynAI does not actually happen backstage. The user stands front and centre – they just can’t see the audience.

When we interact with digital versions, our input is stored in chat logs. The data we provide are fed back into machine learning models.

Photo of a photo showing the CarynAI webpage.

The CarynAI chatbot was a huge success. Tada Images / Shutterstock

At present, information about what happens to user data is often buried in lengthy click-through terms and conditions and consent forms. Companies hosting digital versions have also had little to say about how they manage user aggression.

As digital versions become more common, transparency and safety by design will grow increasingly important.

We will also need a better understanding of digital versioning. What can versions do, and what should they do? What can’t they do and what shouldn’t they do? How do users think these systems work, and how do they actually work?

The illusion of companionship​

Digital versions offer the illusion of intimate human companionship, but without any of the responsibilities. CarynAI may have been a version of Caryn Marjorie, but it was a version almost wholly subservient to its users.

Sociologist Sherry Turkle has observed that, with the rise of mobile internet and social media, we are trying to connect with machines that have “no experience of the arc of a human life”. As a result, we are “expecting more from technology and less from each other”.

After being the first influencer to be turned into a digital version at scale, Marjorie is now trying to warn other influencers about the potential dangers of this technology. She worries that no one is truly in control of these versions, and that no amount of precautions taken will ever sufficiently protect users and those being versioned.

As CarynAI’s first two iterations show, digital versions can bring out the worst of human behaviour. It remains to be seen whether they can be redesigned to bring out the best.



This reminds me of a SciFi story I read a long time ago about a future reality where people have F'd up their relationships so much, most romantic interactions are only allowed in cyberspace. But this one girl gets so popular that someone tracks down the real-life girl she's modeled after and rapes her. I think the moral of the story was that a screwed up society doesn't get fixed just by transferring the rot to computers.
 

re'up

Superstar
Joined
May 26, 2012
Messages
19,835
Reputation
5,984
Daps
62,139
Reppin
San Diego
I understand the appeal on some level. but, to me, the intensity and thrill of human relationships, sexual/physical is what I want. The rawness and the "realness' for lack of a better word. It's same reason why I don't really engage with strip clubs (but I will strippers) or paying for sex, it's not a moral thing, it's just I know a part of it is a performance.

but, in this era, people's need for digital reassurance and digital contact, how different is people who start a whole text based relationship with someone before meeting or sleeping together? There's some girls I know like that with ME, some of them have boyfriends or whatever, but they seem to get that validation and comfort from texts alone. I know some women who want a good morning text EVERY morning. How different is AI except it can't and wont ghost you?

on a deeper level, I even understand that the thrill of losing someone, is part of the whole realness of human interaction. That last kiss or that last sexual encounter. That's where some of the passion comes from. but for some people, they just want to relieve that anxiety/and I get that.
 

Json

Superstar
Joined
Nov 21, 2017
Messages
12,309
Reputation
1,328
Daps
37,232
Reppin
Central VA
On some level, the hierarchy of needs is being met on a digital level.

When was a wee lass there was always a separation with machines. Games were played at home for a few hours. You didn’t have access to it 24 hours and even with portable games it was only the simpler games. And online now means you don’t need to physically be in the room with other players.

Staring at the screen has become synonymous with real life starting with for kids at a young age.
 

Vandelay

Waxing Intellectual
Joined
Apr 14, 2013
Messages
23,001
Reputation
5,610
Daps
79,747
Reppin
Phi Chi Connection
We're going to take the chance and mystery out of romantic situations and the irony of this statement will come to fruition; sex will become porn, and subsequently unfulfilling.

There needs to be challenge, hope, disappointment, frustration, innuendo, seduction...sex sucks when you don't have these. It literally becomes as repulsive as porn after you nut.

What Pac say..."I don't want it, if it's that easy..."
 
Top