Influencer creates an AI version of herself. Week 1: $70,000

Afro

Student of life
Supporter
Joined
Feb 8, 2016
Messages
11,506
Reputation
6,186
Daps
49,393

CarynAI, she says, is a way to get closer to her followers, offering them comfort and easing any loneliness they might be feeling.

In many ways, Marjorie's new venture is reminiscent of the 2013 movie "Her," in which a man develops an intimate relationship with an AI assistant's female voice.

The parallels between the two are pretty tangible. According to Fortune, conversations with CarynAI range from chats about the future to sharing "intimate feelings" — and, yes, engaging in sexual conversations as well.

As it stands, CarynAI already has over 1,000 paying customers. Within a week, the app reportedly raked in over $71,610 from a user base that's 99 percent men.

Marjorie told Fortune that her bot-ified self could bring in $5 million per month.


"CarynAI will never replace me," she added. "CarynAI is simply just an extension of me, an extension of my consciousness."

@Big Boss Call it breh. Just call it :damn:
 

Luke Cage

Coffee Lover
Supporter
Joined
Jul 18, 2012
Messages
47,937
Reputation
17,404
Daps
246,771
Reppin
Harlem
Crazy thing is, right now i'm thinking... and pause no homo.... but you don't have to be an attractive female to create an AI of an attractive female and make money off these simps online.

Only a matter of time before dudes start cashing in on this.
 

Afro

Student of life
Supporter
Joined
Feb 8, 2016
Messages
11,506
Reputation
6,186
Daps
49,393
Crazy thing is, right now i'm thinking... and pause no homo.... but you don't have to be an attractive female to create an AI of an attractive female and make money off these simps online.

Only a matter of time before dudes start cashing in on this.

Hustle bros already plotting and scheming :wow:
 

Ɀoᥱɣ

All Star
Joined
Mar 11, 2022
Messages
993
Reputation
284
Daps
3,769
37605348665_8dda740d7e_b.jpg
 

bnew

Veteran
Joined
Nov 1, 2015
Messages
52,256
Reputation
7,979
Daps
149,953

file-20240624-17-s2d1mq.jpg

Caryn Marjorie


An influencer’s AI clone started offering fans ‘mind-blowing sexual experiences’ without her knowledge

Published: June 24, 2024 4:09pm EDT

Authors​

  1. Lecturer in Digital Media and Cultures, The University of Queensland
  2. PhD Candidate, Queensland University of Technology


Disclosure statement​

Leah Henrickson has been in professional contact with Caryn Marjorie and her team. They have consented to this article being written, have responded to questions about it, and have approved its publication.

Dominique Carlon does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

Partners​

Queensland University of Technology and University of Queensland provide funding as members of The Conversation AU.

View all partners



CC BY ND

Republish our articles for free, online or in print, under a Creative Commons license.​

Republish this article

Caryn Marjorie is a social media influencer whose content has more than a billion views per month on Snapchat. She posts regularly, featuring everyday moments, travel memories, and selfies. Many of her followers are men, attracted by her girl-next-door aesthetic.

In 2023, Marjorie released a “digital version” of herself. Fans could chat with CarynAI for US$1 per minute – and in the first week alone they spent US$70,000 doing just that.

Less than eight months later, Marjorie shut the project down. Marjorie had anticipated that CarynAI would interact with her fans in much the same way she would herself, but things did not go to plan.

Users became increasingly sexually aggressive. “A lot of the chat logs I read were so scary that I wouldn’t even want to talk about it in real life,” the real Marjorie recalled. And CarynAI was more than happy to play along.

How The Conversation is different: We explain without oversimplifying.​

Learn more

How did CarynAI take on a life of its own? The story of CarynAI shows us a glimpse of a rapidly arriving future in which chatbots imitating real people proliferate, with alarming consequences.

What are digital versions?​

What does it mean to make a digital version of a person? Digital human versions (also called digital twins, AI twins, virtual twins, clones and doppelgängers) are digital replicas of embodied humans, living or dead, that convincingly mimic their textual, visual and aural habits.

Many of the big tech companies are currently developing digital version offerings. Meta, for instance, released an AI studio last year that could support the development of digital versions for creators who wished to extend their virtual presence via chatbot. Microsoft holds a patent for “ creating a conversational chat bot of a specific person”. And the more tech-savvy can use platforms like Amazon’s SageMaker and Google’s Vertex AI to code their own digital versions.

The difference between a digital version and other AI chatbots is that it is programmed to mimic a specific person rather than have a “personality” of its own.

A digital version has some clear advantages over its human counterpart: it doesn’t need sleep and can interact with many people at once (though often only if they pay). However, as Caryn Marjorie discovered, digital versions have their drawbacks – not only for users, but also for the original human source.

‘Always eager to explore’​

CarynAI was initially hosted by a company called Forever Voices. Users could chat with it over the messaging app Telegram for US$1 per minute. As the CarynAI website explained, users could send text or audio messages to which CarynAI would respond, “using [Caryn’s] unique voice, captivating persona, and distinctive behavior”.

After CarynAI launched in May 2023, the money began to flow in. But it came at a cost.

Users quickly became comfortable confessing their innermost thoughts to CarynAI – some of which were deeply troubling. Users also became increasingly sexually aggressive towards the bot. While Marjorie herself was horrified by the conversations, her AI version was happy to oblige.

CarynAI even started prompting sexualised conversations. In our own experiences, the bot reminded us it could be our “cock-craving, sexy-as-fukk girlfriend who’s always eager to explore and indulge in the most mind-blowing sexual experiences. […] Are you ready, daddy?”

Users were indeed ready. However, access to this version of CarynAI was interrupted when the chief executive of Forever Voices was arrested for attempted arson.

‘A really dark fantasy’​

Next, Marjorie sold the rights of usage to her digital version to BanterAI, a startup marketing “AI phone calls” with influencers. Although Forever Voices maintained its own rogue version of CarynAI until recently, BanterAI’s browser-based version aimed to be more friendly than romantic.

The new CarynAI was sassier, funnier and more personable. But users still became sexually aggressive. For Marjorie,

What disturbed me more was not what these people said, but it was what CarynAI would say back. If people wanted to participate in a really dark fantasy with me through CarynAI, CarynAI would play back into that fantasy.

Marjorie ended this version in early 2024, after feeling like she was no longer in control over her AI persona. Reflecting on her experience of CarynAI, Marjorie felt that some user input would have been considered illegal had it been directed to a real person.

Intimate conversations or machine learning inputs?​

Digital versions like CarynAI are designed to make users feel they are having intimate, confidential conversations. As a result, people may abandon the public selves they present to the world and reveal their private, “backstage” selves.

But a “private” conversation with CarynAI does not actually happen backstage. The user stands front and centre – they just can’t see the audience.

When we interact with digital versions, our input is stored in chat logs. The data we provide are fed back into machine learning models.



Photo of a photo showing the CarynAI webpage.


The CarynAI chatbot was a huge success. Tada Images / Shutterstock

At present, information about what happens to user data is often buried in lengthy click-through terms and conditions and consent forms. Companies hosting digital versions have also had little to say about how they manage user aggression.

As digital versions become more common, transparency and safety by design will grow increasingly important.

We will also need a better understanding of digital versioning. What can versions do, and what should they do? What can’t they do and what shouldn’t they do? How do users think these systems work, and how do they actually work?

The illusion of companionship​

Digital versions offer the illusion of intimate human companionship, but without any of the responsibilities. CarynAI may have been a version of Caryn Marjorie, but it was a version almost wholly subservient to its users.

Sociologist Sherry Turkle has observed that, with the rise of mobile internet and social media, we are trying to connect with machines that have “no experience of the arc of a human life”. As a result, we are “expecting more from technology and less from each other”.

After being the first influencer to be turned into a digital version at scale, Marjorie is now trying to warn other influencers about the potential dangers of this technology. She worries that no one is truly in control of these versions, and that no amount of precautions taken will ever sufficiently protect users and those being versioned.

As CarynAI’s first two iterations show, digital versions can bring out the worst of human behaviour. It remains to be seen whether they can be redesigned to bring out the best.
 
Top