‘Nudify’ Apps That Use AI to ‘Undress’ Women in Photos Are Soaring in Popularity

O.Red

Veteran
Joined
Jun 1, 2012
Messages
17,921
Reputation
5,476
Daps
71,479
Reppin
NULL
So that's where all these nude Beyonce photos are poppin up :snoop:

cats gonna get sued like shyt because of this and it's gon be ugly.
Sue who? Sue how?

I remember a few years ago Scarlett Johansson was begging nikkas to stop making porn fakes of her because it got so out of hand. She only made nikkas go harder
:francis:
 

TELL ME YA CHEESIN FAM?

I walk around a little edgy already
Joined
Jul 1, 2012
Messages
48,631
Reputation
2,697
Daps
126,926
Reppin
The H
IMG-0387.png
IMG-0386.jpg
This shyt gon be a problem
nikka nerd really selling them shyts?
:gucci:
 

Umoja

Veteran
Joined
Dec 29, 2016
Messages
15,272
Reputation
3,310
Daps
105,599
This shyt lame but that's the era we're in. Wild that we have a bunch of nerds who watched or read science fiction about advanced societies and scientific breakthroughs and said "yea but wouldn't it be cool if we just made everything worse with AI in order to make money." The scamming and bs we're gonna see will be wild. Just wait until they start using AI recreations of people's voices to access banking and medical info.
That isn't even the shyt that worries me.

What worries me is A.Is unfettered access to social media. We see targeted ads. The thing that concerns me is A.I being used to identify and influence at risk individuals.

Feels like we're on the titanic right before he decided to steam through the iceberg.
 

MikelArteta

Moderator
Staff member
Supporter
Joined
Apr 30, 2012
Messages
252,234
Reputation
31,762
Daps
771,002
Reppin
Top 4
Y’all giving AI easy guess. Chicks who already been naked or wear skimpy clothes

Tell me when it can make a naked Oprah or Helen Mirren realistically

Helen Mirren been naked many times :heh:
 

bnew

Veteran
Joined
Nov 1, 2015
Messages
58,835
Reputation
8,672
Daps
163,050


1/2
Jason Koebler

The simple truth is that nonconsensual porn AI "nudify" apps are viable because they get almost all of their traffic from social media ads. One of the largest one is getting *90 percent* of its traffic from Instagram ads. 90 percent! This is only going to get worse

Instagram Ads Send This Nudify Site 90 Percent of Its Traffic

bafkreihhy436f2pqlbnqoo6chav7lrtu76wgq26aqz6ihgkg7ha5bzwr2u@jpeg


2/2
‪remblanc.com‬ ‪@remblanc.com‬

instagram is the only platform on the internet that manages to consistently advertise me drugs no matter if i actually want them or not. their ads are so cooked

To post tweets in this format, more info here: https://www.thecoli.com/threads/tips-and-tricks-for-posting-the-coli-megathread.984734/post-52211196




Instagram Ads Send This Nudify Site 90 Percent of Its Traffic​


· Jan 15, 2025 at 7:52 AM

A service for creating AI-generated nude images of real people is running circles around Meta’s moderation efforts.

Instagram Ads Send This Nudify Site 90 Percent of Its Traffic
Photo by Eyestetix Studio / Unsplash

An AI app for creating nonconsensual nude images of anyone is getting the vast majority of its traffic directly from Meta platforms, where the app is buying thousands of explicit ads featuring nonconsensual nudity of celebrities and influencers. The blatant and repeated violation of Meta’s policies over the course of months is making a mockery of the company’s ability or willingness to moderate a known bad actor that at the moment appears to get the majority of its users by paying Meta directly for ads.

The app, known as Crushmate or Crush AI, has been buying ads on Facebook, Instagram, and other Meta platforms since at least early September. As first reported by Alexios Mantzarlis in his Faked Up newsletter, according to internet traffic analysis firm Similarweb, three of the domains Crush uses had around 240,000 visitors combined, with 90 percent of that traffic coming from Facebook or Instagram.

I’ve seen Meta remove some of these ads since September, but at the time of writing the same three domains that were advertised on Meta platforms and redirected to Crushmate’s services had around 350 active ads and more than 5,000 ads overall.

Most of the recent ads use the same format. They take a video a woman posted to Instagram or TikTok and show how a user can pause the video on any frame and create a nude image of her. Many of the ads, which are still active, do this to videos of the extremely popular OnlyFans creator Sophie Rain, who made headlines recently for making $43 million in one year on OnlyFans. As Mantzarlis points out, one ad nudifies Mikayla Demaiter, a model with 3.2 million followers on Instagram. Rain and Demaiter did not respond to a request for comment.



Gallery Image




Gallery Image


Two of the Crushmate ads

Other ads feature other real women I wasn’t able to identify and AI generated women with their clothes being “erased” by the app.

In early September, a 404 Media reader also tipped me that Crushmate was advertising its services on Facebook Marketplace.



A marketplace ad for Crushmate

I’ve confirmed that all these ads lead to the same Crushmate service that will create nonconsensual nude images and offers some of its services via a subscription plan.



Promotional copy from Crushmate's site.

I’ve recently reported about Meta running ads that feature explicit nudity, including dozens of ads that are just close up images of vaginas. I’ve also reported repeatedly about “nudify” apps buying ads on Meta platforms. When we’ve flagged these ads to Meta in the past, they removed them. Meta has also removed associated Facebook pages that are buying the ads, but Crushmate has found an easy workaround that is clearly paying off: It creates multiple Facebook pages with AI-generated profile images that look like normal people, then buys ads promoting new, different URLs that redirect to to Crushmate.



Gallery Image




Gallery Image


Two of the fake Facebook profiles buying Crushmate ads.

Meta did not respond to specific questions about why it’s not detecting and removing the offending ads for featuring nonconsensual nudity. As I reported last week, extensive testing by AI Forensics, a European non-profit that investigates influential and opaque algorithms, found that nudity uploaded to Instagram and Facebook as a normal user was promptly removed for violating Meta’s Community Standards. The same exact visuals were not removed when they were uploaded as ads, showing that Meta has a different standard for enforcement when it’s getting paid to push images in front of users.

“Meta prohibits ads that promote adult sexual exploitation. We have removed the violating content, enforced against violating urls, and have taken action against the associated accounts and users,” a Facebook spokesperson told me in a statement. “This is a highly adversarial space and bad actors are constantly evolving their tactics to avoid enforcement, which is why we continue to invest in the best tools and technology to help identify and remove violating content.”

💡

Do you know anything else about Crushmate? I would love to hear from you. Using a non-work device, you can message me securely on Signal at ‪emanuel.404‬. Otherwise, send me an email at emanuel@404media.co.

Meta removed the ads promoting the three Crushmate domains after Mantzarlis flagged them to the company. Around 230 of the same ads promoting a fourth Crushmate domain Mantzarlis found after reaching out for comment are still live on Meta’s platforms.

As we’ve previously reported, these nudify apps are some of the most harmful applications of generative AI because they make it so easy to create nonconsensual images of anyone. In the last two years, we’ve seen several examples of these apps being used by minors to create images of other minors. Last year, a survey found that 1 in 10 minors reported that their friends or classmates have used AI tools to generate nudes of other kids. As the Crushmate ads show, minors don’t need to go to the dark corners of the web in search of these tools. Meta is getting paid to popularize them.
 
Top