JBO Breh Makes Taylor Swift AI Photos, Goes Viral

Cakebatter

All Star
Joined
Mar 11, 2022
Messages
3,007
Reputation
791
Daps
10,423
Wrong.. You can't SELL or make a profit... You can definitely draw, paint or computer create anything you want
This part is true. It's a work of art. The problem lies with the sexual or explicit nature of the distributed work. If Taylor Swift's legal team can show damages to her revenue stream or reputation, she can take legal action without fear of it getting thrown out. The artist would also also have to worry about a judge conflating this with existing revenge porn laws.
 
Joined
Mar 11, 2022
Messages
277
Reputation
120
Daps
1,761
So uhhh not only is the fake content being deleted whoever created it should be going to damn jail yea?
To be honest the way twitter works nowadays you get rewarded financially the more people engage with your tweets. There could be a case that the platform or the person behind the tweets could be sued for using a celeb’ AI porn to gain money, or at least laws could be created against that.

And honestly I’m pretty disturbed by the whole thing. I know the platform is full of bots, but shyt, it’s a sign of the times event level. We as a society really need to draw the lines somewhere and not promote this kind of content or take it lightly. :francis:
 

bnew

Veteran
Joined
Nov 1, 2015
Messages
46,441
Reputation
7,543
Daps
138,339
Wrong.. You can't SELL or make a profit... You can definitely draw, paint or computer create anything you want

won't be the case for long..


Congress Is Trying to Stop AI Nudes and Deepfake Scams Because Celebrities Are Mad​

Lawmakers are introducing new bills to protect famous actors and musicians from ‘AI Fraud’—and maybe the rest of us.


By Janus Rose

NEW YORK, US

January 16, 2024, 11:12am

1705421503279-screen-shot-2024-01-16-at-111121-am.png

IMAGE: TIKTOK

If you’ve been on TikTok lately, you may have noticed weird videos of celebrities promoting extremely shady products, such as a robotic-sounding Taylor Swift promising viewers a free cookware set. All of these videos are scams created with generative AI—the latest example of how the technology is being used to create disturbing virtual clones of people without their consent.

Needless to say, this kind of thing has pissed off a lot of famous people. And now, Congress is proposing new legislation that aims to combat AI deepfakes—specifically when it comes to things like fake celebrity endorsements and non-consensual AI-generated nudes, which have become a problem online and in high schools. Despite the surging popularity of websites and apps designed to generate deepfakes, there's no comprehensive law on the books banning the creation of AI images.

The new bill, called the No AI FRAUD Act and introduced by Rep. María Elvira Salazar (R-FL) and Rep. Madeleine Dean (D-PA), would establish legal definitions for “likeness and voice rights,” effectively banning the use of AI deepfakes to nonconsensually mimic another person, living or dead. The draft bill proclaims that “every individual has a property right in their own likeness and voice,” and cites several recent incidents where people have been turned into weird AI robots. It specifically mentions recent viral videos that featured AI-generated songs mimicking the voices of pop artists like Justin Bieber, Bad Bunny, Drake, and The Weeknd.



The bill also specifically targets AI deepfake porn, saying that “any digital depiction or digital voice replica which includes child sexual abuse material, is sexually explicit, or includes intimate images” meets the definition of harm under the act.

The proposed Act is a companion to a similar bill in the Senate, called the Nurture Originals, Foster Art, and Keep Entertainment Safe Act ( NO FAKES Act), which was introduced last October in the aftermath of the viral deepfaked Drake song. The new bill was also introduced the same day as another measure proposed by lawmakers in Tennessee, called the Ensuring Likeness Voice and Image Security Act ( ELVIS Act).

Given that these bills seem to be a response to celebrities getting mad, either in whole or in part, the big question is whether or not they would in practice protect normal people—and not just the intellectual property rights of pop stars with multi-million dollar record deals.

“It’s really drafted with an eye toward the property rights that celebrities and recording artists have in their likeness and voices,” Carrie Goldberg, an attorney who specializes in deepfakes and other internet-based harassment, told Motherboard. “However, our legal system treats the intellectual property of celebrities differently than those of people not in the public eye.”

The most common example is paparazzi photos, Goldberg said. The law allows some redress for celebrities when their photos are taken without permission and used for commercial gain. But for the average person, the rights to their photos belong solely to the person who took them, and there’s not much they can do about someone reproducing their image for reasons other than profit—unless they have the money to spend on an expensive and often lengthy legal process.

“For normal people, when their image is exploited, it’s not usually for commercial gain but instead to embarrass or harass them; and the wrongdoer in these situations is rarely somebody who has the resources to make a lawsuit worthwhile for the victim,” said Goldberg.

The new bill states that everyone has a right to control their own voice and likeness against deepfakes, but the provisions for non-famous people depend heavily on the victim proving harm. Specifically, that means proving that the deepfake has resulted in “physical or physical injury,” caused “severe emotional distress,” or is sexually explicit in nature.

Of course, all of this is an attempt to regulate a symptom of a larger problem, which is that tech companies are building massive AI systems with data scraped from the internet and no robust mitigations against the harm they inevitably cause. In an ongoing lawsuit against ChatGPT creator OpenAI, the company recently argued that it shouldn’t be punished for training its AI models with illegal and copyrighted material because it’s “impossible” to create AI systems without doing so.

But the nature of black box AI systems built by companies like OpenAI, Microsoft, and Meta virtually guarantees that these bad things will happen. Recently, researchers found over 3,000 images of child sexual abuse material in a massive dataset used to train almost every major AI system on the market. Companies are also struggling to ensure that their generative AI systems will filter out illegal content, and deepfake porn has been found at the top of Google and Bing image search results. A major issue is that there are numerous apps made by smaller companies or individuals that are designed solely to create non-consensual AI nudes, which advertise their services on major social media platforms and are available on app stores.

Ultimately, says Goldberg, these problems won’t be fully addressed until the companies building these AI systems are held responsible.

“What our society really needs is to be attacking AI and deepfakes on a systemic level and going after the malicious products that are available on mainstream places like the AppStore and GooglePlay that are on the market solely to manipulate images,” said Goldberg. “We need to pressure search engines to not guide people to these products or promote sites that publish these images and we need to require that they make content removal simple for victims.”

TAGGED: SCAMS AI AI SCAMS CELEBRITIES
 

bnew

Veteran
Joined
Nov 1, 2015
Messages
46,441
Reputation
7,543
Daps
138,339
So uhhh not only is the fake content being deleted whoever created it should be going to damn jail yea?

depends on whether or not the creator is within U.S jurisdiction. I doubt anybody would get extradited over this.
 

Givethanks

Superstar
Joined
Dec 4, 2015
Messages
7,486
Reputation
858
Daps
15,111
The man dem called me a "coward" because I think it's weird to make sexual videos of pictures/videos of women using AI, when there's over 50 years and an unlimited amount of porn on the internet
:francis:

This shyt is weirdo behaviour
:francis:
 
Top