AI clones teen girl’s voice in $1M kidnapping scam: ‘I’ve got your daughter’

bnew

Veteran
Joined
Nov 1, 2015
Messages
57,843
Reputation
8,562
Daps
161,302

By
Ben Cost
April 12, 2023 10:31am
Updated

It was a dead “ringer” for her daughter.

Artificial intelligence has taken phone scams to a frightening new level.

An Arizona mom claims that scammers used AI to clone her daughter’s voice so they could demand a $1 million ransom from her as part of a terrifying new voice scheme.

“I never doubted for one second it was her,” distraught mother Jennifer DeStefano told WKYT while recalling the bone-chilling incident. “That’s the freaky part that really got me to my core.”

This bombshell comes amid a rise in “caller-ID spoofing” schemes, in which scammers claim they’ve taken the recipient’s relative hostage and will harm them if they aren’t paid a specified amount of money.

The Scottsdale, Ariz., resident recounted how she received a call from an unfamiliar phone number, which she almost let go to voicemail.

Then DeStefano remembered that her 15-year-old daughter, Brie, was on a ski trip, so she answered the call to make sure nothing was amiss.

That simple decision would turn her entire life upside down: “I pick up the phone, and I hear my daughter’s voice, and it says, ‘Mom!’ and she’s sobbing,” the petrified parent described. “I said, ‘What happened?’ And she said, ‘Mom, I messed up,’ and she’s sobbing and crying.”

NYPICHPDPICT000009517376.jpg

Mom Jennifer DeStefano said the voice on the ransom call sounded just like her daughter Brie’s.
Jennifer DeStefano/Facebook

NYPICHPDPICT000009517375.jpg

“I never doubted for one second it was her,” said DeStefano.
Jennifer DeStefano/Facebook

Her confusion quickly turned to terror after she heard a “man’s voice” tell “Brie” to put her “head back” and “lie down.”

“This man gets on the phone, and he’s like, ‘Listen here. I’ve got your daughter,’ ” DeStefano explained, adding that the man described exactly how things would “go down.”

“You call the police, you call anybody, I’m going to pop her so full of drugs,” the mysterious caller threatened, per DeStefano, who was “shaking” at the time. “I’m going to have my way with her, and I’m going to drop her off in Mexico.”

NYPICHPDPICT000009517374.jpg

Brie was on a ski trip the whole time.
Briana DeStefano/Facebook

All the while, she could hear her daughter in the background pleading, “‘Help me, Mom. Please help me. Help me,’ and bawling.”

That’s when Brie’s faux kidnapper demanded the ransom.

He initially asked for $1 million, but then lowered the figure to $50,000 after DeStefano said she didn’t “have the money.”

The nightmare finally ended after the terrified parent, who was at her other daughter’s studio at the time, received help from one of her fellow moms.

After calling 911 and DeStefano’s husband, they confirmed that Brie was safe and sound on her skiing excursion.

However, for the entire call, she was convinced that her daughter was in peril. “It was completely her voice,” the Arizonan described. “It was her inflection. It was the way she would have cried.”

As it turned out, her progeny never said any of it, and the voice was devised via an AI simulation like a case of long-distance ventriloquism.



The identity of the cybernetic catfish is unknown at this time, but computer science experts say that voice-cloning tech has evolved to the point that someone’s tone and manner of speaking can be re-created from the briefest of soundbites.

“In the beginning, it would require a larger amount of samples,” explained Subbarao Kambhampati, a computer science professor and AI authority at Arizona State University. “Now there are ways in which you can do this with just three seconds of your voice. Three seconds. And with the three seconds, it can come close to how exactly you sound.”

With a large enough sample size, the AI can mimic one’s “inflection” as well as their “emotion,” per the professor.
 
Last edited:

bnew

Veteran
Joined
Nov 1, 2015
Messages
57,843
Reputation
8,562
Daps
161,302
NYPICHPDPICT000009524068.jpg


Brie DeStefano’s kidnapper originally asked for $1 million in ransom.Instagram/briedestefano

Think how Robert Patrick’s sinister T-1000 robot from the sci-fi classic “Terminator 2: Judgment Day” parrots the voice of John Connor’s mom to try to lure him home.

DeStefano found the voice simulation particularly unsettling given that “Brie does NOT have any public social media accounts that has her voice and barely has any,” per a post on the mom’s Facebook account.

“She has a few public interviews for sports/school that have a large sampling of her voice,” described Brie’s mom. “However, this is something to be extra concerned with kids who do have public accounts.”

Indeed, FBI experts warn that fraudsters often find their targets on social media.

NYPICHPDPICT000009524074.jpg

Jennifer DeStefano was particularly disturbed by the fact that her daughter doesn’t even have a big social media presence.Instagram/briedestefano

“If you have it [your info] public, you’re allowing yourself to be scammed by people like this,” said Dan Mayo, the assistant special agent in charge of the FBI’s Phoenix office. “They’re going to be looking for public profiles that have as much information as possible on you, and when they get ahold of that, they’re going to dig into you.”

In order to prevent being hornswoggled, he advises asking the scammer a bunch of questions about the “abductee” that the scammer wouldn’t know.
Mayo also suggested looking out for red flags, such as if they’re calling from an unfamiliar area code or using an international number.
Meanwhile, DeStefano warned people on Facebook to alert authorities if the scam she described happened to them or anyone they knew.

NYPICHPDPICT000009524070.jpg


Jennifer said that the AI was even able to mimic Brie’s inflection.Instagram/briedestefano

“The only way to stop this is with public awareness!” she said. “Also, have a family emergency word or question that only you know so you can validate you are not being scammed with AI! Stay safe!”

Her public service announcement is particularly timely given the recent spate of kidnapper schemes.
Last month, TikToker Beth Royce allegedly received a call from a mysterious man who demanded that she pay him $1,000 or he’d kill her sister. All the while, a woman could be heard sobbing in the background.

Meanwhile, in December, social media user Chelsie Gates received a similar call from a man threatening to kill her mom — whom she also heard weeping in the background — if she didn’t shell out the same amount.

In both instances, the victims forked over the ransom, terrified that the caller would harm their family members.
 
Last edited:

bnew

Veteran
Joined
Nov 1, 2015
Messages
57,843
Reputation
8,562
Daps
161,302
can't post, too many server errors.. :francis:

edit:
the site is becoming a chore to post in.. not as simple as copy/paste anymore.
 
Last edited:

bnew

Veteran
Joined
Nov 1, 2015
Messages
57,843
Reputation
8,562
Daps
161,302


Phone network employs AI "grandmother" to waste scammers' time with meandering conversations​



Scambaiting, Abe Simpson-style​


By Rob Thubron Today 6:19 AM 9 comments



Serving tech enthusiasts for over 25 years.

TechSpot means tech analysis and advice you can trust.

In brief: Human-like AIs have brought plenty of justifiable concerns about their ability to replace human workers, but a company is turning the tech against one of humanity's biggest scourges: phone scammers. The AI imitates the criminals' most popular target, a senior citizen, who keeps the fraudsters on the phone as long as possible in conversations that go nowhere, à la Grandpa Simpson.

The creation of O2, the UK's largest mobile network operator, Daisy, or dAIsy, is an AI created to trick scammers into thinking they are talking to a real grandmother who likes to ramble. If and when the AI does hand over the demanded bank details, it reads out fake numbers and names.

The software is designed to keep people on the line for as long as possible. Not only does this mean less time for the scammers to target real humans, but O2 is also using the conversations to learn the favorite tricks and techniques used in these schemes.

As you can hear in the video, the tricksters aren't happy about being tricked – they become increasingly angry and sweary. The bot is so convincing that it has managed to keep some people on the phone for 40 minutes at a time.

If you've seen any of the several YouTube channels that scam scammers, sometimes by using a voice changer to sound like an old lady, you'll know what to expect. Daisy has been trained with the help of one of the platform's most popular scambaiters, Jim Browning.




"So I tied an onion to my belt, which was the style at the time"

Daisy works by listening to a caller and transcribing their voice to text. Responses are generated through a custom LLM complete with a character personality layer, and are then fed back through a custom AI text-to-speech model to generate a voice answer. All of this takes place in real time.

O2 customers aren't being given access to Daisy so they can wage their own campaign of vengeance against scammers. Instead, the AI tool has been added to a list of 'easy target' numbers used by scammers. Daisy is able to interact with callers 24/7 without any input from human controllers.

Murray Mackenzie, Director of Fraud at Virgin Media O2, said: "We're committed to playing our part in stopping the scammers, investing in everything from firewall technology to block out scam texts to AI-powered spam call detection to keep our customers safe. But crucially, Daisy is also a reminder that no matter how persuasive someone on the other end of the phone may be, they aren't always who you think they are."

Daisy was created in response to research from O2 that found 71% of Brits would like to get their revenge on scammers that have tricked them or their loved ones, but most said they wouldn't engage in scambaiting as they didn't want to waste their time.

While the work being done by the AI can be applauded, its ability to converse with someone so convincingly is unnerving. Ironically, similar technology is also being used by scammers to trick people into thinking they are talking to their relatives.
 
Top