Arizona mom who fell victim to deepfake kidnapping scam gives harrowing testimony

An Arizona mother has given emotional testimony about a gruesome ordeal in which con artists used artificial intelligence to mimic her daughter’s voice and fake a kidnapping to demand a ransom.

Speaking before the Senate Judiciary Committee at a hearing on Tuesday, Jennifer DeStefano described her fear when she received a call in April from scammers demanding $1 million for the safe return of her 15-year-old daughter, Brie.

Though the ruse fell apart within minutes, after DeStafano contacted Brie and confirmed she was safe on the ski trip, the sheer terror the mother felt upon hearing what sounded like the girl’s plea for help was completely real. .

While Brie doesn’t have any public social media accounts, her voice can be heard in a handful of interviews for school and sports, her mother said.

DeStefano testified that it was “a typical Friday afternoon” when she received a call from an unknown number, which she decided to answer, thinking it might be a call from a doctor.

Speaking before the Senate Judiciary Committee at a hearing on Tuesday, Jennifer DeStefano described a kidnapping scam in which scammers used AI to replicate her daughter’s voice

DeStefano's 15-year-old daughter Brie (with hair above) was safe with her father on a ski trip, but scammers briefly convinced her mother that they had kidnapped the girl

DeStefano’s 15-year-old daughter Brie (with hair above) was safe with her father on a ski trip, but scammers briefly convinced her mother that they had kidnapped the girl

“I picked up the phone ‘Hello,’ on the other end our daughter Briana was sobbing and crying and said ‘Mommy,'” DeStefano told the Senate panel.

The mother thought at first that her daughter had hurt herself during the skiing trip, and kept calm by asking the girl what had happened.

Briana went on to “Mom, I messed up” with more crying and sobbing. Without thinking twice, I asked her again, ‘Okay, what happened?’” the mother continued.

“Suddenly a man’s voice barked at her to ‘lie down and put your head back’. At that point I started to panic. My anxiety escalated and I demanded to know what was happening, but nothing could have prepared me for her reaction.

“Mom, these bad men got me, help me, help me!!” She begged and begged as the phone was taken from her.

“A menacing and vulgar man took over the call: ‘Listen, I have your daughter, you tell everyone, you call the police, I’m going to pump her stomach so full of drugs, I’m going to have my way with her.’ , drop her off in Mexico and you’ll never see her again!”

“All the while, Briana was desperately begging in the background, ‘Mommy help me!!!'”

At the time of the call, DeStefano was at another daughter’s rehearsal, and as she muted the scammers, she screamed for help and attracted other moms who began calling 911 and trying to contact her husband or Brie.

1686726632 349 Arizona mom who fell victim to deepfake kidnapping scam gives

“Mom, these bad men got me, help me, help me!!” She begged and begged when the phone was taken from her,” DeStefano testified

Meanwhile, DeStefano did her best to keep the “kidnappers” talking until the police arrived.

The “kidnappers” demanded a $1 million ransom, but when a panicked DeStefano told them this was impossible, they quickly lowered their demand to $50,000.

She testified, “At this point the mother who called 911 came in and told me that 911 was aware of an AI scam where they can replicate your lover’s voice.

Brie was safe on a ski trip, completely unaware of the terror her mother had endured

Brie was safe on a ski trip, completely unaware of the terror her mother had endured

“I didn’t believe this was a scam. It wasn’t just Brie’s voice, it was her cries, it was her sobs that were unique to her. It was not possible to pretend to protest.

“She told me that AI can also mimic inflection and emotion. That gave me a little hope, but it wasn’t enough.’

She continued, “I asked for wiring instructions and routing numbers for the $50,000, but was denied. “Oh no” the man demanded, “that’s traceable, this isn’t going to happen like that. We’re coming to pick you up!”

‘”What?” I yelled, “You agree to be picked up in a white van with a bag over your head so you don’t know where we’re taking you. You better have all $50,000 in cash or both you and your daughter are dead! If you don’t agree, you’ll never see your daughter again!” he screamed.’

Despite her horror, DeStefano kept her calm and continued to negotiate the details of her own kidnapping to buy time.

At that time, another mother approached her and confirmed that after reaching her by phone, her daughter was absolutely safe and on a ski trip with her father.

‘My mind is spinning. I can’t remember how many times I needed reassurance, but when I finally realized she was safe, I was furious,” DeStefano testified.

Friends were able to quickly confirm Brie's safety within minutes of the hoax call

Friends were able to quickly confirm Brie’s safety within minutes of the hoax call

Meanwhile, the outraged mother was still on the phone with the hoax kidnappers.

“I lashed out at the men for such a horrific attempt to defraud and extort money. Going so far as to fake my daughter’s abduction was more than the lowest of the low for money.

“They kept threatening to kill Brie. I promised that I would stop them, that not only would they never hurt my daughter, but that they would not continue to harm others with their plan.’

Enraged, DeStefano said that when she tried to report it to the police, the case was dismissed as a “prank call.”

She called on Congress to take action to help prevent criminal misuse of emerging AI technology.

As our world moves at a breakneck pace, the human element of familiarity that forms the foundation of our social fabric of what is “known” and what is “truth” is being revolutionized with artificial intelligence. Some for the better and some for the worse,” she said.

“If left unchecked, unguarded, and without consequences, it will rewrite our understanding and perception of what is and what is not truth. It affects our sense of ‘trust’ because it affects our confidence in what is real and what is not.’

DeStefano called on Congress to take action to help prevent criminal misuse of emerging AI technology

DeStefano called on Congress to take action to help prevent criminal misuse of emerging AI technology

Senators and witnesses appear at Tuesday's hearing titled Artificial Intelligence and Human Rights

Senators and witnesses appear at Tuesday’s hearing titled Artificial Intelligence and Human Rights

AI voice cloning tools are widely available online, and DeStefano’s experience is part of an alarming rash of similar hoaxes that have swept the country.

“AI speech cloning, now almost indistinguishable from human speech, enables threat actors such as scammers to more effectively obtain information and money from victims,” Wasim Khaled, CEO of Blackbird.AI, told AFP.

A simple web search yields a wide variety of apps, many of which are available for free, to create AI voices using a small sample—sometimes just seconds—of a person’s real voice that can be easily stolen from information posted online. contents.

With a small audio clip, an AI voice clone can be used to record voicemails and speech texts. It can even be used as a live voice changer for phone calls,” said Khaled.

Scammers may use different accents or genders or even mimic the speech patterns of loved ones. [The technology] makes it possible to create convincing deepfakes.’

In a global survey of 7,000 people from nine countries, including the United States, one in four people said they had experienced or knew someone who had experienced a voice cloning AI scam.

Seventy percent of respondents said they weren’t sure they could “tell the difference between a cloned voice and the real voice,” according to the survey published last month by US-based McAfee Labs.

US officials have warned of an increase in what is popularly known as the “grandparent scam,” in which an imposter poses as a grandchild in dire need of money in a dire situation.

‘You’re being called. There’s a panicked voice on the line. It’s your grandson. He says he’s in big trouble — he wrecked the car and ended up in jail. But you can help by sending money,” the US Federal Trade Commission said in a March warning.

‘It sounds just like him. How can it be a scam? Voice cloning, that’s how.’

The comments below the FTC’s warning included multiple testimonials from seniors who had been defrauded in this way.