A Georgia mom was latest to face a shocking AI phone scam, using the voice of her 22-year-old daughter who said she had been kidnapped and demanded a $50,000 ransom for her safe return.
So-called scams, where a fraudster poses as someone to steal money, are the most common scams in the US, causing Americans to lose $2.6 billion in 2022 alone. This is reported by the Federal Trade Commission.
Debbie Shelton Moore received a six-minute phone call from what she thought was her daughter Lauren, 22, who lives separately from her.
“It just sounded so much like her. It was 100 percent believable,” Moore said. “Enough to almost give me a heart attack from sheer panic.”
The scam demanded money for the daughter’s return – but she was safe all along and had not been kidnapped.
A Georgia mother became the latest to face a shocking AI phone scam, using the voice of her 22-year-old daughter who said she had been kidnapped and demanded a $50,000 ransom for her safe return.
Debbie Shelton Moore (pictured right) ended up getting a six-minute phone call from what she thought was her daughter Lauren (pictured left), 22, who lives apart from her
DailyMail.com previously reported that fraudsters can mimic a victim’s voice with just a three-second snippet of audio, often stolen from social media profiles.
It is then used to call a friend or relative to convince them that they are in trouble and in urgent need of money.
Shelton Moore had initially thought Lauren had been in a car accident and was asking for help, until she heard three male voices.
“The man had said, ‘Your daughter has been kidnapped and we want $50,000.’ Then they made her cry, like “Mom, Mom” in the background. It was her voice and that’s why I went crazy,” she said 11 alive.
Shelton Moore grew more nervous when she checked the location of Lauren’s phone and discovered she was stranded on a highway.
‘I [was] think she’s in the back because he said, “We got her in the back of the truck.”
Fortunately, her husband – who works in cybersecurity – overheard the conversation and sensed something was up. He FaceTimed Lauren, who confirmed she was in no danger and revealed that his wife was conned.
“It was all kind of blurry because all I was thinking was, ‘How am I going to have my daughter?’ How on earth are we supposed to get money for him?” she added.
They eventually called the county sheriff’s office and they confirmed Lauren’s safety.
Shelton Moore initially thought Lauren had been in a car accident and was asking for help until she heard three male voices
Fortunately, her husband – who works in cybersecurity – overheard the conversation and sensed something was up. He FaceTimed Lauren, who confirmed she was in no danger and revealed that his wife was conned
“My heart is beating and I’m shaking,” she recalled when she received the call. “I tremble now when I think about it.”
The scam is something that has affected a surprising number of Americans. One in four respondents on an April McAfee survey said they had some experience of an AI speech scam and one in 10 said they were personally targeted.
“I’m very knowledgeable about scammers and scams and IRS scams and false jury duty,” Moore said. “But when you hear their voice, of course you don’t think clearly and you panic.”
Police recommend having a “safe phrase” that you and your family can use to prove they are not artificial.
The proliferation of accessible and advanced AI makes scams faster and easier to execute, said Steve Grobman, McAfee Chief Technology Officer.
“One of the things that’s most important to acknowledge with the advancements in AI this year is that it’s largely about bringing these technologies within the reach of many more people, including really enabling the scale within the cyber actor community,” Grobman cautioned.
“Cybercriminals can use generative AI to fake voices and deepfakes in ways that used to require much more sophistication.”
Keep an eye out: AI technology is fueling an explosion of voice cloning scams, experts warn (stock image)
Vice President Kamala Harris also told CEOs of leading technology companies in May that they have a growing moral responsibility to limit the societal harm of their AI products.
Vonny Gamot, Head of EMEA at McAfee said: “Advanced artificial intelligence tools are changing the game for cybercriminals. Now, with very little effort, they can clone a person’s voice and trick a close contact into sending money.
“Artificial intelligence offers incredible opportunities, but with any technology there is always the possibility that it will maliciously end up in the wrong hands,” she added.
“This is what we’re seeing today with the access and ease of use of AI tools that help cybercriminals scale their efforts in increasingly compelling ways.”