Georgia mom gets fake ransom call where scammers used AI to mimic her 22-year-old daughter’s voice

A Georgia mom was latest to face a shocking AI phone scam, using the voice of her 22-year-old daughter who said she had been kidnapped and demanded a $50,000 ransom for her safe return.

So-called scams, where a fraudster poses as someone to steal money, are the most common scams in the US, causing Americans to lose $2.6 billion in 2022 alone. This is reported by the Federal Trade Commission.

Debbie Shelton Moore received a six-minute phone call from what she thought was her daughter Lauren, 22, who lives separately from her.

“It just sounded so much like her. It was 100 percent believable,” Moore said. “Enough to almost give me a heart attack from sheer panic.”

The scam demanded money for the daughter’s return – but she was safe all along and had not been kidnapped.

A Georgia mother became the latest to face a shocking AI phone scam, using the voice of her 22-year-old daughter who said she had been kidnapped and demanded a $50,000 ransom for her safe return.

Debbie Shelton Moore (pictured right) ended up getting a six-minute phone call from what she thought was her daughter Lauren (pictured left), 22, who lives apart from her

Debbie Shelton Moore (pictured right) ended up getting a six-minute phone call from what she thought was her daughter Lauren (pictured left), 22, who lives apart from her

DailyMail.com previously reported that fraudsters can mimic a victim’s voice with just a three-second snippet of audio, often stolen from social media profiles.

It is then used to call a friend or relative to convince them that they are in trouble and in urgent need of money.

Shelton Moore had initially thought Lauren had been in a car accident and was asking for help, until she heard three male voices.

“The man had said, ‘Your daughter has been kidnapped and we want $50,000.’ Then they made her cry, like “Mom, Mom” ​​in the background. It was her voice and that’s why I went crazy,” she said 11 alive.

Shelton Moore grew more nervous when she checked the location of Lauren’s phone and discovered she was stranded on a highway.

‘I [was] think she’s in the back because he said, “We got her in the back of the truck.”

Fortunately, her husband – who works in cybersecurity – overheard the conversation and sensed something was up. He FaceTimed Lauren, who confirmed she was in no danger and revealed that his wife was conned.

“It was all kind of blurry because all I was thinking was, ‘How am I going to have my daughter?’ How on earth are we supposed to get money for him?” she added.

They eventually called the county sheriff’s office and they confirmed Lauren’s safety.

Shelton Moore initially thought Lauren had been in a car accident and was asking for help until she heard three male voices

Shelton Moore initially thought Lauren had been in a car accident and was asking for help until she heard three male voices

Fortunately, her husband - who works in cybersecurity - overheard the conversation and sensed something was up.  He FaceTimed Lauren, who confirmed she was in no danger and revealed that his wife was conned

Fortunately, her husband – who works in cybersecurity – overheard the conversation and sensed something was up. He FaceTimed Lauren, who confirmed she was in no danger and revealed that his wife was conned

“My heart is beating and I’m shaking,” she recalled when she received the call. “I tremble now when I think about it.”

The scam is something that has affected a surprising number of Americans. One in four respondents on an April McAfee survey said they had some experience of an AI speech scam and one in 10 said they were personally targeted.

“I’m very knowledgeable about scammers and scams and IRS scams and false jury duty,” Moore said. “But when you hear their voice, of course you don’t think clearly and you panic.”

Police recommend having a “safe phrase” that you and your family can use to prove they are not artificial.

The proliferation of accessible and advanced AI makes scams faster and easier to execute, said Steve Grobman, McAfee Chief Technology Officer.

“One of the things that’s most important to acknowledge with the advancements in AI this year is that it’s largely about bringing these technologies within the reach of many more people, including really enabling the scale within the cyber actor community,” Grobman cautioned.

“Cybercriminals can use generative AI to fake voices and deepfakes in ways that used to require much more sophistication.”

Keep an eye out: AI technology is fueling an explosion of voice cloning scams, experts warn (stock image)

Keep an eye out: AI technology is fueling an explosion of voice cloning scams, experts warn (stock image)

Vice President Kamala Harris also told CEOs of leading technology companies in May that they have a growing moral responsibility to limit the societal harm of their AI products.

Vonny Gamot, Head of EMEA at McAfee said: “Advanced artificial intelligence tools are changing the game for cybercriminals. Now, with very little effort, they can clone a person’s voice and trick a close contact into sending money.

“Artificial intelligence offers incredible opportunities, but with any technology there is always the possibility that it will maliciously end up in the wrong hands,” she added.

“This is what we’re seeing today with the access and ease of use of AI tools that help cybercriminals scale their efforts in increasingly compelling ways.”

HOW TO AVOID FALLING FOR THE SCAM: EXPERT TIPS

McAfee has shared a series of tips to prevent people from being caught by the AI ​​fraudsters.

They are…

1. Set up a ‘code word’ with children, relatives or trusted close friends that only they can know. Make a plan to always ask when they call, text, or email asking for help, especially if they are older or more frail.

2. Always ask the source: If it’s a call, text or email from an unknown sender, or even if it’s from a number you recognize, stop, pause and think. Asking pointed questions can scare off a scammer.

For example: “Can you confirm my son’s name?” or, “When is your father’s birthday?” Not only may this surprise the scammer, but they may also have to generate a new response, which can add unnatural pauses in the conversation and arouse suspicion.

3. Don’t let your emotions get the better of you. Cybercriminals rely on your emotional connection to the person they are impersonating to get you to take action.

Take a step back before responding. Does that really sound like them? Is this something they would ask of you? Hang up and call the person directly or try verifying the information before responding.

4. Consider answering unexpected calls from unknown phone numbers. It is generally good advice not to answer calls from strangers. If they leave a voicemail, it gives you time to think and independently contact loved ones to confirm their safety.