AI is fuelling a rise in online voice scams, study warns

>

AI technology is fueling an explosion of voice cloning scams, experts warn.

Fraudsters can now mimic a victim’s voice with just a three-second audio clip, often stolen from social media profiles.

It is then used to call a friend or relative to convince them that they are in trouble and in urgent need of money.

One in four Britons say they or someone they know has been targeted by the scam, according to cybersecurity specialist McAfee.

It is so credible that the majority of those affected admit they lost money as a result, with the cost exceeding £1,000 for around a third of victims.

Keep an eye out: AI technology is fueling an explosion of voice cloning scams, experts warn (stock image)

Fraudsters can now mimic a victim's voice with just a three-second audio clip, often stolen from social media profiles.  Cybersecurity specialist McAfee surveyed people around the world to see how many people share their voices online

Fraudsters can now mimic a victim’s voice with just a three-second audio clip, often stolen from social media profiles. Cybersecurity specialist McAfee surveyed people around the world to see how many people share their voices online

A report from the company said AI had “already changed the game for cybercriminals,” with the tools needed to run the scam over the internet for free.

HOW DOES THE SCAM WORK?

  • Fraudsters steal a snippet of a person’s voice from their social media profile, sometimes as short as three seconds
  • They then use this to mimic the victim’s voice
  • The fraudster calls a friend or relative of the victim and uses the faked voice to trick them
  • They then convince the victim’s loved one that they are in trouble and in urgent need of money
  • It’s so compelling that a third of victims admit they’ve lost more than £1,000 as a result

Experts, academics and bosses from across the tech industry are leading the call for tighter regulation of AI as they fear the industry is spiraling out of control.

U.S. Vice President Kamala Harris is meeting today (Wednesday) with the CEOs of Google, Microsoft and OpenAI, the company behind ChatGPT, to discuss how to develop AI responsibly.

They will address the need for safeguards that can mitigate potential risks and emphasize the importance of ethical and trustworthy innovation, the White House said.

McAfee’s report, The Artificial Imposter, said cloning what someone sounds like has become a “powerful tool in a cybercriminal’s arsenal” — and that it’s not hard to find victims.

A survey of over 1,000 UK adults found that half shared their voice data online on social media or voice memos at least once a week.

The research uncovered more than a dozen AI voice cloning tools openly available on the web, many of which are free and require only a basic level of expertise.

In one case, just three seconds of audio was enough to produce an 85 percent match while having no trouble mimicking accents from around the world.

With everyone’s voice being the spoken equivalent of a biometric fingerprint, 65 percent of respondents admitted they weren’t sure they could tell the cloned version from the real one.

The mimicked voice is used to call a friend or relative to convince them that they are in trouble and in urgent need of money (stock image)

The mimicked voice is used to call a friend or relative to convince them that they are in trouble and in urgent need of money (stock image)

A report from the company said AI had

A report from the company said AI had “already changed the game for cybercriminals,” with the tools needed to run the scam over the internet for free. The company surveyed people to see how many people had experienced an AI voice scam themselves, or knew someone who had

Worrying: More than three in 10 Britons said they would reply to a voicemail or voice memo purportedly from a friend or loved one who needed money - especially if they thought it was from a partner, child or parent

Worrying: More than three in 10 Britons said they would reply to a voicemail or voice memo purportedly from a friend or loved one who needed money – especially if they thought it was from a partner, child or parent

The cost of falling for an AI speech scam can be significant, with 78 percent of people admitting to losing money.  About 6 per cent were cheated out of between £5,000 and £15,000

The cost of falling for an AI speech scam can be significant, with 78 percent of people admitting to losing money. About 6 per cent were cheated out of between £5,000 and £15,000

More than three in 10 said they would respond to a voicemail or voice mail supposedly from a friend or loved one who needed money, especially if they thought it was from a partner, child or parent.

Messages most likely to elicit a response were those claiming the sender had been involved in a car accident, been robbed, lost their phone or wallet, or needed help while traveling abroad.

One in 12 said they were personally targeted by some form of AI voice scam, and another 16 percent said it happened to someone they knew.

The cost of falling for an AI speech scam can be significant, with 78 percent of people admitting to losing money. About 6 per cent were cheated out of between £5,000 and £15,000.

Vonny Gamot, Head of EMEA at McAfee said: “Advanced artificial intelligence tools are changing the game for cybercriminals. Now they can clone a person’s voice with very little effort and trick a close contact into sending money.”

She added: “Artificial intelligence brings incredible opportunities, but with any technology there is always the possibility that it could end up maliciously in the wrong hands.

“This is what we’re seeing today with the access and ease of use of AI tools that help cybercriminals scale their efforts in increasingly compelling ways.”

HOW TO AVOID FALLING FOR THE SCAM: EXPERT TIPS

McAfee has shared a series of tips to prevent people from being caught by the AI ​​fraudsters.

They are…

1. Set up a ‘code word’ with children, relatives or trusted close friends that only they can know. Make a plan to always ask when they call, text, or email asking for help, especially if they are older or more frail.

2. Always ask the source: If it’s a call, text or email from an unknown sender, or even if it’s from a number you recognize, stop, pause and think. Asking pointed questions can scare off a scammer.

For example: “Can you confirm my son’s name?” or, “When is your father’s birthday?” Not only may this surprise the scammer, but they may also have to generate a new response, which can add unnatural pauses in the conversation and arouse suspicion.

3. Don’t let your emotions get the better of you. Cybercriminals rely on your emotional connection to the person they are impersonating to get you to take action.

Take a step back before responding. Does that really sound like them? Is this something they would ask of you? Hang up and call the person directly or try verifying the information before responding.

4. Consider answering unexpected calls from unknown phone numbers. It is generally good advice not to answer calls from strangers. If they leave a voicemail, it gives you time to think and independently contact loved ones to confirm their safety.