AI is fuelling a rise in online voice scams, study warns
>
AI technology is fueling an explosion of voice cloning scams, experts warn.
Fraudsters can now mimic a victim’s voice with just a three-second audio clip, often stolen from social media profiles.
It is then used to call a friend or relative to convince them that they are in trouble and in urgent need of money.
One in four Britons say they or someone they know has been targeted by the scam, according to cybersecurity specialist McAfee.
It is so credible that the majority of those affected admit they lost money as a result, with the cost exceeding £1,000 for around a third of victims.
Keep an eye out: AI technology is fueling an explosion of voice cloning scams, experts warn (stock image)
Fraudsters can now mimic a victim’s voice with just a three-second audio clip, often stolen from social media profiles. Cybersecurity specialist McAfee surveyed people around the world to see how many people share their voices online
A report from the company said AI had “already changed the game for cybercriminals,” with the tools needed to run the scam over the internet for free.
Experts, academics and bosses from across the tech industry are leading the call for tighter regulation of AI as they fear the industry is spiraling out of control.
U.S. Vice President Kamala Harris is meeting today (Wednesday) with the CEOs of Google, Microsoft and OpenAI, the company behind ChatGPT, to discuss how to develop AI responsibly.
They will address the need for safeguards that can mitigate potential risks and emphasize the importance of ethical and trustworthy innovation, the White House said.
McAfee’s report, The Artificial Imposter, said cloning what someone sounds like has become a “powerful tool in a cybercriminal’s arsenal” — and that it’s not hard to find victims.
A survey of over 1,000 UK adults found that half shared their voice data online on social media or voice memos at least once a week.
The research uncovered more than a dozen AI voice cloning tools openly available on the web, many of which are free and require only a basic level of expertise.
In one case, just three seconds of audio was enough to produce an 85 percent match while having no trouble mimicking accents from around the world.
With everyone’s voice being the spoken equivalent of a biometric fingerprint, 65 percent of respondents admitted they weren’t sure they could tell the cloned version from the real one.
The mimicked voice is used to call a friend or relative to convince them that they are in trouble and in urgent need of money (stock image)
A report from the company said AI had “already changed the game for cybercriminals,” with the tools needed to run the scam over the internet for free. The company surveyed people to see how many people had experienced an AI voice scam themselves, or knew someone who had
Worrying: More than three in 10 Britons said they would reply to a voicemail or voice memo purportedly from a friend or loved one who needed money – especially if they thought it was from a partner, child or parent
The cost of falling for an AI speech scam can be significant, with 78 percent of people admitting to losing money. About 6 per cent were cheated out of between £5,000 and £15,000
More than three in 10 said they would respond to a voicemail or voice mail supposedly from a friend or loved one who needed money, especially if they thought it was from a partner, child or parent.
Messages most likely to elicit a response were those claiming the sender had been involved in a car accident, been robbed, lost their phone or wallet, or needed help while traveling abroad.
One in 12 said they were personally targeted by some form of AI voice scam, and another 16 percent said it happened to someone they knew.
The cost of falling for an AI speech scam can be significant, with 78 percent of people admitting to losing money. About 6 per cent were cheated out of between £5,000 and £15,000.
Vonny Gamot, Head of EMEA at McAfee said: “Advanced artificial intelligence tools are changing the game for cybercriminals. Now they can clone a person’s voice with very little effort and trick a close contact into sending money.”
She added: “Artificial intelligence brings incredible opportunities, but with any technology there is always the possibility that it could end up maliciously in the wrong hands.
“This is what we’re seeing today with the access and ease of use of AI tools that help cybercriminals scale their efforts in increasingly compelling ways.”