I cloned my voice using AI and the results were terrifying… DailyMail.com tries app that replicated President Joe Biden’s speech to scam voters in New Hampshire

It captured everything from the way I tend to say “Umm” and “Aah” between words, to the way I raise my voice when I ask a question

New Hampshire residents received a strange phone call telling them to skip the primaries, and while it sounded like Joe Biden on the other end of the line, it was an AI clone of his voice.

An anonymous fraudster used the app Eleven Labs last month to replicate Biden’s vote before the attack. I tested the app to see how believable an AI cloned voice is.

The AI-generated voice tricked a friend into thinking a message was really from me.

“Why did you send me a voice note,” my friend replied to the message. ‘Normally you just email, but it’s nice to hear from you!’

My father also admitted that the fake voice might have fooled him and then my wife heard a short message and said, “Oh my God, I want to throw it off a bridge.”

I’ve heard about such apps before, but might have naively assumed that the clones would always have giveaways and telltale signs – when I’m 100 percent sure I could scam anyone from close family to friends to coworkers.

Using the Eleven Labs app requires a 10-minute audio recording of your voice, but the more it feeds to the AI, the more accurate it becomes.

The results captured everything about my tone and word usage: how I tend to say “umm” and “aah” between words and how I raise my pitch when asking questions.

It captured everything from the way I tend to say

It captured everything from the way I tend to say “Umm” and “Aah” between words, to the way I raise my voice when I ask a question

During the attack in New Hampshire, residents were told via the same app: ‘By voting this Tuesday, Republicans can only re-elect Donald Trump in their quest. Your voice makes a difference in November, not this Tuesday,’ victims heard on the phone.’

The scary thing is that the recordings can be generated in real time, so I can easily have a conversation or run a fake message campaign like last month.

For example, I could call my father and ask him to transfer money to me in an emergency.

In fact, anyone can use the app against me to clone my voice and commit fraud under my identity.

For anyone who has a large amount of public voting recordings, such as actors and politicians like President Biden, there is already enough voting data “in the wild” to create an eerily convincing clone.

Eleven Labs is just one of many apps that can do this (and it should be noted that they have a smart security feature before you can create one of the ‘Professional’ voices, which requires you to speak some words on the screen, like a Captcha for your vote).

But scams, where cloned voices are used to defraud people, are “increasingly common,” said Adrianus Warmenhoven, a cybersecurity expert at NordVPN,

Research from cybersecurity company McAfee shows that almost a quarter of respondents have experienced some form of AI voting fraud, or know someone who has been targeted – with 78 percent losing money as a result.

Last year, elderly couple Ruth and Greg Card received a call from their grandson saying he was in jail and needed money, but the voice was fake.

Microsoft also demonstrated a text-to-speech AI model that same year, which can synthesize anyone’s voice from a three-second audio clip.

Warmenhoven said the technology behind “cloned” votes is rapidly improving and also dropping in price, making it accessible to more scammers.

To access Eleven Labs’ ‘Professional’ voices, you will need to pay a monthly subscription of $10.

Other AI apps may have less protection, making it easier for criminals to commit fraud.

“A user’s vulnerability to these types of scams really depends on the number of voice fragments that criminals can use for voice cloning,” Warmenhoven said.

The technology behind 'cloned' voting is improving rapidly and also dropping in price, making it accessible to more scammers.  To access Eleven Labs' 'Professional' voices, you will need to pay a monthly subscription of $10

The technology behind ‘cloned’ voting is improving rapidly and also dropping in price, making it accessible to more scammers. To access Eleven Labs’ ‘Professional’ voices, you will need to pay a monthly subscription of $10

‘The more they have, the more convincing voice clones they can make. Politicians, public figures and celebrities are therefore very vulnerable because criminals can use recordings of events, media interviews, speeches, etc.

She also warned that people who upload videos of themselves to social networks such as Instagram and TikTok could also be at risk.

“There is also a huge amount of video content that users voluntarily upload to social media. So the more publicly available videos users have on social media, the more vulnerable he or she is,” Warmenhoven continues.

‘Be careful what you post on social media. Social media is the largest publicly available source of voice samples for cybercriminals.

You should be concerned about what you post on social media and how it could affect your safety.”

She also said scammers could also try to clone your voice by making phone calls to you to collect voice data for cloning.

‘Scammers are not always out to extort money and data on the first phone call. Collecting enough voice fragments for voice cloning could also be the purpose of the call, Warmenhoven explains.

“Once you know you’re talking to the scammer, hang up and don’t give him or her a chance to record your voice. The more you talk during the call, the more examples of your voice criminals will have and the better clones they will produce.