Houston couple scammed out of thousands after thieves use AI to clone their son’s voice

A couple from Houston, Texas, say the thieves conned them out of thousands of dollars and used artificial intelligence to make their voices sound like their son’s.

Fred and Kathy, who did not give their last names, spoke with khou about their experience, including the convoluted backstory the scammers told them about a faked car accident involving a pregnant woman and her critically injured son.

‘This is a serious situation. He hit a woman who was six months pregnant,” Kathy said she was told. “It was going to be a high profile case and she lost the baby.”

The robbers told the parents they needed $15,000 to get their son out of prison and took the situation so seriously that Kathy had to postpone chemotherapy for her cancer.

Fred and Kathy said they are now telling their story in the hope that it can prevent someone else from finding themselves in a similar and increasingly common situation.

A Houston, Texas, couple says the thieves conned them out of thousands of dollars and used artificial intelligence to make their voices sound like their son’s.

Fred and Kathy said the situation started last Wednesday when their home phone rang. When picking up, they said they heard the alarmed voice of their own son.

The father said that the person on the other end of the phone told him that he had been in a serious car accident and had hurt someone else.

The couple was immediately convinced that it was their own son who needed help.

I could have sworn I was talking to my son. We had a conversation,’ Kathy told KHOU.

Authorities, however, said the artificial intelligence most likely faked her son’s voice.

The scammer told the frightened mother and father that their son was in county jail and was going to be charged with DWI. The person also said his son had suffered serious injuries, including a broken nose in the accident.

Still believing her son was in danger, Kathy said she didn’t hesitate.

You are playing with my children. I will do anything for my children,’ Kathy said.

“Actually, they don’t need as much as you think,” said Eric Devlin of Lone Star Forensics. “They can get it from different sources: from Facebook, from videos you have public, Instagram, anything you post,” Devlin continued.

They were told that $15,000 was the amount they needed to rescue their son, but the amount was eventually reduced to $5,000. The scammers even offered to come collect the money to expedite the release of their son.

It wasn’t until after they handed over the money that they realized they had been duped; the couple’s son had been at work the entire time.

Surprisingly, a forensic expert said that not only are voice cloning situations becoming common, but it’s not even that difficult for scammers.

“Actually, they don’t need as much as you think,” said Eric Devlin of Lone Star Forensics.

“They can get it from different sources: from Facebook, from videos you have public, Instagram, anything you post,” Devlin continued.

Fred and Kathy are now using their story to help protect others.

“I mean we put the $5,000 together, but the next person could give them the last penny that belongs to them,” Kathy said.

Cases of artificial intelligence causing trouble on the internet and in real life have become commonplace in recent months and even some of the biggest names have not been immune.

In February, a fake video of Joe Rogan promoting a libido enhancer for men went viral on TikTok, with many online calling it “disturbingly real.”

At the time, the video caused a major wave of fear over concerns that it could lead to serious scams and the spread of waves of misinformation.

Many Twitter users pointed out in February that it is illegal to recreate someone with artificial intelligence to promote a product.

The ‘eerily real’ clip shows Joe Rogan discussing the Alpha Grind brand with guest lecturer Andrew D. Huberman on The Joe Rogan Experience podcast.

The clip also shows users how to find Alpha Grind on Amazon.

One user, stunned by the deepfake announcement, said: “Deepfake moderation will be more prevalent within the advertising realm soon.” Bullish on advertising monitoring software.’

The clip shows Rogan and Professor Andrew D. Huberman on The Joe Rogan Experience’s Huberman podcast talking about the male enhancer that claims to boost testosterone.

The video scrolls over to Amazon to show users where they can find Alpha Grind, and the clip shows a 15 percent off coupon for the enhancer.

The Rogan deepfake is just one of many released to the masses: One in 2022 showed Meta CEO Mark Zuckerberg thanking Democrats for their “service and inaction” on antitrust law.

Related Post