Houston couple scammed out of thousands after thieves used AI to clone their son’s voice
A Texas couple say they were scammed out of thousands of dollars by thieves who used artificial intelligence to make their voices sound like their son’s.
Fred and Kathy from Houston, who have not given their last name, spoke to I LOVE about their experience, including the convoluted backstory the con artists told them about a mock car accident that left a pregnant woman and their son badly injured.
“This is a serious situation. He hit a lady who was six months pregnant,” Kathy said. “It was going to be a high-profile case and she lost the baby.”
The thieves told the parents they needed $15,000 to bail their son out of jail and they took the situation so seriously that Kathy had to postpone chemotherapy for her cancer.
Fred and Kathy said they are now telling their story in the hope that it might prevent someone else from finding themselves in a similar and increasingly common situation.
A Houston, Texas couple say they were scammed out of thousands of dollars by thieves who used artificial intelligence to make their voices sound like their son’s
Fred and Kathy said the ordeal began last Wednesday when their home phone rang. When they answered, they said they heard their own son’s alarmed voice.
The father said the person on the other end of the line told him that he had been in a bad car accident and that he had hurt another person.
The couple was immediately convinced that it was their own child who needed help.
“I could have sworn I was talking to my son. We had a conversation,” Kathy told KHOU.
However, authorities said it was most likely artificial intelligence that faked their son’s voice.
The scammer told the frightened mother and father that their son was in jail and would be charged with DWI. The person also said their son suffered serious injuries in the crash, including a broken nose.
Still convinced that her child is in danger, Kathy said she didn’t hesitate.
“You play with my children. I’ll do anything for my kids,’ Kathy said.
“They don’t really need as much as you think,” says Eric Devlin (pictured) of Lone Star Forensics. “They can get it from a variety of sources — from Facebook, from videos you have public, Instagram, anything you publish,” Devlin continued
They were told that $15,000 is the amount they needed to save their son, but the number was eventually reduced to $5,000. The scammers even offered to collect the money to expedite their son’s release.
It was only after the money had already been handed over that they realized they had been tricked; the couple’s son had been working the entire time.
Shockingly, a forensic expert said that not only are vocal cloning situations becoming more common, but it’s not even that hard for scammers to reach.
“They don’t really need as much as you think,” says Eric Devlin of Lone Star Forensics.
“They can get it from a variety of sources — from Facebook, from videos you have public, Instagram, anything you publish,” Devlin continued.
Fred and Kathy are now using their story to help protect others.
“I mean, we scraped together the $5,000, but the next person can give them the last penny they own,” Kathy said.
Instances of artificial intelligence causing problems on the internet and in real life have become commonplace in recent months and even some of the most notable names have not been immune.
In February, a deepfake video of Joe Rogan promoting a male libido booster went viral on TikTok with many online calling it “creepy real.”
At the time, the video sparked a major wave of fear over concerns that it could lead to serious scams and waves of misinformation being spread.
Many Twitter users commented in February that it is illegal to impersonate someone with artificial intelligence to promote a product.
The ‘uncannily real’ clip shows Joe Rogan discussing the Alpha Grind brand with visiting professor Andrew D. Huberman on The Joe Rogan Experience podcast
The clip also shows users how to find Alpha Grind on Amazon
One user, puzzled by the deepfake ad, said, “Deepfake moderation will soon become more common in the advertising world. Bullish about ad monitoring software.”
The clip shows Rogan and Professor Andrew D. Huberman on The Joe Rogan Experience podcast Huberman talking about the male enhancer that claims to increase testosterone.
The video goes to Amazon to show users where to find Alpha Grind — and the clip shows a 15 percent off coupon for the amp.
The Rogan deepfake is just one of many released to the masses — one in 2022 showed Meta CEO Mark Zuckerberg thanking Democrats for their “service and inaction” regarding antitrust law.