NewsNation

AI voice cloning scammers’ new trick to dupe your loved ones

(NewsNation) — Cybercriminals are using artificial intelligence to swindle people out of thousands of dollars with their new tactic: cloning voices.

Here’s how it works — scammers will typically make a fake phone call to you, and once you answer, they’ll record your voice. Then, they put the recording of your voice into AI software that mimics your voice, intonations and inflections.


“I’ve seen people do it with 30 seconds to a minute. It’s not ideal but the more they have on you, the more they can synthesize your voice,” explained R. Harvey Castro, an AI author.

The cybercriminals will then create convincing voicemails to trick people into sending money.

Family and friends think their son, daughter or grandkid is in jail and in need of help to bail them out or companies are told they need to wire thousands of dollars, only later finding out they’ve been conned.

In 2022, more than 36,000 imposter scams were reported of people being swindled by those pretending to be friends and family, according to Federal Trade Commission data. The agency also reported that 5,100 of those incidents happened over the phone, accounting for over $11 million in losses.

The Better Business Bureau urges the public to confirm all payment requests and never wire money to a loved one without an agreed-upon code word.

“I don’t care if it’s for $10 if I don’t say Rumpelstiltskin in the conversation, then it’s not me, then don’t send any money,” said Barry N. Moore of the Better Business Bureau.

Another way you can keep yourself safe is to not answer the phone when you don’t know who is calling. The moment you think it’s a scammer, hang up.

The FTC suggests that if you get a money request from someone who sounds like a loved one, particularly if they ask for a wire transfer or gift card, to hang up and call the person directly to verify their story.

The BBB said you may also want to avoid posting videos on social media.