(NewsNation) — An Arizona mom will testify Tuesday in a congressional hearing about her experience with an AI deep fake extortion scam.
In April, scammers used AI to clone Jennifer DeStefano’s daughter’s voice so they could demand a $1 million ransom as part of a new voice scheme.
This comes amid a rise in “caller-ID spoofing” schemes, in which scammers claim they’ve taken the recipient’s relatives hostage and will harm them if they aren’t paid a ransom.
DeStefano will be the only person testifying in front of the Senate Judiciary Committee Tuesday who doesn’t work in or with artificial intelligence. She said it’s important lawmakers hear about her experience from her point of view to prevent others from falling victim to these types of calls.
“I think they need to hear the firsthand perspective of how evil this can be, what this can be used for,” DeStefano said. “The depths of the terror, that it’s being used right now to traumatize the public and the people I’m not the only story and I’m not going to be the last if there isn’t any kind of consequences or action.”
Following the incident, DeStefano said this experience has caused her to have trust issues with other phone calls as a part of her everyday life. She said it “undermines the very idea of what is familiar anymore.”
DeStefano recounted how she received a call from an unfamiliar phone number, which she was concerned may have been a medic or hospital calling to notify her that her daughter was injured while on a ski trip. However, when she answered the phone it was the beginning of a very terrifying experience for her and her family.
“I answered the phone and it was my daughter crying and sobbing saying ‘Mom.'” “I said, ‘What happened?’ And she goes, ‘Mom, I messed up,’ crying and sobbing. So I thought she just got herself a little banged up.”
But all of a sudden DeStefano heard a man’s voice tell her daughter to “lay down, put your head back.”
DeStefano said this experience is “very haunting,” especially because the call sounds so realistic.
“Especially when you’re trying to figure out where did they get these cries? Where do they get the sobs?” she said. “You know, there are so many unanswered questions.”
She said the faux kidnapper initially asked for $1 million but then lowered the figure to $50,000 after DeStefano said she didn’t have the money.
The nightmare ended after DeStefano, who was at her other daughter’s studio at the time, received help from one of her fellow moms.
After calling 911 and DeStefano’s husband, they confirmed that Brie was safe and sound on her skiing excursion.
DeStefano said the call was considered a “prank call,” because there was no money transferred and no kidnapping physically took place.
“So it was a dead end. At that point, no police report was taken,” she said.
The identity of the cybernetic catfish is unknown at this time, but computer science experts say that voice-cloning tech has evolved to the point that someone’s tone and manner of speaking can be recreated from the briefest of soundbites.