AI-generated voices in robocalls can deceive voters. The FCC just made them illegal
The Federal Communications Commission on Thursday outlawed robocalls that contain voices generated by artificial intelligence, a decision that sends a clear message that exploiting the technology to scam people and mislead voters won’t be tolerated.
The unanimous ruling targets robocalls made with AI voice-cloning tools under the Telephone Consumer Protection Act, a 1991 law restricting junk calls that use artificial and prerecorded voice messages.
The announcement comes as New Hampshire authorities are advancing their investigation into AI-generated robocalls that mimicked President Joe Biden’s voice to discourage people from voting in the state’s first-in-the-nation primary last month.
Effective immediately, the regulation empowers the FCC to fine companies that use AI voices in their calls or block the service providers that carry them. It also opens the door for call recipients to file lawsuits and gives state attorneys general a new mechanism to crack down on violators, according to the FCC.
“Bad actors are using AI-generated voices in unsolicited robocalls to extort vulnerable family members, imitate celebrities, and misinform voters,” FCC Chairwoman Jessica Rosenworcel said in a news release. “We’re putting the fraudsters behind these robocalls on notice.”