CONCORD, N.H. (AP) — A political consultant who sent artificial intelligence-generated robocalls mimicking President Joe Biden’s voice to voters ahead of New Hampshire’s presidential primary faces a $6 million fine and more than two dozen criminal charges.
The Federal Communications Commission said the fine it proposed Thursday for Steven Kramer is its first involving generative AI technology. The company accused of transmitting the calls, Lingo Telecom, faces a $2 million fine, though in both cases the parties could settle or further negotiate, the FCC said.
Kramer has admitted orchestrating a message that was sent to thousands of voters two days before the first-in-the-nation primary on Jan. 23. The message played an AI-generated voice similar to the Democratic president’s that used his phrase “What a bunch of malarkey” and falsely suggested that voting in the primary would preclude voters from casting ballots in November.
Kramer is facing 13 felony charges alleging he violated a New Hampshire law against attempting to deter someone from voting using misleading information. He also faces 13 misdemeanor charges accusing him of falsely representing himself as a candidate by his own conduct or that of another person. The charges were filed in four counties and will be prosecuted by the state attorney general’s office.
Attorney General John Formella said New Hampshire was committed to ensuring that its elections “remain free from unlawful interference.”
“I am pleased to see that our federal partners are similarly committed to protecting consumers and voters from harmful robocalls and voter suppression,” said Formella, who was appointed by Republican Gov. Chris Sununu.
Lingo Telecom said it strongly disagrees with the FCC’s action, which it called an attempt to impose new rules retroactively.
“Lingo Telecom takes its regulatory obligations extremely seriously and has fully cooperated with federal and state agencies to assist with identifying the parties responsible for originating the New Hampshire robocall campaign,” the company said. “Lingo Telecom was not involved whatsoever in the production of these calls and the actions it took complied with all applicable federal regulations and industry standards.”
The New Hampshire calls falsely showed up to recipients as coming from the personal cellphone number of Kathy Sullivan, a former state Democratic Party chair who helped run the Biden write-in campaign. She said in an email Thursday that she hopes Kramer is learning “there is a steep price for trying to rig an election.”
“The swift, decisive action by the New Hampshire Department of Justice and the FCC hopefully will deter other bad and/or stupid actors who don’t respect democracy,” she said.
Kramer, who owns a firm that specializes in get-out-the-vote projects, did not respond to an email seeking comment Thursday. He told The Associated Press in February that he wasn’t trying to influence the outcome of the election but rather wanted to send a wake-up call about the potential dangers of artificial intelligence when he paid a New Orleans magician $150 to create the recording.
“Maybe I’m a villain today, but I think in the end we get a better country and better democracy because of what I’ve done, deliberately,” Kramer said in February.
Voter suppression carries a prison sentence of 3 1/2 to 7 years in prison. Impersonating a candidate is punishable by up to a year in jail.
In an interview days after he was publicly identified as the source of the calls, Kramer said he disagreed that his robocall suppressed voter turnout, noting that Biden won the Democratic primary by a wide margin as a write-in candidate. While he did some ballot access work for another former Democratic presidential hopeful, Rep. Dean Phillips of Minnesota, Kramer said he acted alone.
“I wrestled in college. I’m ready for the fight,” said Kramer, who is scheduled to appear in court on June 5. “If they want to throw me in jail, good luck.”
Since the New Hampshire robocalls, the FCC has taken steps to combat the growing use of artificial intelligence tools in political communications. In February, it confirmed that AI voice-cloning tools in robocalls are banned under existing law, and on Wednesday, it introduced a proposal to require political advertisers to disclose when they use content generated by artificial intelligence in broadcast television and radio ads.
If adopted, the new rules would add a layer of transparency that many lawmakers and AI experts have been calling for as rapidly advancing generative AI tools churn out lifelike images, videos and audio clips that threaten to mislead voters in the upcoming U.S. election.
FCC Chairwoman Jessica Rosenworcel said Thursday that regulators are committed to helping states go after perpetrators. In a statement, she called the New Hampshire robocalls “unnerving.”
“Because when a caller sounds like a politician you know, a celebrity you like, or a family member who is familiar, any one of us could be tricked into believing something that is not true with calls using AI technology,” she said in a statement. “It is exactly how the bad actors behind these junk calls with manipulated voices want you to react.”
___
Swenson reported from New York.
___
The Associated Press receives support from several private foundations to enhance its explanatory coverage of elections and democracy. See more about AP’s democracy initiative here. The AP is solely responsible for all content.