BELOW SUPERNAV drop zone ⇩

AI programs can easily impersonate Biden, others to manipulate elections: Study

MAIN AREA TOP drop zone ⇩

(The Hill) – It’s easy for artificial intelligence programs to create mimic voices of politicians like President Joe Biden and former President Donald Trump, posting the risk of a rise in voter misinformation, according to a study from the Center for Countering Digital Hate (CCDH) released Friday.

AI-enabled tools created convincing false statements using the mimic voices about 80 percent of the time, CCDH tests found.

“Guardrails for these tools are so severely lacking — and the level of skill needed to use them is now so low — that these platforms can be easily manipulated by virtually anyone to produce dangerous political misinformation,” CCDH CEO Imran Ahmed said in a statement.

Mimic voices have already been used to influence voters in the 2024 election. During the New Hampshire Democratic primary in February, robocalls using a fake Biden voice told voters to stay home in an attempt to decrease voter turnout.

Steve Kramer, who ran the scheme, said he was inspired by a need to warn the public over the dangers of AI. Last week, was charged with 13 counts each of felony voter suppression and misdemeanor impersonating a candidate. He was also fined $6 million by the Federal Communications Commission (FCC).

The FCC banned the use of AI voices in phone calls after the New Hampshire primary incident, and the committee’s chair moved to require television ads to disclose the use of AI last week.

“As artificial intelligence tools become more accessible, the commission wants to make sure consumers are fully informed when the technology is used,” FCC Chair Jessica Rosenworcel said in a statement last week. “Today, I’ve shared with my colleagues a proposal that makes clear consumers have a right to know when AI tools are being used in the political ads they see, and I hope they swiftly act on this issue.”

The CCDH study found that few of the six AI tools it tested — ElevenLabs, Speechify, PlayHT, Descript, Invideo AI, and Veed — have any built-in safeguards to protect against generating political disinformation.

The group tested the tools on a plethora of politicians’ voices, including Biden and Trump, as well as foreign leaders such as UK Prime Minister Rishi Sunak and French President Emmanuel Macron.

Examples of the generated messages included Trump warning people not to vote because of a bomb threat, Biden claiming to have manipulated election results, and Macron ‘confessing’ to the misuse of campaign funds, CCDH said.

Only one of the tools, ElevenLab, blocked the production of mimic statements using U.S. and UK politicians’ voices, CCDH found.

“AI tools radically reduce the skill, money and time needed to produce disinformation in the voices of the world’s most recognizable and influential political leaders,” Ahmed said. “This could prove devastating to our democracy and elections.”

“This voice-cloning technology can and inevitably will be weaponized by bad actors to mislead voters and subvert the democratic process,” he continued. “It is simply a matter of time before Russian, Chinese, Iranian and domestic anti-democratic forces sow chaos in our elections.”

AI is “supercharging” threats to the election system, technology policy strategist Nicole Schneidman told The Hill in March. “Disinformation, voter suppression — what generative AI is really doing is making it more efficient to be able to execute such threats.”

AI-generated political ads have already broken into the space with the 2024 election. Last year, the Republican National Committee released an entirely AI-generated ad meant to show a dystopian future under a second Biden administration. It employed fake but realistic photos showing boarded-up storefronts, armored military patrols in the streets and waves of immigrants creating panic.

In India’s elections, recent AI-generated videos misrepresenting Bollywood stars as criticizing the prime minister exemplify a trend tech experts say is cropping up in democratic elections around the world. DDHC noted similar attempts at election influence in the UK, Slovakia and Nigeria.

The issue has moved some in Congress to act as well. Sens. Amy Klobuchar, D-Minn., and Lisa Murkowski, R-Alaska, introduced a bill earlier this year that would require similar disclosures to the FCC proposal when AI is used in political advertisements.

AI

Copyright 2024 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed

MAIN AREA MIDDLE drop zone ⇩

Trending on NewsNation

MAIN AREA BOTTOM drop zone ⇩

tt

KC Chiefs parade shooting: 1 dead, 21 shot including 9 kids | Morning in America

Witness of Chiefs parade shooting describes suspect | Banfield

Kansas City Chiefs parade shooting: Mom of 2 dead, over 20 shot | Banfield

WWE star Ashley Massaro 'threatened' by board to keep quiet about alleged rape: Friend | Banfield

Friend of WWE star: Ashley Massaro 'spent hours' sobbing after alleged rape | Banfield

Sunny

la

60°F Sunny Feels like 60°
Wind
4 mph ESE
Humidity
55%
Sunrise
Sunset

Tonight

Partly cloudy early followed by cloudy skies overnight. Low 48F. Winds light and variable.
48°F Partly cloudy early followed by cloudy skies overnight. Low 48F. Winds light and variable.
Wind
4 mph N
Precip
2%
Sunset
Moon Phase
Waning Crescent