BELOW SUPERNAV drop zone ⇩

Deepfake fears surface ahead of 2024 election

  • Deepfakes can be nearly undetectable from real videos
  • Experts fear deepfakes can undermine trust in the democratic process
  • Deepfake robocalls have already been used in the election

MAIN AREA TOP drop zone ⇩

(NewsNation) — In the lead to the 2024 presidential election, a recent picture of former President Donald Trump and what appear to be supporters has gotten a lot of pushback.

The reason? It’s a deepfake. Obtained by the BBC, the photo was generated by artificial intelligence, and there is no evidence linking the photo to the actual Trump campaign.

But the image is concerning because deepfakes are only likely to get better and be more difficult to separate from reality as we approach the election in November.

Deepfakes are relatively cheap and in some cases free to make, but they can cause real damage during a political campaign. Voice cloning can be used to mimic a person in ads or robocalls. AI can also generate fake news reports.

These videos can trick an unsuspecting person if they don’t know what to look for: things like lips not quite syncing up with speech, unnatural speech cadence, skin that seems too smooth to be real and sometimes small glitches around boundaries, like between a person’s face and the background.

While some AI-generated content seems harmless, like fun face filters, the fear is that in the wrong hands, it could have the power to become undetectable.

By the November election, it’s entirely possible it will be extremely difficult to tell what’s real and what’s not.

Cybersecurity expert Nicole Tisdale said the biggest issue with deepfakes is the way they can undermine trust in the democratic process.

“People start to fear that they can’t believe what they are seeing and what they are hearing,” Tisdale told NewsNation’s Nichole Berlie. “So when we get to a place where you can’t believe your eyes, which is where we are, and you also can’t believe your ears when you’re talking about audio, it can really make people kind of question their decision to vote or to participate in a democratic process at all.”

Deepfake robocalls have already gone out to voters trying to trick them into not voting. Even for experts, Tisdale said, it can be hard to identify deepfakes. There are even more convincing examples that aren’t available to the public, she explained, showing just how realistic they can be.

“One of the best examples is actually a music video by Kendrick Lamar that has over 45 million views on YouTube,” Tisdale said.

The video, which she specifically gave as a nonpolitical example, includes deepfakes of Kanye West and Kobe Bryant, among others.

“Viewers have to understand, you’re not going to be able to detect this on your own,” Tisdale said. “You have to have technology, very advanced technology, to spot a really good deepfake.”

Politics

Copyright 2024 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed

Site Settings Survey

MAIN AREA MIDDLE drop zone ⇩

Trending on NewsNation

MAIN AREA BOTTOM drop zone ⇩

tt

KC Chiefs parade shooting: 1 dead, 21 shot including 9 kids | Morning in America

Witness of Chiefs parade shooting describes suspect | Banfield

Kansas City Chiefs parade shooting: Mom of 2 dead, over 20 shot | Banfield

WWE star Ashley Massaro 'threatened' by board to keep quiet about alleged rape: Friend | Banfield

Friend of WWE star: Ashley Massaro 'spent hours' sobbing after alleged rape | Banfield

Sunny

la

69°F Sunny Feels like 69°
Wind
2 mph N
Humidity
28%
Sunrise
Sunset

Tonight

Some clouds. Low 53F. Winds light and variable.
53°F Some clouds. Low 53F. Winds light and variable.
Wind
3 mph N
Precip
0%
Sunset
Moon Phase
Waning Gibbous