NewsNation

Pro-China videos appear to be ‘deepfake’ misinformation

Data scientists. Programmer using laptop analyzing and developing in various information on futuristic virtual interface screen. Algorithm. marketing and deep learning of artificial intelligence(File/Getty)

(NewsNation) — Pro-China videos that resemble a newscast appear to be spreading misinformation with the help of artificial intelligence, according to multiple reports.

In the videos, “Alex,” an anchor from “Wolf News,” criticizes the lack of U.S. action against gun violence, while another anchor discussed the importance of good China-U.S. relations, according to Graphika, which first discovered the videos.


However, neither the anchors nor Wolf News is real. They were generated using artificial intelligence in a first-of-its-kind, pro-China campaign.

According to the New York Times, the videos reportedly use “deepfake” technology, which uses AI to learn from existing photos and videos to thereby produce a real-looking video.

The videos were distributed by bot accounts late last year on Facebook and Twitter, in the first known instance of deepfake video technology being used as part of a state-aligned information campaign.

Graphika reported the videos were intended to promote the interests of the Chinese Communist Party and undercut the U.S. to English-speaking viewers.

“This is the first time we’ve seen this in the wild,” Jack Stubbs, vice president of intelligence at Graphika, told the Times.

Deepfakes have gone beyond editing videos of public figures, creating AI-generated influencing campaigns, and raising concerns about the use of AI to spread disinformation. As technology continues to adapt, it becomes harder to differentiate real from fake.

Facebook parent company Meta reported more than two-thirds of the influence operations it found and took down in 2022 used artificially generated profile pictures.

A study published last year found people have just a 50% chance of correctly guessing whether an AI-generated face is real or fake.