KANSAS CITY, Mo. (AP) — There are plenty of places to turn for accurate information about COVID-19. Your physician. Local health departments. The U.S. Centers for Disease Control.
But not, perhaps, your local government’s public comment session.
During a meeting of the St. Louis County Council earlier this month, opponents of a possible mask mandate made so many misleading comments about masks, vaccines and COVID-19 that YouTube removed the video for violating its policies against false claims about the virus.
“I hope no one is making any medical decisions based on what they hear at our public forums,” said County Councilwoman Lisa Clancy, who supports mask wearing and said she believes most of her constituents do too. The video was restored, but Clancy’s worries about the impact of that misinformation remain.
Videos of local government meetings have emerged as the latest vector of COVID-19 misinformation, broadcasting misleading claims about masks and vaccines to millions and creating new challenges for internet platforms trying to balance the potential harm against the need for government openness.
The latest video to go viral features a local physician who made several misleading claims about COVID-19 while addressing the Mount Vernon Community School Corporation in Fortville, Indiana, on Aug. 6. In his 6-minute remarks, Dr. Dan Stock tells the board that masks don’t work, vaccines don’t prevent infection and state and federal health officials don’t follow the science.
The video has amassed tens of millions of online views, and prompted the Indiana State Department of Health to push back. Stock did not return multiple messages seeking comment.
“Here comes a doctor in suspenders who goes in front of the school board and basically says what some people are thinking: the masks are B.S., vaccines don’t work and the CDC is lying — it can be very compelling to laypeople,” said Dr. Zubin Damania, a California physician who received so many messages about the Indiana clip that he created his own video debunking Stock’s claims.
Damania hosts a popular online medical show under the name ZDoggMD. His video debunking Stock’s comments has been viewed more than 400,000 times so far. He said that while there are legitimate questions about the effectiveness of mask requirements for children, Stock’s broad criticism of masks and vaccines went too far.
YouTube removed several similar videos of local government meetings in North Carolina, Missouri, Kansas and Washington state. In Bellingham, Washington, officials responded by temporarily suspending public comment sessions.
The false claims in those videos were made during the portion of the meeting devoted to public comment. Local officials have no control over what is said at these forums, and say that’s part of the point.
In Kansas, YouTube pulled video of the May school board meeting in the 27,000-student Shawnee Mission district in which parents and a state lawmaker called for the district to remove its mask mandate, citing “medical misinformation.”
The district, where a mask mandate remains in effect, responded by ending livestreaming of the public comment period. District spokesman David Smith acknowledged that it has been challenging to balance making the board meetings accessible and not spreading fallacies.
“It was hard for me to hear things in the board meeting that weren’t true and to know that those were going out without contradiction,” Smith said. “I am all about free speech, but when that free speech endangers people’s lives, it is hard to sit through that.”
After hearing from local officials, YouTube reversed its decision and put the videos back up. Earlier this month the company, which is owned by Google, announced a change to its COVID misinformation policy to allow exceptions for local government meetings — though YouTube may still remove content that uses remarks from public forums in an attempt to mislead.
“While we have clear policies to remove harmful COVID-19 misinformation, we also recognize the importance of organizations like school districts and city councils using YouTube to share recordings of open public forums, even when comments at those forums may violate our policies,” company spokeswoman Elena Hernandez said.
The deluge of false claims about the virus has challenged other platforms, too. Twitter and Facebook each have their own policies on COVID-19 misinformation, and say that like YouTube they attach labels to misleading content and remove the worst of it.
Public comment sessions preceding local government meetings have long been known for sometimes colorful remarks from local residents. But before the internet, if someone were to drone on about fluoride in the drinking water, for instance, their comments weren’t likely to become national news.
Now, thanks to the internet and social media, the misleading musings of a local doctor speaking before a school board can compete for attention with the recommendations of the CDC.
It was only a matter of time before misleading comments at these local public forums went viral, according to Jennifer Grygiel, a communications professor at Syracuse University who studies social media platforms.
Grygiel suggested a few possible ways to minimize the impact of misinformation without muzzling local governments. Grygiel said clear labels on government broadcasts would help viewers understand what they’re watching. Keeping the video on the government’s website, instead of making it shareable on YouTube, could allow local residents to watch without enabling the spread of videos more widely.
“Anytime there is a public arena – a city council hearing, a school board meeting, a public park – the public has the opportunity to potentially spread misinformation,” Grygiel said. “What’s changed is it used to stay local.”