WASHINGTON, D.C. (NewsNation) — Embattled and wildly popular social media platform TikTok unveiled new, updated community guidelines just days before the tech giant’s CEO Shou Zi Chew is set to testify before the U.S. House Committee on Energy and Commerce. However, a new study claims the site is serving up dangerous content to young teens.
The new rules, which will be applied beginning April 21, will impact the company’s reported 1 billion users worldwide. The company says it has 150 million users in the United States alone — nearly half of the population as a whole.
“These principles are based on our commitment to uphold human rights and aligned with international legal frameworks,” the company wrote Tuesday. “These principles guide our decisions about how we moderate content, so that we can strive to be fair in our actions, protect human dignity, and strike a balance between freedom of expression and preventing harm.”
The China-based company has come under fire in recent months over national security and data privacy concerns, with several states and the federal government banning the app from use among employees. Last week, President Joe Biden’s administration threatened to ban the app entirely in the U.S. if the Chinese owners of TikTok didn’t sell their stakes.
The company says its new policy will focus on four pillars:
- Removing violative content
- Age-restricting mature content (18 and older)
- Restricting videos for the “For You” feed if they are not appropriate for a broad audience
- Empowering TikTok users with tools and resources to “stay in control of their experience”
However, even as this change is in the works, new research by international consumer watchdog group Eko suggests TikTok may be especially harmful to younger users, easily serving up violent content.
According to the study, after researchers spent just 10 minutes on the app, it “effectively triggered TikTok’s algorithm to target our researcher’s 13-year-old accounts with content explicitly promoting suicide and violence.”
The report goes on to say the platform then suggested posts with hashtags related to self-harm — under the heading labeled “For You.” Researchers say they found those videos had been viewed more than 8 billion times by TikTok users worldwide and included hashtags like #ImDone, referencing suicide.
But the study did note that algorithms that promote harmful content are a staple across all major social media platforms.
In a statement to NewsNation, a spokesperson for TikTok said:
“The experiment this report is based on does not reflect genuine behaviour that we see on TikTok. The safety of our community is a top priority, and we work hard to prevent the surfacing of harmful content on our platform by removing violations, publishing regular transparency reports, adding permanent public service announcements on certain hashtags, and surfacing helpful resources with guidance from experts. We also balance these efforts by supporting people who come to TikTok to share their personal experiences in a safe way, raise awareness and find community support.”
If you or someone you know needs help, resources or someone to talk to, you can find it at the National Suicide Prevention Lifeline website or by calling 988. People are available to talk to you 24/7.
NewsNation’s Tulsi Kamath contributed to this report.