New research suggests that TikTok is suppressing content that encourages people to vote.
Nonprofit political group Accelerate Change claimed in a new report released on Sunday that TikTok videos get fewer views when creators use election-related phrases compared to similar videos but without the same terms.
The group had influencers upload 20 different paired videos that were nearly identical, then compared each video’s view count. The videos notably accumulated a total of 370,000 views.
In some of the videos, creators verbally used political phrases such as “mid-terms” and “get out and vote.” Meanwhile, others featured the same phrases, but written on signs instead of spoken.
The videos with verbalized election terms were reportedly viewed three times less than the videos with written phrases.
President of Accelerate Change Peter Murray said the findings prove election content is being suppressed by TikTok’s algorithm.
“Often with an algorithm performance experiment like this, you struggle to see a pattern in the data, but in this case the result was dramatic and clear: TikTok is suppressing more than 65% of voting video views,” Murray said in the report.
The study comes months after TikTok announced several changes in September to its content policy on politically related material ahead of the U.S. midterm elections.
A daily dose of Asian America's essential stories, in under 5 minutes.
Get our collection of Asian America's most essential stories to your inbox daily for free.
Unsure? Check out our Newsletter Archive.
The changes include prohibiting videos that contain political fundraising efforts and requiring U.S.-based government and political accounts to get verified.
While the study acknowledged these policy changes, it questioned the platform’s lack of efforts in encouraging voter participation.
In a statement to Gizmodo, a representative from TikTok challenged Accelerate Change’s methodology and the lack of special guidelines for “political content.”
The representative pointed out that the paired videos were posted at different times and on different days, which could affect the number of views each may have generated. The spokesperson asserted that the discrepancy in views had nothing to do with the use of political terms.
“All content — audio, visuals, text, stickers, captions, etc. — is moderated in accordance with our Community Guidelines which apply to everyone and everything on the platform, and we strive to consistently and accurately enforce these policies,” the representative was quoted as saying.
TikTok has existing policies regarding accounts owned by governments, politicians and political parties, although they are not related to Accelerate Change’s research.
The spokesperson further said that none of the paired videos in the Accelerate Change research were “moderated” by the TikTok algorithm. The representative also questioned why one of the verbal-only videos was taken down after amassing 15,000 views.
Murray admitted that the video was deleted in error. However, it was reposted, and the number of combined views was still far fewer than that of the nearly identical version with election terms written out. He also addressed the criticism of their methodology, explaining that the research team randomized the influencers’ postings to try controlling for such variables.
On the spokesperson’s claim that none of Accelerate Change’s videos were moderated, Murray said the company was not being truthful.
“TikTok only moderates a small fraction of videos, but their algorithm is tuned to downgrade videos with voting words (and likely political words as well),” he argued. “Not being moderated doesn’t mean that they weren’t automatically suppressed in the algorithm.”
Acknowledging its impact on elections, TikTok also announced in August that it is taking extra measures to limit misinformation and violations of its policies by rolling out an Elections Center to “connect people who engage with election content to authoritative information and sources in more than 45 languages, including English and Spanish.”
The platform also unveiled tools that allow users to “automatically filter out videos with words or hashtags they don’t want to see in their For You or Following feeds.”