TikTok pushes self-harm, ‘thinspiration’ content on children at alarming rate, study says

TikTok pushes self-harm, ‘thinspiration’ content on children at alarming rate, study saysTikTok pushes self-harm, ‘thinspiration’ content on children at alarming rate, study says
via Solen Feyissa (CC BY-SA 2.0)
TikTok recommends suicide content to underage users as young as 13, with the fastest suggestion appearing 2.6 minutes after sign-up, according to a study published on Thursday.
Posts about body image or mental health were recommended to users every 39 seconds — a duration cut down to 27 seconds for more vulnerable minors.
The findings come from the Center for Countering Digital Hate (CCDH), a U.S.-based nonprofit that claims to “disrupt the architecture of online hate and misinformation.” For the study, researchers created eight TikTok accounts with the user’s age set as 13.
Two accounts — a standard user and a “vulnerable” user — were created for four locations: the U.S., U.K., Australia and Canada. Unlike those of the standard users, the usernames of the vulnerable users included the term “loseweight.”
CCDH said the design of the vulnerable accounts was based on findings from its earlier research, which showed that pro-eating disorder users on Instagram will select usernames with related words such as “anorexia.” Researchers also cited evidence that users vulnerable to content about depression, self-harm and suicide will opt for related words for their usernames.
The recommendations and their speed were tracked after each user “expressed a preference” by pausing on videos related to body image, mental health and eating disorders and pressing the “like” button. All recommendations showed up on the “For You” feed.
Analysis of the four standard accounts revealed that TikTok recommended videos discussing suicide in as fast as 2 minutes and 38 seconds, while a video on eating disorder popped up in as fast as 8 minutes. Meanwhile, suicide/self-harm or eating disorder content showed up every 206 seconds (3 minutes, 26 seconds).
The vulnerable accounts were three times as likely to be exposed to harmful content, according to the study. Eating disorders or suicide/self-harm-related recommendations appeared every 66 seconds, while suicide/self-harm content, specifically, were shown 12 times more.
Razor blades were a common sight in the recommended suicide/self-harm videos. Meanwhile, eating disorder videos often mentioned “thinspo” — shorthand for “thinspiration” — which included posts suggesting chewing gum as a dietary substitute.
CCDH CEOO Imran Ahmed called the results “every parent’s nightmare.” He said the research underscores the “urgent need for reform of online spaces.”
“The results are every parent’s nightmare: young people’s feeds are bombarded with harmful, harrowing content that can have a significant cumulative impact on their understanding of the world around them, and their physical and mental health,” Ahmed said in the report. “Without oversight, TikTok’s opaque algorithm will continue to profit by serving its users – children as young as 13, remember – increasingly intense and distressing content without checks, resources, or support.”
Ahmed also described TikTok’s recommendations as “just a beautiful package, but absolutely lethal content.”
“It’s like being stuck in a hall of distorted mirrors, where you’re constantly being told you’re ugly, you’re not good enough, maybe why don’t you kill yourself,” he said, according to Vice. “It is a really full on experience for those kids because it’s just rushing at them. And that’s the way the TikTok algorithm works.”
TikTok disputed the findings and assured that it performs regular checks. It also criticized the study’s methodology, saying it did not “reflect genuine behavior or viewing experiences of real people.”
“We regularly consult with health experts, remove violations of our policies, and provide access to supportive resources for anyone in need. We’re mindful that triggering content is unique to each individual and remain focused on fostering a safe and comfortable space for everyone, including people who choose to share their recovery journeys or educate others on these important topics,” a TikTok spokesperson said in a statement.
Aside from children’s health and wellbeing, the social media platform has also been the subject of national security concerns. In June, a BuzzFeed investigation of leaked audio recordings revealed that U.S. user data had been repeatedly accessed from China, with one director referring to a Beijing engineer known as “Master Admin” who “has access to everything.”
The full report, titled “Deadly by Design,” is available here. CCDH also published a “TikTok Parents Guide” to accompany the study.
Share this Article
Your leading
Asian American
news source
NextShark.com
© 2024 NextShark, Inc. All rights reserved.