TikTok “pushes harmful content into teens’ feeds,” says one study

Some young TikTok users are being shown potentially dangerous content that could encourage eating disorders, self-harm and suicide, an online safety group has said.

Research into TikTok’s algorithm by the Center for Countering Digital Hate (CCDH) found that some accounts were repeatedly served content about eating disorders and other harmful topics within minutes of joining the platform.

The group set up two accounts in each of the US, UK, Australia and Canada posing as 13-year-olds. One account in each country was given a female name and the other was given a similar name but with a reference to weight loss included in the username.

The studio set up two accounts in each of the US, UK, Australia and Canada posing as 13 year olds (PA)

The content offered to both accounts in the first 30 minutes on TikTok was then compared.

The CCDH said it used this username method as previous research has shown that some users with body dysmorphic issues often express it through their social media managers.

In its reporting methodology, the CCDH also said that accounts used in the study expressed a preference for videos on body image, mental health and eating disorders by pausing on relevant videos and hitting the Like button.

Furthermore, the report does not distinguish between content with a positive intent and that of a clearer negative intent – ​​with the CCDH arguing that in many cases the intent of a video could not be conclusively determined and that even those with positive intent it may still be distressing for some.

The online safety group’s report argues that the speed with which TikTok recommends content to new users is harmful.

During its testing, the CCDH said that one of its accounts received content referencing suicide within three minutes of joining TikTok, and that eating disorder content was posted to one account within eight minutes.

She said that on average her accounts were served mental health and body image videos every 39 seconds.

And the research indicated that the most vulnerable accounts, which included body image references in their username, were served three times as much harmful content and 12 times as much content related to self-harm and suicide.

The CCDH said the study found an eating disorder community on TikTok using both hardcoded and open hashtags to share material on the site, with more than 13 billion views of their videos.

The video sharing platform includes a For You page, which uses an algorithm to recommend content to users as they interact with the app and collects more information about a user’s interests and preferences.

Online Security Bill

Imran Ahmed, Chief Executive Officer of the Center for Countering Digital Hate (PA)

Imran Ahmed, chief executive of the CCDH, accused TikTok of “poisoning the minds” of younger users.

“It promotes body hatred and extreme suggestions of self-harm and life-threatening disorderly attitudes toward food in children,” she said.

“Parents will be shocked to learn the truth and furious that lawmakers aren’t protecting young people from the big tech billionaires, their irresponsible social media apps and increasingly aggressive algorithms.”

In the wake of the research, CCDH has released a new Parenting Guide together with the Molly Rose Foundation, set up by Ian Russell after his daughter Molly took her own life after seeing harmful content on social media.

The guide encourages parents to talk “openly” with their children about social media and online safety, and to seek help from support groups if concerned about their child.

In response to the research, a TikTok spokesperson said: “This activity and the resulting experience do not reflect genuine behaviors or viewing experiences of real people.

“We consult regularly with healthcare experts, fix violations of our policies, and provide access to support resources to anyone who needs them.

“We understand that enabling content is unique to each individual and remain focused on promoting a safe and comfortable space for everyone, including people who choose to share their recovery journeys or educate others about these important topics.”

Leave a Reply

Your email address will not be published. Required fields are marked *