Thus TikTok contents on extreme diets are increasing eating disorders among young people: the disturbing investigation

The video content themed “diet” and “weight loss”, available on the TikTok social network, would foment eating disorders among young users. This is what emerged from a disturbing investigation by Wall Street Journal on the extreme challenges that many young girls undergo to lose weight until they become skeletal, contributing to the spread of the social plague of eating disorders. Journalists created around a hundred fake accounts that used the Chinese application randomly, with little human intervention, simulating the behavior of children.

After spending time watching content related to alcohol, gambling and weight loss (not obscured by the social network), the TikTok algorithm adjusted accordingly, increasing the number and frequency of videos related to diets and weight loss in the section For you.

The results of the investigation

At the end of the experiment, of the approximately 255,000 videos that the AI ​​watched in total, 32,700 of these contained a description or metadata that corresponded to a list of hundreds of keywords related to weight loss: 11,615 videos had text descriptions containing keywords relevant to eating disorders, while 4,402 videos had a combination of keywords advocating the normalization of eating disorders.

But not only that: to avoid being reported by the social network, the descriptions of some of the videos used “rigged” spellings for keywords relating to eating disorders – replacing, for example, some letters with a number or an asterisk.

In response to the newspaper’s report, TikTok announced that it is working on new ways to allow users to safely use the social network and the contents present in it: the idea is to develop a strategy for recognizing video content that may not violate TikTok policies, but could be harmful if watched excessively; furthermore, we are thinking of a tool that allows users (or their parents, in the case of social media use by children) to prevent videos containing certain words or hashtags from being displayed on the page For you.

While the WSJ experiment doesn’t reflect the experience most people have on TikTok, even one person having that experience is one too many, the TikTok spokesperson said. – We allow educational or recovery-oriented content because we understand it can help people see that there is hope, but content that promotes, normalizes, or glorifies disordered eating is prohibited.

TikTok is not the first social network to end up in the storm for the negative influence it has on its users – especially on younger ones: another investigation, again conducted by the WSJ, demonstrated how Instagram would seriously damage the mental health of adolescents and would contribute to undermining girls’ self-esteem and the image they have of themselves.

Following this research, the social network announced the introduction of a feature that steers teenagers away from viewing potentially harmful content, as well as the addition of the “Take a break” to invite users to close the app if they have spent a certain amount of time on the platform (between 10, 20 or 30 minutes).

Sources: TikTok / Wall Street Journal

We also recommend: