Many have happened in these hours to come across some content on not exactly mild social networks. Accidents mortals, mutilations and violence of various kinds have followed each other on the screens of thousands of users, including teenagers, for a period of time sufficient to disturb the most and to send the due reports.
To bring hundreds of violent videos to appear in the Instagram feeds would have been a Algorithm errorwhat goal he would recognize, intervening to correct and apologize to the public.
However, it is not clear how many users have been involved or how many videos have been erroneously inserted among the suggested contents. It is true that numerous members have reported the problem directly to the platform.
The “fault” had serious consequences: the videos have rThe timelins with restrictions for sensitive contents are also addedor those created specifically to protect minors. The diffused images were particularly shocking, as reported by some US media: road accidents, mutilations at work, fatal falls from attractions in the amusement parks (the videos were shown in sequence, without interruptions, and came from unknown pages to the user, with names such as Blackpeoplebeinghurt, Shockingtraugedies And Peopledyinghub).
View this post on Instagram
We solved a problem that made some users see Instagram feeds reels content that should not have been suggested. We apologize for the inconvenience – said a meta spokesperson.
However, the company did not provide details on the nature of the error or on the reason why the content control system did not work as expected.
The incident rekindles the debate on Meta moderation policies, already under the magnifying glass after the recent decision to stop the Fact-Checking program in the United States on Facebook, Instagram and Threads. These three platforms count over 3 billion users in the world in total.
We talked about it here: Facebook, Instagram and Threads: what it means that Zuckerberg will eliminate the fact-checking (and what the consequences for you will be)
In general, the violent and explicit contents are prohibited by the Meta policies, which usually removes them to protect the community. However, in some circumstances, the company allows the publication of videos with strong images if deemed useful to raise awareness on issues such as violations of human rights or conflicts.