Social networks

The recommendation of murder videos forces Meta to apologize to the users of its networks

Meta has now fixed the bug that caused thousands of users, including minors, to be recommended videos with deaths, violence and blood

Aida Xart
and Aida Xart

BarcelonaMeta, the parent company of Facebook, Instagram and WhatsApp, has apologized after Instagram users reported that the thread reels – the algorithm’s personalized recommendations for short videos – was flooded with violent videos. On Friday, netizens around the world found that the app recommended violent videos that featured, among other things, animal abuse, murder and corpses. Some of these posts had thousands of likes and comments.

Among the complaints, users have reported seeing shocking scenes, such as a man crushed by an elephant, a person dismembered by a helicopter, gunfights, a mountain of corpses and a boy putting his face in red-hot oil. Others have filled their profiles with censored screenshots and the message “sensitive content,” a format designed to hide the graphic content of videos, so that before viewing the post the user must press an acceptance button.

Cargando
No hay anuncios

These errors occurred after recent changes in Meta’s approach to content moderation. In this regard, the company has replaced fact-checkers with a system of "community notes" made by users, and implemented new policies regarding reporting content about sexual orientation or gender, something that several associations that defend online rights have criticized. The error has caused Instagram's moderation systems to accidentally prioritize violent content instead of filtering it even if netizens have reported the videos or accounts that spread it.

The technology magazine 404 Media published the case of a man who had an account related to cycling and who suddenly began to see violent videos and recommendations of posts from an account called PeopleDeadDaily (people dying daily). In statements to the digital media, Meta assured that the flood of videos is not related to the review of the content rules of the platform, which includes the reduction of censorship on its platforms.

Cargando
No hay anuncios

"We have corrected a bug that caused some users to see in their Instagram feed reels that should not have been recommended. We apologize for the error," Meta said in a statement shared on CNBC. Associations such as the Molly Rose Foundation, which was founded in 2017 when teenager Molly Rose committed suicide after seeing distressing content on Instagram, have demanded a full explanation for this. said Andy Burrows, director of the foundation.