Labor

Complaint against Meta: Former Barcelona moderators report consequences from viewing violent images

Twenty-nine former employees of CCC Barcelona Digital Services claim to suffer from psychological disorders.

ARA

BarcelonaA group of 29 former Facebook and Instagram moderators from Barcelona have filed a complaint against Meta and the company CCC Barcelona Digital Services for the psychological disorders they suffered after having to review hundreds of violent images daily, including murder, rape, and child pornography. In the complaint, to which Efe has had access, they accuse CCC Barcelona Digital Services, a subcontractor to Meta, of a continuing offense against workers' rights, another of serious injuries due to gross negligence, and a third of moral integrity.

The plaintiffs, represented by lawyer Francesc Feliu, intend for the complaint to be added to the proceedings currently being conducted by Barcelona's Investigating Court No. 29 following another complaint, as they involve the same crimes and against the same company. The case being investigated by this court was reported at the end of 2023, Feliu told ARA, and concerns a former employee who worked from Barcelona as a moderator of content published in Brazil that included live images of violence and even terrorism.

Cargando
No hay anuncios

The decision on whether the court itself will take on the new complaint from 29 moderators could come before the August recess, according to Feliu's calculations. However, whether this new complaint is admitted or not could take longer, given that the complaint and the accompanying documentation submitted by the lawyer exceed a thousand pages. All of the affected parties who signed this latest complaint are former employees of CCC Barcelona Digital Services, which announced the closure of its headquarters in the Catalan capital in April and laid off virtually its entire workforce at the end of May.

"Inhuman and indecent" conditions

The complaint alleges that moderators have had to review platform content, ranging from videos to violent comments, under "absolutely inhumane and indecent working conditions, as well as at a work pace completely unbearable for any human being." This situation, according to the complaint, has left workers "exposed to psychosocial risks," both due to the audiovisual content they moderate and due to their working conditions.

Cargando
No hay anuncios

Furthermore, they accuse the company of not informing them in advance about the type of content they would be reviewing and of providing them with "opaque" information about the task they would be performing: "Before hiring them, they were not informed that they would be viewing extremely violent content, much less continuously throughout the task." The complaint also alleges that the workers were only allowed five minutes of visual rest per hour, which could not be accumulated, and that they could not be absent "under any circumstances from the chair in which they sat for 55 minutes at a time." That is, throughout their entire workday, they only had 35 minutes to stop viewing violent content.

"The viewing load—for each moderator—started at 100 pieces of content per day," but "progressively increased to 800," the complaint states. As a result of this work, according to the text, the moderators ended up suffering from various pathologies, such as anxiety, panic attacks, nightmares, tachycardia, dizziness with fainting, hopelessness, suicidal thoughts, insomnia, vomiting, irritability, and feelings of guilt, according to the complaint.

Cargando
No hay anuncios

Prior to this complaint, each of these workers, who were employed by this company between 2018 and 2022, filed a complaint with the Labor Inspectorate regarding the working conditions, lack of rest, the type of content viewed, and the lack of preventive measures by the company.