Million-dollar fine for Meta for consciously endangering minors
The company has different judicial processes open in parallel
BarcelonaHistoric judicial precedent against Meta, the Facebook and Instagram company. A jury in Santa Fe, New Mexico (USA), has ordered the technology giant to pay 375 million dollars in damages, as it considers that the company prioritized economic benefits over safety and hid what it knew about possible harmful effects of its applications. According to the State Attorney General, who initiated the proceedings, "the jury's verdict is a historic victory for every child and every family who paid the price for Meta's choice to prioritize profits over child safety. Today, the jury has joined families, educators, and child safety experts to say: 'Enough is enough.'" The company has announced that it will appeal the sentence as it "respectfully disagrees" with the verdict.
After seven weeks of hearings and only one day of deliberation, the jury determined that Meta violated state consumer protection laws by omitting disclosure of the potential harm its applications can cause. Specifically, they pointed to risks regarding the sexual exploitation of minors and also consequences for the psychological development of the youngest. New Mexico law allows for a maximum of $5,000 per infraction of this type, so the total comes from the 37,500 infractions the jury determined, counting the number of minors who could have suffered these consequences. Beyond the amount – relatively small for a corporation that billed 200 billion last year – this is the first time a state has won against a major technology company for damages to minors. And it opens the door to similar lawsuits across the United States.
The ruling has arrived while Meta has two more important proceedings. The first is in California, where Meta's platforms are being judged for allegedly allowing minors to access them. In this regard, the company's CEO, Mark Zuckerberg, testified for more than six hours and admitted to being slow to implement effective age controls. Instagram did not ask for users' birth dates until 2019, waited until 2021 to require existing profiles to declare an age, and verification of these declarations did not begin until 2022. An internal company document estimated that, in 2015, there were over four million Instagram accounts belonging to children under 13 years of age.
Besides this proceeding, there is also a major federal trial in June initiated by different school districts across the United States where Meta, Google, Snapchat, and TikTok will be in the dock. The lawsuits have been filed in response to the increase in mental health disorders among young people, which are believed to be exacerbated by prolonged exposure to social networks.