Meta Fined $375M for Harming Teens’ Mental Health & Hiding Exploitation Risks

by Chief Editor

A New Mexico jury ruled this Tuesday that Meta deliberately harmed the mental health of children and concealed what it knew about child sexual exploitation on its social media platforms, thereby facilitating such criminal conduct.

Meta Found Liable in New Mexico

During deliberations, the jury used a checklist of accusations presented by the prosecution, which maintained that Meta did not disclose what it knew about problems enforcing its ban on use by those under 13, the prevalence of teen suicide-related content on social media, and the role of Meta’s algorithms in prioritizing sensational or harmful content, among other aspects.

The decision follows a nearly seven-week trial, and as a federal court jury in California continues deliberating for over a week on whether Meta and YouTube should be held responsible in a similar case.

Jurors sided with prosecutors in New Mexico, who argued that Meta—owner of Instagram, Facebook, and WhatsApp—prioritized financial gain over safety. The jury found that Meta violated several sections of New Mexico’s Unfair Practices Act, based on accusations that the company concealed what it knew about the dangers of child sexual exploitation on its platforms and the impact on children’s mental health, according to AP.

Did You Recognize? The case was based on a state undercover investigation where agents created social media accounts posing as minors to document cases of sexual harassment and Meta’s response.

The jury agreed with allegations that Meta made false or misleading statements, and also found that the company engaged in “unconscionable” business practices that unfairly exploited the vulnerability and inexperience of children.

Jurors concluded that thousands of violations occurred, each being counted separately to comprise a total penalty of $375 million.

“We respectfully disagree with the verdict and will appeal,” a Meta spokesperson stated: “We work hard to keep people safe on our platforms and are forthright about the challenges of identifying and removing malicious actors or harmful content. We will continue to vigorously defend ourselves and remain confident in our track record of protecting teens online.”

Meta’s lawyers asserted that the company discloses risks and makes efforts to eradicate harmful content and experiences, while acknowledging that some inappropriate material evades its safety filters.

Broader Implications for Social Media

The New Mexico case is among the first to reach trial in a wave of litigation involving social media platforms and their impact on children. The trial began on February 9th and is one of the first in a series of lawsuits against Meta, occurring as school districts and legislators demand greater restrictions on smartphone use in classrooms.

More than 40 state attorneys general have filed lawsuits against Meta, alleging that the company contributes to a mental health crisis among young people by deliberately designing addictive features in Instagram and Facebook.

Expert Insight: This verdict underscores the growing legal scrutiny of social media companies and their responsibility for the well-being of young users. The focus on algorithms and the prioritization of engagement over safety represents a significant shift in the legal landscape.

Meta has not acknowledged the existence of social media addiction; however, during the trial, its executives admitted to “problematic use” and stated their goal is for users to feel satisfied with the time they spend on Meta platforms.

“The evidence shows that Meta invests in safety not just because it’s the right thing to do, but because it’s good for business,” Meta attorney Kevin Huff told the jury during closing arguments. “Meta designs its apps to support people connect with their friends and families, not to facilitate contact with predators.”

Technology companies have enjoyed legal protection from liability for material posted on their social media platforms, shielded by Section 230—a 30-year-old provision in the US Communications Decency Act—as well as the protection of the First Amendment.

Prosecutors in New Mexico argue that, nonetheless, Meta should be held accountable for its role in disseminating such content through complex algorithms that spread potentially harmful material to minors.

“We know the goal of that is to generate engagement and increase the amount of time kids spend on the platform,” prosecutor Linda Singer stated. “That decision by Meta has profound negative repercussions for children.”

A second phase of the trial, scheduled for May—and to be held before a judge, without a jury—will determine if Meta has committed a “public nuisance” and whether it should be ordered to rectify its course and bear the costs of corrective measures.

Frequently Asked Questions

What did the jury find Meta guilty of?

The jury found that Meta deliberately harmed the mental health of children and concealed what it knew about child sexual exploitation on its platforms, violating New Mexico’s Unfair Practices Act.

What is the total penalty Meta faces?

The jury assessed a total penalty of $375 million, calculated by counting each violation separately.

What is Meta’s response to the verdict?

Meta stated it respectfully disagrees with the verdict and will file an appeal, maintaining its commitment to platform safety.

As this case unfolds, will other states pursue similar legal action against social media companies regarding the well-being of young users?

You may also like

Leave a Comment