Meta knowingly harmed children’s mental health, US jury decides in landmark ruling

by Chief Editor

Meta’s $375 Million Verdict: A Turning Point for Big Tech Accountability?

A Recent Mexico jury has delivered a landmark verdict against Meta, ordering the company to pay $375 million (€317 million) for failing to protect children from online exploitation and for misleading users about the safety of its platforms – Facebook, Instagram, and WhatsApp. The decision, reached on Tuesday, March 24, 2026, marks the first time a jury has found Meta liable in such a case.

The Core of the Case: Exploitation and Deceptive Practices

New Mexico Attorney General Raúl Torrez initiated the lawsuit in 2023, citing evidence that Meta platforms were “prime locations for predators” and facilitated the exchange of child pornography. The Attorney General’s office used undercover accounts posing as 14-year-olds to gather evidence, revealing that the platforms directed young users to sexually explicit content and recommended participation in unmoderated groups linked to commercial sex.

The jury found Meta engaged in “unconscionable” trade practices, exploiting the vulnerabilities of children. Evidence presented included internal Meta correspondence and reports concerning child safety, as well as testimony from executives, engineers, and safety consultants. Jurors questioned statements made by Meta CEO Mark Zuckerberg, Instagram head Adam Mosseri, and Meta global head of safety Antigone Davis regarding platform safety.

Beyond Exploitation: Addiction and Mental Health Concerns

The lawsuit also addressed concerns about social media addiction, alleging that Meta hadn’t adequately disclosed or addressed the potential dangers. While Meta doesn’t explicitly acknowledge social media addiction, executives admitted to “problematic employ” and a desire for users to have positive experiences on their platforms. The jury considered Meta’s failure to enforce its age restriction of 13 and the role of algorithms in promoting harmful content, including content related to teen suicide.

A Wave of Litigation: Meta Faces Broad Challenges

New Mexico’s case is part of a larger trend of legal challenges against Meta and other social media companies. More than 40 state attorneys general have filed lawsuits claiming that Meta contributes to a mental health crisis among young people through addictive platform features. A separate “bellwether trial” is underway in California, focusing on whether Meta and Google platforms are harmful and addictive to children.

The California lawsuit, brought by a 19-year-old, alleges that early exposure to Instagram and YouTube exacerbated depression and suicidal thoughts. The claim centers on deliberate design choices intended to maximize platform engagement and profits, mirroring techniques used in the gambling industry.

What Happens Next?

Meta plans to appeal the New Mexico verdict. While no immediate changes to Meta’s practices are required, a second phase of the trial in May will determine whether Meta platforms constitute a “public nuisance” and if the company should fund public programs to address related harms.

The Future of Tech Accountability: Potential Trends

Increased Regulatory Scrutiny

The Meta verdict signals a potential shift towards greater regulatory scrutiny of social media platforms. Governments worldwide are likely to intensify efforts to hold tech companies accountable for the safety of their users, particularly children. This could lead to stricter laws regarding content moderation, age verification, and data privacy.

Focus on Algorithmic Transparency

The role of algorithms in amplifying harmful content is coming under increasing scrutiny. Future regulations may require greater transparency in how algorithms operate and how they are designed to influence user behavior. Companies may be forced to demonstrate that their algorithms are not intentionally promoting harmful content or exploiting vulnerabilities.

Shifting Legal Landscape: Negligence and Product Liability

The New Mexico case establishes a precedent for holding social media companies liable for negligence in protecting users from harm. Future lawsuits may increasingly frame social media platforms as “products” subject to product liability laws, potentially opening the door to broader claims of harm.

Rise of “Duty of Care” Legislation

The concept of a “duty of care” – requiring companies to accept reasonable steps to prevent foreseeable harm to users – is gaining traction in legal and policy circles. Legislation based on this principle could impose a legal obligation on social media platforms to proactively protect users from harm, including exploitation, addiction, and mental health issues.

FAQ

Q: Will this verdict immediately change how Meta operates?
A: Not immediately. Meta is appealing the decision. A second phase of the trial in May will determine if further changes are required.

Q: What is a “bellwether trial”?
A: A bellwether trial is an early test case that helps determine the likely outcome of many similar lawsuits.

Q: Are other social media companies at risk?
A: Yes. The legal challenges facing Meta are part of a broader wave of litigation targeting social media platforms, and other companies could face similar lawsuits.

Did you know? Over 100,000 children were reportedly exploited daily across Meta’s platforms, according to evidence presented in the New Mexico case.

Pro Tip: Parents should actively monitor their children’s online activity and educate them about the risks of social media.

This verdict represents a significant moment in the ongoing debate about the responsibility of social media companies. As legal and regulatory pressures mount, the future of these platforms will likely be shaped by a greater emphasis on user safety and accountability.

What are your thoughts on the Meta verdict? Share your opinions in the comments below!

You may also like

Leave a Comment