Starmer vows to tackle social media’s ‘addictive features’ to protect children | Technology

by Chief Editor

Social Media Addiction: A Turning Tide in Tech Regulation?

A recent US court verdict holding Meta and YouTube liable for the harms caused by addictive features on their platforms is sending ripples across the globe, prompting renewed calls for stricter regulation. UK Prime Minister Keir Starmer has signaled his government’s intention to address these concerns, stating he is “remarkably keen” to tackle addictive features and protect children online.

The Landmark US Ruling: What Happened?

Jurors found Meta (owner of Instagram, Facebook and WhatsApp) and Google (owner of YouTube) negligent in the design and operation of their platforms, concluding they intentionally built addictive technologies. The case centered around a 20-year-old woman who alleged she became addicted to social media as a child, leading to mental health issues. She was awarded $6 million (£4.5m) in damages, with Meta responsible for 70% and Google for the remaining 30%.

UK Government Response: A Shift in Policy?

The verdict has spurred the UK government to consider more aggressive regulation. Prime Minister Starmer emphasized the need to go “further” than the current status quo, highlighting ongoing consultations about potentially banning social media for those under 16. This move reflects a growing public expectation for greater accountability from tech companies.

Beyond the US and UK: Global Implications

The implications of this ruling extend far beyond the US and UK. Campaigners, including the Duke and Duchess of Sussex, have hailed the verdict as a “reckoning,” urging platforms to prioritize child safety over profit. In Brussels, the European Commission’s digital chief, Henna Virkkunen, noted the case would send “a very clear message” about the need for platforms to address the risks they pose.

The Tech Industry’s Reaction

Both Meta and Google have expressed disagreement with the verdict and announced plans to appeal. Google maintains that YouTube is a “responsibly built streaming platform, not a social media site,” while Meta stated teen mental health is “profoundly complex and cannot be linked to a single app.” However, these responses are facing increasing scrutiny as more cases come to light.

Similar Cases and the Future of Tech Litigation

With numerous similar cases pending in US courts, legal experts predict a shift in the landscape of tech litigation. Sacha Haworth, executive director of the Tech Oversight Project, stated, “The era of big tech invincibility is over.” This suggests a potential wave of lawsuits challenging the practices of social media companies.

The Molly Rose Foundation and Calls for Safer Tech

The Molly Rose Foundation, established after the tragic death of 14-year-old Molly Russell who was exposed to harmful content on Instagram, has welcomed the ruling. They argue that governments need to legislate for safer tech, making safety and wellbeing a prerequisite for tech firms operating within their borders.

Ethical Considerations and Platform Responsibility

Experts like Thomas Lancaster at Imperial College London emphasize the ethical responsibility of tech companies to enforce their own safety policies. He argues that policies are ineffective if they cannot be adequately enforced, putting vulnerable users at risk.

Frequently Asked Questions

  • What does this ruling mean for social media users? It could lead to changes in platform design and features aimed at reducing addiction and protecting mental health.
  • Will social media be banned for under-16s in the UK? The government is currently consulting on this possibility, but no final decision has been made.
  • Are other tech companies likely to face similar lawsuits? Yes, with numerous cases pending, other platforms could be held liable for the harms caused by their products.
  • What is being done at the European level? The European Commission is investigating Snapchat over concerns about children’s safety.

Pro Tip: Parents can utilize parental control features offered by many devices and platforms to limit screen time and filter content.

Did you know? The jury determined that Meta and Google acted with “malice, oppression, or fraud” in operating their platforms, leading to the punitive damages awarded.

What are your thoughts on the potential for increased regulation of social media? Share your opinions in the comments below!

Explore more articles on digital wellbeing and tech regulation.

You may also like

Leave a Comment