Josh Simons: Meta & Google Not ‘Serious’ About Social Media Safety

by Chief Editor

 |  Updated: 

Josh Simons, a former Cabinet Office minister, has hit out at tech giants.

The landmark ruling holding Meta and Google liable for social media addiction is likely to trigger a wave of legal challenges and a fundamental shift in how tech companies approach platform design. The case, resulting in a $6 million judgement, has ignited debate about accountability and the potential for stricter regulation.

The Rise of Addiction Litigation

The recent verdict, where a jury found Meta and Google negligent in designing addictive platforms, sets a key precedent. Thousands of similar cases are now expected to follow, potentially costing tech giants billions. This isn’t simply about financial penalties; it’s about forcing a reckoning with the psychological impact of social media.

From AI Ethics to Accountability

Josh Simons, a former minister who previously worked on AI ethics at Meta, has been vocal about the issue. He resigned from Meta because recommendations from his team weren’t being followed, stating that the companies prioritized engagement and revenue over user wellbeing. Simons believes tech bosses haven’t been “serious” about safety.

His experience highlights a critical tension: the inherent conflict between maximizing user engagement – the core of the social media business model – and protecting vulnerable users from addictive behaviors. Simons’ testimony in the case underscores the internal awareness of these harms within Meta itself.

The Hunt for Confidential Data and Journalistic Integrity

The case involving Simons similarly brought to light concerns about the targeting of journalists. A PR firm, APCO Worldwide, was commissioned by Labour Together, the think tank Simons formerly led, to investigate journalists following a story about undeclared donations. The report included information about a journalist’s Jewish beliefs and ideological position, raising serious questions about press freedom and ethical boundaries.

Simons has apologized for commissioning the report, stating he was “naive” and never intended for journalists to be investigated. This incident adds another layer to the scrutiny of those in positions of power and their relationship with media oversight.

Government Responses and Potential Bans

Governments worldwide are grappling with how to address the potential harms of social media, particularly for young people. In the UK, ministers are currently consulting on a potential ban on social media for under-16s and piloting restrictions on tech use in schools. The US ruling is likely to add further momentum to these discussions.

The Future of Platform Design

The legal pressure and public scrutiny are likely to force tech companies to rethink their platform designs. Expect to see:

  • Increased Transparency: Greater disclosure of algorithms and data collection practices.
  • User Control: More robust tools for users to manage their time on platforms and filter content.
  • Age Verification: Stricter age verification measures to prevent underage access.
  • AI-Driven Safeguards: Utilizing AI to detect and mitigate addictive behaviors.

The Role of Regulation

Although self-regulation by tech companies is possible, many believe stronger government intervention is necessary. This could include stricter data privacy laws, regulations on algorithmic amplification and increased liability for harmful content.

FAQ: Social Media Addiction and Legal Recourse

  • What does the Meta and Google ruling mean for users? It opens the door for individuals harmed by social media addiction to seek legal redress.
  • Could social media be banned for under-16s in the UK? The government is currently consulting on this possibility.
  • What is Labour Together’s role in the journalist investigation? The think tank commissioned a PR firm to investigate journalists who reported on its funding.

You may also like

Leave a Comment