• Business
  • Entertainment
  • Health
  • News
  • Sport
  • Tech
  • World
Newsy Today
news of today
Home - social media ban
Tag:

social media ban

News

Indonesia to Enforce Child Social Media Restrictions from March 2026

by Rachel Morgan News Editor March 5, 2026
written by Rachel Morgan News Editor

The Indonesian government is preparing to enforce new age restrictions on social media platforms beginning in March 2026, under Government Regulation Number 17 of 2025, known as PP TUNAS. This regulation, a derivative of Indonesia’s Personal Data Protection Law, aims to enhance child protection in the digital realm.

New Regulations for Digital Platforms

PP TUNAS introduces stricter requirements for Electronic System Providers (PSE) operating in Indonesia. These providers will be obligated to implement age verification systems, restrict access for underage users, and strengthen safeguards to protect children online. Minister of Communication and Digital Affairs Meutya Hafid stated the policy reflects the government’s priority to enhance child protection amid rapid digital expansion.

Did You Know? PP TUNAS was ratified by President Prabowo Subianto on March 28, 2025, and came into effect on April 1, 2025, with a one-year adjustment period prior to enforcement.

Platforms classified as “high risk” under the regulation must either restrict access for users under 16 or implement parental supervision mechanisms. Minister Hafid confirmed that the classification of platforms, technical procedures, and monitoring systems were developed after consultations with stakeholders. She as well stated that the government examined global practices before drafting the regulation.

International Trend Towards Child Protection

Indonesia’s move aligns with a growing international trend to strengthen child protection in the digital world. Several countries are introducing age limits and stricter regulations for social media platforms. Malaysia will prohibit children under 16 from registering for social media accounts starting in 2026. Australia is proposing a ban on social media for children under 16, with significant penalties for non-compliance. Similar measures are being considered or implemented in New Zealand, Norway, the Netherlands, the United Kingdom, and Belgium.

Expert Insight: The increasing global focus on age verification and digital safeguards reflects a growing recognition of the unique vulnerabilities children face online and the demand to balance their right to access information with their right to protection from harm.

Under PP Tunas, all PSEs are required to filter harmful content, provide accessible reporting mechanisms, and ensure swift remediation. The regulation also prohibits profiling children’s data for commercial purposes, requiring platforms to prioritize child protection over commercial interests.

Frequently Asked Questions

When will Indonesia enforce child social media restrictions?

Indonesia will enforce child social media restrictions starting March 2026. Once enforced, access to social media features will vary depending on age.

What is the minimum age for social media under Indonesia’s new regulation?

The regulation is set to tighten social media access for children aged 13–16 or implement parental supervision systems.

What does PP Tunas require from digital platforms?

Platforms must implement age verification, filter harmful content, provide reporting mechanisms, protect children’s data, and avoid profiling minors for commercial purposes.

As Indonesia prepares for full enforcement in March 2026, the success of PP TUNAS will depend on the willingness of digital platforms to comply with the new regulations.

March 5, 2026 0 comments
0 FacebookTwitterPinterestEmail
World

France considering social media ban as devastated families launch legal action against TikTok

by Chief Editor February 24, 2026
written by Chief Editor

France Considers Social Media Ban: A Global Reckoning?

The tragic story of Marie Mistre, a 15-year-old French girl who took her own life after being exposed to harmful content on TikTok, has ignited a fierce debate about the responsibility of social media platforms and the protection of young people. Her mother, Stephanie Mistre, is now at the forefront of a landmark class action lawsuit against TikTok in France, alleging the platform served up content promoting self-harm, eating disorders, and suicide.

The Rising Tide of Legal Challenges

The Mistre family is one of seven French families pursuing legal action, a move spurred by devastating consequences – two teenage girls lost their lives, and five others are receiving treatment for severe eating disorders. This legal battle mirrors a growing trend of holding social media companies accountable for the well-being of their users. Similar lawsuits are underway in the US, with five British families also suing TikTok over a dangerous choking challenge that tragically claimed the lives of five teenage boys.

France’s Potential Ban: A Bold Move

The French government is now considering a ban on social media for children under 15, a measure inspired in part by Australia’s recent actions. A bill has already passed the National Assembly and is currently before the Senate. This potential ban has sparked debate, with some arguing it’s a necessary step to protect vulnerable youth, while others express concerns about limiting freedom and access to peer support networks.

The Algorithm Under Scrutiny

Stephanie Mistre powerfully argues that TikTok isn’t simply a platform hosting content, but actively chooses and prioritizes what young users see. “TikTok is not just a host, it chooses, it selects and it prioritises the content that our children see and that’s what makes it serious,” she stated. This focus on the algorithm – created by humans – is central to the legal argument, suggesting a deliberate curation of content with potentially harmful effects.

Beyond Bans: The UK’s Online Safety Act

While France contemplates a ban, the UK is taking a different approach with its newly implemented Online Safety Act. This legislation aims to hold companies accountable for removing harmful content and could result in fines or even being taken offline if they fail to protect underage users. Ian Russell, whose daughter Molly tragically died after viewing harmful content on social media, believes this approach is more sustainable than outright bans.

The Dilemma of Online Support Networks

Russell highlights a crucial point: social media can also provide vital support networks, particularly for marginalized groups like neurodiverse individuals and LGBTQ+ youth. Bans could inadvertently cut off access to these communities. He emphasizes the need to educate young people about online safety rather than simply removing them from the digital world.

A Global Conversation

The debate extends beyond France and the UK. Several other European countries, including Ireland, are also considering similar measures. Australia’s actions are being closely watched as a potential model for other nations grappling with the impact of social media on youth mental health.

What’s Being Done to Protect Children?

The French government conducted a parliamentary inquiry into the harm caused by social platforms, leading to the declaration of a public health emergency to expedite the introduction of the proposed ban. Laure Miller, a French deputy, credits Australia with “pioneering and inspiring” this movement towards greater regulation.

Frequently Asked Questions

  • What is France considering? France is debating a law to ban social media for children under 15.
  • What is the Online Safety Act? It’s a UK law holding social media companies accountable for harmful content and potentially imposing fines.
  • Are there lawsuits against TikTok? Yes, lawsuits are ongoing in both France and the US.
  • What is the concern about algorithms? The algorithms used by social media platforms are accused of prioritizing harmful content for young users.

Learn More: Explore 7.30 on ABC iview and ABC TV for further insights into this critical issue.

February 24, 2026 0 comments
0 FacebookTwitterPinterestEmail
Business

Why young people say they are not the only ones hooked on screens

by Chief Editor February 8, 2026
written by Chief Editor

Beyond the Bubble Pop: Why Screen Time Concerns Aren’t Just for Teens Anymore

The narrative around excessive screen time has long focused on young people. But a growing conversation, fueled by observations from Gen Z and backed by emerging data, suggests that older generations are just as susceptible to the allure of digital devices. From Candy Crush to endless scrolling, the habits are spreading, prompting a re-evaluation of who’s *really* hooked.

The Accusation Reversal: When Parents Become the Problem

Bailee, 24, experienced a common frustration: being lectured about her phone utilize by her mother. “My mum is always like, ‘it’s that phone,’ like every single time I do something wrong, she’s like, ‘it’s that phone,’” she told triple j hack. However, a closer look revealed a similar pattern in her mother’s behavior. “My mum’s addicted to Candy Crush,” Bailee observed, noting her mother’s difficulty disengaging even during brief conversations. This experience highlights a growing trend: young people noticing their parents’ own screen time habits.

A Generational Shift in Digital Habits

This isn’t simply anecdotal. A YouGov survey conducted in the United States last year found that over half of adults aged 45 to 64 spend five or more hours daily looking at screens, with one in five estimating between seven and eight hours. While younger adults (18-29) still report the highest screen time – 70% exceeding five hours, and nearly a third hitting nine or more – the gap is narrowing. This suggests a broader societal shift, rather than a problem confined to younger generations.

The Dopamine Loop: Why We’re All Vulnerable

The appeal is understandable. As Bailee pointed out, apps are designed to be addictive, offering “a quick dopamine hit” for everyone. This is particularly true with games like Candy Crush and the endless scroll of social media feeds. The reward cycles built into these platforms are engineered to keep users engaged, regardless of age.

‘Digital Natives’ as Guides: A Role Reversal?

Interestingly, neuropsychologists suggest that younger, “digital native” generations may be uniquely positioned to help older adults recognize problematic screen use. Melbourne-based neuropsychologist Michoel Moshel believes younger people are more comfortable navigating the digital landscape and understanding the manipulative features embedded in technology. “I think there is some place for younger people…to have a remarkably frank conversation with their parents,” he said, encouraging open dialogue about the intentional design of these technologies.

Beyond Time Limits: Recognizing Loss of Control

Dr. Moshel defines problematic screen use as a “loss of control over screen time with negative consequences.” This isn’t just about the *amount* of time spent, but the inability to intentionally manage it. He estimates that around 3-5% of people are clinically addicted to screens, experiencing a genuine inability to disconnect despite recognizing the negative impact.

Social Media Bans and a Search for Balance

The growing concern over screen time has prompted legislative action. Australia recently prohibited access to some social media platforms for children under 16, and countries like Spain, Greece, Britain, and France are considering similar measures. While these bans primarily target teenagers, they reflect a broader societal awareness of the potential harms of excessive screen use.

Finding Alternatives: A Summer Disconnect

For some, disconnecting is proving beneficial. Jazmin, 15, experienced a reduction in screen time after being “booted off” most social media apps before summer. She found herself engaging in more real-world activities and feeling less pressure to stay constantly connected. Her friend, Blaize, 16, also noted a positive impact on his own screen time, even though he wasn’t directly affected by the ban.

Frequently Asked Questions

  • Is screen time really that bad? It depends. Problematic screen use is defined by a loss of control and negative consequences, not just the amount of time spent.
  • Are older generations more addicted than they admit? Many young people believe so, observing similar patterns of engagement in their parents, and grandparents.
  • Can younger people help their parents with screen time? Neuropsychologists suggest they can, due to their greater familiarity with digital technology and its manipulative features.
  • What are the signs of problematic screen use? Difficulty controlling time spent, neglecting other activities, and experiencing negative consequences are all indicators.

Pro Tip: Start tiny. Instead of aiming for a complete digital detox, try setting specific time limits for certain apps or designating screen-free zones in your home.

What are your experiences with screen time, both your own and those of your family? Share your thoughts in the comments below!

February 8, 2026 0 comments
0 FacebookTwitterPinterestEmail
Tech

Australia has banned social media for young children. Could the U.S. be next?

by Chief Editor December 11, 2025
written by Chief Editor

Why Nations Are Rethinking Teens’ Access to Social Media

From the sun‑drenched beaches of Hawaii to the bustling streets of Sydney, lawmakers, parents, and teens are watching a new social media ban for users under 16 take shape in Australia. The policy forces platforms such as Facebook, TikTok, Instagram, YouTube and Snapchat to block minors or risk multi‑million‑dollar fines. While the law is fresh, the conversation it ignited has already crossed oceans and is prompting a wave of future‑focused trends in digital regulation, mental‑health advocacy, and tech workarounds.

Trend #1 – Age‑Based Platform Restrictions Become the Norm

Australia’s “Kids Off Social Media Act” is quickly becoming a template. In the United States, at least ten states are drafting similar bills, and Senator Brian Schatz is championing a federal version that would block children under 13 and ban algorithmic feeds for anyone under 17. A Pew Research Center study found that 73% of U.S. teens say they feel “pressured” to be constantly online—fueling bipartisan calls for age‑based safeguards.

Trend #2 – Mental‑Health Metrics Drive Policy

Data from the World Health Organization (WHO) shows a 15% rise in anxiety disorders among adolescents globally since 2019. Calls for stricter regulations are no longer grounded in moral panic; they are backed by measurable health outcomes. As a result, future bills are likely to reference public‑health thresholds—for example, requiring platforms to report teen‑engagement statistics to an independent health board.

Did you know? In Finland, schools that introduced “digital‑wellbeing weeks” saw a 22% drop in self‑reported stress among students, according to a 2024 Ministry of Education report.

Trend #3 – VPN Use and the “Tech‑Savvy Kid” Phenomenon

Tech experts like Doc Rock warn that any blockade can be sidestepped with a Virtual Private Network (VPN). Recent data from Statista shows that 38% of U.S. teens have used a VPN at least once. Future regulations will therefore likely pair bans with educational campaigns for parents, not just punitive measures for platforms.

Trend #4 – Growth of “Private‑Channel” Communication

Parents like Rep. Lisa Marten argue that “texting, private photo sharing, and face‑to‑face interaction” can replace public feeds. Already, messaging apps with end‑to‑end encryption (e.g., Signal, Telegram) are seeing a surge in teen usage. A UNICEF report notes a 30% increase in private‑app adoption among 12‑15‑year‑olds between 2022‑2024.

Trend #5 – Platform‑Level “Age‑Gate” Innovations

In response to regulatory pressure, tech giants are piloting AI‑driven age‑verification tools that scan facial features or behavioral cues. While privacy advocates remain skeptical, the push for non‑intrusive verification could become a market differentiator, especially if legislation mandates proof of age before algorithmic content is shown.

What This Means for Parents, Educators, and Policy‑Makers

1. Start Early Conversations: Discuss digital footprints with children before they sign up for any platform.
2. Leverage Parental Controls: Most devices now include built‑in limits for app usage and screen time.
3. Stay Informed About VPNs: Knowing how VPNs work helps you guide kids toward safe alternatives, rather than “forbidden‑fruit” behavior.
4. Advocate for Data Transparency: Demand that platforms share anonymized teen‑engagement data with schools and health officials.

Pro Tip: Set a weekly “digital‑free night” where the whole family puts devices away. Research shows that consistent offline time reduces cortisol levels in teens by up to 18%.

Frequently Asked Questions

Will the Australian ban apply to all social‑media sites?
It currently targets the ten largest platforms, but the legislation allows the regulator to add more services if they reach a certain user threshold.
Can parents legally force their child to stop using social media?
Parents can set household rules, but enforcement varies by state. Some U.S. states are introducing “digital‑guardian” statutes that give parents stronger legal standing.
How effective are age‑verification tools?
Early trials show a 45% reduction in under‑13 sign‑ups, but false positives and privacy concerns remain challenges.
What are the risks of kids using VPNs?
VPNs can mask location, but they also hide risky behavior from parental controls, potentially exposing kids to unmoderated content.
Will banning algorithms improve mental health?
Studies indicate that reduced algorithmic push‑notifications lower screen‑time and improve sleep quality, which are linked to better mental health outcomes.

Looking Ahead: A Balanced Digital Future

The push for age‑based bans is just the first chapter in a longer story about digital wellbeing. As governments tighten regulations, tech companies will innovate with smarter verification, and families will adapt with new communication habits. The ultimate goal is a balanced ecosystem where teens enjoy the benefits of connectivity without compromising their mental health.

Want to stay ahead of the latest trends in teen digital safety? Subscribe to our newsletter for weekly insights, or join the conversation in the comments below.

Explore more:

  • How Social Media Affects Teen Mental Health
  • The Ultimate Guide to Parental Controls in 2025
  • World Mental Health Day – UN Resources
December 11, 2025 0 comments
0 FacebookTwitterPinterestEmail

Recent Posts

  • Karaluchy karmione pyłem księżycowym. Czego bali się naukowcy po Apollo 11?

    May 8, 2026
  • After losing a child, you’re expected to put a broken family back together

    May 8, 2026
  • Pussy Riot protest at Venice Biennale forces Russian pavilion to briefly close | Pussy Riot

    May 8, 2026
  • UK house price growth halved as Iran war fallout hits housing market | House prices

    May 8, 2026
  • The Strad News – String players among YCAT artists for 2026

    May 8, 2026

Popular Posts

  • 1

    Maya Jama flaunts her taut midriff in a white crop top and denim jeans during holiday as she shares New York pub crawl story

    April 5, 2025
  • 2

    Saar-Unternehmen hoffen auf tiefgreifende Reformen

    March 26, 2025
  • 3

    Marta Daddato: vita e racconti tra YouTube e podcast

    April 7, 2025
  • 4

    Unlocking Success: Why the FPÖ Could Outperform Projections and Transform Austria’s Political Landscape

    April 26, 2025
  • 5

    Mecimapro Apologizes for DAY6 Concert Chaos: Understanding the Controversy

    May 6, 2025

Follow Me

Follow Me
  • Cookie Policy
  • CORRECTIONS POLICY
  • PRIVACY POLICY
  • TERMS OF SERVICE

Hosted by Byohosting – Most Recommended Web Hosting – for complains, abuse, advertising contact: o f f i c e @byohosting.com


Back To Top
Newsy Today
  • Business
  • Entertainment
  • Health
  • News
  • Sport
  • Tech
  • World