Indonesia Joins Global Trend: Restricting Social Media Access for Teens
Indonesia has grow the second country worldwide, following Australia, to implement regulations restricting social media access for individuals under the age of 16. This move signals a growing global concern over the impact of social media on youth mental health and well-being.
The New Regulations: What You Need to Know
Effective March 28th, the Indonesian government will prohibit the creation of new accounts on platforms deemed “high-risk,” including YouTube, TikTok, Facebook, Instagram, X (formerly Twitter), Threads, and Roblox. The decision, signed by Indonesia’s Minister of Communication and Digital Affairs, Mutia Hapsari, aims to protect young people from exposure to harmful content such as pornography, cyberbullying, online scams, and addiction.
Addressing Parental Concerns
Minister Hapsari emphasized the government’s role in supporting parents, stating that the regulations are designed to alleviate the burden of battling algorithms alone. “We are taking this action to restore the autonomy of children’s futures,” she explained. Local reactions, as reported by AP, indicate widespread support from parents worried about the unrestricted access their children have to online content. Jakarta resident Mariana expressed concern over the “too much freedom” given to minors on social media, calling for better management of these platforms.
Australia Leads the Way in Youth Online Safety
Australia pioneered this approach in December 2023 with its own restrictions on social media accounts for those under 16. Building on this, Australia is expanding online safety regulations to encompass all online services, including websites, search engines, app stores, games, and AI chatbots. The eSafety Commissioner, Julie Inman Grant, announced that these services must block access to inappropriate content for users under 18, with potential fines of up to AUD 49.5 million (approximately USD 33 million) for non-compliance.
Age Verification and Content Blocking
The Australian regulations require robust age verification processes, moving beyond simple “18+” click-throughs. For example, searches related to suicide, self-harm, or eating disorders will prioritize mental health support services in search results. This proactive approach demonstrates a commitment to creating a safer online environment for young people.
South Korea’s Approach: Dialogue and Data Collection
Unlike the more restrictive measures taken by Indonesia and Australia, South Korea is currently focusing on gathering data and engaging in dialogue with stakeholders. Broadcasting and Communications Commission Chairman Kim Jong-cheol has stated that a solution cannot be achieved through unilateral regulation, emphasizing the need to understand actual user experiences.
High Youth Social Media Usage in South Korea
Despite this cautious approach, South Korea recognizes the potential risks. A 2023 study revealed that 67.6% of South Korean youth use social media, with 40.1% of those aged 10-19 considered at risk of smartphone addiction. A 2025 study by the Korea Press Foundation found that 70.1% of Korean teenagers use social media daily, with 48.8% being constant users. Research increasingly links social media use to issues like decreased concentration, sleep disturbances, addiction, anxiety, and depression.
The Broader Implications: A Global Shift in Digital Parenting
These developments reflect a growing global awareness of the need to protect children and adolescents in the digital age. Indonesia’s recent decision to temporarily block the AI chatbot ‘Grok’ due to the generation and dissemination of explicit deepfake images further illustrates this commitment. Whereas the ban was lifted after xAI promised improvements, it underscored the potential harms of unregulated AI technologies.
FAQ
- What platforms are affected by Indonesia’s new regulations? YouTube, TikTok, Facebook, Instagram, Threads, X (formerly Twitter), and Roblox.
- What is Australia doing to protect young people online? Australia is expanding online safety regulations to all online services, including AI chatbots, and enforcing strict content blocking and age verification measures.
- What is South Korea’s approach to youth social media use? South Korea is currently focusing on data collection and dialogue with stakeholders to inform future policy decisions.
Pro Tip: Parents can utilize parental control features offered by social media platforms and device manufacturers to monitor and limit their children’s online activity.
What are your thoughts on these new regulations? Share your opinions in the comments below!
