Indonesia sets strict age restrictions for children's social media access
President Prabowo has enacted new regulations limiting children's social media access, with parental approval and platform risk classifications.
By Anna Fadiah and Hayu Andini
President Prabowo Subianto has officially issued a new government regulation on the governance of electronic systems for child protection. The regulation, enacted on March 28, 2025, introduces strict controls on children's access to social media platforms based on age classifications and parental oversight.
Minister of Communication and Digital, Meutya Hafid, emphasized that the initiative aims to create a safer digital environment for Indonesian children. “We want the digital space to be safe, healthy, and supportive of children's growth. This is more than just a policy—it is a collective effort by our nation,” she stated in Jakarta.
The regulation, known as PP Tunas, was ratified at a special event held at the Merdeka Palace, attended by child protection organizations and young representatives. It introduces a structured framework to assess platform risks and enforce access limitations based on children’s age groups.
Key provisions of Indonesia’s child protection regulation
The newly introduced PP Tunas establishes five primary rules to regulate children's engagement with digital platforms:
- Risk classification of digital platforms: Social media and other digital services will be categorized based on seven risk assessment factors, including exposure to inappropriate content, personal data security threats, potential addiction risks, and the impact on children’s mental and physical well-being.
- Age-based account creation: Platforms must adhere to three distinct age categories—under 13 years, 13 to under 16 years, and 16 to under 18 years—each requiring different levels of parental approval and oversight.
- Mandatory digital education: Social media platforms must educate both children and parents about safe and responsible internet usage.
- Ban on commercial profiling: Platforms are prohibited from using children's data for commercial purposes unless it serves the best interest of the child.
- Enforcement and penalties: Violations of the regulation will result in administrative sanctions, ranging from warnings and fines to service restrictions and access termination.
Meutya Hafid explained that the access restrictions are based on a detailed evaluation of platform risks. Children under 13 will be limited to low-risk platforms, while those aged 13 to under 16 will have controlled access to services classified as having low to moderate risks. Meanwhile, children aged 16 and above will be allowed to create accounts with parental supervision until they turn 18.
Global trends in regulating children's social media access
Indonesia’s new social media governance for children aligns with measures implemented in other countries. Many nations have recognized the risks of unrestricted digital access for minors and have enacted or proposed strict regulations.
Australia
Australia recently introduced legislation requiring platforms like Facebook, Instagram, and TikTok to block access for minors or face fines of up to 49.5 million Australian dollars ($32 million). The law includes an enforcement trial beginning in January 2025, with a full rollout scheduled within a year.
Despite existing platform-imposed age restrictions (typically 13 years), Australian child protection advocates argue that enforcement has been weak, with many underage users still accessing social media.
Singapore
Since 2022, Singapore has mandated that social media platforms remove harmful content swiftly or face intervention from the Infocomm Media Development Authority (IMDA). In addition, Instagram has introduced new restrictions, placing under-18 users in a more controlled environment. By January 2025, all teenage users will be automatically shifted to “Teen Accounts” with stricter privacy settings, and attempts to change their age will require verification.
European Union
The European Union enforces General Data Protection Regulation (GDPR) rules requiring parental consent for processing data of children under 16. However, individual member states can lower this threshold to 13 years, depending on national policies.
India
Under the Digital Personal Data Protection Act 2023, children under 18 in India must have verifiable parental consent for data processing. While India has not yet imposed social media access restrictions, the government is evaluating measures to curb misinformation and harmful content.
United Kingdom
The UK has yet to implement strict age-specific regulations, but it is actively investigating the impact of social media on children. Digital Minister Peter Kyle has emphasized that child protection is a government priority. The upcoming Online Safety Act, set for implementation later this year, will introduce minimum age requirements and mandate stricter content moderation policies.
Impact of Indonesia’s regulation on digital platforms
With the enactment of PP Tunas, social media companies operating in Indonesia must adapt to new compliance measures. Platforms that fail to adhere to the regulations risk significant penalties, including fines and possible service restrictions.
Indonesia’s decision to impose stricter social media rules reflects a broader global movement to enhance child protection in the digital space. As concerns about data privacy, online safety, and mental health continue to grow, governments worldwide are taking steps to regulate children's access to digital platforms.
Moving forward, the success of Indonesia’s new regulation will depend on effective enforcement and collaboration between the government, digital platforms, and parents.
