Indonesian Political, Business & Finance News

Co-Regulation in Child Protection in the Digital Space

| | Source: KOMPAS Translated from Indonesian | Regulation
Co-Regulation in Child Protection in the Digital Space
Image: KOMPAS

The steps taken by the Ministry of Communications and Digital Affairs (Komdigi) in strengthening child protection in the digital space merit recognition. The state can no longer be merely a spectator amid the rapid penetration of the internet among children and adolescents.

Digital platforms are no longer simply means of communication, but also influence children’s psychological development, shape patterns of social interaction, and affect the consumption behaviour of young people.

It was therefore appropriate for the Communications Minister to issue Ministerial Regulation Number 9 of 2026 regarding the Implementation of Government Regulation Number 17 of 2025 on the Administration of Electronic Systems in Child Protection (PP Tunas).

Based on socialisation of this regulation during its drafting phase, Komdigi has introduced a risk assessment framework for digital services used by children with several indicators. These indicators include interaction with strangers, exposure to harmful content, child exploitation as consumers, personal data security, digital addiction, psychological disturbance, and physiological impact.

Based on these indicators, digital services can subsequently be categorised within specific risk profiles. Such an approach is known as risk-based regulation, a model that aligns industry obligations with the level of risk generated by an activity or product.

Research on regulation by Julia Black (2005) demonstrates that modern regulators increasingly adopt risk-based approaches to prioritise oversight of activities with the greatest potential impact on public interest. A similar approach is evident in various global regulations. The European Union’s Digital Services Act requires large digital platforms to conduct systematic risk assessments of their service impacts, including child safety, harmful content dissemination, and effects on democratic processes.

However, the principal challenge emerges at the implementation stage. Digital platforms do not share the same design. Business models and platform technology architectures vary considerably. Short video platforms rely on content recommendation algorithms. Private communication platforms depend on direct messaging. Gaming platforms have competition systems and reward mechanisms. Consequently, these differences influence the form of risks that emerge.

This condition is known as the implementation gap—the gap between policy objectives and their implementation reality in the field. The classical study by Pressman and Wildavsky (1973) demonstrates that well-intentioned policies often face difficulties when implemented due to the complexity of coordination between actors and limitations in implementation capacity. This underscores the importance of collaborative approaches in formulating and implementing digital policy.

The regulation of modern platforms can no longer be conducted by the state alone. Robert Gorwa (2023) in the journal Policy & Internet demonstrates that practices in regulating digital platforms are formed through interaction among various actors, including government, technology companies, and civil society. Terry Flew and Nicolas Suzor (2023) in the journal New Media & Society likewise demonstrate that digital platform regulation increasingly moves towards a co-regulation model—an approach combining the role of the regulator with the operational responsibility of platforms in managing digital risks.

View JSON | Print