Indonesian Political, Business & Finance News

Ensuring a Safe Digital Space for Children

| | Source: MEDIA_INDONESIA Translated from Indonesian | Regulation
Ensuring a Safe Digital Space for Children
Image: MEDIA_INDONESIA

The government prioritises maintaining the quality of education. Through Ministerial Regulation on Communication and Digital Affairs (Permen Menkomdigi) No. 9 of 2026, the government has taken a significant step by restricting social media account ownership for children under 16 years old. This policy underscores the state’s efforts to respond to the growing risks faced by children in the digital space, from exposure to harmful content and cyberbullying to various forms of online exploitation.

This measure deserves appreciation. In recent years, the digital space has evolved into a social arena that shapes interactions, identities, and even value orientations for the younger generation. Data from the Indonesian Internet Service Providers Association (APJII) shows that nearly half of internet users in Indonesia come from the young age group. Children are not only active users of digital technology but also the most vulnerable group to various forms of information manipulation and unhealthy online relationships.

Nevertheless, child protection in the digital space is not solely about restricting access to social media. The challenges are far more complex. Many interactions that affect children occur through private and socially based communication spaces, such as instant messaging apps or closed digital groups.

Several recent cases illustrate this dynamic. Security authorities have discovered teenagers learning to assemble explosive materials from the internet after experiencing prolonged social isolation and bullying.

In another case, a vocational school student was arrested for allegedly being exposed to extremist ideologies through online chat groups containing hate narratives and bomb-making guides. Such incidents demonstrate that the digital space is no longer merely a medium for information distribution but also a space for forming relationships that can influence children’s thinking and behaviour.

DIGITAL RELATIONSHIPS

In digital security studies, the recruitment and manipulation of children rarely happens instantly. It develops through a process known as grooming, which is the gradual building of closeness with the target through seemingly normal interactions.

In the digital space, this process can occur through casual conversations, humour, use of emoticons, and personal attention that makes the child feel accepted in a group. Over time, the established relationship becomes increasingly intense. Older individuals can position themselves as trusted authority figures or mentors.

In such situations, the flow of information received by the child becomes directed. Certain narratives are reinforced, while other information sources are questioned or weakened. At a certain stage, the group’s values and norms can begin to be internalised by younger members.

This process shows that digital risks to children are not only related to the content they consume but also to the social relationships formed within digital platforms.

A child protection approach that only focuses on restricting access to certain platforms becomes inadequate if it does not consider these relational dynamics.

REGULATORY LIMITATIONS

Age restrictions on social media use are an important policy instrument. This regulation provides an initial framework for the state to ensure that digital platforms prioritise child safety.

However, the effectiveness of this policy depends heavily on various factors influencing daily digital technology practices. Age verification mechanisms on many platforms are still relatively easy to manipulate through inaccurate data entry or the use of adults’ accounts. This condition makes age restrictions potentially difficult to enforce consistently.

On the other hand, many digital platforms provide access to content without requiring account creation. Children can still consume various digital materials without going through the age verification system intended by the policy.

Changes in digital behaviour also need to be considered. When monitoring of social media becomes stricter, interactions often shift to more closed communication spaces, such as instant messaging apps or private online communities. These spaces are far more difficult to monitor by platform security systems or law enforcement authorities.

This situation indicates that age restriction regulations are an important initial step but not sufficient to address the full complexity of risks faced by children in the digital space.

PROTECTION ECOSYSTEM

Child protection in the digital era ultimately requires building a broader ecosystem than just technical regulations. Such efforts must involve families, schools, technology platforms, and state institutions with child protection mandates.

Strengthening digital literacy for parents and educators is an important part of these efforts. Many parents lack adequate understanding of children’s digital interaction dynamics. Without proper guidance, children can enter risky communication spaces without their immediate environment realising it.

Digital platforms also need to be encouraged to develop child-friendly service designs. Behaviour-based security approaches can help detect suspicious interaction patterns, including systematic attempts to contact or recruit children into certain groups through online communication.

In addition, cross-sector collaboration between the government, technology companies, educational institutions, and child protection organisations needs to be strengthened. Such cooperation enables faster responses to various evolving digital threats.

Ultimately, the digital space is an inseparable part of the younger generation’s life.

The challenges c

View JSON | Print