New paper questions effectiveness of under-16 social media ban

Meta slams YouTube exemption in Australia’s social media ban for under-16s

Researchers argue bans alone miss the mark.

With Australians now getting used to life with an under-16 social media ban, a new Science paper argues that restricting access alone will not make children safer online.

Instead, the authors say digital child safety should be built into platforms through design, with a stronger focus on children’s rights, agency, and well-being.

Rather than treating child safety as a question of bans and controls, the authors, Sandra Cortesi, Director of Youth and Media at the Berkman Klein Center for Internet & Society at Harvard University, and Urs Gasser, Professor at the Technical University of Munich and former executive director of the Berkman Klein Center, call for what they describe as a child-centred, research-driven approach.

Their argument draws on the work of the year-long Frontiers in Digital Child Safety expert group, which examined how digital products can better protect children while still supporting learning, creativity, and participation.

Why the paper pushes back on blanket restrictions

Cortesi and Gasser argue that broad restrictions can appear protective, but often fail in practice.

The paper cites policies such as parental control laws in the US, the UK’s Online Safety Act, school smartphone bans, and Australia’s under-16 social media rules as examples of measures shaped by concerns about risk.

The authors say these approaches can flatten important differences between children of different ages and stages of development.

They also warn that heavy-handed controls may erode trust, reduce agency, and push technology use out of sight rather than making it safer.

The paper does not dismiss online harms. It notes the risks children can face online, including cyberbullying, harassment, grooming, exposure to harmful content, and platform features that shape attention and habits.

But it argues that protection should not come at the expense of children’s ability to participate in digital life.

Four design approaches

The article sets out four practical areas where platforms, educators, caregivers, and policymakers could focus their efforts:

  • Designing for trust and gradual autonomy: tools that support conversations between children and adults, with responsibilities that expand as children mature.
  • Improving help-seeking and reporting: clearer, more accessible, and more confidential ways for children to report harm or ask for help.
  • Using on-device supports: real-time nudges, prompts, and guardrails that respond when risks appear, without removing choice entirely.
  • Building resilience through education and participation: integrating digital safety into broader learning and involving children in the design of the tools they use.

The authors argue these measures offer a more durable response than prohibition alone. In their view, digital environments should be designed to help children build resilience and confidence, not simply to block access.

What this means for platforms and policymakers

A central theme in the paper is that safety should be treated as a design challenge, not just a regulatory afterthought.

Cortesi and Gasser argue that companies should be expected to demonstrate that child-facing products are safe by design, while policymakers should establish accountability mechanisms to test whether those measures work.

The paper also calls for more transparency, independent audits, and better evaluation of tools such as filters, reporting systems, and AI-based detection features. It argues that many interventions are already in use, but too few are assessed for effectiveness or unintended consequences.

For Australia, the paper lands as debate continues over whether age-based restrictions can deliver meaningful protection on their own. The authors suggest the stronger long-term answer is a combination of smarter product design, education, evidence-based policy, and clearer accountability for platforms.

Evidence gaps remain

The authors also acknowledge the limits of the current research base. They say more longitudinal studies are needed to understand how trust, resilience, and digital habits develop over time, and argue that researchers need better access to platform data.

They note, too, that much of the existing work comes from Europe and North America, leaving major gaps in understanding children’s digital experiences elsewhere. That, they argue, makes the case for more cross-sector and cross-border collaboration among researchers, governments, industry, and children themselves.

For media, policy, and tech audiences, the paper sharpens a debate that is only likely to intensify: whether child safety online is best addressed by keeping children away from digital platforms or by requiring those platforms to work harder to earn children’s trust.

Keep on top of the most important media, marketing, and agency news each day with the Mediaweek Morning Report – delivered for free every morning to your inbox.

To Top