The Youth Online Safety Movement Needs to Respect Children’s Autonomy
Amelie Ortiz De Leon / Nov 21, 2025
Mark Zuckerberg, founder and CEO of Meta, center right, addresses the audience, including parents of children injured or lost in events involving social media, during a Senate Judiciary Committee hearing in Washington, DC, on Wednesday, Jan. 31, 2024. Photographer: Kent Nishimura/Bloomberg via Getty Images
When Mark Zuckerberg turned to apologize to parents holding photos of children lost to online harms during a Senate hearing last year, the moment captured the national anxiety surrounding kids’ safety online. Today, 96% of teens in the United States use the internet daily, making digital life inseparable from youth itself. Yet while the internet offers children benefits like connection, learning and creativity, those come alongside an exposure to real risks, from cyberbullying and eating disorders to sexual exploitation.
In response, lawmakers and advocacy groups have mobilized to hold platforms accountable. Parents have filed lawsuits accusing companies of designing addictive products that endanger children’s mental health, while legislators have pushed for stricter safety standards. Under intense scrutiny, tech firms have raced to roll out new parental control tools, from TikTok’s “Family Pairing” feature to OpenAI’s “Parental Controls”. These features promise to protect young users from harm by enabling parents to monitor and manage their children’s activity, set time limits, filter content and adjust privacy settings.
But beneath this rhetoric of safety lies an underappreciated trade-off: the erosion of children’s digital autonomy. While these safety measures are well-intentioned, they raise concerns about the rights of young users in digital spaces.
Parental controls are built on a familiar logic: adults know best. While they can prevent exposure to harmful content, their effectiveness is inconsistent. Studies show that rule-setting and privacy-invasive monitoring can often backfire, heightening tensions between parents and teens. They also promote the idea that young people’s digital habits require containment and control rather than cultivation.
A more promising path lies in co-regulation, offering a collaborative approach that encourages open communication and shared responsibility between parents and teens.
The Nintendo Switch 2 as a stepping stone toward co-regulation
Nintendo’s Switch 2 parental controls app offers a useful window into the video gaming industry’s evolving approach, for one. The company’s family-friendly image and hardware-based business model give it little incentive to chase engagement metrics, unlike most public-facing digital platforms. In comparison, social media’s business model hinges on targeted advertising, which involves maximizing screen time, collecting behavioral data, and sustaining continuous user engagement, especially among its core demographic of teens and young adults.
Nintendo released its parental controls application to complement its gaming console, offering a suite of tools to establish stricter oversight over children’s usage of their Nintendo Switch. The app reflects a model rooted in consumer trust and co-regulation between parent and child. This foundation in Nintendo’s business logic shapes their design philosophy: their features are proactive, transparent and tailored to strengthen parent-child relationships.
While social media tools often adopt a one-size-fits-all approach that prevents customization and parent-child negotiation, the app’s age-based restriction levels (young child, child and teen) reflect the principle of evolving capacities, which recognizes that protection should scale with maturity. Rather than applying uniform controls to all users under 18, the app allows parents to adjust gameplay access based on their child’s developmental stage. Each setting aligns with third-party age ratings, providing parents with a structured way to adjust oversight as their child grows. These customization tools are clear nods to the United Nations’ General Comment No. 14, which calls for protective measures that account for children’s age, identity and maturity, and to General Comment No. 25, which emphasizes the need for age-attuned digital environments that provide adequate pathways for children to safely navigate online spaces.
While the application makes strides toward a model that affirms the agency of youth, it continues to lack mechanisms that would center the child as an active rights-holder in the governance of their digital environment.
First, the app has no built-in system for teens to request additional screen time or contest restrictions. For instance, TikTok’s “Family Pairing” feature allows teens to request additional screen time within the app. Additionally, certain interaction settings, such as the ability to block in-game messaging or prevent screenshot sharing to social media, raise important questions about proportionality and potential risks to the child’s rights to freedom of expression.
While intended to minimize exposure to online risks, these restrictions are over-corrective, particularly for older teens who may use in-game features to engage creatively or socially with peers. Blocking the ability to share screenshots, for example, could curtail a child’s ability to express themselves publicly and build community, raising concerns about the extent to which the restrictions are “lawful, necessary and proportionate.”
Nintendo’s parental controls application represents a thoughtful and rights-conscious approach to platform safety, especially when contrasted with more punitive, engagement-focused models. Nintendo’s parental controls feature emphasizes progressive realization, parent-child dialogue and age-based customization. However, to fully uphold the rights of the child, the system would benefit from greater participatory infrastructure and reciprocal transparency.
Designing features for children, by children
Parental control features remain a necessary component of digital platforms, offering crucial protections for children in a complex and unpredictable online environment. However, rather than position children as passive recipients of adult oversight, I advocate for a co-regulatory model that centers minors as stakeholders in their own digital experiences.
To operationalize this shift, I introduce what I call the CRIB (Choice, Respect, Inclusion, and Balance) toolkit, a rights-based design framework that balances between children’s rights and parental involvement:
- Choice: Children should have ways to express their preferences and co-manage their online experiences. Platforms could introduce shared dashboards, allowing both teens and guardians to view usage data and restrictions transparently. Unlike traditional models that grant parents unilateral access to such data, this shared dashboard would include real-time notifications when new restrictions are applied, promoting transparency and mutual trust.
- Respect: Rules should be fair, age-appropriate and contestable. Platforms should ensure that restrictions are not only fair but also developmentally appropriate. Platforms could create appeals processes that allow older teens to request access to restricted features or content. Restrictions should scale back as youth demonstrate maturity, restoring access to non-sensitive content by default for users aged 16 and up. This developmentally attuned approach affirms children’s growing autonomy while maintaining protective oversight.
- Inclusion: Parental controls can unintentionally harm marginalized youth, especially LGBTQ+ teens in unsupportive homes. Platforms should assess whether their tools restrict access to affirming or educational spaces. One solution could be a “trusted topics” filter, an invisible whitelist that preserves access to verified identity-affirming content without alerting parents. To ensure accuracy and safety, companies should co-design such lists with civil society experts and youth advisory groups. This filter promotes harm reduction and informational access while minimizing the risk of family-level conflict.
- Balance: Digital safety works best when supported by dialogue. Platforms could offer parental education guides that explain threat models, encourage open conversation and contextualize online risks. These materials should shift the focus from monitoring to mentorship, helping parents understand how to talk about digital wellbeing rather than outsource it to an app. By contextualizing online risks, such tools can help align parental concerns with young users’ lived experiences.
Assessing impact
Embedding youth participation in design is effective. Studies show that children prefer mediation tools that promote shared responsibility over opaque surveillance. Participatory and gamified systems also improve digital literacy.
The United Kingdom’s Ofcom regulatory agency, for example, piloted a digital etiquette game that taught children principles like “think before you share.” Researchers found measurable improvements in both awareness and behavior, as children became more likely to review privacy settings and reflect before posting. Gamification of these techniques, when used thoughtfully, transforms abstract safety rules into interactive learning, increasing engagement while reinforcing trust. Youth-led moderation experiments offer further evidence. On Discord, teen moderators aged 13 to 17 helped manage online communities, improving conflict resolution and cross-cultural communication skills. These participatory spaces not only nurtured youth capacity but also fostered more respectful community norms. When young people help shape online governance, safety becomes a shared cultural value.
Such examples underscore a central truth: young users are capable co-designers of safer digital spaces. As platforms face growing pressure from legislators and parents alike, platforms stand at a crossroads between control and collaboration. Companies that invite young users into the design process will help rebuild the trust that technology too often erodes, building digital worlds grounded in trust, dignity and shared responsibility.
Authors
