New 'Screen Time' Guidelines Bolster Case for Child Safety Guardrails
Jenny Radesky / Jan 21, 2026Dr. Jenny Radesky acted as chair of the American Academy of Pediatrics (AAP) Council on Communications and Media.

A child on a tablet. (StockSnap)
For many parents and policymakers alike, children and teens’ relationships with digital media are a top of mind concern right now.
Australia is just weeks into implementing its law requiring that users be at least 16 years old to access social media, the United Kingdom and European Union are considering similar age restriction laws and at a Senate hearing last week lawmakers discussed ideas ranging from comprehensive child online safety and privacy standards to social media bans for those under 16 or even 18.
Conversations about youth and media have always been fraught. They reflect our society’s anxiety about the future, older generations’ discomfort with the ways newer technologies disrupt established ways of life and adults’ tendencies to exert control when they are worried about kids, rather than supporting autonomy (which is associated with better youth mental healthoutcomes).
Conversations are also dominated by simplistic concepts like “screen time.” As I argued in my recent written testimony for the Senate Commerce Committee, this framing ignores both mechanisms of risk and precise solutions. Key aspects of those mechanisms are the business models, design practices and engagement goals that dominate the products popular with youth, yet misalign with their developmental needs.
Adding to the conversation, the American Academy of Pediatrics (AAP) on Tuesday released its newest digital media policy statement for children and adolescents. Although I am not an author of these guidelines, I acted as chair of the AAP Council on Communications and Media and oversaw the policy writing process. In addition to providing guidance for pediatricians and families on healthy media use, it called for:
- prioritizing children’s health and well-being over engagement-based designs that encourage excessive use, datafication and commercialization;
- platforms used by minors to recognize when children are using their products, have child safety teams and a governance structure in which these teams have power and report directly to company leadership;
- digital companies to include safety and privacy features as the default setting, including turning off autoplay, not using targeted advertising on minors, providing options to turn off algorithmic feeds or content that has not undergone human review, preventing harmful content from being displayed to minors, minimizing designs that prolong engagement and turning off the chat feature; and
- funding nonprofit, child-centered media such as PBS KIDS.
While often called the “screen time guidelines,” the latest entry heralds a paradigm shift, emphasizing that we can’t just think of screen time in a vacuum. Whether youth have a healthy relationship with screens is shaped by systems-level forces like childcare access, school-device policies or afterschool-program availability.
Another one of those systems is the digital ecosystem.
The new AAP policy statement asks the question: are these designs actively centering youth developmental needs and well-being or engagement metrics, platform profits and market share?
Pediatricians are caring for kids in the midst of a child mental health crisis and high rates of parental burnout. Parents feel overwhelmed with managing tech that wants so much of their children’s time and attention, while many teens struggle with healthy relationships with technology or feel that they spend more time online than they want. We take care of infants who are watching hours of YouTube nursery rhyme videos per day, children being contacted by strangers in online video games and teens being fed posts that make them feel worse about their body, brain rot or gambling ads.
Pediatricians would love our patients to sleep more, read more and have strong family relationships — and crucially, we think better tech design is an important way to make that happen.
The idea of child-centered design isn’t new. The 5 Rights Foundation and Digital Futures for Children’s Centre in the UK, the international group Designing For Children’s Rights and UNICEF have been translating this idea into policy and design guidance for years.
At its core, developmentally-aligned design embraces the idea that children have digital rights to participate safely in the online world, free from exploitation or harmful contacts and content.
The best mechanism for this is a duty of care, which is why the AAP has advocated for the Kids Online Safety Act and COPPA 2.0, which would help shift the design of the digital ecosystem to be less data-extractive and engagement-based and more accountable to the types of harms pediatricians see in their clinics.
Platforms need to care that youth are using their products and are measuring and mitigating harms — harms that are clearly written so that they can’t be re-interpreted when the political tides shift. So that the government isn’t coercing specific design changes, platforms can prevent and mitigate harms however they want — for example, through reducing engagement-prolonging designs that invade the important spaces like school, sleep or family time — and KOSA leaves space for these solutions to naturally shift as tech evolves. For example, new research out of The Netherlands suggests that longer daily use of short form feed-based social media designs that have evolved over the past 5 years are linked with worse well-being, self-esteem and friendship closeness — while use of chat-based platforms correlates with more friendship closeness. A duty of care would have mandated that platforms such as TikTok, Instagram and YouTube examine the risks of engagement-optimized short-form feeds before releasing them to millions of United States youth.
The important part is centering on the question: how are kids doing on our platform?
Mechanisms of accountability are important — when platforms look to find the problems happening on their site and are transparent about how they are mitigating them, it creates a culture of quality rather than avoidance. As physicians who regularly practice “QI” (quality improvement) in our practices and hospitals, pediatricians know this process well. Imagine the day when platform experiments are designed around optimizing well-being rather than engagement, and results are shared, rather than stifled internally and released by whistleblowers.
Social media age restrictions neither meet these policy goals nor do they acknowledge the developmental reasons why teens are driven to connect with peers and culture. While they may seem like a cut-and-dry solution to policymakers, they omit aspects of the larger digital ecosystem where youth spend time (such as video games) and fail to create an ongoing regulatory framework with accountability processes, which is sorely needed in the US. We need a market where competition between humanely designed platforms occurs, not just competition for attention.
I am open to examining the data out of Australia as it puts its own age restriction law into practice. This could help us understand whether limiting social media platforms’ access to teens leads to better outcomes, or instead more use of alternate, unsafe platforms such as AI companions.
In the absence of that data, I agree with the AAP, leading nonprofits, and survivor parents who want KOSA passed, and think the latest guidelines buttress this case. Kids deserve a digital world that respects their rights for participation, but respects their time, attention, emotions and privacy so that they can lead their lives with agency and purpose.
Authors
