Home

Donate
Perspective

Congress’ Child Safety Bills Sound Good. Families Suggest They Won't Work.

Aliya Bhatia, Michal Luria / Mar 5, 2026

The United States Capitol building. (Martin Falbisoner)

The United States House Energy & Commerce Committee on Thursday will mark up bills discussed at a December hearing in pursuit of protecting children online. The bills run the gamut from funding initiatives increasing research and education on online platforms’ impacts to restricting children’s access to certain online features to mandating that online services and app stores verify users’ ages.

The committee is right to tackle the needs of kids, teens and their families, but risks advancing bills that while well-intentioned may not be effective nor in line with what some parents and teens actually want.

While polls have shown that people are supportive of passing some sort of kids’ safety legislation, including “bans” on social media, we conducted research that put these claims up to the test. We took 15 proposed interventions from various bills, put them into visual scenarios for context, presented them to 45 teens and parents and asked them to reflect and discuss.

One scenario walked families through taking a selfie to estimate a member of the family’s ages before they can gain access to a new application, which both parents and teens overwhelmingly balked at as ineffective or inaccurate. Another scenario walked through equipping parents with the ability to approve contacts before teens could message them, which many parents also referred to as burdensome. When teens and parents imagined actually uploading a photo for age verification or approving every new contact, enthusiasm often cooled. In place, concerns about privacy, security and unintended consequences quickly surfaced. Policies that may sound reassuring at first began to feel intrusive or impractical in daily life.

While this research was just one depiction of what parents and teens want, it nonetheless serves as a rare qualitative snapshot of what parents and teens think about policy approaches to protect children when walking through what they mean in practice. Our qualitative interviews do something that polls cannot — they move beyond quick responses and allow people to deeply consider how these policies would actually play out in their own lives.

In our research we asked teenagers and their parents what they would have wanted from online safety interventions. Their answers were consistent: more flexibility to match individual and family needs, and well-designed tools to control their online experiences.

Here’s what families had to say:

Users want flexibility

Policymakers are considering how and where age verification should occur to grant users access to online services, but the parents and kids we spoke to said families are the ones that should decide which apps their children access. These parents and teenagers asserted that they often jointly decide to download apps and, at times, circumvent age restrictions — for example, to use Zoom. Parents said they wanted discretion to decide which apps their teenagers access, based on maturity and mutual trust.

Certain bills don’t reflect that preference. The most prominent age verification bill before the committee, the App Store Accountability Act, requires app stores rather than individual applications to verify age via a “commercially available method.” While the bill also requires app stores to garner parental consent for app use, the bill risks limiting user discretion by having app stores limit which applications are available. Moreover, the bill’s requirement for app stores to use a “commercially available method” to verify users’ ages runs counter to these parents’ stated desire for options for how to designate their children’s ages.

This matters because parents and teenagers we spoke with were skeptical of commonly used age estimation methods, particularly those based on face scans. One teenager said, “If I downloaded an app and it told me I had to take a picture of myself for my age, I would think it’s a scam.”

These parents and teens overwhelmingly preferred flexibility in choosing apps for teens and minimization of data necessary to provide age-appropriate experiences. It’s a shame then that the committee won’t push for approaches like the Parents Over Platforms Act (POPA) that give parents and teens flexibility and which heed families’ preferences for less invasive methods of determining age instead, pushing forth one-size-fits-all approaches to require age verification.

Users want access and controls

While parents and teens alike registered a desire for choices in how they engage online, the committee is considering limiting access to features and the recommendation of some content it deems harmful. Given that what can be deemed harmful to a child depends on the child and can differ based on the beholder, it creates an extraordinary requirement for companies, not parents or kids themselves, to decide what kids should and should not access.. Two examples of this are the Safe Messaging for Kids Act which would ban disappearing messages specifically and the latest derivative of the Kids Online Safety Act, now rolled into the Kids Internet Digital Safety Act (KIDS Act) which, while greatly improved in the House, still risks overcensoring content recommended by algorithmic feeds.

Contrary to these bills’ approaches, teenagers in our research generally reported satisfaction with existing platforms and trust algorithmic feeds to show interesting and appropriate posts. As one teen indicated, “part of what [the algorithm] shows you is up to you [to correct]. It’s not all on the app.” Teenagers unanimously preferred algorithmic feeds to chronological feeds, or at least wanted the ability to choose between the two. Overall, teens spoke confidently about the tactics they use to control feeds.

Parents were primarily concerned with the “endless scrolling” or never-ending content recommendations. Levers to limit the volume of content were more appealing than outright bans on feeds recommending content, alongside granular tools to define what content the feed recommends.

User privacy is top of mind for parents and teenagers alike

Parents and teenagers repeatedly raised privacy issues, expressing distaste for invasive parental tools and highlighting concerns with security breaches and platform data collection.

The committee should address these concerns by prioritizing bills tackling privacy online, preferably for all users but at least for children. Possibilities include an improved version of the Children’s and Teens’ Online Privacy Protection Act, which limits data collection and retention for users under 15 and ensures critical state-level protections remain in place; the Don’t Sell Kids’ Data Act of 2025, which generally prohibits third-party actors from collecting, selling or transferring personal data of minors; or a version of the Parents Over Platforms Act (POPA), which minimizes data collected about users’ age to provide useful signals to apps without impeding privacy or free expression. While some bills noticed today call for a study in service of limiting the collection and use of children’s data , strong limits in the form of the Don’t Sell Kids’ Data Act of 2025 and user-choice focused approaches to age verification, according to the notice on the committee website, a few sensible approaches like POPA remain off the markup agenda.

Parents’ and teenagers’ views can widely vary, and lawmakers should advance solutions that allow for this reality. While we don’t cover all the bills that are likely to be discussed today, a few highlighted one-size-fits-all mandates risk flattening the very differences families are asking policymakers to respect. Families we talked to wanted flexible, well-designed tools they could use to tailor to their own values, kids and circumstances. If Congress truly wants to protect kids online, it should start by building policies that work alongside how families actually function.

Authors

Aliya Bhatia
Aliya Bhatia is a policy analyst at the Center for Democracy & Technology’s Free Expression Project. She works to protect and promote internet users’ free expression rights in the United States and around the world. Her areas of focus include automated content moderation, kids safety, and speech by ...
Michal Luria
Dr. Michal Luria is a Research Fellow at the Center for Democracy & Technology, specializing in qualitative and human-centered design research. She holds a Ph.D. in Human-Computer Interaction from Carnegie Mellon University. In her work, Dr. Luria explores and critiques interactions with emerging te...

Related

Transcript
US House Subcommittee Advances 18 Child Online Safety BillsDecember 13, 2025
News
Congress’s Bipartisan Child Online Safety Coalition is UnravelingDecember 2, 2025
Podcast
What to Expect from US States on Child Online Safety in 2026January 11, 2026
Perspective
The Youth Online Safety Movement Needs to Respect Children’s AutonomyNovember 21, 2025

Topics