Home

Children’s Online Safety Bills are the Future?

John Perrino / Feb 16, 2023

John Perrino is a policy analyst at the Stanford Internet Observatory.

Log Off founder Emma Lembke testifies in a Senate Judiciary Committee hearing on online safety on February 14, 2023 at the U.S. Capitol in Washington, DC.

At a Senate Judiciary Committee hearing on children’s online safety on Tuesday, senators attempted to breathe new life into past legislative proposals and highlight bipartisan momentum to take new action on issues ranging from social media’s impact on mental health to child sexual abuse material (CSAM), cyberbullying, human trafficking and illicit drug sales.

The nearly three hour hearing featured witnesses from civil society groups, academia and activists focused on these issues. Senators on both sides of the aisle reiterated calls for Section 230 reform that would remove liability protections for harmful user content, particularly CSAM, and discussed the challenge of monitoring end-to-end encrypted messaging tools. The EARN IT Act, legislation designed to address the online sexual exploitation of children that was reintroduced last year, more than a dozen mentions.

An alternative path forward also came to the fore with discussion of the potential for safety by design legislation, such as the Kids Online Safety Act, and proposals to address gaps in federal law for reporting and taking action on non-consensual intimate imagery (NCII) and CSAM.

“We have to give kids and parents — yes both kids and parents — the tools, transparency and guardrails they need to take back control over their own lives,” said Sen. Richard Blumenthal (D-CT). “That is why we must and we will double down on the Kids Online Safety Act.”

Senate Judiciary Chairman Dick Durbin (D-IL) used the occasion to release a draft bill, the STOP CSAM Act, that he says would make reporting CSAM easier for victims and nonprofit youth safety organizations.

Otherwise, senators found unanimous support for legislative proposals such as the EARN IT Act from the invited witnesses; no critical voices from industry or civil rights organizations were in the room to voice concerns over implications for privacy and security. Many civil liberties organizations oppose the EARN IT Act over such concerns, including potential implications for encrypted messaging applications.

Potential reform to Section 230 dominated the hearing, but content-focused policy proposals face thorny challenges, such as potential politicization and First Amendment concerns.

Despite the apparent consensus at the hearing, legislation may continue to stall amidst a more complex and thorny dialogue outside the Capitol on how to achieve a balance between empowering users, protecting children’s privacy and access to information, and providing users more control over their online experiences, all while preserving free expression and digital rights.

Sen. Sheldon Whitehouse (D-RI) cautioned that there is a partisan split over how Section 230 should be reformed, driven by different views of the problem. Republicans generally want platforms to permit more speech while Democrats want platforms to take more action against potentially harmful speech.

As an alternative to chipping back protections for online user content, new legislative proposals aim to address the design of social platforms with safety guardrails, strong default privacy and safety settings, internal risk assessments, and external auditing and transparency requirements.

This safety by design approach, introduced by Australia’s eSafety Commissioner and modeled on the United Kingdom’s Age appropriate design code, was recently adopted in California, but now faces a legal challenge from the tech industry. Similar bills have since been introduced in Maryland, New Mexico, New Jersey, New York and Oregon.

The Kids Online Safety Act, introduced by Sens. Richard Blumenthal (D-CT) and Marsha Blackburn (R-TN), had a strong push for inclusion in last year’s must-pass federal spending bill, but faced strong civil society and industry opposition.

The bill would require social media platforms to set default privacy and security settings to the strongest option for children 16 or younger and provide access to parental control settings. It would also implement a program for independent researchers to access social media data to study effects on young users.

“When you have a majority of children that are experiencing adverse impacts from social media platforms, you have to step in and do something. And that is what we are working to do,” said Sen. Marsha Blackburn (R-TN).

Notably, senators across the aisle raised proposals for transparency and researcher access and funding to study the effects of social media on young users could help to refine obligations under a duty of care.

Weighing the costs and benefits of any legislation designed to regulate technology always requires making difficult judgments, often with imperfect information. But when it comes to harms to children and teens, evidence is piling up.

A CDC report released on Monday found that “teen girls are experiencing record high levels of violence, sadness, and suicide risk.” The report was raised by Sens. Blackburn and Blumenthal and addressed by American Psychological Association Chief Science Officer Mitch Prinstein who said it raised the urgent need for federal funding to conduct “research on the effects of social media on mental health.”

Prinstein and youth advocate and Log Off Movement Founder Emma Lembke highlighted pop-ups about time spent on platforms, turning off auto-play, and muting notifications as clear actions for platforms to address addictive features.

Prinstein added that “there is clearly a dependency on social media which we can see in kids suffering from many of the same symptoms that we see in the DSM, the diagnostic manual for addiction to substances. It seems to apply quite well to the description of kids' behavior and dependency on social media.”

Sen. Chris Coons (D-DE) promoted his Platform Accountability and Transparency Act as a measure to inform future online safety legislation. “We have limited research about exactly what the effects are of the design choices that social media platforms are making on childhood development and on children's mental health,” Sen. Coons said.

"Transparency and researcher access is a critical piece of the equation and we shouldn't have to rely on courageous whistleblowers like Frances Haugen to understand what the companies already understand," Fairplay Executive Director Josh Golin told Sen. Coons.

Sen. Josh Hawley (R-MO) announced two children's online safety bills. The first would ban users under age 16 and the second would fund a government study on the effects of social media on children.

Senators disagreed on the need for age ID verification and bans on social media use for children under 16 years of age.

Lembke cautioned that children always find a way around restrictions to access technology and suggested that safety guardrails for users would be more effective than a ban on social media access.

Notably, discussion of privacy legislation mentions was nearly absent from senators during the hearing. That is within the Senate Commerce Committee’s jurisdiction, but there remains a divide between the House and Senate over whether to address children’s privacy and safety before passing comprehensive privacy legislation.

Chair Dick Durbin said that there will be a markup held soon on children's online safety legislation and highlighted the bipartisan collaboration across the committee in his concluding remarks. Whether there is enough momentum to pass reforms in a divided legislature remains to be seen.

Authors

John Perrino
John Perrino is a policy analyst at the Stanford Internet Observatory where he translates research and builds policy engagement around online trust, safety and security issues.

Topics