Home

Child Online Safety Law Clears the US Senate, But Faces Uncertainty in the House

Gabby Miller / Jul 30, 2024

Senate press briefing following the passage of the Kids Online Safety and Privacy Act on July 30, 2024. Source.

After a years-long battle in the US Senate to pass legislation to protect children online, and with less than a week to go before August recess, the Kids Online Safety and Privacy Act (KOSPA) cleared the chamber in a 91-3 vote on Tuesday. It was a widely expected outcome after a cloture vote late last week, where the Senate voted near-unanimously to hold a floor vote without debate or amendment, clearing the way for the bill’s passage.

Senate Majority Leader Chuck Schumer (D-NY) gave floor remarks following the vote, where he applauded the Senate for not letting partisanship get in the way of “its promise to every parent who has lost a child because of the risks of social media.” He also urged the House to pass KOSPA “as soon as they can.”

The Act combines an amended version of the Children and Teens’ Online Privacy Protection Act (COPPA 2.0) and the Kids Online Safety Act (KOSA), a pair of bipartisan bills that struggled to reach the floor of the Senate for a full vote. The bill that passed on Tuesday, now referred to as the Kids Online Safety and Privacy Act, or KOSPA, combines both COPPA 2.0 and KOSA, albeit with some definitional changes, among others. The Filter Bubble Transparency Act was also added to the final package.

COPPA is a 1998 rule under the Federal Trade Commission (FTC) that gives parents greater control over what information online service providers can collect on minors, and imposes certain requirements on operators of websites or online services directed to children under 13 years of age. An update to the bill (COPPA 2.0) was reintroduced by Sens. Ed Markey (D-MA) and Bill Cassidy (R-LA) last May to “stop the data practices fueling today’s youth mental health crisis.” The law bans targeted advertising to kids, creates an “eraser button” that allows parents and kids to delete their personal information when possible, and establishes a “Youth Marketing and Privacy Division” at the FTC.

Since 2021, when KOSA was first introduced in the Senate by Sens. Richard Blumenthal (D-CT) and Marsha Blackburn (R-TN), there’s been a steady push to build broad support for the bill. Its 'duty of care' provision requires platforms to “exercise reasonable care in the creation and implementation of any design feature to prevent and mitigate” harms to minors, including mental health disorders, addictive-like behaviors, and violence or online bullying.

Platforms are also expected to set up safeguards for minors, such as providing the most-protective settings for privacy and safety by default. KOSA also mandates platforms to provide more tools for parents to manage privacy and account settings of minors, including disabling or limiting “addictive features,” such as personalized algorithmic recommendations.

Despite bipartisan support for KOSA, some advocates and civil society groups have expressed stark opposition to the legislation, many of which remain opposed to the bill despite lawmakers' best efforts this spring to address their concerns. This includes Stop KOSA, a project by the group Fight for the Future, which maintains that the bill’s expanded use of parental monitoring and age verification tools are “needlessly invasive” and, in the wrong hands, could endanger access to “online content about LGBTQ resources, reproductive healthcare, and other lifelines for marginalized communities.” The initiative had support from nearly sixty organizations heading into the floor vote on Tuesday, including groups like the American Civil Liberties Union and the Center for Democracy and Technology.

Last week, ahead of the cloture vote, CDT Policy Analyst Aliya Bhatia asked the Senate to let “productive negotiations to improve KOSA” continue rather than pushing a rushed bill through the Senate that could “harm the very children it aims to protect,” in an emailed statement.

The ACLU was also quick to condemn KOSPA after its final passage Tuesday. “As state legislatures and school boards across the country impose book bans and classroom censorship laws, the last thing students and parents need is another act of government censorship deciding which educational resources are appropriate for their families,” said Jenna Leventoff, senior policy counsel, in a press release calling on the House to block the bill this fall when Congress reconvenes and a companion bill will be up for consideration.

Industry groups like the Information Technology and Innovation Foundation (ITIF) disapproved of the bill’s passage, too. “This country needs children’s online safety and privacy legislation that strikes the right balance between protecting consumers without infringing on their free speech rights or stifling innovation, which KOSA and COPPA 2.0 fail to do,” according to ITIF Senior Policy Manager Ash Johnson. She added that provisions banning targeted advertising on users under 17, as COPPA 2.0 does, would cut off revenue for ad-supported services.

Advocates for KOSPA’s approach to child online safety legislation, however, are optimistic about the bill’s potential. “We are particularly excited by the possibility that additional research is possible,” said Children and Screens Executive Director Kris Perry. “We could go from where we are today, with data that’s opaque and unavailable, to a more transparent, easily studied environment where researchers could go in and really start to look at how these design changes are impacting children.” The Institute, which aims to shape public policy on digital media and child development, released a first-of-its kind report earlier this year assessing the impacts of the United Kingdom’s Age Appropriate Design Code, another legislative approach to keeping the internet safer for kids that’s been adopted by US states such as California and Maryland.

The Tech Oversight Project, a Big Tech accountability group, characterized the bill’s passage as a win for kids and parents–and a sharp rebuke to Silicon Valley. “The Kids Online Safety Act will shift responsibility to platforms, demanding that companies stop prioritizing engagement metrics and ad revenue over children’s well-being, and instead give young people and parents better controls and a safer online experience,” said Sacha Haworth, executive director of TOP, in a statement. She also characterized the vote as a win for holding companies like Meta, TikTok, and Snap accountable in order to protect America’s youth.

While the Kids Online Safety and Privacy Act has officially cleared the Senate, a companion bill must also be passed by the House, which is in recess until September. After last week’s cloture vote, the Speaker, Mike Johnson (R-LA), said that he was “committed to working to find consensus in the House.”

Related Reading:

Authors

Gabby Miller
Gabby Miller is a staff writer at Tech Policy Press. She was previously a senior reporting fellow at the Tow Center for Digital Journalism, where she used investigative techniques to uncover the ways Big Tech companies invested in the news industry to advance their own policy interests. She’s an alu...

Topics