Senators Richard Blumenthal (D-CT) and Marsha Blackburn (R-TN), the Chair and Ranking Member of the Senate Commerce, Science, and Transportation Subcommittee on Consumer Protection, Product Safety, and Data Security, today unveiled the Kids Online Safety Act, which would create a new regulatory regime to govern how minors interact with technology platforms.kids_online_safety_act_-_bill_text
“This measure makes kids’ safety an internet priority,” said Senator Blumenthal in a statement. “Big Tech has brazenly failed children and betrayed its trust, putting profits above safety. Seared in my memory—and motivating my passion—are countless harrowing stories from Connecticut and across the country about heartbreaking loss, destructive emotional rabbit holes, and addictive dark places rampant on social media. The Kids Online Safety Act would finally give kids and their parents the tools and safeguards they need to protect against toxic content—and hold Big Tech accountable for deeply dangerous algorithms. Algorithms driven by eyeballs and dollars will no longer hold sway.”
“Protecting our kids and teens online is critically important, particularly since COVID increased our reliance on technology,” said Blackburn. “In hearings over the last year, Senator Blumenthal and I have heard countless stories of physical and emotional damage affecting young users, and Big Tech’s unwillingness to change.”
The proposed legislation would establish a “duty of care” requiring that platforms “act in the best interests” of minors, preventing “heightened risks of physical, emotional, developmental or material harms to minors posed by materials on, or engagement with, the platforms,” with a particular focus on these specific harms:
1. Promotion of self-harm, suicide, eating disorders, substance abuse, and other matters that pose a risk to physical and mental health of a minor;
2. Patterns of use that indicate or encourage addiction-like behaviors;
3. Physical harm, online bullying, and harassment of a minor;
4. Sexual exploitation, including enticement, grooming, sex trafficking, and sexual abuse of minors and trafficking of online child sexual abuse material;
5. Promotion and marketing of products or services that are unlawful for minors, such as illegal drugs, tobacco, gambling, or alcohol; and
6. Predatory, unfair, or deceptive marketing practices.
The bill requires platforms to provide “safeguards for minors” that would include giving “a minor, or a parent acting on a minor’s behalf” with readily-accessible and easy-to-use safeguards” to govern their experience using the platform and what happens to their personal data. These safeguards include limitations on the ability of “individuals,” particularly “adults with no relationship to the minor” to contact the user; the prevention of data sharing; limitations on features that might drive addictive or unhealthy use of the platform; an opt-out of “algorithmic recommendation systems that use a minor’s personal data;” the ability to delete an account and remove all personal data; restrictions on geolocation; and limits of time spent on a particular platform.
It also requires covered platforms to provide “parental tools” to give parents “easy-to-use parental tools for parents to appropriately supervise the use” of platforms by minors. These include the ability to control privacy and commerce settings, the safeguards mentioned above, time spent on the platform, and more. Notably, the platforms would be required to “provide clear and conspicuous notice to a minor when parental tools are in effect.”
The bill introduces a range of transparency measures, including public reporting and accounting related to the use of platforms by minors, the dissemination of “illegal or harmful content involving minors,” and other harms. It requires platforms to conduct audits of risks to minors to assess potential harms, such as addiction, as well as the efficacy of safeguards.
A section on “independent research” also establishes a mechanism for “eligible researchers” associated with universities or nonprofits to conduct “public interest research” using platform data. “Qualified” researchers would apply to get access to datasets . A subsequent section puts limitations on the market research that platforms themselves can conduct with regard to minors.
Under the legislation, the National Institute of Standards and Technology would be directed to work with the Federal Communications Commission, the Federal Trade Commission (FTC) and the Secretary of Commerce to “conduct a study evaluating the most technologically feasible options for developing systems to verify age at the device or operating system level.” The Bill also creates the “Kids Online Safety Council,” which would be convened by the Secretary of Commerce and include a variety of interest groups, from parents to tech executives to government officials, as well as “youth representation.”
Ultimately, the FTC is given responsibility for enforcing the provisions of the proposed Act, while the authorities of state attorneys general are also preserved.
Justin Hendrix is CEO and Editor of Tech Policy Press, a new nonprofit media venture concerned with the intersection of technology and democracy. Previously, he was Executive Director of NYC Media Lab. He spent over a decade at The Economist in roles including Vice President, Business Development & Innovation. He is an associate research scientist and adjunct professor at NYU Tandon School of Engineering. Opinions expressed here are his own.