KOSA is Good Tech Policy, But the House Has an Opportunity to Make it Even Better
Mariana Olaizola Rosenblat / Aug 7, 2024Mariana Olaizola Rosenblat is a policy advisor on technology and law at the NYU Stern Center for Business and Human Rights.
Last week the U.S. Senate overwhelmingly passed the Kids Online Safety and Privacy Act (KOSA) in a 91-3 vote. This moves the bill one step closer to becoming the first significant piece of federal legislation to address harms inflicted by online platforms on users — in this case, children. Now it’s up to the House of Representatives to pass an improved version of this historic piece of legislation, which it would need to do in the weeks between its return from recess on September 9 and the general elections on November 5 — a difficult, but not impossible, feat.
Past efforts to regulate online platforms have evaporated in the face of Congressional stagnation and strong opposition from the tech lobby. But KOSA, driven by arguably the most committed and relentless of constituencies — parents who have lost children to harms perpetrated through online media — might overcome both Congressional inaction and potential future legal challenges.
The version of KOSA that passed the Senate would regulate online platform design rather than content. The bill creates a “duty of care” for online platforms, which includes gaming sites, requiring them to take reasonable steps to prevent and mitigate specific harms to children stemming from platform use. These harms, as enumerated by the bill, include anxiety, depression, eating disorders, substance use disorders, suicidal behaviors, online bullying and harassment, the promotion of addiction-like behaviors, and sexual exploitation.
Related: Read more perspectives on KOSA
Reasonable steps involve changes in the platform architectures and features that platforms know or have reason to know that contribute to these harms. These design aspects include limiting addictive product features like the infinite scroll, autoplay, platform rewards, and personalized algorithmic recommendations. In addition, the bill requires companies to implement the highest privacy and safety settings by default for accounts they believe belong to minors. As such, the bill does not regulate third-party content, nor does it even regulate the content moderation policies of the platforms, which the Supreme Court recently affirmed are covered by First Amendment protection.
KOSA is a long overdue piece of legislation that has been thoughtfully drafted in consultation with vulnerable groups, such as the LGBTQ+ community, to avoid infringing on freedom of expression and access to information online. Still, influential civil society groups such as the American Civil Liberties Union, the Electronic Frontier Foundation, and the Center for Democracy and Technology continue to oppose the bill, arguing that KOSA would “incentivize social media companies to over-filter content over fear of legal risks” and “lead to broad online censorship of lawful speech.”
These fears are overblown and based on a conflation of content moderation policy and platform design. Platforms’ removal or modification of addictive features like infinite scroll and autoplay has no bearing on their content policies, moderation decisions, or resulting speech online. Similarly, rules that limit the platforms’ ability to target content and advertisements to children and teens are unlikely to infringe on their right to access information. Children and teens can still search for content that interests them; the difference is they will now either have more control over a platform’s personalized recommendation system or the ability to opt out of it altogether. And, with respect to gaming sites, the prohibition of so-called “dark patterns,” or invisible design elements that incentivize children to spend more time or money on their platforms, should not impact the content of the games or discourse around them.
Granted, the Senate version of KOSA is not perfect, and the House has an opportunity to improve the text in at least a couple of ways. Specifically, lawmakers should address concerns about potential misapplication of the duty of care provision to content moderation choices, which could indeed have the effect of silencing disfavored opinions. The bill already provides that the duty of care pertains to the “the creation and implementation of any design feature,” but the text could explicitly state that content policies and individual moderation decisions are not considered design features. Additionally, lawmakers could hone the provision on default settings to clarify that where there is a conflict between achieving maximum privacy and maximum safety – as there sometimes is – platforms can exercise their judgment in balancing both interests.
Aside from these needed changes, KOSA is, on the whole, a good law. Notably, three major industry players — Microsoft, Snap, and X — have endorsed the bill, likely in recognition that they can still make money while taking some reasonable steps to protect children. It is time that the leaders of other tech companies, many of whom have children, do the same.