Home

Donate

Considering KOSA: A Bill to Protect Children from Online Harms

Tim Bernard / Dec 1, 2022

Tim Bernard recently completed an MBA at Cornell Tech, focusing on tech policy and trust & safety issues. He previously led the content moderation team at Seeking Alpha, and worked in various capacities in the education sector.

Senator Richard Blumenthal (D-CT) and Senator Marsha Blackburn (R-TN), August 2021. Source

This article has been updated to reflect the Senate Commerce Committee’s amended version of KOSA as of Dec 13. Links to sections refer to the earlier version available on congress.gov. There have also been more recent letters of support and opposition from notable groups.

On November 15, Senator Richard Blumenthal (D-CT) and Senator Marsha Blackburn (R-TN), met with a group of parents whose children were harmed after exposure to social media content. The parents shared their stories and lobbied the senators to take prompt legislative action.

Sens. Blumenthal and Blackburn are Chair and Ranking Member of the Senate Commerce, Science, and Transportation Subcommittee on Consumer Protection, Product Safety, and Data Security, and sponsors of the Kids Online Safety Act (S.3663). Introduced earlier this year, this proposed legislation passed out of committee in July with unanimous support, and its proponents hope that it will be voted on during the present lame duck session, most likely appended to a defense or spending bill. But what would the Kids Online Safety Act (KOSA) do?

What Is In the Bill?

Before delving into the requirements of KOSA, it is important to note two definitions(§2) that underlie the scope of its impact should it should become law:

  • A minor is defined as a user aged 16 or below
  • An internet platform appears to be in scope of the legislation if it is for-profit, “is used, or is reasonably likely to be used, by a minor” and is not a school or common carrier.

Reminiscent of the California Age-Appropriate Design Code Act and the UK’s Online Safety Bill, KOSA (§3) establishes a duty of care for platforms “to act in the best interests of” users who are minors. These “best interests” are defined as prevention of harm from content, such as self-harm material, bullying, sexual expolitation, advertising of goods and services–such as gambling and alcohol–that are illegal for minors, as well as from the design of the platform itself, including “addiction-like behaviors” and deceptive marketing practices (also known as dark patterns).

The other most substantive section of the bill (§4) requires platforms to institute “safeguards” to protect minors from harm. Interestingly, most of these are intended to be in the form of settings that can be adjusted by the minor user or their parent or legal guardian. These controls must default to the most restrictive setting if the platform “knows or should know” that the user is a minor. The settings are to control findability, limiting time-on-platform and disabling features that tend to extend this, geolocation, and use of personal data in recommendation engines, as well as deletion of account and data.

Additionally, this section requires giving parents additional control over account settings (including restricting payments) and access to the user’s activity on the platform that may relate to online harms, as well as tracking time spent. If these parental tools are in use, the platform must alert the minor to this. These settings are to be at their most restrictive by default if the user is under 13. Platforms must also provide minors and parents with a functioning reporting mechanism for the harms designated in the bill.

The disclosuresmandated in §5 cover the aspects of platform policies and affordances relevant to the online harm categories the bill has established, and “whether the covered platform ... pose[s] any heightened risks of harm to a minor.” These should be displayed, presumably in the registration or purchase flow, in an age-appropriate way, and the platform must also make “reasonable effort” to communicate them to a parent and receive a parental acknowledgement before initial usage.

Public reports based on third-party audits are required for platforms with at least 10 million MAUs in the United States by §6. These cover the basics: overview of platform usage, a risk assessment, and details of actions taken to mitigate risks. However, the platforms must also conduct their own research about minors’ experience of the platform, solicit input from parents and experts, and account for outside research in the product space.

The next few sections of the bill deal with various research-related topics:

  • §7 creates a system to approve qualifying researchers to get access to platform data in order to study online harms to minors (again for platforms with at least 10 million MAUs in the United States).
  • §8 regulates market research that platforms can conduct on minors, requiring parental consent and instructing the FTC to draw up guidelines.
  • §9 directs NIST to research options for age verification that are effective, feasible, and not unduly intrusive.
  • §10 requires the FTC to produce guidance for platforms on how to ensure that they are in compliance with the bill and associated best practices.

Two avenues of enforcementare established in §11: via the FTC’s authority to regulate any “unfair or deceptive act or practice,” and by civil suits brought by State Attorneys General on behalf of residents of their states.

Lastly, an advisory Kids Online Safety Council is to be convened (§12), consisting of a very broad range of stakeholders.

Support for KOSA

Concerned parents, like those who met with Sens. Blumenthal and Blackburn and some other child safety and mental health advocates, support this bill. Their arguments typically consist of recounting the harms suffered by children, and citing various studies that suggest that use of social media and interactions that occur on social media can result in harm.

It is very difficult to draw direct lines between usage of online platforms and harm at scale, but there are indications that there does at least seem to be some correlation between mental health problems and intensive social media usage in minors. (See Social Media and Mental Health: A Collaborative Review by Jonathan Haidt and Jean Twenge.) Additionally, not all platforms have always taken even elementary steps to protect children from harm. For example, the video chat platform Omegle has facilitated numerous instances of child sexual exploitation, and has been criticized—and sued—for its system design (it instituted higher age limits and a moderated mode after some of these cases, though these measures are trivally easy to circumvent).

Criticism of KOSA

Much of the criticism of the bill has come from civil liberties organizations and those supporting LGBTQ+ minors. Points of contention include:

  • The duty of care provision may lead to overfiltering, which may prevent even 15- and 16-olds from accessing appropriate sex education materials, and much-needed resources for LGBTQ+ teens. A section added to the newest version of the bill (§3(b)) clarifies that it does not require that platforms block any content that is specifically being sought out. However, it may be easier for platforms to just block certain content all together. Filtering, critics argue, is also of dubious value as teens are adept at working around keyword restrictions as is common on TikTok—and if the platform is relying on the newly-added limitation, it would legally provide access to, for instance, self-harm material that a minor may search for.
  • For those suffering from domestic or parental abuse, the internet gives access to important sources of support, both person-to-person communications and information. This bill would compromise these resources, as well as basic privacy for typical teenagers. The amended bill gives the FTC responsibility for training platforms on how to avoid parental misuse of the tools, though doubts remain as to how feasible this is.
  • Many services, including essential educational tools, make use of individual-specific algorithmic ranking, and so use of the control opting-out of this aspect of platform functionality would make these platforms unusable.
  • The “reasonably likely to beused by minors” benchmark gives the bill an incredibly wide scope. Platforms are therefore incentivised to make use of age verification services (with associated privacy concerns) and either close the platform to under-17s to avoid being in scope and all together escape the burdens such as notice, research and reporting (for large platforms), or to give minors a radically pared-down service to avoid a complete rebuild.
  • Another danger of this standard is that services would retain more personal information for all users in the attempt to either verify ages or to have usage information available for review by parents (in compliance with §4(b)(2)(e)).
  • Additionally, it is unclear how far into the stack this scope extends. Is AWS in scope as minors access applications and content there? Cloudflare or Akamai, as they direct minors to resources? The specific exclusion of common carriers suggests that all other services that host content or could otherwise be construed as “platforms” are covered.
  • At least one State Attorney General has been suspected of taking action against social media companies for political reasons in the past. When the topic is content that children are accessing, the temptation for culture war-attuned AGs is all too evident.
  • Finally, how would this all work? How does PBS Kids know when a parent hands their tablet to a child for her to watch Daniel Tiger? How will Google know how to send the parent a notice before their child first uses Search on a computer at school? And is it really plausible that the FTC will figure this all out in one year, as suggested by §10?

Another Approach

Several of these critiques reflect either a discomfort or a practicality problem with the age definition of a minor in KOSA. 17 is not a typical age of majority in the US, and not in line with the Children's Online Privacy Protection Rule (COPPA), perhaps the most comparable existing federal law, that applies to those aged under 13, or the California Age-Appropriate Design Code Act, which establishes discrete age categories that go up to under-18.

Indeed, KOSA’s list of safeguards includes several measures, many in accord with Safety by Design and Privacy by Design principles, that would doubtless be very popular with users of all ages. Who would not want more control over their data, how ranking algorithms use it, and who can contact them? Wouldn’t all users benefit from tools to limit overuse and avoid manipulation by dark patterns? And why should we limit qualified independent researcher access to research about only those harms that pertain to minors?

Two bills already under congressional consideration would take care of some of these priorities for all users: The ADPPA, which would provide the comprehensive data privacy framework that has been sorely lacking in the US for far too long, and PATA, providing researcher access to the social media platforms that are most of concern without reference to age. As journalist Karina Montoya suggested in Tech Policy Press, these bills are well-designed, have bipartisan support, and could transform US tech regulation for the better.

Rather than rushing through a bill that impacts a very large proportion of the websites and applications on the Internet and is opposed by serious, independent organizations that care about the wellbeing of vulnerable minors, it may be advisable for Congress to focus on enacting ADPPA and PATA, while carefully observing how the California Age-Appropriate Design Code Act changes the services under its aegis for the better, and perhaps for the worse.

Related: Read more perspectives on KOSA

Authors

Tim Bernard
Tim Bernard is a tech policy analyst and writer, specializing in trust & safety and content moderation. He completed an MBA at Cornell Tech and previously led the content moderation team at Seeking Alpha, as well as working in various capacities in the education sector. His prior academic work inclu...

Topics