Home

Eight Questions to Ask About Children’s Online Harms Legislation (and Beyond)

Tim Bernard / Apr 1, 2024

Almost every week, it seems, there is news on legislative efforts to protect children from online harms in the United States and around the world. Despite legal challenges and political roadblocks to enacting such laws, the issue remains salient for advocates and lawmakers, who appear ready for a long fight.

In my recent white paper for Stanford Cyber Policy Center’s Program on Platform Regulation, I lay out the background to these proposed and enacted laws, the harms they seek to remedy, the variety of approaches taken, and their drawbacks and challenges. As the state of the field continues to shift, I recommend that interested parties consider asking a number of questions of any new piece of legislation that is proposed in this area—several of which apply more broadly to tech regulation.

1. What are its ideological roots?

Laws aimed at child online safety often emerge from a few distinct starting points, in particular:

  • The obscenity tradition, which is concerned about material that is “harmful” to children from a conventional morality perspective, without necessary grounding in empirical data.
  • The children’s rights perspective, which emerges from international human rights discourse, and is concerned with balancing the protection of children from harm with their rights as internet users with some degree of autonomy.
  • The parental rights movement, which asserts that parents should be empowered to control influences on their children.

The laws themselves, and even individual measures within them, may well be influenced by more than one of these conceptual starting points. Regardless of this, considering their origins may be of use when examining legislation for hints of inconsistency or other weaknesses.

2. Are the harms or remedies child-specific?

For several well-recognized online harms to children, such as cyberbullying, sexploitation, self-harm, or privacy violation, adults are also frequently victims. In most legislation, it is rarely spelled out how these harms may pertain differently to children, or whether the particular remedies prescribed in a bill might also be applied to protect adults. When there are trade-offs with other values (e.g. free expression – more on that below), would these be unacceptable for adults? If so, what does that say about how we conceptualize children’s rights differently from those of adults, or how we see the place of children in society?

3. How are freedom of expression and privacy impacted?

Many proposals have some negative impact on one or both of these two values. Whenever age assurance or parental identity is required, explicitly or by implication, some personal data will need to be supplied. If almost any change in access to any kind of content is a consequence of a law, this will impact freedom of expression, which includes the rights to seek and consume content as well as to promulgate it. The rights of both adults and children, or to any subsection of these populations, should be considered. Some civil society groups may oppose any compromise on one or both of these principles; other interest groups will nearly always prioritize children’s protection. Most people, I suspect, will look for some balance. These trade-offs may also result in legal obstacles, especially regarding the First Amendment in the US; European privacy regulations may also be pertinent.

4. Is compliance well-defined?

There are a range of generalized obligations of platforms to children that are introduced in these laws. Age-appropriate design codes, including the UK’s and California’s, require companies to take into account “the best interests” of child users; the Kids Online Safety Act (KOSA) in the US and the UK Online Safety Act impose a “duty of care;” the EU Digital Services Act requires “appropriate and proportionate measures;” and several US state bills impose civil liability for harms to individual children resulting from their use of the platform. Only sometimes are parameters given for these obligations. It is not always evident that legislators have a clear idea of what compliance with these laws would mean in effect. (A new report by proponents of age-appropriate design codes proposes an interpretation of “best interests,” grounded in the framework of the United Nations Convention on the Rights of the Child.) For many harms, drawing causal lines between online experiences and specific harms is notoriously difficult. Furthermore, if these laws do go into effect, platforms may also err on the side of caution and allow minors only very limited experiences and selections of content if they are facing serious legal risk.

5. How have similar laws fared?

Legislators around the world have rushed to consider bills with very similar measures—in US states, these are sometimes almost identical. But what has happened with these parallel bills? In the US, several significant bills have been copied by multiple states but do not seem likely to survive judicial review in whole or in part. Even where laws have gone into effect, some are proving very difficult for regulators to enforce.These types of laws are mostly very new and we have little evidence at this point as to which, if any, will prove effective at achieving their policy goals. (Some plausible claims can be made for the UK’s Age-Appropriate Design Code, but now that platforms have implemented new features in response to it globally, it is unclear what copycat bills will achieve.) In many cases, it may be more prudent for lawmakers to wait a year or two and then to focus on those measures that have been shown to be (i) legal, (ii) enforceable and (iii) effective.

6. Who carries the burden of compliance?

Not all these laws require action by online platforms; it may be ISPs or device makers or even retailers who have new responsibilities. Other laws require parents to approve their children’s app downloads or account creation. There are some laws aimed at libraries, education departments, schools, or vendors engaged by these institutions. Finally, a few pieces of legislation direct law enforcement or related agencies to stand up new departments or take on additional responsibilities. There are reasonable questions as to if this is equitable when any of these parties may claim (with varying degrees of plausibility) to not be truly responsible for the harms nor for their remedies. When public institutions are given the burden, only sometimes are they allocated funding for this; when they are, it is worth looking into if this funding will be sufficient.

7. Who are the enforcers?

Where laws impact the details of complex internet company operations, their enforcement may require extensive expertise. This is especially important when many practical details are left unclear in the text of the law. If a regulator is given this task, they ideally should have the expertise already, or be given the resources to staff up appropriately. Former platform workers are obvious choices, but regulators ought not be overly sympathetic to the industry they are supervising. When state or local public attorneys are responsible for enforcement, their capacity and expertise may be in question. And their good faith too: one of the greatest fears about earlier versions of KOSA (based on some evidence) was that state attorneys-general would misuse their enforcement power to discourage platforms from making LGBTQ content accessible to young people. Other proposed laws would create private rights of action for harms to children, which would enable anyone to bring a case, potentially creating an undue burden for both companies and for courts.

8. Are there industry winners?

There are certain measures that industry players may actively support for business reasons. Some laws can further entrench large incumbents, by imposing regulations that are either beyond the capacity of entrants or smaller competitors to easily meet, or that require specific actions that the big players are already taking. Some companies are also eager to place the compliance burden on another part of the ecosystem. For example, Meta has been outspoken in proposing that it is app stores – rather than apps – that should have to assure the age of users and handle parental consent. There are also vendors who are only too happy to provide compliance services in areas such as harmful content classification, age assurance, and content filtering. These vendors are emerging as their own constituency; in the UK, their interests are promoted by specific industry associations and supportive government departments.

***

Regardless of one’s general opinion of the recent run of proposed and enacted legislation in this area, given the range of efforts and the complexity of the subject matter, the need for scrutiny is evident. Applying the above questions should assist the reader with that process, and hopefully lead us to more thoughtful, measured, and effective policymaking and regulation.

Authors

Tim Bernard
Tim Bernard is a tech policy analyst and writer, specializing in trust & safety and content moderation. He completed an MBA at Cornell Tech and previously led the content moderation team at Seeking Alpha, as well as working in various capacities in the education sector. His prior academic work inclu...

Topics