Home

A Bill Designed to Protect Kids Could Change the Internet for the Better

Jennifer King / Sep 15, 2022

Dr. Jennifer King is the Privacy and Data Policy Fellow at the Stanford University Institute for Human-Centered Artificial Intelligence.

This month, California Governor Gavin Newsom will decide whether to sign AB-2273, the California Age Appropriate Design Code (Cal-AADC), into law. If adopted, it will be one of the more significant pieces of internet legislation that you’ve likely not heard of, as the legislation’s backers have run an understated campaign, and its opponents didn’t begin to weigh in publicly until the bill had already garnered unanimous approval from the California Assembly.

Supporters argue that the bill will force changes in the design of online services that will promote not only children’s data privacy, but also children's overall health and welfare while at the same time benefiting adults as well. This is because the bill casts a broad net by taking aim at online services “likely to be accessed by children,” not just sites specifically aimed at them. The safe harbor the bill provides is to encourage online services to adopt the same baseline changes for all users, not just children. Detractors suggest that the bill will do no less than blow up the internet, most significantly by requiring websites to implement age verification or ‘gating’ mechanisms that could make it far more difficult to use online products and services, invade privacy through identity verification services that require personal data, and disproportionately impact small to medium-sized businesses.

These concerns sit at the edges of how this bill might be enacted, and could be addressed and mitigated in the regulatory process should the governor sign it. But age-gating any online site that could be accessed by a child is not the end goal of this legislation. In fact, it’s an attempt to inculcate values directly into the design of online services and sites most frequently accessed by kids by centering the needs and experiences of one of society’s most vulnerable constituencies: children. It does this by forcing a paradigm shift towards the framework of Privacy by Design (PbD), something that both federal (specifically, the Federal Trade Commission) and California regulators have been attempting for years. By doing so, the Cal-AADC may shift the Overton window of what we consider as minimally acceptable design to protect not only privacy, but also online wellbeing, for adults as well as children.

Background on the Bill

Briefly, the Cal-AADC was introduced by California Assemblymembers Buffy Wicks (D) and Jordan Cunningham (R), and supported by many advocacy groups, most notably Accountable Tech, Common Sense Media, and the 5 Rights Foundation (which introduced similar legislation known as the Children’s Code in the UK). The California state Senate and Assembly both posted impartial analyses of the bills, and lawyers at Wilson Sonsini wrote an overview of the bill that highlights some of the potential concerns. It’s also been covered by Wired, the New York Times, and by Marketplace Tech (featuring me). Finally, Techdirt’s Mike Masnick and Santa Clara law professor Eric Goldman have posted scathing critiques of the bill, primarily voicing their concerns with the possibility of widespread age verification.

In the UK, the Children’s Code went into effect just over a year ago, on September 1, 2021. This month marks the Code’s first anniversary, which I mention to underscore the point that after a year in existence, the internet has not collapsed in the UK under the burden of its requirements. In particular, the UK user experience hasn’t been crippled by a proliferation of age-gating or identity verification mechanisms. In terms of successes, the Information Commissioner's Office (ICO) claims as wins these specific design changes: the elimination of autoplay and the introduction of bedtime and break reminders on YouTube; limitations on ad targeting for children by Facebook and Instagram and the introduction of parental controls; and on Google, the disabling of location history on accounts with users under the age of 18, as well as provisioning the removal of images from image search for users under 18.

Those changes are examples of the substantive difference between the new approach of the California and UK rules and the usual focus of kids’ tech regulation, which requires companies to manage (still important!) macro-level harms such as bullying, inappropriate content moderation, child sexual abuse material (CSAM), and exploitation. It also goes beyond pushing self-management tools and parental controls. Instead, the Cal-AADC, like the UK code, aims for fundamental changes in both the architecture and design of platforms and their business practices, focusing on micro-level experiences and outcomes that promote a systemic ecosystem of child-centered data privacy and behavioral wellness.

The primary ways it accomplishes substantive change are by forcing companies to:

  • Set default settings for kids to the highest level of privacy possible;
  • Make “prominent and accessible” tools for managing privacy preferences;
  • Complete data protection impact assessments before releasing new products or services that may be used by children;
  • And, very importantly, address behavioral manipulations at the system design level that “increase, sustain, or extend use of the online product, service, or features by children, including the automatic playing of media, rewards for time spent, and notifications.” ¹

Much like the perennial battle over opt-out defaults from online behavioral tracking, these requirements put companies in the position of having to shift the baseline assumptions under which they conduct their business. Rather than defaulting to both affordances and system architectures that rely upon data collection and engagement maximization, companies will be required to think ahead and anticipate harms and consequences. Instead of tinkering at the margins of the system, this bill seeks to shift the design of platforms and online services in fundamental ways that alter the system itself, with changes that can not only benefit children, but also adults.

A Chance to Realize Privacy by Design

Cal-AADC makes this shift by enshrining PbD concretely into law through privacy-centered design. First proposed by former Canadian Privacy Commissioner Dr. Ann Cavoukian in 2006, PbD is a set of principles that support privacy, rather than a specific methodology. The simplest summary of PbD is that it calls for product developers and designers to consider and incorporate their customers’ privacy needs from the inception stage of design, rather than after a service is created, finalized, or (worse) launched. (Remember the FTC’s 2011 Google Buzz settlement?)

But despite the presence of the term ‘design’ in PbD, there is still little in the way of methods or standards in the design world that guides PbD development, particularly in the practitioner fields of user experience research and design. (Privacy engineering, in contrast, is a maturing field that uses a technical approach to PbD in the engineering development process. There’s also academic scholarship on PbD, but it hasn’t been adopted widely by practitioners.)

Despite the passage of two privacy laws in California (CCPA & CPRA), neither codified PbD. The closest the laws came to addressing design practices is in their narrow attention to the topic of so-called ‘dark patterns’; neither bill required companies to make fundamental changes in the design of their services. The singular exception is the California Attorney General’s support of the use of the browser-based Global Privacy Control (GPC) signal as a valid opt-out for data sales under the CCPA. GPC is an example of a technical implementation of PbD, one that shifts the status quo of privacy (in this case, from your data being routinely acquired and sold) by changing the default from opt-in to opt-out.

PbD was intended to promote a cultural shift in companies, obliging them to internalize a respect for privacy, and changing product development processes under the assumption that designing for privacy would in turn drive fundamental positive changes in the design and operation of products and services. However, over fifteen years after Dr. Cavoukian published the principles, it is clear that merely embracing the term isn’t enough to drive change. Enter the Cal-AADC. If passed, the law will give PbD a substantial boost. As the Assembly analysis describes:

“[T]he bill would require online platforms likely to be accessed by children to turn privacy and safety settings up to a level that is protective of children's mental and physical health and well-being unless the online platform can, with an appropriate level of certainty, determine the age of the consumer. In other words, existing law generally permits online platforms to treat all consumers as adults unless there is actual knowledge that the consumer is under 13 years of age. This bill would instead require that websites and other online services likely to be accessed by children offer privacy and safety protections by default, unless there is reasonable certainty that the consumer is an adult.”

The Cal-AADC, if enacted, would leap ahead much farther than the current status quo in protecting children on the web– the Children's Online Privacy Protection Rule (COPPA), the federal children’s privacy law– in two main ways. First, it would ensure that children receive a greater level of protection from both data collection and profiling by opting them into privacy protective defaults. Second, the Cal-AADC protects children up to age 18 (COPPA only applies to those 13 and under; I’ll leave the arguments as to whether Cal-AADC is preempted by COPPA to lawyers). As a result, the Cal-AADC will promote PbD in ways that COPPA has failed to do.

Another aspect of the bill that could promote PbD is its age verification requirement, noted above. Critics have objected that the bill would force any website with visitors under the age of 18 to implement age verification, arguing that doing so would have the opposite result of the bill's intent because websites would need to collect even more personal information to verify the ages of its visitors. These concerns are exaggerated for two reasons. First, they rely on an absolutist interpretation of the bill’s applicability to services ‘likely to be accessed by children;” in sum, applicability is qualified by several factors in this section of the legislation and isn’t simply reduced to having any visitors under age 18. Again, sites that elect to apply design changes to all users irrespective of age would be exempted from any age verification requirement. Second, to the extent that children are often using their own dedicated devices to access online content, if age verification is necessary to access some sites or services this verification may potentially be delegated to the device rather than requiring all online services to enact age verification measures.

Admittedly, this bill could raise critical questions for publishers who operate websites that depend on third party ad tracking for their revenue, and will certainly cause hand-wringing inside tech firms as to how to implement the bill’s subjective requirements among those not already impacted by the UK version of this code. For example, will local news sites like the San Francisco Chronicle or even my local Berkeleyside be impacted if, say, fewer than 1% of their readers are under 18? I’m skeptical this will be the case, but even so, with third party data collection under attack on multiple fronts, including in the American Data Protection and Privacy Act (ADPPA) proposed in Congress and the FTC advanced rulemaking, this law is unlikely to be what undoes the surveillance advertising economy.

Finally, the other area where this law could have a real impact beyond privacy (and where the Children’s Code seemingly already has in the UK) is manipulative and addictive design. This is a broad space that encompasses both dark patterns (especially those aimed at children in edtech software and kids’ games) but also algorithmically driven interactions such as content feeds. To date, content feeds and similar algorithmic features have largely been targeted for regulation based on the content they display, rather than on the behavioral impacts and harms they foster by design in order to increase and sustain engagement. So far, this content based approach has seen little success because companies are able to shield themselves from liability under Section 230 of the Communications Decency Act. However, we are witnessing some of the first legal cases attempting to hold companies liable for the harms caused by the design of these systems, rather than the content they produce. This approach could be a bellwether for how we as a society approach the regulation of algorithmic systems that target behavioral impact and cause harms to wellbeing.

In sum, critics might be right that the Cal-ADDC could upend the internet status quo that we rely on today. It just may do so in ways they don’t like; not because the bill’s changes are bad policy, but because they attempt to prioritize both privacy and wellbeing for all users. We’ve had nearly thirty years of design masquerading as being values-agnostic driving the development of the internet. Do we really want to defend this status quo? Is this the online world we not only want for children, but for all of us? Should Governor Newsom sign the Cal-AADC into law, a new era will begin.

- - -

Thanks to Caroline Sinders, Dr. Jenny Radesky, and Jael Makagon for their feedback.

¹ 1798.99.31(a)(1)(B)(vii)

Authors

Jennifer King
Dr. Jennifer King is the Privacy and Data Policy Fellow at the Stanford University Institute for Human-Centered Artificial Intelligence. An information scientist and a recognized expert and scholar in information privacy, her research examines the public’s understanding and expectations of online pr...

Topics