Home

Evaluating the Argument Over the California Age Appropriate Design Code Act

Gabby Miller / Oct 20, 2023

Gabby Miller is Staff Writer at Tech Policy Press.

On Wednesday night, California Attorney General Robert Bonta filed an appeal of a preliminary injunction issued in federal court last month that blocked the California Age-Appropriate Design Code Act (CAADCA). The Act is billed by its supporters as a first-in-the-nation law to protect children online, but like other state laws with similar aims, it faces opposition on First Amendment grounds.

The expected appeal, filed in the US Court of Appeals for the Ninth Circuit, marks an important step in determining whether the CAADCA stands a chance at moving forward. In a press release accompanying the appeal, Attorney General Bonta argued that parents should be able to protect children as they use the internet. “Big businesses have no right to our children’s data: childhood experiences are not for sale,” Bonta said. A coalition of nearly twenty organizations – such as Fairplay, Common Sense, ParentsTogether, and 5Rights Foundation, which originally sponsored the Act – issued a letter thanking Bonta for appealing a lower court ruling that they say ignores obvious harms to kids in favor of protecting the interests of tech firms.

The California Age-Appropriate Design Code Act (AB-2273) was signed into law by Governor Gavin Newsom, a Democrat, on Sept. 15, 2022. The bipartisan legislation, introduced by Assemblymembers Buffy Wicks (D-CA14) and Jordan Cunningham (R-CA35) and passed unanimously in the California legislature, was set to take effect on July 1, 2024. The Act would require online businesses likely to be accessed by children – defined as any user under the age of 18 as determined through “age assurance” methods – to default privacy settings to the highest level and complete an impact assessment before any new products or features are made publicly available. Failure to comply can result in steep fines.

The Act was challenged by Netchoice, a tech lobbying group whose members include Google, Meta, and TikTok, in a lawsuit it filed last December (NetChoice v. Bonta). NetChoice argues that despite proponents’ claims that the Act was designed to protect minors, it does so by “replacing parental oversight with government control.” One of the core claims in the suit is that the CAADCA would violate its member companies’ expressive rights. This means, according to NetChoice, that the Act restricts businesses’ ability to exercise their own editorial discretion, imposes strict liability, and chills speech.

A month ago, Judge Beth L. Freeman of the US District Court for the Northern District of California issued a preliminary injunction for the CAADCA. In the 45-page decision, she concluded that the plaintiffs demonstrated a likelihood of success in proving the Act is facially unconstitutional since it would violate the First Amendment, and such “speech restrictions” would fail strict or even lesser scrutiny.

The crux of this legal showdown can be best understood as a tug-of-war between data privacy law, consumer protections, and the First Amendment, with various overlapping questions and incompatible answers. Proponents of the bill generally frame the CAADCA as a regulation aimed at protecting children through platform design, whereas its opponents argue that the purported content-based speech restrictions are unconstitutional.

Opponents' perspectives

In its motion for a preliminary injunction, NetChoice argued that the CAADCA represents a content-based restriction on speech that would “subject a global communications medium to state supervision and hobble a free and open resource for ‘exploring the vast realms of human thought and knowledge.’” The Act also purportedly violates the “expressive rights” of NetChoice and its members under the First Amendment.

Indeed, some see the CAADCA as an outright form of censorship. Eric Goldman, a professor at Santa Clara University School of Law who focuses on internet law, argues that the current moment is merely an extension of “almost 30 years of legislative attempts here in the United States to justify censorship by the claim that it protects children.” Goldman referenced the Communications Decency Act of 1996, which prohibited users under 18 from accessing “obscene or indecent” material before that provision was struck down by the US Supreme Court on grounds that the law was overbroad and criminalized constitutionally protected speech. “To me, it's like Groundhog Day,” Goldman said. He filed an amicus brief in support of the preliminary injunction in NetChoice v. Bonta.

NetChoice has also filed suit against a more restrictive age-verification law on similar grounds as its challenge to the California Age-Appropriate Design Code Act. In Arkansas, Netchoice filed a suit (NetChoice v. Griffin) in June challenging that state's Social Media Safety Act (SB 396), which would require parents to consent to their child’s social media by using age verification methods. US District Judge Timothy L. Brooks blocked the Act shortly thereafter, declaring it unconstitutional and that requiring users to upload driver's licenses to the internet would deter adults from using the internet, thus chilling speech. Chris Marchese, the director of NetChoice’s litigation center, recently told the New York Times that laws like the CAADCA and others that require age-verification would essentially require tech companies to “sanitize the internet on behalf of young people.”

Goldman expressed stark disapproval of all age assurance and age verification methods. “I’m blown away by how acquiescent the privacy community has been to broad-based mandates to do things like facial scans or disclosures of private information into the hands of all kinds of businesses, legitimate or not,” argued Goldman. He chalked this up to a bad trade-off made for the purpose of suing Big Tech. (The CAADCA does not define what age assurance methods companies must use, only stipulating that the methods are “proportionate to the risks that arise from the data management practices of the business, privacy protective, and minimally invasive.”)

Proponents' arguments

Proponents of the California Age Appropriate Design Code often argue that legislation based on this framework is not about speech or content, but rather is designed to address various consumer protection concerns by requiring online platforms to take “common sense measures” to protect children and teens.

Some point out that the law’s framework is widely considered to be uncontroversial when applied to offline, brick-and-mortar businesses. Ryan Calo, a professor at the University of Washington School of Law and founding co-director at the UW Tech Policy Lab, characterized the CAADCA as “pretty routine regulation” of companies within California. “It’s not requiring the companies to say anything, nor is it censoring their speech, it is merely requiring them to be attentive to their design choices when children are involved,” Calo said. In other words, it polices the conduct rather than the speech of businesses.

Calo takes issue with NetChoice’s “kitchen sink approach,” with claims he says are “truly all over the place.” For instance, the CAADCA requires covered companies to produce reports regarding their policies to protect children and adopt other transparency measures. The NetChoice suit cites the Fourth Amendment in challenging the constitutionality of these disclosures, which provides protections for unreasonable search and seizure. “That’s a pretty wild argument because so many laws require reporting by companies,” Calo noted.

Neil Richards, a law professor at the Washington University School of Law who specializes in privacy, information, and freedom of expression, believes the logic that says “data is speech” is “utter nonsense” from a technical, policy, and First Amendment perspective, “because everything we do is with data and everything we do is with words.” This idea, Richards fears, could make the governance of a digital society impossible.

And if the First Amendment is implicated every time the government attempts to regulate online product design, then everything is constitutionalized, according to Calo. “It treats digital tech companies as exceptional, merely because their service happens to take place in a mediated environment using, you know, pixels and bytes and stuff like that.”

Woodrow Hartzog, a professor at the Boston University School of Law who focuses on privacy and technology law, is also worried about the secondary effects a ruling that adopts the plaintiff's logic could have on other regulations impacting product and service design on the internet. This includes the enforcement of Federal Trade Commission (FTC) regulations against deceptive practices that have long been considered uncontroversial. The claim that design rules interfere with a company’s expressive rights, Hartzog argued, is both inconsistent with theoretical approaches to the First Amendment and all too convenient to the tech companies swatting at various privacy laws. And the broadness of the opinion also “keeps us from appreciating how other data privacy restrictions may have wildly different outcomes,” Hartzog said.

Hyperfixation on the Act’s First Amendment implications could also obscure the harmful privacy practices that platforms regularly use, such as profiling kids by default or the use of dark patterns to manipulate their choices. In a digital press briefing by a group of civil society organizations calling itself The Kids’ Code Coalition shortly after the CAADCA was blocked, Megan Iorio, senior counsel at the Electronic Privacy Information Center (EPIC), argued that Judge Freeman didn’t take seriously the list of widely-recognized harms often baked into a platform’s design. “The First Amendment doesn't let a judge strike down laws simply because they disagree with the policy goals,” Iorio said.

Looking ahead

When pressed on whether there was any legislation Goldman could support that focused on designing safer user experiences, he threw out the question’s premise altogether. “The idea of a safer internet, from whom and for whom? I can't answer that because I don't know what dimension of safety we're trying to optimize,” remarked Goldman. “I will tell you that government interventions are often the thing we need to be protected the most from. And that's why things like constitutional indication, unfortunately, become so important, because those are the only mechanisms that we can use to keep safe from governments trying to undermine our ability to talk to each other in important ways.”

Goldman wants to prioritize teaching children to be smarter consumers of internet content and services in lieu of laws that require safer internet design. “Couldn't you imagine that we would create an entire curriculum built around how to use the internet in order to become a savvy consumer of information and to become a well-informed citizen?” he asked.

This mirrors recent comments by NetChoice Vice President and General Counsel Carl Szabo. At the August 2023 National Conference of State Legislatures in Indianapolis, Szabo took square aim at child online safety laws advancing across the country. In a heated back-and-forth with state lawmakers in a panel titled “Protecting Kids on Social Media,” Szabo argued that these bills take away the rights of parents and put them into the hands of the state. Better model legislation, he said, is focused on internet literacy for children on the safe use of technology, such as a recent Florida measure. Szabo also placed a significant amount of responsibility onto parents, saying they need to set better boundaries for their children by taking phones away and modeling better internet usage behaviors.

But with an appeal filed, the Ninth Circuit must now consider the legal questions surrounding the California Age-Appropriate Design Code Act. While it’s unclear how receptive the court will be to the idea that data privacy and consumer protection regulations are a First Amendment issue, other state attorneys general and lawmakers, and perhaps the US Supreme Court, will be tuning in as they weigh similar questions.

A revised version of the CAADCA could, in theory, be couched more firmly in constitutionally protected grounds, according to Hartzog. He wonders, though, if the Age-Appropriate Design Code incorrectly assessed users’ experiences on the web. In other words, the Act may presume kids are less sophisticated at internet literacy than they are, and that adults are more sophisticated. A more comprehensive federal data privacy bill, which might ensure a safer experience not only for children, but all users, is an alternate solution. While a law establishing fundamental privacy protections would be consistent with global trends, it does not yet have the same bipartisan energy propelling it forward as child online safety does.

Either way, Richards says that the behavior of tech companies in the current information age uses a similar playbook to industrial-age tycoons, who used the law to protect their own interests over those of the public. Tech firms are “pursuing policy through litigation, rather than respecting democratic choices,” said Richards. “The argument they're making is that anything you do with data is somehow expressive. And if that's the case, we cannot have any laws in the information economy. None.”

Authors

Gabby Miller
Gabby Miller is a staff writer at Tech Policy Press. She was previously a senior reporting fellow at the Tow Center for Digital Journalism, where she used investigative techniques to uncover the ways Big Tech companies invested in the news industry to advance their own policy interests. She’s an alu...

Topics