Big Tech Should Embrace California's Age Appropriate Design Code
Hannah-Beth Jackson, Jordan Cunningham / Apr 18, 2023Hannah-Beth Jackson is a Democrat who served in the California State Assembly from 1998-2004 and State Senate from 2012-2020. Jordan Cunningham is a Republican who served in the California State Assembly from 2016-2022 and was a co-author of the California Age Appropriate Design Code.
Big Tech is one of the most powerful forces in Washington, DC and state capitals around the country. Using their armies of lobbyists and the power of campaign contributions, major technology companies routinely sway public policy decisions in their favor.
If that wasn’t enough, an organization representing tech giants including Amazon, Meta, and TikTok announced in late March that it is setting up a “litigation center” to fight new regulations and lawsuits that seek to hold them to account for harms to children.
The evidence of tech’s malfeasance is overwhelming: A recent CDC study found that one in three girls reported seriously considering suicide in the past year – a 60% rise from a decade ago. Hospitalizations among kids under age 19 increased 61% from 2016 to 2021 - and the connection to social media is increasingly stark. Tech companies have made children the most tracked, profiled, and monetized generation; childhood happens increasingly online, without any rules.
As lawmakers who worked for a combined 20 years to enact privacy protections for consumers, it’s appalling to see Big Tech companies use their power and money to attempt to wipe their hands clean of the harms their products have inflicted. But that shouldn’t stop us from continuing to fight for creating responsible design and safe digital products and services.
The law is not static. It must be reformed and modernized to keep pace with market conditions and changes in technology, just as it has in the past. We require product safety guardrails on every other set of products our children consume — including food, toys, and even TV advertising. So why not digital products? The simple reason is that Big Tech has successfully lobbied to prohibit consideration of child safety.
There is one exception. Last year, California legislators unanimously passed the Age Appropriate Design Code (AADC): a law focused on holding tech companies accountable by designing products with children’s privacy and safety at the forefront.
Throughout the history of this country, we have updated our laws to address previously unforeseen risks caused by innovative products. In the late 19th century, courts initially refused to hold manufacturers liable for harms to consumers caused by defects in product design and manufacturing. In the early 20th century, however, American courts instituted product liability rules that held companies liable for defective designs. This forced companies to take responsibility for the risks caused by their products to design safer products and services. The result was a safer world for all of us.
Similarly, car manufacturers initially resisted implementing seatbelts, but as it became apparent that they saved lives, laws were updated to incorporate this critical feature. But even as we’ve learned that social media companies possess internal research that their products are harming our kids, we have failed to take necessary action as a country.
When the market doesn’t self-correct, the government must act. Now, as then, the law must be reformed to keep pace with market conditions.
It was a paradigm shift last year when California passed the AADC to create safety standards for our kids. The AADC requires companies to stop unnecessarily collecting children’s data and profiting from their digital identities. It puts an end to a model that incentivizes companies to use, share, or sell children’s data for all kinds of purposes.
Instead of recognizing their responsibility to protect our children, certain Big Tech companies are placing blame on parents, suggesting ludicrous “solutions” such as genius bars for parents, and filing lawsuits claiming First Amendment and other legal violations. But it's not parents' fault that these platforms track their children’s movement for profit. And it’s not parents’ responsibility to design products that are safe for kids to use.
Importantly, the AADC is not a bill about content – indeed, it doesn't dictate anything about third-party content, or whether companies should put up or take down content. Instead, it incentivizes heightened safety and privacy for children upstream, at the point of design. The AADC focuses squarely on product features within platform control, such as algorithmic design, prioritization mechanisms, and information collection and use. It appropriately encourages platforms to make choices that prioritize child safety over wringing out every possible dollar of profit regardless of the human cost.
The California AADC is the first attempt in the United States to put the responsibility back on companies for designing digital products that take child safety into consideration from the start. It is not meaningfully different from demanding that companies drilling for oil take precautions to prevent spills, or that shipping companies try to prevent derailments that risk entire towns, or requiring car manufacturers to install seat belts and other safety features.
Not surprisingly, the bill has caught the ire of Big Tech and its “litigation center” – which filed a lawsuit to halt the California AADC just months after it was signed into law. But we cannot let Big Tech firms bully lawmakers and courts into submission — with our children as the ultimate victims.
California lawmakers stepped up last year in passing the AADC. Big Tech should step down — for once. Or, even better, Big Tech should step forward by embracing the opportunity to make online platforms safer for children. Our children deserve nothing less.