Home

Donate

The California Bill to Combat Online Child Abuse Nobody Seems to be Talking About

John Perrino / Sep 20, 2023

John Perrino is a policy analyst at the Stanford Internet Observatory.

California State Capitol, Sacramento, California. Shutterstock

The California legislative session came to an end last Thursday with two online safety bills killed in the committee process, including SB 646 on child abuse reporting and SB 680, which would have added requirements to the Age-Appropriate Design Code Act, a law passed in 2022 which was just enjoined by a federal district court.

But one piece of legislation to combat online child sexual abuse material (CSAM) managed to pass unanimously out of both chambers, even though it drew little attention and faced industry pushback with late amendments. The legislation, AB 1394, would require social media companies to create user reporting systems and conduct regular risk assessments and reports on how platforms are addressing child sexual exploitation and abuse.

The bill now heads to Governor Gavin Newsom’s desk with an October 14 deadline to veto or sign the bill into law. The requirements would then go into effect on January 1, 2025.

If it becomes law, the legislation may face legal challenges. Techdirt’s Mike Masnick argues that AB 1394 would be preempted by Section 230 of the Communications Decency Act, which provides liability protections for user generated content.

Parts of the bill may also be challenged under the First and Fourth amendments, or conflict with laws regarding CSAM reporting to the National Center for Missing and Exploited Children (NCMEC), a congressionally chartered organization that facilitates online child abuse reports with a database used by online services to scan and remove known abuse material.

Nevertheless, if signed into law, the legislation would require a user reporting mechanism on social media sites with up to $250,000 in fines if companies fail to respond and permanently block instances of user reported CSAM with “reasonable efforts” to block future instances of that content.

Under the law, California users who submit reports would have to receive a confirmation notice within 36 hours with action required by social platforms within 30 days. Weekly status updates on user reports are required until the platform determines whether the content is CSAM and confirms to users that the content is blocked and removed.

In cases where a platform fails to comply with the user reporting and response requirements, potential penalties are limited if the social media company actively participates in the NCMEC Take It Down service. The program allows users to report sexually explicit media created when they were under 18 for removal across participating platforms.

The law would also require social media companies to conduct risk assessments of design, recommendation, and other features to protect against child sexual exploitation on at least a biannual basis and take steps to mitigate issues within 30 days. Significant failures to do so can result in minimum fines of $1 million, and up to $4 million.

The user reporting system and statutory damages for failures to remove reported CSAM would only apply to California residents for content uploaded on or after January 1, 2025 in which they or a dependent are “identifiable.” The legislation only applies to social media companies and would not cover standalone messaging services like Meta's Messenger or WhatsApp.

Recent amendments to the bill lowered requirements for the frequency of auditing and reporting, and provided more time for social media companies to respond to user reports with lower penalties if companies comply with other safety measures. The bill also changes mitigation requirements from “correction” to taking “action” on identified risks for child sexual abuse and exploitation.

“Despite many attempts by big tech lobbying organizations to stop or water down this legislation, the California Senate did the right thing today, taking a hugely important step to help stamp out the insidious and deeply harmful problem of online child sex trafficking,” Common Sense Media Founder and CEO Jim Steyer said in a statement.

Tech industry trade association TechNet responded to recent concerns raised by Steyer and Common Sense Media by saying the group was working in good faith with Assemblymember Buffy Wicks (D-CA14) to strengthen and make the legislation actionable for social media companies. It is unclear whether the trade group opposes the final bill, amended in early September.

Authors

John Perrino
John Perrino is a policy analyst at the Stanford Internet Observatory where he translates research and builds policy engagement around online trust, safety and security issues.

Topics