Child Safety-Focused REPORT Act Passes US Senate
Riana Pfefferkorn / Dec 19, 2023On December 14, the full United States Senate passed the REPORT Act, S.474, by unanimous consent. The bill would amend federal law governing the reporting of suspected child sexual exploitation and abuse (CSEA) offenses. Many of these are common-sense improvements to the reporting process that will be welcomed by providers, child safety organizations, and law enforcement alike. However, the bill’s expansions to the reporting process come with trade-offs, and may carry downsides for online speech and privacy. These aren’t fatal flaws, but they deserve to be addressed as the bill heads to the House.
Current Law
Under existing federal law, electronic service providers are required on pain of monetary penalties to report apparent violations of certain CSEA laws that they find on their services – for example, the posting of child sex abuse material (CSAM), also known as child pornography. Currently, violations of six specific CSEA laws are subject to the reporting requirement. Reports go to the CyberTipline, a hotline run by the National Center for Missing and Exploited Children (NCMEC), which then routes the reports to the appropriate law enforcement agency. Pursuant to this clearinghouse function, NCMEC is generally immune from liability for CSAM possession and transmission that would otherwise be illegal. After making a CyberTipline report, providers must preserve the reported content for 90 days before deleting it. The CyberTipline received over 32 million reports in 2022, the bulk of which came from a handful of major providers of social media, communications, cloud, and search services with very large user bases.
What Would the Bill Do?
The REPORT Act would make several changes to current law:
- It extends providers’ 90-day preservation period to 1 year, and allows voluntary retention beyond that period for purposes of combating online CSEA.
- It extends NCMEC’s limited liability to the vendors NCMEC contracts to support its duties, subject to carve-outs for misconduct.
- It immunizes children depicted in CSAM, or their representatives (such as a parent or guardian), from liability if they report the imagery to the CyberTipline, again subject to carve-outs for misconduct.
- It imposes minimum cybersecurity requirements on NCMEC vendors as well as the providers making CyberTipline reports.
- It increases providers’ statutory penalties for knowing and willful failure to report as required. Fines that previously ranged from $150,000 to $300,000 now range from $600,000 to $1 million, depending on the size of the offending provider and whether the failure to report is an initial or repeated offense.
- It expands the reporting requirements to cover apparent violations of two more federal laws, those against child sex trafficking and coercion and enticement of minors.
- Pursuant to those new additions, it allows NCMEC to issue guidelines to providers about how to identify content that may indicate child sex trafficking or enticement.
Overall, I view this bill as mostly positive, but with some clear trade-offs as well as the potential, depending on providers’ response, to negatively affect online privacy and speech.
The Good:
Common-sense measures that are long overdue. Extending the preservation period is a commonality across multiple recent child safety bills. Why? Because everyone – NCMEC, law enforcement, and providers alike – agrees that 90 days is simply not long enough. With tens of millions of CyberTips per year, NCMEC and law enforcement agencies experience delays in processing and responding to reports. By the time investigators follow up with the reporting provider, the material has often been deleted, impeding the investigation of a serious crime. Extending mandatory retention to 1 year should allay that problem. Providing limited liability for NCMEC’s vendors, on par with what NCMEC and its employees already receive, is likewise a common-sense measure that should help NCMEC, a relatively small organization, extend its capacity to carry out its anti-CSEA mission.
Requiring decent cybersecurity. In an age of frequent hacks, leaks, and data breaches, it’s a no-brainer to make NCMEC vendors and reporting providers follow cybersecurity best practices for the highly sensitive material in their custody – especially once providers must preserve reported content for far longer before deleting it. The original version of the REPORT Act immunized NCMEC vendors but didn’t impose cybersecurity requirements on them, a gap which would have let vendors dodge liability for breaches. That’s fixed in the as-passed version, for which we can thank privacy- and cybersecurity-conscious voices in the Senate.
Allaying child victims’ fears of liability. Immunizing the individuals depicted in CSAM from civil or criminal liability for reporting their own imagery is another important change, one made even more urgent by NCMEC’s recent TakeItDown initiative to help minors scrub such content from the internet. Whether it’s a teenager’s nude selfie that was nonconsensually shared beyond the intended recipient, or documentation of the worst moment of a child’s life, nobody should have to suffer their own imagery’s spread online out of fear that they themselves (or those from whom they seek help, such as their parents) might get arrested for reporting it.
The Bad:
More obligations without more resources. As noted, the CyberTipline system is already overburdened beyond NCMEC and law enforcement’s capacity to timely keep up with the volume of reports. Adding more reportable offenses means there will be even more reports, and yet the bill doesn’t allocate any funding or other resources to support NCMEC and law enforcement in handling the foreseeable increase.
Exacerbating the overreporting problem. Under current law, providers face stiff monetary penalties for not reporting CSEA violations as required, but are mostly immune from liability for reporting material that doesn’t actually violate the law. The obvious incentive is to report it all and let NCMEC or law enforcement sort it out. By sharply increasing the monetary penalties for failure to report (and adding more reportable offenses subject to those penalties), the REPORT Act will exacerbate this incentive. Overreporting contributes to the strain on the CyberTipline system and distracts investigators from focusing on legitimate reports. It also ensnares innocent users in nightmarish scenarios, as tech journalist Kashmir Hill has documented in a series of recent articles for the New York Times.
What’s more, the overreporting phenomenon is about to metastasize: The advent of generative AI means that photorealistic AI-generated CSAM is starting to be traded online. Rather than be fined for not reporting an authentic image they mistook for AI, providers will err on the side of caution. Put another way, providers already overreport real images that aren’t CSAM, and soon they’ll start reporting CSAM that isn’t real. Again, it will be up to the overtaxed report recipients to separate the wheat from the chaff. If enacted, the REPORT Act may supercharge the flood of AI-CSAM reports that’s about to start hitting the CyberTipline.
The Unknown:
How would providers comply with new reporting obligations? Most of the apparent violations currently reported by providers to the CyberTipline are child sex abuse images and videos. The laws against CSAM are effectively strict-liability: it doesn’t matter why someone possesses or shares CSAM, it’s still illegal. Context and intent are irrelevant, and there’s no “fair use” like for copyrighted material. Therefore, anytime a provider finds CSAM on its service, that’s an “apparent violation” that must be reported. While the reporting law doesn’t (indeed can’t) force providers to proactively look for content to report, many providers do so voluntarily (hence the tens of millions of CyberTips per year). Various tools have been developed over time for detecting both known and previously unknown CSAM.
The REPORT Act would add a duty for providers to report apparent child sex trafficking, coercion, and enticement. It’s not clear to me how providers would handle that new obligation, or what NCMEC would recommend in any guidelines it might issue. Whether an image is apparently CSAM is a comparatively easy call – and still there are false positives, as Kashmir Hill’s stories document. But trafficking, coercion, and enticement typically play out in written form (though images may be involved). Compared to CSAM, free text – such as a posting by one user, or a conversation between two users – is far harder to detect and classify as trafficking, coercion, or enticement. CSAM is always illegal in any circumstance, whereas evaluating whether free text violates the law requires context, judgment, language competency, etc. That complicates providers’ task of determining whether they must report given content, and it’s harder to build automated tools for proactive detection.
Importantly, compliance may require providers to access more user content than is required to detect CSAM. Determining an apparent trafficking or enticement offense may entail reading or scanning an entire conversation, or series of conversations, between users. The risk, then, is that providers will comply with the REPORT Act’s expanded reporting obligations by more intensively monitoring and scrutinizing their users’ private and public posts and interactions.
What’s more, given the increased incentives to overreport, the REPORT Act might make it more likely that innocent online speech – say, flirting between two teens, or a user quoting song lyrics about pimping – gets reported, first to NCMEC and from there to the police. The same is true of user content that, while abusive or harassing toward an underage user, nevertheless falls short of qualifying as a federal child sex crime punishable by years in prison. As the NYT series demonstrates, being erroneously reported as a suspected pedophile creates Kafkaesque predicaments for innocent people. Federal policy should not incite the providers of online services to plunge more of their users into that situation.
These potential risks are not guaranteed to materialize should the REPORT Act become law. Rather, the impact on users will depend on the tools and policies that providers would implement in response to expanded reporting obligations. Many providers surely already act on child sex trafficking, enticement, and coercion as terms of use violations. Nevertheless, requiring them to report those offenses would inevitably affect providers’ approach to handling them.
Conclusion
In a year replete with child safety bills, the REPORT Act is noteworthy for its narrow ambit: to make some changes to providers’ CSEA reporting obligations. That distinguishes it from other bills in Congress with a far broader scope and concomitantly graver risks. Bills such as the EARN IT Act, STOP CSAM Act, and Kids Online Safety Act (KOSA), while ostensibly about improving child safety online, would in fact pose a significant threat to Americans’ cybersecurity, digital privacy, and online speech rights, especially those of at-risk groups such as sex workers, abortion seekers, and LGBTQ+ individuals.
Those bills have foundered under their own ponderous weight, bogged down by highly contentious provisions that have (thankfully) impeded their progress in Congress. That the REPORT Act attracted little attention and passed the Senate by unanimous consent suggests that if they truly want to improve child safety online, legislators can be more effective by crafting narrow bills focused on dry but impactful changes (such as extending a mandatory preservation period) than by showboating with messaging bills that generate too much controversy to pass.
Yet it is the REPORT Act’s comparatively benign nature that makes its potential drawbacks harder to see. The current system for reporting CSEA faces two major issues: the overwhelming number of reports to the CyberTipline putting untenable strain on NCMEC and law enforcement, and the incentives for service providers to report non-violating content, at their users’ expense. The REPORT Act would not ameliorate these known issues; indeed it would make them worse. But amidst this year’s heated policy fights over other, more prominent pieces of legislation, this little bill’s risks largely slipped under everyone’s radar (including mine).
I think these flaws are fixable. With some tweaking, the REPORT Act could be that rarest of creatures: a “child safety” bill that actually helps children without sacrificing everyone’s digital rights. There’s an opportunity for consensus among stakeholders who have been intractably at loggerheads over KOSA, EARN IT, and their ilk. With the bill now in the House’s hands, it’s time to pay more attention to the REPORT Act. If we’re careful, we might just get this one right.