Tim Bernard recently completed an MBA at Cornell Tech, focusing on tech policy and trust & safety issues. He previously led the content moderation team at Seeking Alpha, and worked in various capacities in the education sector.
WIRED recently published a leaked document containing feedback solicited by the European Council from EU member states regarding proposed Child Sexual Abuse regulation. At the heart of several of the questions asked of the national governments was how willing they would be to compromise on privacy in service of the goal of combatting child sexual abuse (CSA).
At present, the draft regulation would include messaging platforms like WhatsApp and iMessages. A regulator would have the power to run their own detection protocols and/or impose technological solutions if self-assessments and mitigation strategies submitted by platforms were deemed inadequate. The privacy and security afforded by applications utilizing end-to-end encryption (E2EE) may be at risk in the regulation as originally drafted if the regulators choose to prescribe a detection order that interferes with it.
Some of the technologies required by detection orders may also constitute general monitoring obligations, including scanning all uploaded content for known or suspected child sexual abuse material (CSAM), and even more controversially, for solicitation of children (grooming). This would be a significant break with current EU law, which prohibits the imposition of general monitoring obligations in Article 8 of the recently enacted Digital Services Act and in Article 15 of the E-Commerce Directive.
As WIRED reported, Spain is on one end of the spectrum regarding privacy and encryption, proposing the outlawing of E2EE within its national borders, while Germany is on the other, insisting that the regulation forbid not only the weakening or breaking of E2EE, but also its circumvention. A German MEP has also suggested amendments to protect E2EE and prohibit general monitoring obligations. The other countries attempt to stake out positions between these poles. The table below summarizes the position of each member state, as far as can be ascertained from the document, on key questions about encryption and privacy.
This consultation specifically asked for responses on the following:
- Excluding E2EE from detection order measures;
- Including audio communications in the scope of the regulation and
- Including interpersonal communication (as well as public content) in the scope of the regulation.
(Other discussed issues less pertinent to this analysis include: extending the current regime of voluntary platform reporting of CSAM; processes and timelines; jurisdictional questions; and blocking of CSA-containing websites and services at the country level.)
It appears that the EU has shared details of proposed technologies that promise some level of detection for E2EE communications in, among other contexts, its Impact Assessment for the proposed regulation (pp. 291-309). The Netherlands cites variants of client-side scanning—a means of scanning content on a user’s device before it is encrypted or after it is decrypted—from the Impact Assessment as solving the problem of detecting CSAM in E2EE communications. (Apple proposed rolling out a version of this technology in 2021, but put a hold on those plans after opposition from civil liberties organizations.) The primary dangers of this solution are:
- Privacy: false positives that are reported to platforms or authorities.
- Civil liberties: once these systems are implemented, authoritarian regimes or other powerful interests may insist on adding non-CSA material to registries of prohibited content.
Comments from some governments state that they do not yet have clarity as to the effectiveness of these technologies. However, despite these drawbacks, several member states appear to be pinning their hopes on these as-yet-theoretical technological solutions, or others that will be developed in the near future. This enables them to express some level of support for protecting E2EE as well as confidence that the regulation will successfully impede the spread of CSA content on encrypted platforms. Another strategy from some countries was to suggest that the regulators should not actively interfere with E2EE, while still holding platforms liable for any CSA material or activity, regardless of their use of E2EE.
Member states repeatedly raised questions about the feasibility of detection in the contexts at issue, in terms of technology, human resources, cost, and conflicts with standing privacy law at the Union and national levels. This uncertainty, combined with the broad range of opinions within the EU, leaves the future shape of the regulation unsettled.
Tim Bernard is a tech policy analyst and writer, specializing in trust & safety and content moderation. He completed an MBA at Cornell Tech and previously led the content moderation team at Seeking Alpha, as well as working in various capacities in the education sector. His prior academic work includes an MA in Talmud and a BA in Philosophy.