Home

Donate
Perspective

Moving Past 'Chat Control' to Solutions that Truly Protect Kids and Privacy

Ella Jakubowska / Nov 18, 2025

Shutterstock

The CSA Regulation, the EU’s proposed law laying down rules to prevent and combat child sexual abuse, has been one of the most sharply-contested legal proposals of recent years. Aiming to curb the serious issue of the spread of child sexual abuse material (CSAM) online, in 2022, the EU executive proposed a bill to scan digital communications and storage, mainly using artificial intelligence (AI) tools to find and report CSAM. In autumn 2025, public awareness of this controversial law reached an all-time high across EU countries, as an important vote signaled a possible crunch point.

The vote was ultimately canceled due to a lack of support from EU Member State governments, and what came next surprised everyone. However, the possibility of moving forward productively is within the EU’s reach following a significant shift by the Danish government. In an unexpected move, Ambassadors for the EU’s Member States are set to agree next week to rule out the most controversial elements of their draft position - those that would amount to encryption-breaking ‘chat control’. For the first time, it seems possible that lawmakers could strike a deal to address the grave issue of online CSAM in ways that will uphold the privacy and digital security of the general population.

Critics of the draft law have called its proposed scanning of private communications ‘chat control’. Proponents of the law lament that attention has been taken away from protecting children. Both of these things are true. A high number of expert analyses have found that the Commission's proposal relies on a fundamentally flawed understanding of technology, would likely amount to unlawful mass surveillance under EU human rights law and may even harm those it seeks to protect.

It has taken significant resources to contest these serious flaws of the proposal, as well as to undertake the vital democratic exercise of holding the European Commission to account for repeated breaches of their duties, several confirmed counts of maladministration, and a formal admonishment for an illegal ad-targeting campaign.

Undoubtedly, this time would have been better invested in working together on how to tackle the terrible crime of child sexual abuse online, which severely violates many children’s rights. Whilst several attempts to build consensus have been made by different stakeholders over the years, in such a polarised landscape, it has been hard to move past the false ‘children versus privacy’ dichotomy.

----

Once the 3-and-a-half-year impasse has officially been broken, many of the key details that have been overshadowed by the controversies will once again become the focus of discussions. This article does not claim to have all the answers; indeed, fully sufficient answers are lacking, but it seeks to chart possible directions for further exploration from a tech policy perspective.

First of all, there are some measures on the table in the draft CSA Regulation that would already have a positive impact. As the European Parliament has recognized in its position, the law could establish an EU Centre for coordination, education and survivor empowerment. The EU Centre would also build a secure European database of existing abuse material that can be used to help identify and remove CSAM. Whilst experts are clear that this database shouldn’t be embedded on people’s devices due to significant security risks, and strict guardrails are needed, there are ways in which such tech can be used in a more limited and proportionate way to flag, triage and otherwise tackle the spread of CSAM.

Europe urgently needs more research and collaboration into what the smart, effective and proportionate use of tech to tackle CSAM could look like. This should be a key area of collaboration between child rights, child protection, technology, and digital rights groups.

The draft law’s idea to boost obligations on providers to remove CSAM when they are made aware of it is also really important. Material that remains online lowers the bar for re-circulation, whereas getting it off the net has been shown to be extremely effective in limiting the spread of CSAM. Any such measures, however, need to be met with an equal commitment from governments to do their part. Germany provides one example of where huge volumes of CSAM that were notified to police were left online for years due to a lack of resources to take them down, but unofficial reports suggest that this is a systemic issue across Europe. Removal is a relatively simple action that would make a big difference.

Secondly, when it comes to internet policy, there is a lot that platforms should already be doing to make their services safer, but aren't. Big Tech platforms, for example, routinely nudge users, including young people, into making decisions (such as having a public profile by default) that risk their safety and expose them to possible abusers. Many social media providers are notorious for making their platforms deliberately addictive and harmful by design.

EU laws like the General Data Protection Regulation (GDPR), Audiovisual Media Services Directive (AVMSD) and the Digital Services Act (DSA) provide a wide range of tools and measures to tackle these problems. They require providers to embed privacy, safety and security from the ground up, including some enhanced measures to protect children. Enforcement of these laws is a vital mechanism in the fight against CSAM and for a safer digital world more broadly, but across EU Member States, enforcement of tech rules generally lacks political commitment and resources. Worse still, the European Commission’s current deregulation drive threatens these vital laws, which could take us even further away from a safer internet.

Third, the crucial remaining measure is to look at how to better equip national authorities specialised in child protection – many of whom are chronically under-resourced to fight CSA - to get to the root of the problem. In particular, this means equipping them to find and prosecute the criminal networks that profit from abuse. What many people don't know is that, as long as they follow due process, there are ways that police can already access encrypted conversations. The problem that police repeatedly report, however, is that this is too time-consuming and expensive.

Small-scale investigations like L'Enfant Bleu’s 2022 Undercover Avatar have further shown that with sufficient resources, traditional-style undercover investigations against those suspected of creating or disseminating CSAM online can work incredibly well. Once again, this is expensive and time-consuming, but it provides real answers to the problem of CSAM, without reverting to mass scanning or undermining encryption.

----

There is no single technological "silver bullet" solution to the serious problem of online CSAM. But that doesn't mean we must choose between "chat control" or nothing. As experts in child protection have long highlighted, fighting online CSAM requires a broad spectrum of solutions that encompass an all-of-society approach. Taking a socio-technical lens to the problem can also help us understand where technology can play an assistive role, whilst recognizing technology’s limits to tackle complex societal issues.

A meaningful solution requires measures at the level of the family, school, community, technology (including investing in vital child protection hotlines), policing, justice, and society. Alone, none of these measures will be enough. But together, they can be effective in preventing abuse and curbing the spread of this deeply harmful material.

We also desperately need more large-scale research and investment in how to prevent offending in the first place, stop people who have offended from offending again, and move towards “healing, not policing”. With the crime of CSA most often committed by adults (well)known to the child, solutions must center on primary prevention, rather than on digital manifestations of the crime once the abuse has already happened.

If the EU truly cares about tackling online CSAM, effective measures should never be considered too expensive. Substantial investment into multi-disciplinary online investigation capacity would go a long way to address the root, rather than the symptoms, of child sexual abuse. However, this must also be matched with reforms towards child-friendly and trauma-informed access to justice, given the long history of police and criminal justice systems failing survivors.

Now that the current CSA Regulation impasse is breaking, it's clear that there is a lot to do. Europe needs action on removal - bolstered by the savvy, targeted use of innovative tech – and a new EU Center. Europeans need better enforcement of digital laws, increased funding and personnel for investigations into abuse, and whole-of-society reforms aimed at supporting survivors, and ultimately preventing abuse in the first place.

Authors

Ella Jakubowska
Ella Jakubowska is head of policy at EDRi, Europe's biggest network of digital human rights groups, where she leads the organisation's strategic advocacy. She has worked on a range of EU laws relating to surveillance, encryption, privacy and non-discrimination and frequently provides input and testi...

Related

Perspective
The Future (and Past) of Child Online Safety Legislation: Who Minds the Implementation Gap?May 15, 2025

Topics