Contact Us


Disruptive Competition Project

655 15th St., NW

Suite 410


Washington, D.C. 20005

Phone: (202) 783-0070
Fax: (202) 783-0534

Contact Us

Please fill out this form and we will get in touch with you shortly.
Close

Detecting CSA While Respecting Encryption: A Delicate Balance To Strike for the EU  

Main takeaways

  1. Fighting child sexual abuse (CSA) is crucial, but it can’t come at the expense of encryption
  2. Measures like client-side scanning risk undermining Europeans’ right to privacy
  3. Scanning online content for unknown child sexual abuse material (CSAM) and grooming poses unique challenges

Imagine the following: you go to a public toilet and are surprised to see that – due to recent legislation to promote “transparency” – all of the stalls now have see-through walls made out of glass. You are still perfectly able to use your own cubicle, but other people will see you. Proponents say they have “nothing to hide” – and neither do you. But would you feel comfortable? 

Now imagine if the same would happen to your private communications, such as instant messaging apps. Well, there’s a real risk that new EU rules might end up doing exactly that.

Detection and prosecution of crime is without a doubt fundamental to giving people a sense of security and peace of mind. This is even more important when we talk about crimes affecting children, such as CSA. In this context, it is important that every part of society steps up its game to prevent such heinous material from being circulated on the internet.

However, the EU rules to prevent and combat child sexual abuse (CSA Regulation) that were proposed by the European Commission, could mandate providers of online services to deploy certain (potentially) invasive technological solutions to execute detection orders in the near future. This would effectively destroy the unique protections that encrypted communications, including end-to-end encryption, provide to users. 

Right now, Members of the European Parliament and EU Member States are in the process of finalising their respective positions on the Commission’s controversial CSA proposal.

In trying to find a compromise, some have floated the idea of introducing “alternative” solutions, such as client-side scanning. That is the scanning of content on the sender’s device, looking for matches with potentially dangerous content, before the message is sent to the intended recipient. What those people don’t account for is the fact that this would still seriously undermine the robust protection to users’ privacy and confidentiality provided by end-to-end encryption.

Indeed, in recent months we have heard from experts and academics that current detection technologies are just not accurate enough to effectively detect CSA or exploitation. Given the large volume of material to be scanned, processing false positives would also require extensive resources and human effort. And what will happen to people whose false-positive content is still pending further review by a human? Will they be blocked from sending messages in the meantime?

Not only is today’s technology too inaccurate, but it can also be circumvented. Over the past few years we’ve seen multiple instances where simple manipulation of images lead to the content not being detected as CSAM. This is exactly why those debating the proposed CSA Regulation should exercise additional caution when calling for the detection of unknown CSAM and the solicitation of children (also known as “grooming”). Especially given the added difficulties in detecting such content and the risk of breaking the ban on general monitoring enshrined in the Digital Services Act.

Lately, we have seen many different stakeholders and parts of society ring the alarm bell about the dangers of introducing such back doors or weakening encrypted systems. Warnings have been issued by computer scientists, academics, and a wide range of industry and digital rights representatives, among others. 

They all agree: it is absolutely crucial that the EU co-legislators ensure the CSA Regulation’s obligations end up being proportionate to the known risks and explicitly exclude any weakening or prohibition of encryption, including (but not limited to) end-to-end encryption.

It goes without saying that we all want a world without CSAM. However, it’s unacceptable to rely on detection technology that is not mature enough to scan (personal) content, let alone mandating measures that would make our private communications less secure. 

EU lawmakers need to introduce appropriate safeguards, so that CSA prevention, detection, and removal can still take place in a proportionate way, without undermining users’ rights.

European Union

DisCo is dedicated to examining technology and policy at a global scale.  Developments in the European Union play a considerable role in shaping both European and global technology markets.  EU regulations related to copyright, competition, privacy, innovation, and trade all affect the international development of technology and tech markets.