Contact Us


Disruptive Competition Project

655 15th St., NW

Suite 410


Washington, D.C. 20005

Phone: (202) 783-0070
Fax: (202) 783-0534

Contact Us

Please fill out this form and we will get in touch with you shortly.
Close

Protecting Children While Safeguarding Privacy and Fostering New Safety Tools

· September 27, 2022

The European Commission proposed new rules to prevent and combat child sexual abuse (CSA) both online and offline in May 2021. This proposal puts forward new obligations for online service providers to detect, remove, and report — in real-time — any known or new child sexual abuse material (CSAM), as well as the grooming or solicitation of children. 

In response, a broad range of experts and stakeholders came out to voice concerns, including Germany’s privacy chief and Minister of Digital Affairs, European data protection authorities, as well as civil society. They all fear that the proposed EU rules, while focused on a worthy goal, would inadvertently establish an oppressive surveillance system putting the privacy of Europeans at risk. 

The European Parliament and EU Member States are now each assessing and amending the Commission’s proposal. Given the concerns raised, there are three elements in particular that EU policy and decision-makers should keep in mind: protect children, respect privacy, and allow for innovative detection technologies which can keep pace with bad actors’ attempts to game the system.

Protect children

Child protection is a collective responsibility that we, as a society, share. How to achieve this goal in practice, however, is a complex and sensitive matter — especially when it comes to taking measures to protect children online. 

The tech industry is already actively addressing the dissemination of CSAM. Indeed, leading digital service providers work together with NGOs to develop solutions to disrupt the online exchange of CSAM and prevent the sexual exploitation of children. Think, for example, of the Technology Coalition, ICT Coalition, WeProtect Global Alliance, INHOPE, and the Fair Play Alliance.

Tech firms have also developed a wide range of proactive initiatives tailored to their respective online services. These include voluntary scanning for known CSAM, partnering with expert bodies, developing innovative new technologies to detect previously unknown CSAM, and funding research into the detection of behaviour indicative of child exploitation (such as grooming). 

Likewise, digital service providers have developed all kinds of innovative tools that enable children to use the Internet, access content, and interact with others online in a safe, secure, and private manner. These tools and industry inititiaves already have delivered significant public benefit, resulting in the successful investigation and prosecution of sex offenders around the world.

The EU’s proposed CSA Regulation holds great potential to build on these industry efforts and complement existing frameworks to combat and prevent online child sexual abuse. One way to do that is to ensure that companies can still perform scanning voluntarily.

Respect privacy

Equally important is that the new CSA rules need to clarify how scanning and filtering obligations would be consistent with the EU-wide ban on general monitoring recently reconfirmed in the EU Digital Services Act (DSA), as well as privacy and data protection laws. Adoption of the current proposal would introduce a de-facto monitoring obligation for online service providers. Without further amendments, the CSA Regulation will thus contradict important EU privacy safeguards and principles. 

The encryption of data traffic (including end-to-end encryption) should not be threatened or undermined either, as it plays a crucial role in providing the private and secure communication that today’s users — including minors — demand and expect. Clear safeguards have to be put in place to ensure that any scanning obligations do not compromise the security of a service nor require it to be redesigned.

The proposed CSA Regulation also needs to make clear how tech firms should carry out risk assessment and mitigation measures in line with EU data protection rules. For example, the Commission proposal requires age verification of users, which would involve the collection of a substantial amount of sensitive personal information from all users. In practice this could mean that users might have to provide legal documents such as a passport to use digital services for instance. Alternative, less privacy-intrusive age assessment measures exist. For example, age assurance allows companies to infer a user’s age based on their behaviour, language use, content browsed, and network of friends. This more proportionate alternative to age verification can achieve the same policy goals, while still respecting users’ privacy. 

Allow innovative detection technology to keep up with perpetrators

Perpetrators and other malicious actors are always trying to find new ways to bypass protections and abuse the system. Ensuring that detection technologies can evolve and remain effective over time is therefore crucial to the fight against the dissemination and exploitation of child sexual abuse material.

That is why the CSA proposal should not prescribe the use of specific technical solutions or methods that are around today. This would only hamper future innovation, while perpetrators will not stop trying to game the current system. Instead, the new rules should introduce a transparent framework that incentivises the development of innovative safety tools, as this is the only way to guarantee future-proof solutions. 

The CSA Regulation proposed by the Commission has an important role to play in combatting and preventing the sexual abuse of children online in the European Union and beyond. At the same time, however, it is important that these new rules respect the EU ban on general monitoring and safeguard people’s privacy, as well as other fundamental rights, without undermining data encryption. Innovation should also be allowed to play a bigger role in order to deliver future-proof rules and tools. 

Digital service providers stand ready to work with EU decision makers to develop effective and workable rules, reiterating their strong commitment to the fight against child sex abuse material.

European Union

DisCo is dedicated to examining technology and policy at a global scale.  Developments in the European Union play a considerable role in shaping both European and global technology markets.  EU regulations related to copyright, competition, privacy, innovation, and trade all affect the international development of technology and tech markets.