Contact Us


Disruptive Competition Project

655 15th St., NW

Suite 410


Washington, D.C. 20005

Phone: (202) 783-0070
Fax: (202) 783-0534

Contact Us

Please fill out this form and we will get in touch with you shortly.
Close

Illegal Content: What Cloud Providers Can and Can’t Do To Detect and Remove It

· October 18, 2022

white red and blue candy cane

Sexual crimes against children are a scourge and the European Commission proposed new rules to prevent and combat child sexual abuse online (the CSA Regulation) in May 2021. These draft rules would require a wide range of companies to scan and monitor online private messages and user-generated content. Oddly enough, cloud infrastructure vendors would also have to scan their customers’ data to detect illegal content despite the fact that they are technically not able to do that. To ensure effective and swift action against child sexual abuse material (CSAM), the new EU rules should target those actors that actually are technically capable of detecting and removing content.

Detection orders should target “data controllers”

As things stand now, the rules proposed by the European Commission mandate all hosting service providers to detect nefarious content their customers might store. In practice, however, not all hosting providers have the technical ability to access – or even control – the data hosted on behalf of their customers. The new Regulation should acknowledge this reality and clearly exempt hosting service providers from detecting CSAM whenever it is technically unfeasible

For example, providers of cloud infrastructure and platform services only offer computing resources that enable business customers to build and run their own IT operations. Cloud vendors generally have no control, let alone knowledge, of the content and nature of the data they process on behalf of their customers. 

And for good reason: for the past two decades, cloud products have been specifically designed to leave cloud customers with sole control over their data in order to guarantee a level of confidentiality and integrity of European data and IT systems that is equivalent to, if not higher than, processing performed on in-house IT infrastructure. Suddenly requiring  cloud vendors to scan, monitor, and filter their customers’ data would undoubtedly jeopardise EU and global customers’ trust in data processing services in Europe. 

So, all this begs the question: how are cloud vendors supposed to comply with an order to detect illegal content on their servers and networks if they are technically incapable of doing so?

In order to avoid frustrations for all parties involved – victims of online sexual abuse, public authorities, cloud vendors, and their customers – detection orders under this regulation should exclusively be addressed to those who can effectively access and control the data they hold. That is the “data controllers” – not those providing data processing services.

Risk of excessive removal of legal or rightful content 

Infrastructure providers typically get much less precise orders when it comes to the removal of specific pieces of content. In practice, it is not technically possible for an infrastructure provider to take down or disable access to one specific piece of content because, as explained before, the infrastructure provider does not have access to the customers’ content. 

Should IT infrastructure providers face removal orders for pieces of content flagged by an authority, they may actually be obligated to remove thousands of pieces of lawful content in the process, especially under tight deadlines. Providers might even have to disable access to all content stored by their own customers, or uploaded by those who resell hosting services again to other users (who are not direct customers of the provider concerned). This may be the case for content uploaded by individuals or companies onto a cloud storage software service built on third-party infrastructure for instance.

To avoid excessive removal of lawful content, the CSA Regulation ought to clarify that the customer of cloud services should be the one that is first issued removal orders for specific content. Only when all other avenues have been exhausted, should a removal order be addressed directly to the cloud infrastructure or platform provider. This would be a far more effective and proportionate procedure to remove illegal content, especially compared to the Commission’s current proposal.

This is exactly the same approach the EU has already taken in other legislation such as the rules on access of law enforcement to electronic evidence (i.e. the e-Evidence Regulation). A similar approach for CSA would ensure that removal orders, particularly those of an urgent nature, are received by the appropriate stakeholders – allowing them to react swiftly and remove CSAM expeditiously. 

When tabling amendments to the Commission’s original proposal, the European Parliament and EU Member States should ensure that the new obligations set by the CSA Regulation target the right digital service providers. That is, those having the possibility to act swiftly, as this would serve the overarching objective of creating a safer and more trustworthy Internet in the best way. 

European Union

DisCo is dedicated to examining technology and policy at a global scale.  Developments in the European Union play a considerable role in shaping both European and global technology markets.  EU regulations related to copyright, competition, privacy, innovation, and trade all affect the international development of technology and tech markets.