Protecting Children Online: Three Ways to Improve the New EU Framework to Fight CSAM
The protection of children on the Internet is a collective responsibility that requires all parts of society to work together. The European Commission, for its part, proposed new EU rules to prevent and combat child sexual abuse (CSA) in May 2022. Complementing today’s existing frameworks to fight online CSA, the EU proposal would introduce a new, harmonised European structure for assessing and mitigating the spread of child sexual abuse material (CSAM) online.
The European Parliament’s Internal Market and Consumer Protection (IMCO) Committee recently published its draft opinion on the Commission’s proposal, while the Council’s examination of the proposed CSA Regulation is still ongoing. If EU co-legislators want an effective CSA framework, then they should consider three key elements: allow voluntary scanning for CSAM to continue, clarify the scope of the new rules, and make sure that detection orders are targeted and effective.
1.) Allow voluntary scanning for CSAM
There, regrettably, is no one-size-fits-all solution to limit the spread of CSAM across the Internet. Providers of interpersonal communication services (ICS) – such as direct messaging and social media platforms – should therefore be allowed to continue with voluntary CSAM detection they currently perform as part of their existing risk-mitigation measures.
ICS providers are already proactively deploying a number of voluntary measures in this regard, which include the processing of communications metadata to scan for known CSAM, having regular exchanges with expert bodies, researching behaviour indicative of grooming, and developing new technologies to detect previously unknown CSAM.
That is why the proposed EU framework should provide a specific legal basis allowing ICS providers to continue performing these voluntary actions as part of the risk assessment and mitigation within their respective platforms. The current proposal, however, suggests that ICS providers would have to wait to receive a detection order before acting, which they would only get upon failing a risk assessment. That is because under the ePrivacy Directive ICS providers require an explicit legal basis to process communications metadata. In practice, this would mean that all the proactive work that is currently already happening would come to an end, likely leading to further abuse material being spread online.
2.) Clarify the scope of the proposal
In order to ensure the overall effectiveness of the CSA proposal, it is vital that risk-assessment obligations target those actors best placed to act. In other words, only those online services that present a high risk of abuse should fall within the scope of the new Regulation. Due consideration also needs to be given to the capabilities of each of the concerned services to scan, detect, and remove CSAM.
Cloud service providers, for example, encrypt the data on their servers as part of the service they provide and customers pay for. They don’t have knowledge of, nor control over, the nature of the content they host. Requiring them to assess the risks of their service would require providers to scan all content in the cloud, likely breaching customers’ trust in the service. And if third-party messaging services with end-to-end encryption were required to scan all content shared by users, trust in these services would decline, not to mention the risks of the data transmitted being tampered with.
This means that the assessment of risks should be done by those parties who control the data, cloud customers and software application developers for example, instead of those who merely process the data. Customers of cloud service infrastructure are the ones in control of the content they upload afterall. Similarly, developers of software applications are best suited to properly identify the risk of CSAM being transmitted through their services. Of course, providers of relevant “information society services” (a term used by the Commission in its proposal) would still regularly perform risk assessments and mitigate any identified risks, like they already do today, but this specification would help clarify the difference in responsibilities between those who provide data and those who merely process the data.
3.) Make sure detection orders are targeted and effective
Detection orders for CSAM should be issued as a measure of last resort, only when it has become evident that a service provider has failed to meaningfully address and reduce the risks of harm on their platform. In the proposal, as it stands right now, the process for issuing detection orders is still too complex, lengthy and imposes burdensome requirements, without taking into account the different capabilities of each service provider based on their size and resources.
Particular attention should also be paid to detecting previously unknown CSAM and the solicitation of children (known as “grooming”) via interpersonal communication services. In these cases, detection is difficult and heavily relies on probability, while the technology used still shows high error rates. Concern over the inclusion of unknown CSAM and grooming in the CSA Regulation has also been voiced by the European Data Protection Board (EDPB) and European Data Protection Supervisor (EDPS) in their joint opinion on the proposal.
Given the importance of protecting Europeans’ right to privacy, the chosen approach will need to strike a careful balance between fighting these heinous crimes and making sure that the CSA Regulation is aligned with the ban on general monitoring recently introduced by the Digital Services Act (DSA).
Indeed, when evaluating the Commission’s original proposal, the European Parliament and EU Member States should make sure that the CSA Regulation allows service providers to continue with their proactive prevention measures, which have a proven track record of already making a difference.
While some of the latest proposals by the European Parliament’s IMCO Committee appear to be encouraging at this point, it remains vital to ensure that the proposal becomes more proportionate. Making sure that the new rules apply to those digital service providers that are best equipped to identify and mitigate risks will bring more clarity for all involved, and allow Europe to further ramp up the fight against CSAM.