DSA: Proposed Redress Obligations Could Put Good Content Moderation Practices at Risk
Even with the best intentions, and first-rate procedures or tools in place, mistakes can occur when platforms moderate content. However, if problems are quickly detected and resolved, any adverse impacts will essentially be mitigated. An effective, user-friendly, and timely dispute resolution system is therefore essential. The EU’s Digital Services Act (DSA) aims to create this framework, but concerns remain as the current proposals will have major unintended consequences.
Impeding, not improving, content moderation practices
European politicians often say that the same rules should apply online as they do offline. When a consumer seeks redress for a problem they have with an offline service they will have several layers of potential escalation. Firstly, there would be an informal attempt to find a mutually satisfactory solution between the service provider and user. When this proves difficult to achieve, the complaint might be escalated within the hierarchy of the service provider. Litigation in court would be a very last resort when all other options have been exhausted. By the same logic, when a consumer wants to contest a decision made by an online platform, it makes sense to first have recourse to the internal complaint-handling system of the platform in question. Such systems follow the basic principles of customer service, while being tailored to be cost-effective, time-efficient for both parties. When common ground can’t be found, an out-of-court solution could then be the way forward.
Sadly not everyone engaging with online platforms acts in good faith. Platforms invest significant resources into internal complaint-handling systems to ensure a safe online environment and protect consumers. The DSA proposals on internal complaint-handling and out-of-court redress mechanisms currently contain loopholes that risk being exploited by abusive and criminal parties, or “bad actors”. For example, bad actors could use out-of-court redress mechanisms to arbitrate every content removal decision at a company’s expense, slowing down the process for legitimate seekers of redress. Rogue actors could also flood a platform with illegal content with the specific purpose of slowing down the platform’s efforts to moderate content.
Recent amendments to the DSA dilute and extend the scope of the internal complaint-handling system to the degree of risking the functionality of the redress mechanism itself. Proposals grant users, consumers, and businesses the possibility to appeal and challenge online platforms’ content moderation practices, from reduction in visibility, to decisions to no action taken. These last-minute amendments go far beyond the European Commission’s initial proposal.
EU Member States have proposed requiring a human review within the various appeal steps. While human oversight is a pivotal component of the content moderation process, it would be almost impossible to verify all the complaints due to the amount of content uploaded and the broad scope of elements that can be appealed.
EU lawmakers will want to ensure that the same guiding principles from the offline world should apply to the online world, while leaving flexibility to accommodate the specific features of the digital environment. The amount of content uploaded onto the thousands of online platforms is massive. Therefore, the rules for online content moderation would need to be operationally scalable to be implemented to the millions of decisions online platforms have to make. This is particularly important for small and medium size platforms operating in Europe. The DSA rules need to enable digital services to scale by providing them with a clear and proportionate framework for content moderation practice.
Legal fragmentation and uncertainty
The current proposals could lead to fragmentation, as the same content could end up being treated differently from one Member State to the other, depending on the decision applied by the national certified out-of-court bodies in that territory. Such a scenario would be extremely challenging for businesses, and in particular for the growth of small and medium-sized companies.
Furthermore, decisions made by out-of-court bodies will be binding on digital service providers. Surprisingly, neither users nor digital services will have the possibility to appeal such decisions before regular courts. If one out-of-court body keeps having a laxist or stringent interpretation, it wouldn’t ensure a harmonized DSA application throughout Europe. It must be possible to challenge decisions in court to ensure legal certainty and ensure future-proof approach.
The DSA redress mechanism should safeguard the rights of European consumers and of businesses operating in the EU. It should avoid duplication and ensure consistency with existing European frameworks, e.g. the General Data Protection Regulation, the Audio-Visual Media Services Directive, the Copyright Directive, the Platform to Business Regulation, and the Directive on better enforcement and modernisation of Union consumer protection rules (Consumer Omnibus).
Lastly, when policymakers develop an effective, user-friendly, and timely dispute resolution system, they should take into account the heightened due diligence requirements and the systemic approach that the DSA already contains.
Europe will succeed in its digital transition by creating rules for the online world, inspired by the offline world but with room for digital specificities. Policy makers should ensure that the thousands of companies within scope, big and small, have a fair chance at actually complying with the new DSA rules. Redress and appeal systems should empower users and consumers, while ensuring good content moderation practices in one harmonized approach across the EU single market.