Contact Us


Disruptive Competition Project

655 15th St., NW

Suite 410


Washington, D.C. 20005

Phone: (202) 783-0070
Fax: (202) 783-0534

Contact Us

Please fill out this form and we will get in touch with you shortly.
Close

Content moderation practices: Where will the DSA draw the line?

· April 13, 2022

person writing on beige paper using black ink

We’re entering a crunch time for the EU negotiations on the Digital Services Act (DSA) and there are still some important details to iron out. The DSA will define what users can see and how consumers buy and interact online. It will frame how digital services operate and moderate content in a transparent way for years to come. 

Amidst the global pandemic and an unstable geopolitical environment, there has been more scrutiny on how content can shape public opinion and inform or misinform society. Moderating online content can be a delicate and complex process that must tread a fine line between free expression and timely takedowns of “illegal” and “legal but harmful” content. Even with the best intentions and first-rate procedures or tools in place, mistakes can occur. However, if problems are quickly detected and resolved, any adverse impacts can be mitigated. 

The DSA must create the right conditions to support free speech, the takedown of problematic content and to enable an effective redress mechanism. However, a number of proposals on the table could undermine platforms’ ability to effectively and timely moderate content, keep users safe and promote trust online. 

Information without inundation

One of policymakers’ key objectives is to help consumers and users better understand content moderation practices; in other words, why and how some content is displayed. However, it is crucial that any system that provides this information does not result in a stream of notifications that overwhelms the user and the system itself. As it stands, recent amendments suggest to notify any action affecting content ranking, including when it is benign or due to the personalisation of the user experience. 

On top of this, online platforms can demote content when it is problematic to limit user exposure to harmful content. Notifying users about the demotion of harmful content, e.g., disinformation, may have the opposite effect to what policymakers are aiming for. It may equip bad actors with information on how to game the system and drive large-scale misinformation narratives. 

Efficient redress mechanisms 

Despite the good intentions, there is a risk that the new rules inadvertently could overwhelm companies’ redress mechanisms at the expense of legitimate claims. If platforms have to offer redress options for any content demotions or to any flaggers, the complaint mechanisms would be bogged down and intermediary services might be unable to manage the tsunami of user redress claims in practice. Their actions against harmful content might also be jeopardised. A more balanced approach would be to have a redress mechanism focusing on protecting free expression and be limited to removals or account suspension. 

Let out-of-court settlement succeed

The phrase “what is illegal offline should also be illegal online floats around Brussels often and has been visible throughout the negotiations on the DSA. While both channels are quite different, there are characteristics and practices in the offline world that should be replicated online. Offline, there is a pathway to further escalation when seeking redress. It can range from a simple complaints process, to mediation or to legal action. By the same logic, when a consumer wants to contest a decision made by an online platform, it makes sense to first have recourse to the internal complaint-handling system of the platform in question. Such systems are tailored to be cost-effective and time-efficient for both parties. The out-of-court dispute settlement mechanism should be designed as a second-line option. 

There also needs to be clarity on what exactly can be taken to an out-of-court settlement procedure. Ideally, it should be reserved for serious cases such as for the termination of consumer accounts or service provision to consumers.

The DSA represents a big change for the wider digital ecosystem in Europe. It is critically important that policymakers enable effective implementation and help contribute to a successful and innovative digital economy for many years to come.

European Union

DisCo is dedicated to examining technology and policy at a global scale.  Developments in the European Union play a considerable role in shaping both European and global technology markets.  EU regulations related to copyright, competition, privacy, innovation, and trade all affect the international development of technology and tech markets.