Contact Us


Disruptive Competition Project

655 15th St., NW

Suite 410


Washington, D.C. 20005

Phone: (202) 783-0070
Fax: (202) 783-0534

Contact Us

Please fill out this form and we will get in touch with you shortly.
Close

Dark Patterns: Four Key Principles the EU Must Get Right

Pop-ups claiming that you have won a ‘free prize,’ fake countdown timers for a special offer, or the automatic billing of users after a free trial without notice are not only manipulative ploys, they fundamentally degrade the online user experience. Such malicious practices to trick users into doing something are known as ‘dark patterns’, and there is wide agreement to put an end to them. 

However, what exactly constitutes a dark pattern must be clearly defined in order to avoid holding back intuitive and consumer-friendly interface design for legitimate purposes. When addressing dark patterns in the context of consumer protection, either as part of the existing EU framework or the review of the Omnibus Directive, there are four key principles the EU should get right.

Clearly define dark patterns

Everyone seems to agree that psychological tricks, deceit, and manipulation are deceptive design practices that should be prohibited. When defining the concept of dark patterns, the question we need to ask is: where to draw the line on what is a legitimate user interface design, and what is not?

Dark patterns should be understood as manipulative design choices that significantly distort the behaviour of the average user. Any bans should not target practices designed in good faith without the intention to undermine consumers, nor practices that are justified in specific circumstances. Think, for example, of requests for location access that allow users to update their preferences or awareness tools aimed at improving safety and privacy. 

Policymakers should thus guarantee that the scope of measures is limited to dark practices that do not have a legitimate purpose in any scenario, and they should aim to address the problem at large across the Internet. Because an inherently vague concept, with no legal foundation, would simply end up creating a lot of confusion and legal uncertainty. That is why examples of dark patterns provided for guidance should always be supported by robust research. 

Support innovation in product design 

Taking an unclear definition as the starting point for addressing this issue could, unintentionally, limit platforms’ functionality, security, and usefulness for users. Prescriptive, one-size-fits-all rules would harm technological progress and deteriorate the user experience of millions of Europeans. Brand and visual identity are important for product differentiation and user experience, both for consumers and businesses. Hence, policymakers have to recognise that services often need to use a multitude of different visual identities and brands in order to make them identifiable and accessible to users.

Empower users to revisit their choices

Interfaces that remind users of their past choices can be legitimate and well intentioned. These choices can vary based on time and context, reflecting different use cases and intentions. Users should indeed be able to revisit their choices when there is clear demand or user interest. This may be the case, for example, when users are occasionally asked to review their privacy settings, as per the European data protection authorities’ advice

Strengthen and harmonise the EU framework, rather than creating conflicting rules 

Even though we are talking about a concept that is yet to be defined in EU law, the European Commission and data protection regulators are expected to issue no less than three separate guidelines on dark patterns within the span of two years. This includes the European Commission guidelines on consumer protection rules, new comprehensive guidelines on the General Data Protection Regulation from the European Data Protection Board (EDPB), and the upcoming Commission guidelines on the Digital Services Act.

Absent any efforts to bridge the gap between all relevant policy considerations, these guidelines risk overlapping and contradicting each other, or even worse, negating otherwise important policy goals. For instance, the draft EDPB guidelines effectively suggest the prohibition of two-factor authentication for securing consumers’ online accounts, while at the same time this is considered to be the ‘gold standard’ under the EU’s consumer protection framework. 

At a time when Europe’s web of digital regulation only continues to expand, the risk of overlap and contradiction does so too, unless all policy considerations are scrutinised across the board and addressed in concert, whether it is dark patterns or any other cross-disciplinary issues.

Indeed, any initiative to address dark patterns should clearly define what they exactly constitute, support innovation in product design, empower users to revisit their choices, and harmonise the EU framework instead of introducing conflicting rules.

European Union

DisCo is dedicated to examining technology and policy at a global scale.  Developments in the European Union play a considerable role in shaping both European and global technology markets.  EU regulations related to copyright, competition, privacy, innovation, and trade all affect the international development of technology and tech markets.