New Proposals Weaken the EU Data Protection “Gold Standard”
Not a year goes by without the European Union inadvertently rewriting its own General Data Protection Regulation (GDPR), from the e-Privacy Regulation all the way to the Digital Markets Act. While the political agenda of the day may motivate those changes, bridging the gap between multiple disciplines is becoming increasingly important. Absent that, the EU risks letting its “gold data protection standard” turn rusty.
Four years ago, the GDPR came into force, and was the fruit of intense negotiations to achieve a careful balance between protecting individuals’ data and leaving breathing room for innovation. For businesses and public bodies handling personal data, three important principles sit at the heart of GDPR: transparency (say what you do), accountability (do what you say), and user empowerment (let users control their data).
Fast forward to the present day, new EU proposals and laws are set to fundamentally disrupt Europe’s data protection rulebook in all kinds of different ways. While those proposals and laws often explain that they are “without prejudice” to the GDPR, a closer look reveal laws which (i) selectively choose certain parts of GDPR and purposely omit others, (ii) miss important data protection safeguards, (iii) openly conflict with Europe’s data protection rules, and (iv) upset the careful balance the GDPR strikes between innovation and data protection.
Consent pop-ups everywhere
A common theme across new proposals is the generalization of pop-up consent boxes in all kinds of scenarios. Following some of the latest language circulated in various proposals, users could be confronted with consent obligations to:
- Run any connected products or services via the most popular virtual assistants (Digital Markets Act)
- Allow basic email functionalities such as storage of received and draft emails, spelling corrections, predictive typing, text-to-speech, speech-to-text, follow-up actions suggestions (e-Privacy Regulation)
- Store email attachments onto a separate cloud storage service run by the email service provider (Digital Markets Act)
- Receive protection against bad actors’ fraudulent attempts to lure them outside an ecommerce platform’s chat and payment system (e-Privacy Regulation)
- Use any app, news media, or online service funded by ads (Digital Markets Act). This would add up to the cookie banners we have all been experiencing (e-Privacy Directive / Regulation).
- Sync browsing histories and account information across devices’ browsers (Digital Markets Act)
This is a non-exhaustive list. Suffice to say that the frustration that people are experiencing with cookie banners is nothing compared to the tsunami of consent boxes that’s about to hit European shores. While some of those proposals are designed to disable large tech companies’ business models, requiring consent for every bit of innocuous and expected processing would equally disable effective data protection. Each and every one of us know that the proliferation of consent forms for every new website we visit is counterproductive. Not only does the proliferation of consent boxes degrade the user experience, it numbs people into clicking onto whatever is presented to them first, without any real sense of empowerment or meaningful transparency. Users that have empowered themselves with privacy enhancing browsers, technologies, or plugins would have to suffer this consent fatigue every time they visit a service.
Consent mechanisms are a useful tool for users in certain situations, when the service does not already offer privacy enhanced options, or when they could not reasonably expect the processing to take place. But as data protection authorities put it, “[the EU’s data protection framework] does not exclude the possibility, depending on the context, of other legal grounds perhaps being more appropriate from both the [company’s] and from the [individual’s] perspective.” The GDPR indeed contains other grounds of processing, each with their own conditions, safeguards, and limitations, alongside a comprehensive set of contractual, organisational and technical obligations for companies to comply with, and different user control mechanisms for each ground of processing.
A narrow cut-and-paste of GDPR misses the broader picture and does in fact prejudices Europe’s data protection framework. To put it bluntly, if consent was the silver bullet to protect users from harmful data processing, the GDPR wouldn’t be 99-Articles long.
Weakening safeguards for personal data accessed by public bodies
Several recent proposals seek to facilitate public bodies’ access to various kinds of data, including personal data, to fight against illegal content online or ensure the safety of products sold online for instance.
In an ideal world, those proposals should meet a number of conditions set out in Article 23 GDPR, including the identification of the purpose of the disclosure, safeguards to prevent unlawful use or further disclosure, the identification of the risks for individuals and the categories of personal data processed, etc.
In reality, none of those proposals meet the GDPR conditions for lawful access to personal data held by companies.
Take for example the Data Act proposal which allows public bodies to request access to personal data for an undefined range of purposes in the absence of specific legislation in Article 15(c). While the proposal does attempt to provide some high level safeguards to prevent abuse and unlawful access, public bodies have full discretion to decide for which purpose and public interest they may obtain personal data. Further, the draft Data Act does not define the categories of personal data involved, nor does it spell out clear retention periods or measures that public bodies should take to safeguard the rights and freedoms of data subjects.
Yet, the Data Act scores better than the proposed General Product Safety Regulation (GPSR) which does not seem to even consider the personal data and security implications when market surveillance authorities seek to access interfaces of online marketplaces, or require online marketplaces to facilitate data scraping from their platform under Article 20(5) and (6).
Under Article 31 of the Digital Services Act (DSA), “vetted researchers” and potentially “vetted not-profit bodies” would be able to get access to data necessary to conduct research on “systemic risks”. A similarly broad provision applies for the DSA enforcement agencies to “monitor and assess compliance with [the DSA].” A handful of paragraphs do provide minimum requirements about who could qualify to receive data from platforms, but the exact conditions under which data may be shared with “vetted researchers” would have to be spelled out in a delegated act without adequate legislative scrutiny.
Catch-22 on conflicting legal obligations
This is probably the most worrisome development where competition and industrial policy objectives openly clash with users’ data protection rights. Under the GDPR, users in Europe are entitled to request their data to be moved from one service to the next, regardless of the service provider. However, the Data Act prohibits users from moving their data to one of the companies designated as a gatekeeper under the DMA. Since the GDPR makes no such exception, companies sending and receiving personal data at the user’s request would have to decide whether to break the Data Act or the GDPR.
Other proposals such as the proposed EU Artificial Intelligence Act suggest reinventing the GDPR wheel, including how companies should assess AI technologies involving personal data processing that may create a “high risk” for users.
All these developments raise important challenges for companies to comply, and for users to enjoy meaningful protection of their personal data. No matter how well-intended the motivations behind all those proposals may be, the repeated attempts to rewrite GDPR is likely going to cast doubt on Europe’s ability to adopt laws that can withstand the test of time and, ultimately, its global normative influence.
While the European Data Protection Supervisor is leading efforts to bridge the gap across multiple disciplines to ensure effective enforcement in the digital world, similar efforts need to be pursued from the outset of each legislative proposal as Europe’s web of digital regulation continues to grow.