Chapter 30: Changes afoot in the European Union

The European Union is putting considerable energy into preventing deceptive patterns. Existing legal frameworks like the General Data Protection Regulation and the consumer law frameworks, including the Unfair Commercial Practices Directive, are currently being deployed extensively against platforms that use deceptive patterns. What’s more, the EU has been working on two huge new laws that will extensively regulate big tech. By the first quarter of 2024, the Digital Services Act and the Digital Markets Act will have come into force. Both of these laws contain specific provisions regulating deceptive patterns and manipulative design. This is a step forward from the GDPR and the UCPD, which require interpretation and legal application of concepts like consent, transparency, and unfairness to deceptive patterns.

The Digital Markets Act

The Digital Markets Act (DMA) was created in March 2022 with the goal of ensuring fair and open digital markets in the EU.1 It targets big tech companies like Microsoft, Apple, Google, Meta and Amazon. Any company that has either more than 45 million monthly active EU users, or over €7.5 billion annual turnover in the EU may be defined as a ‘gatekeeper’ and is subject to obligations under the DMA. There are also some qualitative criteria that appear to be designed to prevent influential companies from sneaking under the size requirements2. In the words of the European Commission, the DMA aims at ‘preventing gatekeepers from imposing unfair conditions on businesses and end users and at ensuring the openness of important digital services’.3

The DMA bans deceptive patterns when they’re used to undermine the other rules in the DMA (article 13). The other rules are very wide-ranging, so this means that the DMA is powerful in its scope regarding deceptive patterns. For example, the recitals of the DMA (the part of the legislation that explains how to interpret the provisions) clearly state that deceptive patterns are forbidden if they are used by gatekeepers to do any of the following:

  • Interfere with a user’s choice to be tracked or not for targeted advertising outside the gatekeeper’s main platform (recital 36, 37).
  • Nag users; that is, prompt them more than once a year to give consent for data processing, having previously ignored or refused the request (recital 37).
  • Interfere with a user’s choice to install third-party apps or app stores (recital 41).
  • Interfere with a user’s choice of settings, or their choice to uninstall any pre-installed apps (recital 49).
  • Interfere with a user’s ability to export their data from the gatekeeper’s platform in a format that can be imported to third parties (recital 59).
  • Make it more difficult for a user to unsubscribe from a service than it was to subscribe (recital 63).

This demonstrates how enormously consequential the DMA will be for tech companies that get categorised as gatekeepers. If a gatekeeper breaks the rules, the sanctions are potentially huge: up to 10% of the company’s total worldwide annual turnover, or 20% if they are repeat offenders. There are other sanctions possible too, ranging all the way up to having the gatekeeper broken up or kicked out of the EU entirely.

The Digital Services Act

The EU Digital Services Act (DSA) contains even more good news about deceptive patterns.4 It entered into force in November 2022, and is gradually being rolled out, with the parts about deceptive patterns becoming fully applicable by June 2023. The DSA has a layered system with the rules becoming stricter at each successive level.

The DSA contains provisions about deceptive patterns, but they only apply to the two highest tiers: ‘online platforms’ and – something of a mouthful – ‘very large online platforms’ and ‘very large online search engines’ (VLOPs and VLOSEs):

  • Online platforms: A service ‘that, at the request of a recipient of the service, stores and disseminates information to the public’. This includes online marketplaces (like Amazon), app stores (like Apple’s App Store and Google Play), collaborative economy platforms (like Uber) and social media platforms (like Facebook).
  • Very large online platforms (VLOPs) and very large online search engines (VLOSEs): VLOPs are the same as online platforms, only bigger, with 45+ million monthly active users. VLOSEs are search engines (like Google) that have 45+ million monthly active users.

The DSA’s provisions about deceptive patterns don’t apply to the lower tiers. So, the following are excluded:

  • Micro and small enterprises: Any business with a headcount of less than 50 and a turnover of less than €10 million (unless the size of their user base makes them a VLOP or VLOSE).
  • Intermediary services: Offering network infrastructure – things like VPNs, DNS services, domain name registries, registrars, VOIP services, CDNs, and so on.
  • Hosting services: Offering cloud and web hosting, such as Godaddy or Amazon Web Services (AWS).

As you can see, the layered nature of the DSA is a bit complicated, but the main point to take away is that the provisions about deceptive patterns apply to lots of big tech companies. Apple, Amazon, Uber, Google, Facebook – they’re all regulated by the DSA in some capacity.

Now we’ve established that, we can move on to the actual provisions in the DSA that regulate deceptive patterns. The term ‘dark pattern’ is defined in the DSA recitals (recital 67). To quote:

‘Dark patterns on online interfaces of online platforms are practices that materially distort or impair, either on purpose or in effect, the ability of recipients of the service to make autonomous and informed choices or decisions. Those practices can be used to persuade the recipients of the service to engage in unwanted behaviours or into undesired decisions which have negative consequences for them. Providers of online platforms should therefore be prohibited from deceiving or nudging recipients of the service and from distorting or impairing the autonomy, decision-making, or choice of the recipients of the service via the structure, design or functionalities of an online interface or a part thereof. This should include, but not be limited to, exploitative design choices to direct the recipient to actions that benefit the provider of online platforms, but which may not be in the recipients’ interests’

It’s notable that the recital states that deceptive patterns do not have to be intentional; they only have to be shown to have an effect on users (‘either on purpose or in effect’), which is also the case in the UCPD for unfair commercial practices, and it will make enforcement more straightforward. The recital also goes on to explicitly forbid certain deceptive patterns – though bear in mind that a ‘recital’ in EU law is not legally binding; it’s just intended to clarify the law.

  • Misdirection: ‘presenting choices in a non-neutral manner, such as giving more prominence to certain choices through visual, auditory, or other components, when asking the recipient of the service for a decision’.
  • Nagging: ‘It should also include repeatedly requesting a recipient of the service to make a choice where such a choice has already been made’.
  • Hard to cancel: ‘making the procedure of cancelling a service significantly more cumbersome than signing up to it, or making certain choices more difficult or time-consuming than others, making it unreasonably difficult to discontinue purchases’.
  • Obstruction: ‘default settings that are very difficult to change, and so unreasonably bias the decision making of the recipient of the service, in a way that distorts and impairs their autonomy, decision-making and choice’ (recital 67).

Unlike recitals, ‘articles’ of an EU law are legally binding. In article 25 of the DSA, deceptive patterns are expressly forbidden:

‘Providers of online platforms shall not design, organise or operate their online interfaces in a way that deceives or manipulates the recipients of their service or in a way that otherwise materially distorts or impairs the ability of the recipients of their service to make free and informed decisions.’

Although this provision is rather brief, it immediately goes on to propose that the European Commission may issue further guidelines to expand upon these rules in the future. Specifically...

      Buy the book to

      Since 2010, Harry Brignull has dedicated his career to understanding and exposing the techniques that are employed to exploit users online, known as “deceptive patterns” or “dark patterns”. He is credited with coining a number of the terms that are now popularly used in this research area, and is the founder of the website He has worked as an expert witness on a number of cases, including Nichols v. Noom Inc. ($56 million settlement), and FTC v. Publishers Clearing House LLC ($18.5 million settlement). Harry is also an accomplished user experience practitioner, having worked for organisations that include Smart Pension, Spotify, Pearson, HMRC, and the Telegraph newspaper.