Chapter 25: Our attempts so far have not been successful

Codes of ethics

Today, the ACM,1 AIGA,2 APA,3 UXPA4 and other industry bodies all have codes of ethics that – either directly or indirectly – forbid deceptive patterns. These codes of ethics provide standards to aim for, but we are nowhere near meeting them. Codes of ethics are generally ignored by the tech industry, despite our best efforts.

In Europe, article 5 of the Unfair Commercial Practices Directive (UCPD) states that a commercial practice is unfair if it distorts the behaviour of consumers and is ‘contrary to the requirements of professional diligence’.5 The UCPD Guidance goes on to state that the notion of professional diligence ‘may include principles derived from national and international standards and codes of conduct’.6 This means that codes of ethics may become a powerful instrument in preventing deceptive patterns in the EU. However, this hasn’t been applied in practice yet, so it’s hard to say how it will play out. If one day there is a judicial decision that interprets codes of ethics as ‘the requirements of professional diligence’ under UCPD article 5 then it will be a game changer and they’ll suddenly become enormously important in the fight against deceptive patterns.

Today, businesses often use codes of ethics as a form of ‘ethics washing’. It’s commonplace to see the message ‘We care about your privacy’ on a user interface, immediately before they trick you into letting them track you and sell your personal data.

Education

Education plays a vital role in raising awareness and equipping people with the knowledge to recognize and push back against the use of deceptive patterns. For example, in higher education for design and HCI, it’s commonplace to be taught about user-centred design, persuasion and design ethics. They are standard options in higher education courses, and they have been for years. Given the proliferation of deceptive patterns today, it’s safe to say that it hasn’t stopped them from happening. The economic incentives for businesses to continue to use deceptive patterns are just too strong. In other words, education is necessary but not sufficient. Of course we need education, but we need something more if we’re going to solve the problem of deceptive patterns.

Bright patterns

A number of voices have suggested responding to the problem of deceptive patterns with bright patterns7 or fair patterns8 – design patterns that are fair towards users. The idea is to fight against deceptive patterns by creating the opposite type of pattern, and then sharing these recommendations as widely as possible.

The sad fact is that we already have a lot of materials that teach designers and business owners how to engage in user-centred or human-centred design processes that result in helpful, usable and useful design patterns that assist users in achieving their goals. Hundreds of university courses, bootcamps and textbooks teach the concepts. There’s even an ISO standard on this topic.9 On their own, bright patterns are really just yet more educational materials that appeal to the reader’s moral code to do the right thing, which so far hasn’t worked.

It’s tempting to respond to this by suggesting that perhaps bright patterns should be mandatory. The problem here is the almost infinite variety of design possibilities for any given problem. Consider all the possible configurations of words, images, layouts, buttons and interactive components a design team might want to use – and then also consider all the possible goals they have been asked to bring together. They have their business objectives, various internal stakeholders asking for things, and of course they have to make the thing useful, usable and appealing for end users, otherwise it won’t get successfully adopted and used. Then, once the product goes live, it gets iterated. Data is collected from research and analytics, giving clues as to how to make the product more performant. Designs evolve. They’re improved, added to, tweaked and trimmed. Design in the digital age is never done, and innovation is an ongoing process.

If you forced the tech industry to use a mandatory bright pattern, you’d stop all of that from happening. You’d kill innovation and improvement overnight. So a regulatorily mandated bright pattern should only be deployed in a very narrow situation – a key juncture when there’s a very high risk of harm to users. In fact, this is not a new idea. If you think back to your most recent major financial product (e.g. an investment, loan or mortgage), you were probably given a standardised document that was designed according to regulatory requirements. These documents might not look like much, but they’re intended to be bright patterns. The business uses them because they’re legally required to, and they serve to help prevent the business from bamboozling you into signing a contract that’s against your best interests.

So bright patterns aren’t quite as transformative a concept as they initially seem. They’re a useful educational tool and they’re already mandatory in certain narrow situations, but to stop deceptive patterns we need to go deeper, and look more closely at the business processes and practices that cause them to occur.

Naming and shaming

Naming and shaming is useful because it can lead to legal consequences. For example, if hundreds of users complain about a provider, this can draw attention from consumer protection groups, regulators and law firms, which can lead to enforcement actions or class action lawsuits.

One of the shortcomings of naming and shaming is that many users don’t do it, so the number of complaints can be far smaller than the number of people who have suffered negative consequences. Deceptive patterns are usually designed to be subtle. This means many users don’t even know they’ve suffered negative consequences (perhaps a few dollars for an add-on they didn’t intend to buy), so they’re not aware they’ve got anything to complain about. In other words, a very carefully designed deceptive pattern may never get named and shamed, because it was so carefully hidden. Also, not everyone wants to speak out publicly – some people are shy or introverted. They might blame themselves for ‘being stupid’ for being taken in and may feel a sense of shame or embarrassment. If they complain privately to the business, the world never finds out about it (unless the business is forced to reveal it in a legal case). Other people might intend to name and shame, but are unable to find the time. If the consequences are minor (e.g. just a few dollars lost), they might notice and feel irked but not enough to justify the effort of complaining.

All of this means that naming and shaming seems to be effective only for the most noticeable deceptive patterns. It’s reasonable to assume that many deceptive patterns never get named and shamed. In summary, naming and shaming is useful, but it’s not powerful enough – we need something more.

Industry self-regulation

Supporters of industry self-regulation tend to claim that it is faster and more flexible than government regulation, takes advantage of contemporary industry expertise, and reduces administrative burden on governments. Anyone who has been on the receiving end of tedious government bureaucracy knows what bad regulations feel like, so this perspective has some appeal. However, the idea of self-regulation is popular among industry lobbyists because it leaves the door open for superficial gestures and performative compliance, while continuing with whatever profitable practices that went before...

Buy the book to read more...

Since 2010, Harry Brignull has dedicated his career to understanding and exposing the techniques that are employed to exploit users online, known as “deceptive patterns” or “dark patterns”. He is credited with coining a number of the terms that are now popularly used in this research area, and is the founder of the website deceptive.design. He has worked as an expert witness on a number of cases, including Nichols v. Noom Inc. ($56 million settlement), and FTC v. Publishers Clearing House LLC ($18.5 million settlement). Harry is also an accomplished user experience practitioner, having worked for organisations that include Smart Pension, Spotify, Pearson, HMRC, and the Telegraph newspaper.