Chapter 20: Forced action

Forced action is a category of deceptive pattern in which a business offers users something they want – but forces them to do something in return. This is a problem when the forced action runs contrary to a reasonable user’s expectations, or when it contradicts laws or regulations.

One of the most well-known and amusingly named types of forced action is ‘privacy zuckering’, named, of course, after Mark Zuckerberg.1 The user is tantalised by a service or product and in the process of trying to get it, they are tricked into sharing personal data with the business, and also tricked into giving the business permission to use that data for profit-making endeavours – like selling it, sharing it or using it for targeted advertising.

The issue here isn’t that data sharing, data sales or targeted advertising are necessarily bad – because they are legitimate business models when done correctly. The issue is the lack of the user’s consent for this to happen. It doesn’t count as consent if the user has been tricked or coerced. Consent must be ‘freely given, specific, informed and unambiguous’ – the exact language used, in fact, in the EU’s GDPR.

Here’s an example of forced action, observed by security researcher Brian Krebs.2 When a user installs Skype on their iPad, they are taken through a series of log-in steps. One of the steps requires the user to upload their personal address book from their iPad to Skype (a division of Microsoft). There is no option to decline (shown below), and the page does not explain that the next step (the iOS permissions dialog) will actually give them the choice to decline, and that declining will not have an effect on their ability to use Skype.3

Screenshot of the forced action deceptive pattern in the Skype iPad app (2022).

If we look at a subsequent step (below), we can see that the designers certainly know how to design a clear opt-out when they want to.4 The options ‘Yes, contribute’ and ‘No, do not contribute’ are equally weighted, obvious and easy to understand. This further highlights the forced action and coercive wording on the ‘Find Contacts Easily’ step (above).

Screenshot of a dialog box in the Skype iPad app asking the user for permission to share diagnostic and usage data. As well as two links, ‘Privacy & Cookies’ and ‘Learn more’, there are two blue buttons with white text, clearly labelled with ‘Yes, contribute’ and ‘No, do not contribute’.
Screenshot of a different Skype iPad app dialog, in which the means to opt out is easy (2022).

So why is contact sharing something that users may want to opt out of? This is essentially a question about the right to privacy. The book Privacy’s Blueprint by Woodrow Hartzog (2018) covers this,5 including...

Buy the book to read more...

Since 2010, Harry Brignull has dedicated his career to understanding and exposing the techniques that are employed to exploit users online, known as “deceptive patterns” or “dark patterns”. He is credited with coining a number of the terms that are now popularly used in this research area, and is the founder of the website deceptive.design. He has worked as an expert witness on a number of cases, including Nichols v. Noom Inc. ($56 million settlement), and FTC v. Publishers Clearing House LLC ($18.5 million settlement). Harry is also an accomplished user experience practitioner, having worked for organisations that include Smart Pension, Spotify, Pearson, HMRC, and the Telegraph newspaper.