CNIL found Google liable for providing information in a fragmented and generic manner, and for using pre-ticked boxes for personalization settings of the account.
According to the CNIL's findings, Google has scattered information related to data processing operations across several documents, some of which are not easily accessible. Furthermore, Google has designed its interfaces in such a way that the information is fragmented and requires users to click on buttons and links to access relevant information. The CNIL has noted that the amount of information that users must read before they can identify the data processing operations is too large. This makes it difficult for users to understand what data processing operations are being carried out and what the scope and consequences of those operations are.
The information provided by Google to users is not easily accessible or comprehensible, which violates Article 12 of the GDPR. The CNIL has also noted that while providing exhaustive information from the first level might not be practical, the information provided should still be clear and comprehensible. In particular, the CNIL has found that the purposes of the processing operations are described in a way that is far too generic given the scope and consequences of the processing operations carried out. Additionally, the description of the purposes does not allow users to measure the extent of the processing and the degree of intrusion into their private sphere. The description of the data collected is also found to be imprecise and incomplete, which violates Article 13 of the GDPR.
Google's formulations do not distinguish between personalized advertising and other forms of targeting, and the personalization settings of the account are pre-ticked by default, which translates to the user's agreement to the processing of their data. This violates Article 4(11), Article 6(1)(a), Article 7, Article 12, and Article 13 of the GDPR, which require that information provided to users must be clear and comprehensible, and that users must give their explicit and informed consent to the processing of their data.
The CNIL concluded that the way consent was obtained by a company was not valid due to the absence of a positive action and the use of an opt-out mechanism. Additionally, the CNIL found that the company's terms and conditions did not allow for specific consent and prevented users from making a granular choice of processing. As a result, the CNIL imposed a penalty of €50 million and required the decision to be published. This decision was confirmed by the Supreme Administrative Court (Conseil d'Etat) on June 19th, 2020.
NGO la Quadrature du Net (LQDN) and noyb (Complainant) and Google LLC
Related deceptive patterns
Obstruction is a type of deceptive pattern that deliberately creates obstacles or roadblocks in the user's path, making it more difficult for them to complete a desired task or take a certain action. It is used to exhaust users and make them give up, when their goals are contrary to the business's revenue or growth objectives. It is also sometimes used to soften up users in preparation for a bigger deception. When users are frustrated or fatigued, they become more susceptible to manipulation.
Forced action involves a provider offering users something they want - but requiring them to do something in return. It may be combined with other deceptive patterns like sneaking (so users don't notice it happening) or trick wording (to make the action seem more desirable than it is). Sometimes an optional action is presented as a forced action, through the use of visual interference or trick wording. In cookie consent interfaces, forced action is sometimes carried out through "bundled consent". This involves combining multiple agreements into a single action, and making it hard or impossible for a user to selectively grant consent.
Preselection employs the default effect cognitive bias – a psychological phenomenon where people tend to go with the option that is already chosen for them, even if there are other choices available. Providers know this and often use it to take advantage of consumers. A common approach is to show a pre-ticked checkbox, though there are various other ways of doing this, including putting items in the user's shopping cart, or pre-selecting items in a series of steps. There are lots of reasons why this is a powerful deceptive pattern. Firstly, there’s simply the matter of awareness - users have to notice it, read it and work out what it all means. If the user doesn't, they'll scroll past completely unaware of the implications. There are other cognitive biases that may be employed in his deceptive pattern. For example, the content may be written to make the user feel that people to feel other people like them would accept the default so they should too (targeting the social proof bias). Alternatively, the content may use an authority figure to pressure users into accepting the default (targeting the authority bias).
Sneaking involves intentionally withholding or obscuring information that is relevant to the user (e.g. additional costs or unwanted consequences), often in order to manipulate them into taking an action they would not otherwise choose.
Consent is a voluntary agreement by an individual for their personal data processing, after being informed of its specific purposes and conditions.
Legal basis for processing personal data are performance of contract, legal obligations compliance, protection of vital interests, controller's legitimate interests, and data subject's consent.
Valid consent conditions include being freely given, specific, informed, and unambiguous, and the data subject should be able to withdraw it anytime.
Ensures transparent information and easy access for individuals to their personal data processing, with the right to obtain a copy in a clear and common format.
Controllers must provide identity, contact details, processing purposes and legal basis, recipient information, retention period, and data subject rights when collecting personal data.