Dark Patterns': the Case for Regulatory Pluralism Dr M.R
Total Page:16
File Type:pdf, Size:1020Kb
'Dark Patterns': the case for regulatory pluralism Dr M.R. Leiser1 Abstract ‘Dark patterns’ is a term commonly used by the web collective to describe a user interface that exploits users into doing something that they would not normally do. It is a coercive and manipulative design technique used by web designers when some sort of action is needed from a user - typically to begin the processing of personal data or indication of agreement to a contract. As dark patterns can compromise legal requirements like consent and privacy-by-design and legal principles found in both regimes, like fairness and transparency, this article analyses ‘dark patterns’ from a regulatory perspective. Two frameworks are critiqued: the European Union’s regime for data privacy and consumer protection. The paper also provides an overview of the enforcement measures available for the regulation of dark patterns. It concludes that a pluralistic approach that mixes the strengths of one regulatory regime while compensating for its weaknesses by the use of the other is needed to harness dark patterns. Keywords: dark patterns, consumer protection, data protection, law, regulation Introduction ‘Dark patterns’ are ‘interface design choices that benefit an online service by coercing, steering, and/or deceiving users into making decisions that, if fully informed and capable of selecting alternatives, they might not make’.2 Via specific design choices, many e-commerce and social media platforms prioritize capturing user attention while obfuscating information that might help them make more informed decisions. These techniques, including information overload, fine-tuned personalization, and distorted social cues, pave the way for the manipulation of users and compromise rational decision- making. Scholarship from the fields of computer science, consumer law, and data protection have categorized the different ways users are tricked, dazzled, and decoyed into everything from providing consent to the processing of personal data to entering into transactions to making purchases that they would not undertake without the use of a dark pattern.3 By adopting creative metaphors like 'roach motel', 'misdirection', 'confirm-shaming', 'bait-and-switch' and so on to describe tricks of the trade, scholarship from design ethics is largely responsible for increased awareness about the kinds of dark patterns used to manipulate users.4 However, the adoption of these terms by European regulators is inappropriate for a number of reasons. First, term inflation risks amalgamating all forms of manipulative design under the scope of one regulatory regime. Second, the term ‘dark patterns’ runs the risk of getting overused to describe features of websites that we simply do not like or are a matter of poor UX design. Third, the design literature is very precise - dark patterns are deceptive and manipulative choices made by designers to get users to 1 Dr M.R. Leiser is Assistant Professor of Law and Digital Technologies at Leiden University. The author would like to thank Dr Mireille Caruana of the University of Malta, Dr Damian Clifford of Australian National University, and Egelyn Braun, Team Leader on Unfair Commercial Practices at the EU Commission for their helpful comments on this article. 2 Mathur, A., Acar, G., Friedman, M. J., Lucherini, E., Mayer, J., Chetty, M., & Narayanan, A. (2019). Dark Patterns at Scale: Findings from a Crawl of 11K Shopping Websites. Proceedings of the ACM on Human Computer Interaction, 3(CSCW), 81; See also Norwegian Consumer Council (Forbrukerrådet) report on how Google uses dark patterns to discourage users from exercising their rights, ‘Deceived by Design. How Tech Companies use dark patterns to discourage us from exercising our rights to privacy’ (27 June 2018) 19 , at DECEIVED BY DESIGN, (last visited 10 April 2020). 3 On Computer Science: Tales from the dark side: Privacy dark strategies and privacy dark patterns. Proceedings on Privacy Enhancing Technologies, 2016(4), 237-254; Fritsch, L. (2017). Privacy dark patterns in identity management. In Open Identity Summit (OID), 5-6 october 2017, Karlstad, Sweden. (pp. 93-104). On data protection: Nouwens, M., Liccardi, I., Veale, M., Karger, D., & Kagal, L. (2020). Dark Patterns after the GDPR: Scraping Consent Pop-ups and Demonstrating their Influence. arXiv preprint arXiv:2001.02479; Clifford, D. (2017). On Consumer Protection: Gesellschaft für Informatik; Lior Strahilevitz et al., Subcommittee report: Privacy and data protection, Stigler Center Committee for the Study of Digital Platforms 22-23 (2019). 4 Dark patterns: submission by design?, (visited 18 April 2020). do something; however, both the consumer and data protection frameworks are clear - one does not have to prove intent to hold a data controller or a trader accountable for failing to live up to their respective obligations under either framework. Accordingly, this paper moves away from terminology of the design community and adopts the vernacular of European regulators for data and consumer protection. The preventative nature of the regulatory framework for the protection of personal data requires data controllers to satisfy one of several conditions before processing. 5 For example, users are required to provide positive indication of their consent to processing6 or that they agree to enter into a contract.7 Unsurprisingly, dark patterns have emerged a manipulative means of nudging users toward satisfying the legal conditions required to process personal data. Because it is seen as regulating data controllers’ behaviour directly, the data protection regime is increasingly, and arguably erroneously, seen as applicable to the entirety of the user experience.8 This is not necessarily a shortcoming of the GDPR: Regulating controllers is one way of ensuring the GDPR is enforceable, effective, and relevant as technology progresses. But these formal requirements contain very complex standards and are intended to be enforced by specialised government agencies. 9 The GDPR may contain virtues of dependability and predictability (if adequately enforced), it often proves to be inflexible and inefficient, and its proponents left to resort to having to delve into creative interpretations of its provisions in order to get the right outcome. Against this backdrop, this paper critiques the ability of the regulatory frameworks (consumer and data protection) of the European Union to constrain design techniques embedded in websites that facilitate the practice of both anti-privacy functionality and the entering into of a contractual agreement that would not have happened without the use or influence of the dark pattern. What are ‘dark patterns’? Most dark patterns can be categorized into those that obfuscate interface elements that could be theoretically used to protect user privacy, and those that hide disclosures that, had they been known and understood, might affect whether a typical user would enter into a transaction. Why are dark patterns so problematic? One aspect of the digital single market is to provide a competitive environment for data-driven businesses. Dark patterns contribute to a new kind of 'race to the bottom' as businesses become more willing to forgo other obligations in order to capture user consent and/or agreement. As this article will argue both consumer and data protection law apply in all stages of the data collection stage, it's really only consumer law that will apply ex-post. Most dark patterns manifest themselves in user interfaces, but some can be found at system level. The risks arising from this type of system architecture design are amplified when its sources and intentions remain hidden: when users are subjected to customized architecture and dark pattern tactics, they cease to be subject to not only regulatory scrutiny, but make decisions that go against their interest, and, on occasion, harm the entire digital ecosystem. Other more nefarious dark patterns exist within system architecture and are used to push users in certain directions. Some use design to deflect users away from making rational decisions about the use of their personal data (e.g. persistent pop-ups, attention grabbing graphics, and insights from eye-tracking and neuromarketing) to nag users into accepting terms and conditions or grant consent. Others will use information overload to overwhelm users with complex and confusing offers or choices. Some dark patterns brazenly place goods into e-commerce baskets, hide information away from users, or make it appreciably harder to find how to exercise one’s rights against the data controller or trader. By creating manipulative menus, hiding costs to users away, using pre-checked boxes and specific colouring and graphics in the UI, 5 General Data Protection Regulation, Art 6(1) 6 Art 6(1)(a) 7 Art 6(1)(b) 8 de Hert, P., & Gutwirth, S. (2009). Data Protection in the Case Law of Strasbourg and Luxemburg: Constitutionalisation in Action. In S. Gutwirth, Y. Poullet, P. de Hert, C. de Terwangne, & S. Nouwt (Eds.), Reinventing Data Protection? (pp. 3–44). at Page 44. 9 Rhoen, M. (2016). Beyond consent: improving data protection through consumer protection law. Internet Policy Review, 5(1), 1-15. users can find it disparately harder to withdraw from something than they did to accept. These are used to benefit the trader’s interest, rather than the user’s.10 Even more specific typologies of dark patterns have emerged from the fields of design ethics11 and psychology12. According to a classification system proposed