Written Comments Received During

Total Page:16

File Type:pdf, Size:1020Kb

Written Comments Received During From: To: Privacy Regulations Subject: RE: Comment on proposed regulatory amendments Date: --Tuesday, October 13, 2020 2:43:22 AM ■ From: Sent: Monday, October 12, 2020 12:58 PM To: '[email protected]' <[email protected]> Subject: Comment on proposed regulatory amendments The proposed addition to§ 999.315 seeks to further complicate the opt-out right regulations, which are already overreaching and invite fraud and abuse, by imposing an additiona l series of confusing stipulations that would make it significa nt ly harder for online businesses to comply in good faith with t he regulations. First, the stipulation proposed in (h)(l) for counting a number of steps is confusing, nonsensical, and completely arbitrary. It would significantly penalize, to no good purpose, businesses like mine that use webforms as a means of processing CCPA requests. By setting an arbitrary standard for number of clicks or number of steps, t his stipulation would arbit rarily pena lize the use of CAPTCHAs or other means to ensu re that the webform is being submitted by a human user rather t han bots, who are drawn to webforms li ke moths to a flame. Since the regulations are written to require that businesses respond prompt ly to ALL requests, even obviously fraudulent ones, this expectation would devastate small businesses like mine. I have only modest online traffic, but if I post a webform without a CAPTCHA or other means of separating huma n users from bots, I may get HUNDREDS of obviously fraudulent spam submissions a day. As a sole proprietor, I simply do not CCPA_3RD1SDAY _ 00001 have the time to handle that volume of responses. Furthermore, this stipulation would effectively require that all means of submitting requests involve the same number of steps, which is obviously and fundamentally ridiculous. For a consumer who has an established ongoing relationship with a business -- for example, a customer who logs into an account with an online retailer -- the process of submitting an opt-in or opt-out request may be as simple as a single click on their account settings page; in that case, the business already knows who the consumer is and has mechanisms in place for managing their information. For a website visitor who does NOT have an established relationship with the business, they will almost certainly need to indicate to the business who they are (and that they’re human) so that the business can respond to and process their request. Once a business has received an opt-out request from a given consumer, processing an opt-in-request from the same individual is an inherently simpler process. These procedures clearly, logically, NECESSARILY involve a different number of steps, so to stipulate that they not only shouldn’t but may NOT by law require a different number of steps is absurd. That is not practical, practicable, or enforceable, and represents a further unwarranted overreach by OAG. I strongly recommend that (h)(1) be struck in its entirety. If OAG attempts to revise the wording of this provision in an effort to clarify this mess, it’s likely to compound rather than resolve the issues it presents. The proposed example in (5)(h) is in some respects even more concerning. I grasp that the intent is to discourage businesses from “burying” opt-out instructions in voluminous text, but stipulating that “the business shall not require to search or scroll through the text of a privacy policy or similar document or webpage” would effectively allow OAG to set arbitrary, undefined expectations for what constitutes excessive “searching or scrolling.” Even a fairly straightforward Do Not Sell My Personal Information webpage, containing specific instructions for submitting requests, may require a fair bit of scrolling if a consumer accesses the page from a mobile phone rather than a desktop computer. It also threatens to penalize businesses for minor technical errors, such as an anchor link (that is, a hyperlink pointing to a specific anchor at a specific position on a given webpage) that fails to correctly resolve due to connection issues beyond the business’s reasonable control. I do not object in principle to the proposed text of section (h), but the illustrative examples offer a disturbing indication that OAG’s intention is to find ways to arbitrarily penalize businesses for minor procedural issues. Many business are striving in good faith to meet the often confounding expectations established in these regulations, but OAG seems determined to make that as difficult as possible. My recommendation is to strike (h)(1) and (h)(5) in their entirety. Regarding the proposed addition to § 999.326, the proposed change is, refreshingly, a straightforward and sensible clarification of the existing text. CCPA_3RD15DAY_00002 CCPA_3RD1 SDAY_ 00003 From: Adam Schwartz To: Privacy Regulations Subject: EFF comments on proposed Cal DOJ regulations re CCPA (OAL file no. 2019-1001-05) Date: Tuesday, October 20, 2020 12:46:38 PM Attachments: 2020-10-20 - EFF comments re Cal DOJ proposed regs re dark patterns.pdf Salutations. EFF submits the attached comments in support of the "dark patterns" regulations proposed by the California DOJ at Section 999,315(h) of the third set of proposed modifications of CCPA regulations, published on October 12. Sincerely, -Adam -- Adam Schwartz | Senior Staff Attorney Electronic Frontier Foundation 815 Eddy St. | San Francisco, CA 94109 | -Pronouns: he/him/his CCPA_3RD15DAY_00004 ELECTRONIC FF FRONTIER E FOUNDATION October 20, 2020 BY EMAIL ([email protected]) Re: EFF comments on proposed Cal DOJ regulations on “dark patterns” (OAL File No. 2019-1001-05) Salutations: The Electronic Frontier Foundation (EFF) writes in support of the proposed regulations from the California Department of Justice (DOJ) to protect against what are commonly called “dark patterns.” These are manipulative user experience designs that businesses use to trick consumers into surrendering their personal data. Specifically, we support the proposed regulations at Section 999.315(h), within the third set of proposed modifications of CCPA regulations, which the California DOJ published on October 12. The California Consumer Privacy Act (CCPA) created a right of consumers to opt-out of the sale of their personal data. Businesses might use dark patterns to hamstring this CCPA right. The proposed DOJ regulations will secure this right by stopping dark patterns. Among other things, the proposed regulations would: • Require opt-out processes to be “easy” and “require minimal steps.” • Ban opt-out processes “designed with the purpose or having the substantial effect of subverting or impairing a consumer's choice to opt-out.” • Limit the number of steps to opt-out to the number of steps to later opt back in. • Ban “confusing language” such as “double negatives” (like “don’t not sell”). • Ban the necessity to search or scroll through a document to find the opt-out button. For more on EFF’s opposition to dark patterns, please see: eff.org/deeplinks/2019/02/designing-welcome-mats-invite-user-privacy-0. For the DOJ’s proposed regulations, please see: oag.ca.gov/sites/all/files/agweb/pdfs/privacy/ccpa-text-of-third-set-mod-101220.pdf? Sincerely, Adam Schwartz Senior Staff Attorney 815 EDDY STREET, SAN FRANCISCO, CA 94109 USA phone +1.415.436.9333 fax +1.415.436.9993 eff.org CCPA_3RD15DAY_00005 From: Zoe Vilain To: Privacy Regulations Cc: Pierre Valade Subject: To the attention of Deputy Attorney General Kim - Comments with regards to CCPA Date: Tuesday, October 27, 2020 9:06:48 AM Attachments: 20201027 - 2121 Atelier Inc - comments 3 on CCPA to California GA.pdf To the attention of Deputy Attorney General Kim Dear Deputy Attorney General Kim, Please find attached a letter to your attention containing our comments regarding the third set of proposed modifications to the CCPA regulation. I am available for any queries, Best regards, Zoé Vilain Jumbo Privacy www.jumboprivacy.com CCPA_3RD15DAY_00006 ~ Jumbo Jumbo Privacy Lisa B. Kim 2121 Atelier Inc. Deputy Attorney General 32 Bridge Street, 2nd Floor California Department of Justice Brooklyn, NY 11201 Consumer Law Section – Privacy U. USA 300 South Spring Street, 1st Floor Los Angeles, CA 90013 USA October 27th, 2020 By email ([email protected]) Subject: Written comments regarding the proposed CCPA regulations Dear Deputy Attorney General Kim, We write to you concerning the third set of proposed modifications to the California Consumer Privacy Act (“CCPA”) made on October 12th, 2020. As mentioned in our previous letters to you dated respectively February 25, 2020, and March 27th, 2020, 2121 Atelier Inc. d/b/a Jumbo Privacy1 has been acting as registered Authorized Agent in California for California residents, thanks to the introduction of such a role in the CCPA on Feb 1, 2020. Jumbo Privacy notably represents California consumers who request deletion of their personal information from consumer-selected businesses falling under the scope of the CCPA. Requests sent to a business by Jumbo Privacy on behalf of a consumer all contain the identification of the consumer and a signed mandate executed through and stored by a trusted third-party certifier, authorizing Jumbo to act on behalf of the consumer. As of the date of this letter, 73% of businesses we are sending Requests to, are refusing to comply with our Requests based on the argument that such businesses refuse to comply with third-party requests to delete the personal information and/or require the consumer to take further action directly. Jumbo Privacy has therefore been pushing back against such refusals by quoting sections 1798-135 of the CCPA and § 999.315.e of the California Attorney General text of Regulations and indicating that such refusals are a restriction of consumer’s rights. We are concerned that proposed modifications to the CCPA might highly restrict the efficiency and opportunity for consumers to mandate an Authorized Agent.
Recommended publications
  • Techlash, Loot Boxes, and Regulating “Dark Patterns” in the Video Game Industry’S Monetization Strategies
    12. GOODSTEIN.DOCX (DO NOT DELETE) 12/2/20 7:53 AM WHEN THE CAT’S AWAY: TECHLASH, LOOT BOXES, AND REGULATING “DARK PATTERNS” IN THE VIDEO GAME INDUSTRY’S MONETIZATION STRATEGIES Scott A. Goodstein* INTRODUCTION .......................................................................... 287 I. DEFINING DARK PATTERNS AND OTHER INVASIVE ASPECTS OF THE VIDEO GAME INDUSTRY ....................... 294 A. Dark Patterns ............................................................ 294 1. Video Game Dark Patterns Identified by Lewis, Björk, and Zagal ...................................... 297 2. A Loot Box is Simply a Monetized Rivalries Dark Pattern, Sometimes Combined with a Currency Confusion Dark Pattern ..................... 300 B. Psychology, Consumer Surveillance, and Data Research .................................................................... 302 II. TECHLASH IN THE VIDEO GAME INDUSTRY ..................... 306 A. Political Recognition of Exploitative Video Game Design ........................................................................ 308 B. Consumers Should Not Expect or Be Forced to Rely Upon the Video Game Industry to Self- Correct its Predatory Practices ................................. 311 III. LEGISLATIVE ATTEMPTS TO CURB MANIPULATIVE INTERFACE DESIGN .......................................................... 314 A. The Protecting Children from Abusive Games Act .. 315 *J.D. Candidate 2021, University of Colorado Law School; Associate Editor, Univer- sity of Colorado Law Review. I would like to thank all my fellow Law Review
    [Show full text]
  • Dark Patterns and the Legal Requirements of Consent Banners: an Interaction Criticism Perspective
    Dark Patterns and the Legal Requirements of Consent Banners: An Interaction Criticism Perspective Colin M. Gray Cristiana Santos [email protected] [email protected] Purdue University Utrecht University West Lafayette, Indiana, United States The Netherlands Nataliia Bielova Damian Clifford Michael Toth [email protected] [email protected] Australian National University [email protected] Australia Inria France ABSTRACT ACM Reference Format: User engagement with data privacy and security through consent Colin M. Gray, Cristiana Santos, Nataliia Bielova, Michael Toth, and Damian Clifford. 2021. Dark Patterns and the Legal Requirements of Consent Ban- banners has become a ubiquitous part of interacting with inter- ners: An Interaction Criticism Perspective. In CHI Conference on Human net services. While previous work has addressed consent banners Factors in Computing Systems (CHI ’21), May 8–13, 2021, Yokohama, Japan. from either interaction design, legal, and ethics-focused perspec- ACM, New York, NY, USA, 18 pages. https://doi.org/10.1145/3411764.3445779 tives, little research addresses the connections among multiple disciplinary approaches, including tensions and opportunities that transcend disciplinary boundaries. In this paper, we draw together 1 INTRODUCTION perspectives and commentary from HCI, design, privacy and data protection, and legal research communities, using the language and The language of ethics and values increasingly dominates both the strategies of “dark patterns” to perform an interaction criticism read- academic discourse and the lived experiences of everyday users ing of three different types of consent banners. Our analysis builds as they engage with designed technological systems and services. upon designer, interface, user, and social context lenses to raise Within the HCI community, there has been a long history of en- tensions and synergies that arise together in complex, contingent, gagement with ethical impact, including important revisions of the and conflicting ways in the act of designing consent banners.
    [Show full text]
  • Tales from the Dark Side: Privacy Dark Strategies and Privacy Dark Patterns
    Proceedings on Privacy Enhancing Technologies ; 2016 (4):237–254 Christoph Bösch*, Benjamin Erb, Frank Kargl, Henning Kopp, and Stefan Pfattheicher Tales from the Dark Side: Privacy Dark Strategies and Privacy Dark Patterns Abstract: Privacy strategies and privacy patterns are However, online service providers have become more fundamental concepts of the privacy-by-design engineer- and more sophisticated in deceiving users to hand over ing approach. While they support a privacy-aware de- their personal information. Up until now, privacy re- velopment process for IT systems, the concepts used by search has not studied this development. malicious, privacy-threatening parties are generally less An example for this development is the Tripadvi- understood and known. We argue that understanding sor mobile app (depicted in Figure 1), which is a review the “dark side”, namely how personal data is abused, platform for travel-related content. At first glance, the is of equal importance. In this paper, we introduce the starting page asks the user to log in with a personal concept of privacy dark strategies and privacy dark pat- Google+, Facebook, or email account. Taking a closer terns and present a framework that collects, documents, look, one notices that a third option is given that offers and analyzes such malicious concepts. In addition, we the creation of a Tripadvisor account. Furthermore, a investigate from a psychological perspective why privacy “Skip”-button is hidden in the upper right corner, which dark strategies are effective. The resulting framework which skips the login process entirely. When signing in allows for a better understanding of these dark con- with Facebook, Tripadvisor wants to gain access to the cepts, fosters awareness, and supports the development friend list, photos, likes, and other information (cf.
    [Show full text]
  • The Dark (Patterns) Side of UX Design
    The Dark (Patterns) Side of UX Design Colin M. Gray, Yubo Kou, Bryan Battles, Joseph Hoggatt, and Austin L. Toombs Purdue University West Lafayette, IN {gray42; kou2; bbattles; jhoggatt; toombsa}@purdue.edu ABSTRACT We focus specifically on the practitioner-created construct of Interest in critical scholarship that engages with the complex- "dark patterns" and its emergence in the practitioner lexicon, ity of user experience (UX) practice is rapidly expanding, defining a co-opting of human-centered values in the service yet the vocabulary for describing and assessing criticality in of deceptive or malicious aims. practice is currently lacking. In this paper, we outline and While ethics and values have been studied extensively in the explore the limits of a specific ethical phenomenon known HCI and Science & Technology Studies (STS) literature (e.g., as "dark patterns," where user value is supplanted in favor of value levers [63], values at play [25], critical design [10,7, 23], shareholder value. We assembled a corpus of examples of reflective design [60, 61]), these conversations are frequently practitioner-identified dark patterns and performed a content bound to the academic community and related discourses, analysis to determine the ethical concerns contained in these making practitioner access to these conversations difficult and examples. This analysis revealed a wide range of ethical issues activation of their implications problematic. While there are raised by practitioners that were frequently conflated under clear uptakes from these conversations for UX practice, it is the umbrella term of dark patterns, while also underscoring a unclear whether these methods or approaches are used in UX shared concern that UX designers could easily become com- practice to raise or foreground awareness of criticality, an issue plicit in manipulative or unreasonably persuasive practices.
    [Show full text]
  • Are Dark Patterns Anticompetitive?
    044BBC27-503D-4AFB-B1D9-B7BAF6F2C013.DOCX (DO NOT DELETE) 11/5/2020 7:00 PM ARE DARK PATTERNS ANTICOMPETITIVE? Gregory Day and Abbey Stemler INTRODUCTION ............................................................................................................ 2 I. PLATFORMS, ATTENTION, AND ONLINE MANIPULATION ......................... 7 A. The Economic Value of Attention ................................................................. 8 B. The Attention Cycle ..................................................................................... 10 C. Dark Patterns and Online Manipulation .................................................... 14 II. PRIVACY AND PRIVACY PROTECTION .......................................................... 16 A. Privacy and Decisional Privacy..................................................................... 17 B. The Individual and Societal Costs of Online Manipulation .......................... 19 C. The Lack of Regulation of Online Manipulation ......................................... 22 III. PRIVACY AND MANIPULATION IN MODERN ANTITRUST ......................... 24 A. Digital Markets and Antitrust .................................................................... 25 B. Persuasion or Coercion in Digital Markets?................................................. 27 1. Persuasion ............................................................................................ 28 2. Coercion ............................................................................................... 31 C. Anticompetitive
    [Show full text]
  • Dark Patterns': the Case for Regulatory Pluralism Dr M.R
    'Dark Patterns': the case for regulatory pluralism Dr M.R. Leiser1 Abstract ‘Dark patterns’ is a term commonly used by the web collective to describe a user interface that exploits users into doing something that they would not normally do. It is a coercive and manipulative design technique used by web designers when some sort of action is needed from a user - typically to begin the processing of personal data or indication of agreement to a contract. As dark patterns can compromise legal requirements like consent and privacy-by-design and legal principles found in both regimes, like fairness and transparency, this article analyses ‘dark patterns’ from a regulatory perspective. Two frameworks are critiqued: the European Union’s regime for data privacy and consumer protection. The paper also provides an overview of the enforcement measures available for the regulation of dark patterns. It concludes that a pluralistic approach that mixes the strengths of one regulatory regime while compensating for its weaknesses by the use of the other is needed to harness dark patterns. Keywords: dark patterns, consumer protection, data protection, law, regulation Introduction ‘Dark patterns’ are ‘interface design choices that benefit an online service by coercing, steering, and/or deceiving users into making decisions that, if fully informed and capable of selecting alternatives, they might not make’.2 Via specific design choices, many e-commerce and social media platforms prioritize capturing user attention while obfuscating information that might help them make more informed decisions. These techniques, including information overload, fine-tuned personalization, and distorted social cues, pave the way for the manipulation of users and compromise rational decision- making.
    [Show full text]
  • DECEIVED by DESIGN How Tech Companies Use Dark Patterns to Discourage Us from Exercising Our Rights to Privacy 27.06.2018 Table of Contents 1 Summary
    DECEIVED BY DESIGN How tech companies use dark patterns to discourage us from exercising our rights to privacy 27.06.2018 Table of contents 1 Summary .............................................................................................. 3 2 Introduction ......................................................................................... 4 3 Background .......................................................................................... 5 3.1 From nudging to exploitation through dark patterns .................................. 6 3.2 European data protection legislation .......................................................... 8 3.3 Methodology .............................................................................................. 10 4 Dark patterns in prominent digital services.......................................... 12 4.1 Default settings - Privacy by default? ........................................................ 13 4.2 Ease - Making the privacy option more cumbersome ............................... 19 4.3 Framing – Positive and negative wording .................................................. 22 4.4 Rewards and punishment .......................................................................... 25 4.5 Forced action and timing ........................................................................... 27 5 Illusion of control ................................................................................ 31 5.1 Facebook: “You control whether we use data from partners to show you ads” ...........................................................................................
    [Show full text]
  • DECEIVED by DESIGN How Tech Companies Use Dark Patterns to Discourage Us from Exercising Our Rights to Privacy 27.06.2018 Table of Contents 1 Summary
    DECEIVED BY DESIGN How tech companies use dark patterns to discourage us from exercising our rights to privacy 27.06.2018 Table of contents 1 Summary .............................................................................................. 3 2 Introduction ......................................................................................... 4 3 Background .......................................................................................... 5 3.1 From nudging to exploitation through dark patterns .................................. 6 3.2 European data protection legislation .......................................................... 8 3.3 Methodology .............................................................................................. 10 4 Dark patterns in prominent digital services.......................................... 12 4.1 Default settings - Privacy by default? ........................................................ 13 4.2 Ease - Making the privacy option more cumbersome ............................... 19 4.3 Framing – Positive and negative wording .................................................. 22 4.4 Rewards and punishment .......................................................................... 25 4.5 Forced action and timing ........................................................................... 27 5 Illusion of control ................................................................................ 31 5.1 Facebook: “You control whether we use data from partners to show you ads” ...........................................................................................
    [Show full text]
  • What Makes a Dark Pattern... Dark? Design Attributes, Normative Considerations, and Measurement Methods
    What Makes a Dark Pattern... Dark? Design Attributes, Normative Considerations, and Measurement Methods ARUNESH MATHUR, Princeton University JONATHAN MAYER, Princeton University MIHIR KSHIRSAGAR, Princeton University There is a rapidly growing literature on dark patterns, user interface designs—typically related to shopping or privacy—that researchers deem problematic. Recent work has been predominantly descriptive, documenting and categorizing objectionable user interfaces. These contributions have been invaluable in highlighting specific designs for researchers and policymakers. But the current literature lacks a conceptual foundation: What makes a user interface a dark pattern? Why are certain designs problematic for users or society? We review recent work on dark patterns and demonstrate that the literature does not reflect a singular concern or consistent definition, but rather, a set of thematically related considerations. Drawing from scholarship in psychology, economics, ethics, philosophy, and law, we articulate a set of normative perspectives for analyzing dark patterns and their effects on individuals and society. We then show how future research on dark patterns can go beyond subjective criticism of user interface designs and apply empirical methods grounded in normative perspectives. ACM Reference Format: Arunesh Mathur, Jonathan Mayer, and Mihir Kshirsagar. 2021. What Makes a Dark Pattern... Dark?: Design Attributes, Normative Considerations, and Measurement Methods. In CHI Conference on Human Factors in Computing Systems (CHI ’21), May 8–13, 2021, Yokohama, Japan. ACM, New York, NY, USA, 27 pages. https://doi.org/10.1145/3411764.3445610 1 INTRODUCTION Recent scholarship has called attention to dark patterns, user interface designs that researchers deem problematic. The preponderance of academic literature on dark patterns has curated collections of objectionable user interface designs [3, 21] and highlighted the frequency of dark patterns in specific contexts, such as privacy settings [4], online gaming [62], and online shopping [35].
    [Show full text]
  • Dark Patterns at Scale: Findings from a Crawl of 11K Shopping Websites
    81 Dark Patterns at Scale: Findings from a Crawl of11K Shopping Websites ARUNESH MATHUR, Princeton University, USA GUNES ACAR, Princeton University, USA MICHAEL J. FRIEDMAN, Princeton University, USA ELENA LUCHERINI, Princeton University, USA JONATHAN MAYER, Princeton University, USA MARSHINI CHETTY, University of Chicago, USA ARVIND NARAYANAN, Princeton University, USA Dark patterns are user interface design choices that benefit an online service by coercing, steering, or deceiving users into making unintended and potentially harmful decisions. We present automated techniques that enable experts to identify dark patterns on a large set of websites. Using these techniques, we study shopping websites, which often use dark patterns to influence users into making more purchases or disclosing more information than they would otherwise. Analyzing ∼53K product pages from ∼11K shopping websites, we discover 1,818 dark pattern instances, together representing 15 types and 7 broader categories. We examine these dark patterns for deceptive practices, and find 183 websites that engage in such practices. Wealso uncover 22 third-party entities that offer dark patterns as a turnkey solution. Finally, we develop a taxonomy of dark pattern characteristics that describes the underlying influence of the dark patterns and their potential harm on user decision-making. Based on our findings, we make recommendations for stakeholders including researchers and regulators to study, mitigate, and minimize the use of these patterns. CCS Concepts: • Human-centered computing → Empirical studies in HCI; HCI theory, concepts and models; • Social and professional topics → Consumer products policy; • Information systems → Browsers. Additional Key Words and Phrases: Dark Patterns; Consumer Protection; Deceptive Content; Nudging; Manipulation ACM Reference Format: Arunesh Mathur, Gunes Acar, Michael J.
    [Show full text]
  • Dark Patterns and Their Use in E-Commerce
    Emma Nevala DARK PATTERNS AND THEIR USE IN E-COMMERCE JYVÄSKYLÄN YLIOPISTO INFORMAATIOTEKNOLOGIAN TIEDEKUNTA 2020 ABSTRACT Nevala, Emma Dark patterns and their use in e-commerce Jyväskylä: University of Jyväskylä, 2020, 33 p. Information systems science, Bachelor’s thesis Supervisor: Kyppö, Jorma Growing competition in the field of e-commerce has led retailers to adopt different strategies to engage users and guide them through the digital buying process. Some retailers use digital nudges that aim to guide user through the shopping process. Some e- commerce sites have resorted to the use of dark patterns – user interface elements that manipulate the user into making a choice they might not have made, had they had the chance to choose freely. This thesis has three research questions: 1.) What are dark patterns? 2.) Why do dark patterns work? 3.) Why are dark patterns used in e-commerce? This thesis defines dark patterns as “intentional, deceptive design decisions that were made to take advantage of psychology, to manipulate the user into making decisions that were unintended and unwanted; creating value for the service that employs them”. There seems to be four factors that explain the effectiveness of dark patterns: technological, cognitive, social and motivational. Each of these factors is looked at in detail. Dark patterns are widely used in e-commerce. The prevalence of dark patterns has created an industrial drift that has led to the adoption of dark patterns on a much larger scale. Dark patterns have also been used to promote customer engagement, which, in turn, has been associated with higher levels of economic success.
    [Show full text]
  • End User Accounts of Dark Patterns As Felt Manipulation
    End User Accounts of Dark Patterns as Felt Manipulation COLIN M. GRAY, JINGLE CHEN, SHRUTHI SAI CHIVUKULA, and LIYANG QU, Purdue University Manipulation defines many of our experiences as a consumer, including subtle nudges and overt advertising campaigns thatseek to gain our attention and money. With the advent of digital services that can continuously optimize online experiences to favor stakeholder requirements, increasingly designers and developers make use of “dark patterns”—forms of manipulation that prey on human psychology—to encourage certain behaviors and discourage others in ways that present unequal value to the end user. In this paper, we provide an account of end user perceptions of manipulation that builds on and extends notions of dark patterns. We report on the results of a survey of users conducted in English and Mandarin Chinese (n=169), including follow-up interviews from nine survey respondents. We used a card sorting method to support thematic analysis of responses from each cultural context, identifying both qualitatively-supported insights to describe end users’ felt experiences of manipulative products, and a continuum of manipulation. We further support this analysis through a quantitative analysis of survey results and the presentation of vignettes from the interviews. We conclude with implications for future research, considerations for public policy, and guidance on how to further empower and give users autonomy in their experiences with digital services. CCS Concepts: • Security and privacy ! Social aspects of security and privacy; • Human-centered computing ! Empirical studies in HCI; Empirical studies in interaction design. Additional Key Words and Phrases: dark patterns, ethics, manipulation, user experience Draft: October 20, 2020 1 INTRODUCTION Digital products create and reinforce structures that guide our modern lives, providing a means to connect with others, share our experiences, perform mundane everyday tasks, and purchase goods and services.
    [Show full text]