End User Accounts of Dark Patterns As Felt Manipulation
Total Page:16
File Type:pdf, Size:1020Kb
End User Accounts of Dark Patterns as Felt Manipulation COLIN M. GRAY, JINGLE CHEN, SHRUTHI SAI CHIVUKULA, and LIYANG QU, Purdue University Manipulation defines many of our experiences as a consumer, including subtle nudges and overt advertising campaigns thatseek to gain our attention and money. With the advent of digital services that can continuously optimize online experiences to favor stakeholder requirements, increasingly designers and developers make use of “dark patterns”—forms of manipulation that prey on human psychology—to encourage certain behaviors and discourage others in ways that present unequal value to the end user. In this paper, we provide an account of end user perceptions of manipulation that builds on and extends notions of dark patterns. We report on the results of a survey of users conducted in English and Mandarin Chinese (n=169), including follow-up interviews from nine survey respondents. We used a card sorting method to support thematic analysis of responses from each cultural context, identifying both qualitatively-supported insights to describe end users’ felt experiences of manipulative products, and a continuum of manipulation. We further support this analysis through a quantitative analysis of survey results and the presentation of vignettes from the interviews. We conclude with implications for future research, considerations for public policy, and guidance on how to further empower and give users autonomy in their experiences with digital services. CCS Concepts: • Security and privacy ! Social aspects of security and privacy; • Human-centered computing ! Empirical studies in HCI; Empirical studies in interaction design. Additional Key Words and Phrases: dark patterns, ethics, manipulation, user experience Draft: October 20, 2020 1 INTRODUCTION Digital products create and reinforce structures that guide our modern lives, providing a means to connect with others, share our experiences, perform mundane everyday tasks, and purchase goods and services. Modern accounts of these technologies range from the techno-optimistic to the downright dystopian—alternately channeling the power of technology to encourage equality and ease of access or the power of technology to corrupt and undermine individual agency. While modern Science and Technology Studies (STS) scholars point out that society has been shaped by the forces of marketing and advertising since the dawn of the industrial revolution, never before have systems been able to better track consumer behavior (via the “surveillance economy” [57]) and “intelligently” respond with customized and personalized prompts to optimize for monetary gain [31, 34, 45]. This desire for market and consumer control is not new, however; as far back as the 1960s, advertising designers recognized the tensions between contributing to societal good and working as shills for “trivial purposes”: “We do not advocate the abolition of high pressure consumer advertising: this is not feasible. Nor do we want to take any of the fun out of life. But we are proposing a reversal of priorities in favour of the more arXiv:2010.11046v1 [cs.HC] 21 Oct 2020 useful and more lasting forms of communication. We hope that our society will tire of gimmick merchants, status salesmen and hidden persuaders, and that the prior call on our skills will be for worthwhile purposes.” [21] Of course, little has changed in modern advertising, even after this statement—known as the First Things First manifesto—was renewed using similar language in 2000 [2]. Today, slick advertising appeals have been replaced by Authors’ address: Colin M. Gray, [email protected]; Jingle Chen, [email protected]; Shruthi Sai Chivukula, [email protected]; Liyang Qu, [email protected], Purdue University, 401 N Grant Street, West Lafayette, Indiana, 47907. 1 2 Gray, et al. “growth hacking” approaches that seek to maximize business and shareholder value by acquiring clients at the lowest cost [7]. This market tendency—when paired with modern use of overtly persuasive and manipulative design patterns—has led to the emergence of “dark UX” and “dark patterns” as key drivers for building and sustaining a customer base [9–11, 26, 38]. These dark patterns are strongly linked to their manipulative counterparts in advertising over the past 50 years, while taking form in increasingly rapid and tailored ways [37]. In this paper, we build upon the concepts of manipulation and persuasion from the perspective of the technology end- user, seeking to supplement accounts of design ethics from designer [8, 38], organizational [24, 41, 42, 44], educational [15, 54, 55], and professional ethics [12, 22] perspectives with a description of how these persuasive systems impact user experiences and point towards felt qualities of manipulation. We seek to connect the discourses of dark patterns with end user experiences, with the goal of better articulating potential uptakes for public policy, user empowerment, and design ethics. We report on the results of a survey study distributed to English and Mandarin Chinese-speaking participants (n=169), augmented by nine follow-up interviews. Through these data collection methods, we sought to elaborate participants’ experiences of digital technologies that they felt were manipulative, including the qualities that caused them to suspect manipulation at varying levels of awareness, the emotions they felt when encountering these manipulative digital products, and the ways in which these participants were aware of the creation and creators of these manipulative experiences. While we did not directly ask participants about their experiences with “dark patterns,” we used the participants’ “felt manipulation” as a proxy to investigate how dark-patterns-informed digital products might be experienced by end users. Building upon our survey and interview results, we offer an end user perspective on dark patterns, building on minimal existing literature from Maier and Harr [33] in this space to identify opportunities to better support user agency through public policy and designer/organizational practices. The contribution of this paper is two-fold: 1) We document the felt experiences of manipulation shared by users of digital products, providing a new perspective on “dark patterns” that facilitates further work that builds upon the intersections of user trust, designer intent, and public policy. 2) We describe users’ perceptions of the creator of these manipulative interfaces or technology systems from multiple perspectives, including the projected role of designer ethics and responsibility in the creation of manipulative technologies. 2 BACKGROUND WORK 2.1 Manipulation in Digital Systems Manipulation has been studied in numerous disciplinary contexts, often building upon and further specifying the conditions by which an act or experience is manipulative—beyond what we already colloquially understand to “feel” or “be” manipulative. From a philosophy perspective, Ackerman [6] contends that for an act to be considered manipulative it must first involve “getting someone to go against what he finds natural or appropriate or is otherwise inclinedtodo.” From a critical discourse perspective, van Dijk [47] further positions manipulation as simultaneously dealing with “a form of social power abuse, cognitive mind control and discursive interaction”—implicating both social dimensions of interaction and a bi-directional act of discursive practice that is shaped “by cognitive and social dimensions.” From a legal perspective, Waldman [49] focuses on aspects of cognitive bias that impact the kinds of strategies that digital products use to manipulate user behaviors. Waldman [49] links these legal concerns to their description of common psychological forms of manipulation that impact users’ decision making processes, including: anchoring (“the disproportionate reliance on the information first available when we make decisions”); framing (“the way in which an opportunity ispresented End User Accounts of Dark Patterns as Felt Manipulation 3 to consumers—namely, either as a good thing or a bad thing”); hyberbolic discounting (“the tendency to overweight the immediate consequences of a decision and to underweight those that will occur in the future”); overchoice (“the problem of having too many choices, which can overwhelm and paralyze consumers”); and metacognitive processes (e.g., “perceiv[ing] difficulty as a signal of importance”). In an HCI and CSCW context, relatively little work has directly addressed manipulation as a construct in relation to digital systems, although such a concept is clearly implicated in the study of restrictive online community practices, disinformation campaigns, and even in mechanisms used to discourage direct action against platforms. Frequently, “nudging” is used to describe a light form of manipulation whereby a system maintains user agency but privileges one type of behavior or outcome over another [51, 52]. As stated by Weinmann et al. [51], “Even simple modifications of the choice environment in which options are presented can influence people’s choices and ‘nudge’ them into behaving in particular ways,” concluding that “there is no neutral way to present choices.” In contrast, other scholars such as Fogg [19] have actively called for the identification and harnessing of principles of persuasion, with the contention that these principles can be used to promote user experiences where behavior modification is desirable (e.g., increasing motivation, reducing addictive behaviors). However, the