The Ethics of Persuasion: Persuasive Technology and the Power Behind Design
Total Page:16
File Type:pdf, Size:1020Kb
The Ethics of Persuasion: Persuasive Technology and the Power Behind Design A Research Paper submitted to the Department of Engineering and Society Presented to the Faculty of the School of Engineering and Applied Science University of Virginia • Charlottesville, Virginia In Partial Fulfillment of the Requirements for the Degree Bachelor of Science, School of Engineering Danielle Newman Spring 2020 On my honor as a University Student, I have neither given nor received unauthorized aid on this assignment as defined by the Honor Guidelines for Thesis-Related Assignments Advisor Kathryn A. Neeley, Associate Professor of STS, Department of Engineering and Society Introduction Throughout the last decade, there has been a significant increase in technology usage. The development of technology and different technological systems has allowed the general public to access more information, increase task production, and engage in more opportunities. Specifically, technology has opened up new opportunities to develop new information systems for influencing users. This has led to the development of Behavior Change Support Systems (BCSS), which is “a socio-technical information system with psychological and behavioral outcomes designed to form, alter or reinforce attitudes, and behaviors or an act of complying using coercion or deception” (Kulyk, p. 2), as a key construct for research in persuasive technology. Persuasive technology is a research field that studies how people are persuaded while interacting with computer technology. Although effective in its ability to treat patients, using persuasive systems and technology can become biased and cause ethical issues, based on the developer of the system. The introduction of design biases can potentially hinder patient results because people are susceptible to decision bias, which often makes it difficult for them to make self-beneficial choices. According to Lee, behavioral economists have shown that an individual’s decision-making process is influenced by various situational factors, such as the manner in which options are presented and the emotional state of the individual at the time of their decision. Biases will temporarily treat patients that use these systems, but will not permanently reduce their symptoms. It would almost brainwash users into thinking the system is effective, rather than it actually being of assistance. By understanding biases and how they can affect people’s decisions, developers can effectively apply tools to persuasive technology that promotes 1 healthy treatment. In this paper, I discuss the different design strategies in persuasive technology, and determine its overall effectiveness in the pursuit of mental health treatment. Part I: How Persuasive Technology Influences Human Behavior The most important aspect in the effectiveness of persuasive technology is the ability to persuade users to accomplish a given task. By definition, persuasion is “human communication designed to influence the autonomous judgements and actions of others” (Simons). Persuasion is a form of attempted influence, which seeks a way to alter the way others think, feel, or act (Simons). This modification of human behavior shows that the process of persuasion is, fundamentally, a non-rational process, dominated much more by the emotional and impulsive part of our nature than by the rational (MacPherson, 2018). By authorizing the display of behavior through emotion, persuasion has the opportunity to promote implicit bias. When an individual employs the process of rational logical, their objective is either to discover or demonstrate. The mere connotation behind the act of trying to discover some conclusion indicates that there exist no preconceived artifacts. When logical demonstration is the aim, the whole course of reasoning becomes directed to experimentation and furnishing proof for validity, rather than persuasion which is indicative of previous beliefs (MacPherson, 2018). Although persuasion can have positive effects, using persuasion in persuasive technologies can have “undesirable consequences, employ problematic methods of persuasion, or persuade users to do things which cannot be morally justified” (Verbeek). The first step to analyzing the moral aspects of persuasive technologies, is to conceptualize their impact on human beings. It is assumed that the impact of persuasive technology only affects the behavioral aspects of an individual, but these systems can have unintended effects as well. For example, automobiles and highways helped create American suburbs, but they were not invented with the 2 intent of persuading millions of people daily to commute (Berdichevsky, 1999). The unintentional outcomes that have occurred while using technology has highlighted the intricate connections between technology design and user behavior. Naturally, human beings are efficient but imperfect information processors and decision makers. Due to this, human behavior and decisions are often marked by systematic departure from logical, rational “norms”. Interaction between technology and design has the ability to significantly influence human behavior in extremely complex ways. As technology has become more involved and developed, more people have come to realize that there is a critical need to better understand the human side of emerging technologies, rather than just the technical side (Yan, 2019). Philosopher, Bruno Latour, has offered many interesting concepts for analyzing how technological artifacts can mediate human action. Latour has continuously pointed out that human behavior in many cases is prompted by the things that they use. Therefore, actions are not only a result of individualized intentions, but also a reflection of people’s material environment. The concept that Latour introduced to describe the influence of technology or artifacts on human action is “script”. Similar to how a script is used in a movie or a play, technological artifacts direct their users to act a specific way when they use them. For instance, a speed bump or traffic signal has the script to “slow down” when approaching (Verbeek, 2006). Specific properties of technological devices and systems seem to have a close correlation with the associated behavior of the individuals using them. Technology has always had consequences for human behavior, therefore, in order to accurately comprehend human behavior, it is necessary to take into account the ways in which it is influenced by technology. For this reason, developers and people that design technologies must focus on policy making and pay attention to the technological context of behavior in order to effectively create technology that 3 can positively influence users (Verbeek). Developers are required to conceptualize the ethical dimensions of technologies and the consequences that might result. Part II: Applying Ethics to the Design of Persuasive Technology Ethics are considered “a rational, consistent system for determining right and wrong, usually in the context of specific actions or policies” (Berdichevsky, 1999). Specifically, the ethics of engineering design aims to analyze the moral aspects of technology design in a systematic way. The largest aspect to consider in design is the social impact that the technology will have upon its arrival into society. Recent research in science and technology studies have shown that technologies profoundly influence the behavior and experiences of users (Verbeek). This directly forces developers to conceptualize this influence and to anticipate it in their design, while taking into account the ethics of engineering design. When introducing persuasive technologies that explicitly aim to influence behavior, there are specific ethical requirements that need to be met in order to achieve morality. One of the most important requirements is that the user is able to trust the technology that they are using. In this context, trust means that the technology achieves its objectives and that the consequences of using the technology is not harmful or does not cause undesirable results to the user, unless they are adequately informed (Verbeek). The design process of incorporating ethical principles within the design might force developers to make difficult decisions in keeping potential design features, but being able to maintain balance between morals and design is important. The more principles that a designer violates, the more susceptible the final product becomes to being ethically problematic (Berdichevsky, 1999). 4 Analyzing the ethics of any specific persuasive act or design requires a systematic approach, initiating with a standard breakdown of persuasion and gradually incorporating persuasive technologies. To support this approach, Berdichevsky has proposed a framework that serves to analyze the acts of persuasion, “according to their motivations, methods, and outcomes – intended and unintended” (p.54). The development of the framework (Figure 1) begins with a basic relationship between a persuader and person being persuaded. This instance only distributes responsibility to two parties. However, the focus of persuasive technologies was intended to persuade, therefore, there needs to be active intermediaries between the persuader and persuaded individual (Berdichevsky, 1999). Central to this framework is the interaction between persuasive technology, the persuader and persuaded (Figure 2). All elements in this interaction incorporate a specific point of reflection in which the individual can discuss morals: the designer with