<<

The Ethics of : Persuasive Technology and the Power Behind Design

A Research Paper submitted to the Department of Engineering and Society

Presented to the Faculty of the School of Engineering and Applied Science University of Virginia • Charlottesville, Virginia

In Partial Fulfillment of the Requirements for the Degree Bachelor of Science, School of Engineering

Danielle Newman Spring 2020

On my honor as a University Student, I have neither given nor received unauthorized aid on this assignment as defined by the Honor Guidelines for Thesis-Related Assignments

Advisor Kathryn A. Neeley, Associate Professor of STS, Department of Engineering and Society

Introduction

Throughout the last decade, there has been a significant increase in technology usage.

The development of technology and different technological systems has allowed the general public to access more information, increase task production, and engage in more opportunities.

Specifically, technology has opened up new opportunities to develop new information systems for influencing users. This has led to the development of Behavior Change Support Systems

(BCSS), which is “a socio-technical information system with psychological and behavioral outcomes designed to form, alter or reinforce attitudes, and behaviors or an act of complying using coercion or deception” (Kulyk, p. 2), as a key construct for research in persuasive technology. Persuasive technology is a research field that studies how people are persuaded while interacting with computer technology.

Although effective in its ability to treat patients, using persuasive systems and technology can become biased and cause ethical issues, based on the developer of the system. The introduction of design biases can potentially hinder patient results because people are susceptible to decision bias, which often makes it difficult for them to make self-beneficial choices.

According to Lee, behavioral economists have shown that an individual’s decision-making process is influenced by various situational factors, such as the manner in which options are presented and the emotional state of the individual at the time of their decision.

Biases will temporarily treat patients that use these systems, but will not permanently reduce their symptoms. It would almost brainwash users into thinking the system is effective, rather than it actually being of assistance. By understanding biases and how they can affect people’s decisions, developers can effectively apply tools to persuasive technology that promotes

1 healthy treatment. In this paper, I discuss the different design strategies in persuasive technology, and determine its overall effectiveness in the pursuit of mental health treatment.

Part I: How Persuasive Technology Influences Human Behavior

The most important aspect in the effectiveness of persuasive technology is the ability to persuade users to accomplish a given task. By definition, persuasion is “human designed to influence the autonomous judgements and actions of others” (Simons). Persuasion is a form of attempted influence, which seeks a way to alter the way others think, feel, or act

(Simons). This modification of human behavior shows that the process of persuasion is, fundamentally, a non-rational process, dominated much more by the emotional and impulsive part of our nature than by the rational (MacPherson, 2018). By authorizing the display of behavior through emotion, persuasion has the opportunity to promote implicit bias. When an individual employs the process of rational logical, their objective is either to discover or demonstrate. The mere connotation behind the act of trying to discover some conclusion indicates that there exist no preconceived artifacts. When logical demonstration is the aim, the whole course of reasoning becomes directed to experimentation and furnishing proof for validity, rather than persuasion which is indicative of previous beliefs (MacPherson, 2018).

Although persuasion can have positive effects, using persuasion in persuasive technologies can have “undesirable consequences, employ problematic methods of persuasion, or persuade users to do things which cannot be morally justified” (Verbeek). The first step to analyzing the moral aspects of persuasive technologies, is to conceptualize their impact on human beings. It is assumed that the impact of persuasive technology only affects the behavioral aspects of an individual, but these systems can have unintended effects as well. For example, automobiles and highways helped create American suburbs, but they were not invented with the

2 intent of persuading millions of people daily to commute (Berdichevsky, 1999). The unintentional outcomes that have occurred while using technology has highlighted the intricate connections between technology design and user behavior.

Naturally, human beings are efficient but imperfect information processors and decision makers. Due to this, human behavior and decisions are often marked by systematic departure from logical, rational “norms”. Interaction between technology and design has the ability to significantly influence human behavior in extremely complex ways. As technology has become more involved and developed, more people have come to realize that there is a critical need to better understand the human side of emerging technologies, rather than just the technical side

(Yan, 2019). Philosopher, Bruno Latour, has offered many interesting concepts for analyzing how technological artifacts can mediate human action. Latour has continuously pointed out that human behavior in many cases is prompted by the things that they use. Therefore, actions are not only a result of individualized intentions, but also a reflection of people’s material environment.

The concept that Latour introduced to describe the influence of technology or artifacts on human action is “script”. Similar to how a script is used in a movie or a play, technological artifacts direct their users to act a specific way when they use them. For instance, a speed bump or traffic signal has the script to “slow down” when approaching (Verbeek, 2006).

Specific properties of technological devices and systems seem to have a close correlation with the associated behavior of the individuals using them. Technology has always had consequences for human behavior, therefore, in order to accurately comprehend human behavior, it is necessary to take into account the ways in which it is influenced by technology. For this reason, developers and people that design technologies must focus on policy making and pay attention to the technological context of behavior in order to effectively create technology that

3 can positively influence users (Verbeek). Developers are required to conceptualize the ethical dimensions of technologies and the consequences that might result.

Part II: Applying Ethics to the Design of Persuasive Technology

Ethics are considered “a rational, consistent system for determining right and wrong, usually in the context of specific actions or policies” (Berdichevsky, 1999). Specifically, the ethics of engineering design aims to analyze the moral aspects of technology design in a systematic way. The largest aspect to consider in design is the social impact that the technology will have upon its arrival into society. Recent research in science and technology studies have shown that technologies profoundly influence the behavior and experiences of users (Verbeek).

This directly forces developers to conceptualize this influence and to anticipate it in their design, while taking into account the ethics of engineering design.

When introducing persuasive technologies that explicitly aim to influence behavior, there are specific ethical requirements that need to be met in order to achieve morality. One of the most important requirements is that the user is able to trust the technology that they are using. In this context, trust means that the technology achieves its objectives and that the consequences of using the technology is not harmful or does not cause undesirable results to the user, unless they are adequately informed (Verbeek). The design process of incorporating ethical principles within the design might force developers to make difficult decisions in keeping potential design features, but being able to maintain balance between morals and design is important. The more principles that a designer violates, the more susceptible the final product becomes to being ethically problematic (Berdichevsky, 1999).

4

Analyzing the ethics of any specific persuasive act or design requires a systematic approach, initiating with a standard breakdown of persuasion and gradually incorporating persuasive technologies. To support this approach, Berdichevsky has proposed a framework that serves to analyze the acts of persuasion, “according to their motivations, methods, and outcomes

– intended and unintended” (p.54). The development of the framework (Figure 1) begins with a basic relationship between a persuader and person being persuaded. This instance only distributes responsibility to two parties. However, the focus of persuasive technologies was intended to persuade, therefore, there needs to be active intermediaries between the persuader and persuaded individual (Berdichevsky, 1999). Central to this framework is the interaction between persuasive technology, the persuader and persuaded (Figure 2). All elements in this interaction incorporate a specific point of reflection in which the individual can discuss morals: the designer with motivations, methods of persuasive technology, and the intended or unintended outcomes of persuasion (Verbeek).

Figure 1. Framework for evaluating the ethics of a persuasive interaction in a traditional persuasive context.

Figure 2. Framework for evaluating the ethics of the more complex interaction of persuader, persuasive technology, and the part of parties being persuaded.

5

By enforcing an intermediary relationship between technology and the developer, the framework draws attention to the idea of separation and the attribution of motivations to the designer and of persuasive intent to the technology. The introduction of an actively persuasive technology allows the technology itself to become both a method and direct executor of persuasive methods. It calls to question the distribution of responsibility and whether technology has any shares within this distribution. Due to technology’s incapacity to form its own intentions nor the ability to make rational decisions, it is incapable of being a moral agent – therefore, when there is a mistake in technology, the blame falls on the designer rather than the device itself. This further drives the idea that design efforts/practices and the ethical principles associated with technology are crucial to take into account for implementation because they can be the determining factors of successful technology (Berdichevsky, 1999).

Implementation of these ethical design practices can be exhibited in Behavior Change

Support Systems (BCSS). BCSS combine properties of interpersonal interaction and mass communication in order to promote behavioral changes (Ludden). As a whole, society faces severe problems when it comes to securing health for the public at large. Most technologies or systems that have been developed suffer from a lack of reach, which is also a characteristic of

BCSS, regardless of how easily accessible or overly expensive it is. In general, web-based interventions seem to miss out on helping the public at large. This selective reach is not intended and, in many cases, it has caused attempts from developers to increase reach through the strategy of personalization and system design. Personalization of the content of a BCSS offers opportunities to attract different target groups.

The University of Twente and the Centre for Infectious Disease conducted a study with the aim to “identify the persuasive features and design factors that contribute to the use and

6 uptake of existing and new health technologies” (Kulyk). In this study, four focus groups were conducted with young adults for about 70 to 75 minutes to discuss and express their opinions on alternative concepts on technologies for health behavior and lifestyle change support. Each of these focus group sessions were audio recorded and analyzed to determine the influence of various persuasive features (Figure 3).

Figure 3. Persuasive features derived from the focus groups and categorized into different features, based on the coding scheme used for data analysis.

7

Based on the analysis of the focus group data and through experimentation of BCSS as a technique, five general design factors were formulated: anonymity, interactivity, portability, source, and comprehensibility (Kulyk). By implementing each of these factors into the system, online intervention usage is expected to have increased effectiveness. Anonymity was agreed to be a crucial factor for health support due to the stigma that is associated with seeking aid for mental health. Several participants mentioned that interactivity was a missing feature or one that was not fully available. It was also agreed that the presence of interactivity and the level of interaction from participants is crucial in order to facilitate better engagement. Portability of the platform, such as having a mobile application, is preferred as it ensures privacy and effortless use from users. It was mentioned that the source of the information provided by the system is important in determining the level of trust that users will put into the application. Participants indicated that they would trust the provided health information more if the organization was familiar and was clearly stated. By providing valid health information, the users are more willing to effectively participate in the intervention. Lastly, comprehensibility is important because users appreciate visually aided information, as it is enticing and makes participation in intervention more appealing. (Kulyk).

Part III: Leveraging User Experience as a Means for Design

Persuasive technologies constantly have this negative connotation, as it is convincing users to perform certain actions. By taking into account the ethics of technological design and the affect it has on human behavior, we can change this perspective. Persuasive technology can have massive significance, as long as it is performed correctly. According to Fogg, persuasive technology can “change people’s lives in three related ways based on the functional role or how user respond to the computer system, which are; computer as a tool, as a media and as an actor”

8

(Fogg, 2009). As a tool, computers or technology are able to increase people’s ability to make target behavior by facilitating the process. For example, health monitoring devices help users monitor their health, as well as improve it overall. As a media, persuasive technologies can provide interactivity and narrative that can create persuasive experiences for users. Ultimately, this allows user to rehearse behavior and explore the cause and effects of their actions. As a social actor, persuasive technologies can be persuasive by giving social clues that trigger response (Dolhalit, 2016). Examining these strategies in developing successful persuasive systems is crucial for designers and developers to ensure that the users will be persuaded. Figure

4 shows a table of the results reviewed based on the strategic ways that persuasive technology can be used by 25 different researchers.

Figure 4. Results of 25 different researchers review to determine the most effective persuasive system strategies.

Based on the table in Figure 4, you can clearly see what strategies are used more often than not. Of all the strategies reviewed, it shows that “attractiveness” (96%) is the largest strategy used in persuasive technology. The principle of attractiveness for technology states that,

9

“a more attractive technology (interface or hardware) will have greater persuasive power than the unattractive” (Dolhalit, 2016). This correlates with the “liking” persuasive feature that was seen in the table in Figure 3. This suggests that if something is more attractive or appealing, then users tend to engage more (Kulyk). Overall, technology as a social actor seems to have the most usage in persuasive technology.

Furthermore, when examining these different strategies of design for the development of persuasive technology, it is important to keep in mind the ethics behind each strategy. Being able to explore each method thoroughly before design is another means of establishing intent and assessing ethics. Some methods or strategies will be more clearly unethical than others. Whether it is being used by a human being or computer system, there are some methods of changing attitude or behavior that are always unethical, such as, deception and coercion (“The Ethics of

Persuasive Technology”). This also relates back to the stimulating persuasion features mentioned in Figure 3. A majority of the persuasive features recognized system credibility as a valuable category in user persuasion. Users in general feel more comfortable knowing the origin of information in technology and having it verified (Kulyk). Deception and coercion actively contrast these methods and could potentially hinder results through intentional bias.

Conclusion:

The emerging usage of persuasive technology offers an opportunity to assess how a focus on persuasion and persuasive design can potentially be effective in bringing about behavioral change. Although the effects of persuasive technology can be largely beneficial to users, the main focus of these systems lie not with its outcome, but with its development. Designing persuasive technology can be far more difficult than coming up with a new product idea because the strategies used in design can be the determining factor in its success. Being able to

10 incorporate effective persuasive techniques, while maintaining ethical practice can be hard to balance.

Ultimately, persuasive technology is very user focused and oriented. It is true that design and ethics significantly contribute to the success of persuasive systems, but true triumph is in the reaction of its users. It is necessary for designers to evaluate persuasion strategies to ensure that the target user will be persuaded, and make use of the benefits of this technology. If the users are unable to engage in the technology, then they lose the ability to be persuaded and fail to receive the advantages of persuasive technology.

In order to ensure the success of persuasive technology, it is important for both technology users and designers to become educated on the ethical issues of technology. In turn, designers will be in a better position to create and actively sell their products. Also, technology users will be in a better position to recognize when technology is providing unethical or questionably ethical methods of persuasion.

11

References

Berdichevsky, D., & Neuenschwander, E. (1999). Toward an ethics of persuasive technology. of the ACM, 42(5), 51–58. doi: 10.1145/301353.301410

Demarzo, P. M., Vayanos, D., & Zwiebel, J. (2003). Persuasion Bias, , and Unidimensional Opinions. The Quarterly Journal of Economics, 118(3), 909–968. doi: 10.1162/00335530360698469 Dolhalit, M. L. B., Salam, S. N. B. A., & Mutalib, A. B. A. (2016). A Review on Persuasive Technology (PT) Strategy in Awareness Study. Indian Journal of Science and Technology, 9(34). doi: 10.17485/ijst/2016/v9i34/100826 Fogg, BJ Stanford University, BJ Fogg, Stanford University, Stanford UniversityView Profile, Claremont Graduate University, & Innovation in Learning Inc. (2009, April 1). Creating persuasive technologies: an eight-step design process. Retrieved from https://dl.acm.org/doi/10.1145/1541948.1542005 Kulyk, O., Daas, C. den, David, S., & Gemert-Pijnen, L. van. (n.d.). How Persuasive are Serious Games, Social Media and mHealth Technologies for Vulnerable Young Adults? Ludden, G. D. S., & Offringa, M. (n.d.). Trigger in the environment. MACPHERSON, W. I. L. L. I. A. M. (2018). Revival: the of persuasion (1920). LONDON: ROUTLEDGE. Pannucci, C. J., & Wilkins, E. G. (2010). Identifying and Avoiding Bias in Research. and Reconstructive Surgery, 126(2), 619–625. doi: 10.1097/prs.0b013e3181de24bc Simons, H. W. (2011). Persuasion in society.

The Ethics of Persuasive Technology. (n.d.). Retrieved from

https://flylib.com/books/en/2.438.1/the_ethics_of_persuasive_technology.html#fastmenu_12

User Behavior and Technology Development. (2006). doi: 10.1007/978-1-4020-5196-8

Verbeek, P.-P. (n.d.). Acting artifacts. User Behavior and Technology Development, 53–60. doi: 10.1007/978-1-4020-5196-8_6 Verbeek, P.-P. (n.d.). Materializing Morality: Design Ethics and Technological Mediation - Peter- Paul Verbeek, 2006. Retrieved from https://journals.sagepub.com/doi/abs/10.1177/0162243905285847 Verbeek, P.-P. (2006). Materializing Morality. Science, Technology, & Human Values, 31(3), 361– 380. doi: 10.1177/0162243905285847 Wiafe, I. (n.d.). The Role of U-FADE in Selecting Persuasive System Features. Encyclopedia of Information Science and Technology, Fourth Edition, 7785–7795. doi: 10.4018/978-1-5225- 2255-3.ch677

12

Yan, Zheng & Gaspar, Rui & Zhu, Tingshao. (2019). Editorial: Emerging technologies, human behavior, and human behavior and emerging technologies. 1. 1. 10.1002/hbe2.111.

13