Persuasive Design Techniques in the Attention Economy
Total Page:16
File Type:pdf, Size:1020Kb
Load more
Recommended publications
-
Reconnect Discussion Guide
Reconnect Discussion Guide Spiritual Restoration in Beloved Community 4 Sessions Ed Cyzewski Reconnect Study Guide • 2 CONTENTS IDEAS FOR USING THIS DISCUSSION GUIDE Session 1: Do We Know Paradise Is All around Us? Introduction Session 2: How Digital Technology Shapes Us 1 The Goal of Digital Formation 2 How Digital Formation Changes Us 3 How Digital Formation Hinders Spirituality Session 3: How to Make Space for Spiritual Restoration 4 Where Two or Three Are Texting in My Name 5 The Goal of Spiritual Formation 6 Detoxing from Distractions Session 4: Practices for Spiritual Restoration 7 Get a Habit . Like Monks and Nuns 8 Reconnect with the Good and Beautiful IDEAS FOR USING THIS DISCUSSION GUIDE The Reconnect Discussion Guide has been designed with groups and individuals in mind who want to spend some time further discussing or pondering the ideas in Reconnect. This discussion guide is broken up into four separate sessions for group discussion over a four-week period. Just substitute the word session with week, and you’ll be ready to go. The chapters from Reconnect: Spiritual Res- toration from Digital Distraction show up in the same order as the book. Each session in the study guide begins with a simple introduction to the ideas discussed in the corresponding chapters of Reconnect. For a single session study, consider selecting one quote and two discussion questions from each session. I have pulled many of the longer quotes from the book and added more context from the original source when helpful. Finally, I’ll wrap up with a few questions for discussion and next steps to consider putting into practice. -
Technology and Behaviour Change, for Good and Evil
Technology and Behaviour Change, for Good and Evil Adam N. Joinson & Lukasz Piwek Behavioural Research Lab Bristol Social Marketing Centre Bristol Business School University of the West of England (UWE) DRAFT: WORKING PAPER FOR SUBMISSION TO SOCIAL TRENDS INSTITUTE MEETING ‘Communication Technologies and Lifestyles’, MAY 2013, Barcelona. Not for distribution beyond STI workshop participants without permission of authors. For latest version, email: [email protected] ! 1! Since the first basic stone and bone tools were used by Plio-Pleistocene hominids over three million years ago (McPherron et al., 2010), the ability of humans to fashion tools has not only extended our capacity to conduct tasks, but may also have had a transformative impact on our own selves. For instance, tool development may have been a key facilitator in the development of a large, energy hungry brain in humans (Gibbons, 1998). In more recent times, technologies and inventions - ranging from tally marks as a pre-cursor to numbers, the number zero, writing, the printing press, the stirrup, and the computer - have transformed not only individual human abilities through an extension of physical capabilities (McLuhan, 1964), but also society through both the intended and unintended consequences of widespread adoption and use. In the present position paper we summarise the key ways in which technology influences behaviour, and propose ways in which the same technology can be utilized in order to achieve a social good. We then look at two technologies – smart technologies for self- monitoring and social media – and discuss potential ways in which our approach to understanding the use and impact of tools can help shape their application in changing behaviour. -
Chapter 4: INFORMAL FALLACIES I
Essential Logic Ronald C. Pine Chapter 4: INFORMAL FALLACIES I All effective propaganda must be confined to a few bare necessities and then must be expressed in a few stereotyped formulas. Adolf Hitler Until the habit of thinking is well formed, facing the situation to discover the facts requires an effort. For the mind tends to dislike what is unpleasant and so to sheer off from an adequate notice of that which is especially annoying. John Dewey, How We Think Introduction In everyday speech you may have heard someone refer to a commonly accepted belief as a fallacy. What is usually meant is that the belief is false, although widely accepted. In logic, a fallacy refers to logically weak argument appeal (not a belief or statement) that is widely used and successful. Here is our definition: A logical fallacy is an argument that is usually psychologically persuasive but logically weak. By this definition we mean that fallacious arguments work in getting many people to accept conclusions, that they make bad arguments appear good even though a little commonsense reflection will reveal that people ought not to accept the conclusions of these arguments as strongly supported. Although logicians distinguish between formal and informal fallacies, our focus in this chapter and the next one will be on traditional informal fallacies.1 For our purposes, we can think of these fallacies as "informal" because they are most often found in the everyday exchanges of ideas, such as newspaper editorials, letters to the editor, political speeches, advertisements, conversational disagreements between people in social networking sites and Internet discussion boards, and so on. -
Logical Fallacies and Distraction Techniques
Sample Activity Learning Critical Thinking Through Astronomy: Logical Fallacies and Distraction Techniques Joe Heafner [email protected] Version 2017-09-13 STUDENT NOTE PLEASE DO NOT DISTRIBUTE THIS DOCUMENT. 2017-09-13 Activity0105 CONTENTS Contents QuestionsSample Activity1 Materials Needed 1 Points To Remember 1 1 Fallacies and Distractions1 Student1.1 Lying................................................. Version 1 1.2 Shifting The Burden.........................................2 1.3 Appeal To Emotion.........................................3 1.4 Appeal To The Past.........................................4 1.5 Appeal To Novelty..........................................5 1.6 Appeal To The People (Appeal To The Masses, Appeal To Popularity).............6 1.7 Appeal To Logic...........................................7 1.8 Appeal To Ignorance.........................................8 1.9 Argument By Repetition....................................... 10 1.10 Attacking The Person........................................ 11 1.11 Confirmation Bias.......................................... 12 1.12 Strawman Argument or Changing The Subject.......................... 13 1.13 False Premise............................................. 14 1.14 Hasty Generalization......................................... 15 1.15 Loaded Question........................................... 16 1.16 Feigning Offense........................................... 17 1.17 False Dilemma............................................ 17 1.18 Appeal To Authority........................................ -
Download Book
0111001001101011 01THE00101010100 0111001001101001 010PSYHOLOGY0111 011100OF01011100 010010010011010 0110011SILION011 01VALLEY01101001 ETHICAL THREATS AND EMOTIONAL UNINTELLIGENCE 01001001001110IN THE TECH INDUSTRY 10 0100100100KATY COOK 110110 0110011011100011 The Psychology of Silicon Valley “As someone who has studied the impact of technology since the early 1980s I am appalled at how psychological principles are being used as part of the busi- ness model of many tech companies. More and more often I see behaviorism at work in attempting to lure brains to a site or app and to keep them coming back day after day. This book exposes these practices and offers readers a glimpse behind the “emotional scenes” as tech companies come out psychologically fir- ing at their consumers. Unless these practices are exposed and made public, tech companies will continue to shape our brains and not in a good way.” —Larry D. Rosen, Professor Emeritus of Psychology, author of 7 books including The Distracted Mind: Ancient Brains in a High Tech World “The Psychology of Silicon Valley is a remarkable story of an industry’s shift from idealism to narcissism and even sociopathy. But deep cracks are showing in the Valley’s mantra of ‘we know better than you.’ Katy Cook’s engaging read has a message that needs to be heard now.” —Richard Freed, author of Wired Child “A welcome journey through the mind of the world’s most influential industry at a time when understanding Silicon Valley’s motivations, myths, and ethics are vitally important.” —Scott Galloway, Professor of Marketing, NYU and author of The Algebra of Happiness and The Four Katy Cook The Psychology of Silicon Valley Ethical Threats and Emotional Unintelligence in the Tech Industry Katy Cook Centre for Technology Awareness London, UK ISBN 978-3-030-27363-7 ISBN 978-3-030-27364-4 (eBook) https://doi.org/10.1007/978-3-030-27364-4 © The Editor(s) (if applicable) and The Author(s) 2020 This book is an open access publication. -
The Ethics of Persuasion in Technology
The Ethics of Persuasion in Technology Nathaniel Zinda February 21, 2019 Abstract This essay is concerned with the issue of persuasion in our information and communication technologies. More specifically, it is a rebuke of the common practice in Silicon Valley of designing products with the intention of maximizing user “engagement” - in other words, the time that users spend with the product. This article’s normative contention is that this practice (i.e., persuasive technology) degrades the user’s capacity for freedom and autonomy. The argument is divided into three parts: 1) the causes, 2) the characteristics, and 3) the ethical effects of persuasive technology. Part 1 argues that Silicon Valley’s widespread adoption of the ad-based business model has produced a competitive marketplace for users’ attention, whereby companies compete to extract as much “time spent” as possible. Part 2 outlines the two primary methods by which they do this, differentiating between persuasive design (the behavioral method) and algorithmic persuasion (the technological method). Part 3 contends that these methods are responsible for producing both functional and existential distraction amongst users, both of which undermine their sense of well-being and threaten their capacity for freedom and autonomy. i Introduction It is becoming increasingly clear that the information and communication technologies that have revolution- ized society are not on our side. This cynicism is well-deserved: from the rampant abuses of user data and privacy, to the abject failure of social networking sites to keep disinformation off their platforms, to the ef- fect that these oligopolies have had on the economics of journalism and other industries, this past decade has been marked by an impressive set of failures, abuses, and consequences that have duly marred the public’s optimism and trust. -
Social Media Impacts on Social & Political Goods: a Peacebuilding Perspective
Policy Brief No. 20 APLN/CNND 1 Policy Brief #22 October 2018 Social Media Impacts on Social & Political Goods: A Peacebuilding Perspective Lisa Schirch The Toda Peace Institute and the Alliance for Peacebuilding are hosting a series of policy briefs discussing social media impacts on social and political goods. Over the next several months, top experts and thought leaders will provide insight into social media’s threats and opportunities. This first briefing provides a conceptual summary, and a set of policy recommendations to address the significant threats to social and political goods. The Full Report provides a more in-depth literature review and explanation of key themes. ABSTRACT Social media is both an asset and a threat to social and political goods. This first in a series of policy briefs on social media aims to build the capacity of civil society to un- derstand the economic and psychological appeal of social media, identify the range of opportunities and challenges related to social media, and promote discussion on po- tential solutions to these challenges. Currently, few in civil society or government un- derstand how social media works. Technology experts and investor-oriented social media platforms cannot address the opportunities and challenges brought by social media capabilities. Together, government, corporations, and civil society, need to find ways to respond to the growing crisis of social media ethics and impacts. EXECUTIVE SUMMARY • All forms of media hold the potential for both solving and amplifying so- cial and political problems. Media is, as its Latin root suggests, “in the mid- dle”; it is a channel for communication between people. -
The Ethics of Persuasion: Persuasive Technology and the Power Behind Design
The Ethics of Persuasion: Persuasive Technology and the Power Behind Design A Research Paper submitted to the Department of Engineering and Society Presented to the Faculty of the School of Engineering and Applied Science University of Virginia • Charlottesville, Virginia In Partial Fulfillment of the Requirements for the Degree Bachelor of Science, School of Engineering Danielle Newman Spring 2020 On my honor as a University Student, I have neither given nor received unauthorized aid on this assignment as defined by the Honor Guidelines for Thesis-Related Assignments Advisor Kathryn A. Neeley, Associate Professor of STS, Department of Engineering and Society Introduction Throughout the last decade, there has been a significant increase in technology usage. The development of technology and different technological systems has allowed the general public to access more information, increase task production, and engage in more opportunities. Specifically, technology has opened up new opportunities to develop new information systems for influencing users. This has led to the development of Behavior Change Support Systems (BCSS), which is “a socio-technical information system with psychological and behavioral outcomes designed to form, alter or reinforce attitudes, and behaviors or an act of complying using coercion or deception” (Kulyk, p. 2), as a key construct for research in persuasive technology. Persuasive technology is a research field that studies how people are persuaded while interacting with computer technology. Although effective in its ability to treat patients, using persuasive systems and technology can become biased and cause ethical issues, based on the developer of the system. The introduction of design biases can potentially hinder patient results because people are susceptible to decision bias, which often makes it difficult for them to make self-beneficial choices. -
Coercion and Deception in Persuasive Technologies
Coercion and deception in persuasive technologies Timotheus Kampik Juan Carlos Nieves Helena Lindgren [email protected] [email protected] [email protected] (Corresponding author) Department of Computing Science Umeå University Sweden Abstract Technologies that shape human behavior are of high societal rele- vance, both when considering their current impact and their future potential. In information systems research and in behavioral psychol- ogy, such technologies are typically referred to as persuasive tech- nologies. Traditional definitions like the ones created by Fogg, and Harjumaa and Oinas-Kukkonen, respectively, limit the scope of per- suasive technology to non-coercive, non-deceptive technologies that are explicitly designed for persuasion. In this paper we analyze ex- isting technologies that blur the line between persuasion, deception, and coercion. Based on the insights of the analysis, we lay down an updated definition of persuasive technologies that includes coercive and deceptive forms of persuasion. Our definition also accounts for persuasive functionality that was not designed by the technology de- velopers. We argue that this definition will help highlight ethical and societal challenges related to technologies that shape human behavior and encourage research that solves problems with technology-driven persuasion. Finally, we suggest multidisciplinary research that can help address the challenges our definition implies. The suggestions we provide range from empirical studies to multi-agent system the- ory. 1 Introduction In scientific literature, the term persuasive technologies appeared first in 1998 in an abstract (special interest group announcement) by the behavioral psychologist Brian J. Fogg in the proceedings of the CHI Conference on Human Factors in Computing Systems. -
Kevin Masters 1 VITA Kevin S. Masters Office: Department Of
Kevin Masters 1 VITA Kevin S. Masters Office: Department of Psychology University of Colorado Denver Denver, CO 80217-3364 Telephone: Office: (303) 315-7062 Fax: (303) 556-3520 E-mail: [email protected] Major Position Professor; Program Director in Clinical Health Psychology Department of Psychology University of Colorado Denver Denver, CO Secondary Position Director of Behavioral Research Anschutz Health and Wellness Center University of Colorado Denver Denver, CO Education Ph.D. Clinical Psychology. Brigham Young University, Provo, Utah, 1989. APA accredited program. Predoctoral Clinical Internship. Duke University Medical Center, Durham, North Carolina, 1988-89. APA accredited internship. M.A. Clinical Psychology. University of Dayton, Dayton, Ohio, 1982. B.A. (Summa Cum Laude), Psychology, Cedarville College, Cedarville, Ohio, 1980. Teaching Experience (Courses Taught) Physiological Processes and Health Psychology (Graduate course) Cardiovascular Health Psychology (Graduate course) Health Psychology (Graduate course) Readings in Health Psychology (Graduate course) Ethical and Professional Issues in Clinical Psychology (Graduate course) Introduction to Psychotherapy (Graduate course) Kevin Masters 2 Existential and Spiritual Approaches to Psychotherapy (Graduate course) Existential and Spiritual Issues in Health Psychology (Graduate course) Intellectual Assessment (Graduate course) Advanced Personality Assessment (Graduate course) Empirically Validated Treatments (Graduate course) Practicum in Psychotherapy (Graduate course) -
The Role of Reasoning in Constructing a Persuasive Argument
The Role of Reasoning in Constructing a Persuasive Argument <http://www.orsinger.com/PDFFiles/constructing-a-persuasive-argument.pdf> [The pdf version of this document is web-enabled with linking endnotes] Richard R. Orsinger [email protected] http://www.orsinger.com McCurley, Orsinger, McCurley, Nelson & Downing, L.L.P. San Antonio Office: 1717 Tower Life Building San Antonio, Texas 78205 (210) 225-5567 http://www.orsinger.com and Dallas Office: 5950 Sherry Lane, Suite 800 Dallas, Texas 75225 (214) 273-2400 http://www.momnd.com State Bar of Texas 37th ANNUAL ADVANCED FAMILY LAW COURSE August 1-4, 2011 San Antonio CHAPTER 11 © 2011 Richard R. Orsinger All Rights Reserved The Role of Reasoning in Constructing a Persuasive Argument Chapter 11 Table of Contents I. THE IMPORTANCE OF PERSUASION.. 1 II. PERSUASION IN ARGUMENTATION.. 1 III. BACKGROUND.. 2 IV. USER’S GUIDE FOR THIS ARTICLE.. 2 V. ARISTOTLE’S THREE COMPONENTS OF A PERSUASIVE SPEECH.. 3 A. ETHOS.. 3 B. PATHOS.. 4 C. LOGOS.. 4 1. Syllogism.. 4 2. Implication.. 4 3. Enthymeme.. 4 (a) Advantages and Disadvantages of Commonplaces... 5 (b) Selection of Commonplaces.. 5 VI. ARGUMENT MODELS (OVERVIEW)... 5 A. LOGIC-BASED ARGUMENTS. 5 1. Deductive Logic.. 5 2. Inductive Logic.. 6 3. Reasoning by Analogy.. 7 B. DEFEASIBLE ARGUMENTS... 7 C. THE TOULMIN ARGUMENTATION MODEL... 7 D. FALLACIOUS ARGUMENTS.. 8 E. ARGUMENTATION SCHEMES.. 8 VII. LOGICAL REASONING (DETAILED ANALYSIS).. 8 A. DEDUCTIVE REASONING.. 8 1. The Categorical Syllogism... 8 a. Graphically Depicting the Simple Categorical Syllogism... 9 b. A Legal Dispute as a Simple Syllogism.. 9 c. -
Persuasive Technology: Development and Implementation of Personalized Technologies to Change Attitudes and Behaviours
View metadata, citation and similar papers at core.ac.uk brought to you by CORE provided by Hanze UAS repository Peter W. de Vries, Thomas Van Rompay (Eds.) Persuasive Technology: Development and implementation of personalized technologies to change attitudes and behaviours 12th International Conference, PERSUASIVE 2017, Amsterdam, The Netherlands, April 4-6, 2017 Adjunct Proceedings Second edition: April 2017 12th International Conference on Persuasive Technology, PERSUASIVE 2017, Amsterdam, The Netherlands, April 4-6, 2017, Adjunct Proceedings Edited by Peter W. de Vries and Thomas van Rompay © Copyright of the complete adjunct proceedings is held by the Centre for eHealth & Wellbeing Research, Department of Psychology, Health and Technology, University of Twente, The Netherlands. Copyright of individual contributions is held by the author(s). Contact: Centre for eHealth & Wellbeing Research, Department of Psychology, Health and Technology, University of Twente, The Netherlands PO Box 217, 7500 AE, Enschede, The Netherlands https://www.utwente.nl/igs/ehealth/ Preface Persuasive Technology (PT) is an emerging, interdisciplinary research field, focusing on the design, development and evaluation of technologies aimed at creating awareness and inducing behavior change with the ultimate goal to increase wellbeing and quality of life. Academic researchers, designers, and practitioners from the social sciences and technological disciplines, as well as from the fields of health, healthcare, safety, sustainability, and ICT have developed this field in the preceding years, giving rise to a community which aims to ‘persuade’ people into adopting healthier lifestyles, behave more safely, and reduce consumption of renewable resources, to name a few examples. The ‘technology’ component in PT reflects usage of, amongst others, big data analytics, sensor technology for monitoring, personalized feedback and coaching, mHealth, data visualization techniques, serious gaming, and social media.