"Discrimination, Artificial Intelligence and Algorithmic Decision-Making"
Total Page:16
File Type:pdf, Size:1020Kb
Load more
Recommended publications
-
Addressing Cognitive Biases in Augmented Business Decision Systems Human Performance Metrics for Generic AI-Assisted Decision Making
Addressing Cognitive Biases in Augmented Business Decision Systems Human performance metrics for generic AI-assisted decision making. THOMAS BAUDEL* IBM France Lab, Orsay, France MANON VERBOCKHAVEN IBM France Lab & ENSAE VICTOIRE COUSERGUE IBM France Lab, Université Paris-Dauphine & Mines ParisTech GUILLAUME ROY IBM France Lab & ENSAI RIDA LAARACH IBM France Lab, Telecom ParisTech & HEC How do algorithmic decision aids introduced in business decision processes affect task performance? In a first experiment, we study effective collaboration. Faced with a decision, subjects alone have a success rate of 72%; Aided by a recommender that has a 75% success rate, their success rate reaches 76%. The human-system collaboration had thus a greater success rate than each taken alone. However, we noted a complacency/authority bias that degraded the quality of decisions by 5% when the recommender was wrong. This suggests that any lingering algorithmic bias may be amplified by decision aids. In a second experiment, we evaluated the effectiveness of 5 presentation variants in reducing complacency bias. We found that optional presentation increases subjects’ resistance to wrong recommendations. We conclude by arguing that our metrics, in real usage scenarios, where decision aids are embedded as system-wide features in Business Process Management software, can lead to enhanced benefits. CCS CONCEPTS • Cross-computing tools and techniques: Empirical studies, Information Systems: Enterprise information systems, Decision support systems, Business Process Management, Human-centered computing, Human computer interaction (HCI), Visualization, Machine Learning, Automation Additional Keywords and Phrases: Business decision systems, Decision theory, Cognitive biases * [email protected]. 1 INTRODUCTION For the past 20 years, Business Process Management (BPM) [29] and related technologies such as Business Rules [10, 52] and Robotic Process Automation [36] have streamlined processes and operational decision- making in large enterprises, transforming work organization. -
A Task-Based Taxonomy of Cognitive Biases for Information Visualization
A Task-based Taxonomy of Cognitive Biases for Information Visualization Evanthia Dimara, Steven Franconeri, Catherine Plaisant, Anastasia Bezerianos, and Pierre Dragicevic Three kinds of limitations The Computer The Display 2 Three kinds of limitations The Computer The Display The Human 3 Three kinds of limitations: humans • Human vision ️ has limitations • Human reasoning 易 has limitations The Human 4 ️Perceptual bias Magnitude estimation 5 ️Perceptual bias Magnitude estimation Color perception 6 易 Cognitive bias Behaviors when humans consistently behave irrationally Pohl’s criteria distilled: • Are predictable and consistent • People are unaware they’re doing them • Are not misunderstandings 7 Ambiguity effect, Anchoring or focalism, Anthropocentric thinking, Anthropomorphism or personification, Attentional bias, Attribute substitution, Automation bias, Availability heuristic, Availability cascade, Backfire effect, Bandwagon effect, Base rate fallacy or Base rate neglect, Belief bias, Ben Franklin effect, Berkson's paradox, Bias blind spot, Choice-supportive bias, Clustering illusion, Compassion fade, Confirmation bias, Congruence bias, Conjunction fallacy, Conservatism (belief revision), Continued influence effect, Contrast effect, Courtesy bias, Curse of knowledge, Declinism, Decoy effect, Default effect, Denomination effect, Disposition effect, Distinction bias, Dread aversion, Dunning–Kruger effect, Duration neglect, Empathy gap, End-of-history illusion, Endowment effect, Exaggerated expectation, Experimenter's or expectation bias, -
Bias and Fairness in NLP
Bias and Fairness in NLP Margaret Mitchell Kai-Wei Chang Vicente Ordóñez Román Google Brain UCLA University of Virginia Vinodkumar Prabhakaran Google Brain Tutorial Outline ● Part 1: Cognitive Biases / Data Biases / Bias laundering ● Part 2: Bias in NLP and Mitigation Approaches ● Part 3: Building Fair and Robust Representations for Vision and Language ● Part 4: Conclusion and Discussion “Bias Laundering” Cognitive Biases, Data Biases, and ML Vinodkumar Prabhakaran Margaret Mitchell Google Brain Google Brain Andrew Emily Simone Parker Lucy Ben Elena Deb Timnit Gebru Zaldivar Denton Wu Barnes Vasserman Hutchinson Spitzer Raji Adrian Brian Dirk Josh Alex Blake Hee Jung Hartwig Blaise Benton Zhang Hovy Lovejoy Beutel Lemoine Ryu Adam Agüera y Arcas What’s in this tutorial ● Motivation for Fairness research in NLP ● How and why NLP models may be unfair ● Various types of NLP fairness issues and mitigation approaches ● What can/should we do? What’s NOT in this tutorial ● Definitive answers to fairness/ethical questions ● Prescriptive solutions to fix ML/NLP (un)fairness What do you see? What do you see? ● Bananas What do you see? ● Bananas ● Stickers What do you see? ● Bananas ● Stickers ● Dole Bananas What do you see? ● Bananas ● Stickers ● Dole Bananas ● Bananas at a store What do you see? ● Bananas ● Stickers ● Dole Bananas ● Bananas at a store ● Bananas on shelves What do you see? ● Bananas ● Stickers ● Dole Bananas ● Bananas at a store ● Bananas on shelves ● Bunches of bananas What do you see? ● Bananas ● Stickers ● Dole Bananas ● Bananas -
Working Memory, Cognitive Miserliness and Logic As Predictors of Performance on the Cognitive Reflection Test
Working Memory, Cognitive Miserliness and Logic as Predictors of Performance on the Cognitive Reflection Test Edward J. N. Stupple ([email protected]) Centre for Psychological Research, University of Derby Kedleston Road, Derby. DE22 1GB Maggie Gale ([email protected]) Centre for Psychological Research, University of Derby Kedleston Road, Derby. DE22 1GB Christopher R. Richmond ([email protected]) Centre for Psychological Research, University of Derby Kedleston Road, Derby. DE22 1GB Abstract Most participants respond that the answer is 10 cents; however, a slower and more analytic approach to the The Cognitive Reflection Test (CRT) was devised to measure problem reveals the correct answer to be 5 cents. the inhibition of heuristic responses to favour analytic ones. The CRT has been a spectacular success, attracting more Toplak, West and Stanovich (2011) demonstrated that the than 100 citations in 2012 alone (Scopus). This may be in CRT was a powerful predictor of heuristics and biases task part due to the ease of administration; with only three items performance - proposing it as a metric of the cognitive miserliness central to dual process theories of thinking. This and no requirement for expensive equipment, the practical thesis was examined using reasoning response-times, advantages are considerable. There have, moreover, been normative responses from two reasoning tasks and working numerous correlates of the CRT demonstrated, from a wide memory capacity (WMC) to predict individual differences in range of tasks in the heuristics and biases literature (Toplak performance on the CRT. These data offered limited support et al., 2011) to risk aversion and SAT scores (Frederick, for the view of miserliness as the primary factor in the CRT. -
GDPR Is Here: What to Expect Now?
GDPR Is Here: What to Expect Now? Brett Lockwood Smith, Gambrell & Russell, LLP June 19, 2018 Agenda Principal Obligations Related Developments Under GDPR & What’s Ahead E.U. / U.S. Compliance Comparison & Best Practices GDPR Obligations Summary • E.U. General Data Protection Regulation (GDPR) was effective May 25, 2018 • Replaces 1995 E.U. Privacy Directive, with many new provisions: enhanced personal rights, affirmative consent, breach notice and DPO requirements • Very broad and process oriented • Essentially: If you process personal data of a person in the E.U. or processing is in the uE.U. or yo are a controller or processor in the E.U. then GDPR applies • Penalties –greater of €20 MM or up to 4% of worldwide revenue GDPR Obligations Key Definitions • Data Subject: an identified or identifiable natural person • Personal Data: any information relating to a Data Subject • Processing: any operation which is performed on personal data • Controller: one who determines the purposes and means for the processing of personal data • Processor: one who processes personal data on behalf of a controller (Art. 4) GDPR Obligations Major Requirements • Governing Principles for Processing Personal Data (Arts. 5, 24 & 25) ► Processing must be lawful, fair and transparent ► Processed for specified, explicit and legitimate purpose ► Adequate, relevant and limited to processing purpose ► Must be accurate and kept updated without delay ► Maintained only as long as necessary for processing purpose ► Must ensure appropriate security GDPR Obligations Major Requirements (Continued) • Affirmative consent from data subject (or guardian) or another lawful basis (e.g., legitimate interest or contract fulfillment) is needed to process data (Arts. -
Table of Contents
Table of Contents Foreword ...................................................................................................................... v Preface ........................................................................................................................ vii Table of Contents ......................................................................................................... xi Table of Cases ........................................................................................................... xxi Table of Legislation ...................................................................................................lvii Chapter 1 Introduction Introduction .................................................................................................................. 1 Core Principles that apply to Internet Law ................................................................... 5 Importance of the internet ....................................................................................... 5 User-generated content ........................................................................................... 9 Internet v traditional modes of communication .................................................... 11 Liability of intermediaries .................................................................................... 22 The E-Commerce Directive .................................................................................. 28 Privacy and anonymity ........................................................................................ -
Surveillance by Intelligence Services: Services: Intelligence by Surveillance
FREEDOMS FRA Surveillance by intelligence services – Volume II: field perspectives and legal update II: field perspectives – Volume services intelligence by Surveillance Surveillance by intelligence services: fundamental rights safeguards and remedies in the EU Volume II: field perspectives and legal update This report addresses matters related to the respect for private and family life (Article 7), the protection of personal data (Article 8) and the right to an effective remedy and a fair trial (Article 47) falling under Titles II ‘Freedoms’ and VI ‘Justice’ of the Charter of Fundamental Rights of the European Union. Europe Direct is a service to help you find answers to your questions about the European Union Freephone number (*): 00 800 6 7 8 9 10 11 (*) The information given is free, as are most calls (though some operators, phone boxes or hotels may charge you). Photo (cover & inside): © Shutterstock More information on the European Union is available on the internet (http://europa.eu). Luxembourg: Publications Office of the European Union, 2017 FRA – print: ISBN 978-92-9491-766-9 doi:10.2811/15232 TK-04-17-696-EN-C FRA – web: ISBN 978-92-9491-765-2 doi:10.2811/792946 TK-04-17-696-EN-N © European Union Agency for Fundamental Rights, 2017 Reproduction is authorised provided the source is acknowledged. For any use or reproduction of photos or other material that is not under the European Union Agency for Fundamental Rights copyright, permission must be sought directly from the copyright holders. Printed by Imprimerie Centrale in Luxembourg Neither the European Union Agency for Fundamental Rights nor any person acting on behalf of the European Union Agency for Fundamental Rights is responsible for the use that might be made of the following information. -
Annual Report 2017
101110101011101101010011100101110101011101101010011100 10111010101110110101001 10101110001010 EDRi EUROPEAN DIGITAL RIGHTS 110101011101101010011100101011101101010011100101 10101110110101000010010100EUROPEAN010 DIGITAL001 RIGHTS11011101110101011101101100000100101101000 DEFENDING RIGHTS AND FREEDOMS ONLINE 01000111011101110101 101011101101010000100101000100011101110111010101110110110000010010110100001000111011101110101 101110101010011100 101110101011101101010011100 101011101101010000100101000100011101 101011101101010000100101000100011101110111010101110110110000010010110100001000111011101110101 101110101010011100 101110101011101101010011100 1010111011010100001001010001000111011101110101011101101100000 101011101101010000100101000100011101110111010101110110110000010010110100001000111011101110101 101110101010011100 101110101011101101010011100 10101110110101000010010100010001110111011101010111011011000001001011010000100011101110111010 101011101101010000100101000100011101110111010101110110110000010010110100001000111011101110101 101110101010011100 101110101011101101010011100 101011101101010000100101000100011101110111010101110110110000010010110100001000111011101110101 101110101010011100 101110101011101101010011100 EUROPEAN DIGITAL RIGHTS EUROPEAN DIGITAL RIGHTS ANNUAL REPORT 2017 1011011101101110111010111011111011 January 2017 – December 2017 1011011101101110111011101100110111 101011101101010000100101000100011101110111010101110110110000010010110100001000111011101110101 101110101010011100 101110101011101101010011100 101011101101010000100101000100011101110111010101110110110000010010110100001000111011101110101 -
Civil Society Organizations and General Data Protection Regulation Compliance
Civil Society Organizations and General Data Protection Regulation Compliance Challenges, Opportunities, and Best Practices ABOUT THE ORGANIZATIONS The Open Society Foundations, founded by George Soros, are the world’s largest private funder of independent groups working for justice, democratic governance, and human rights. Open Society funded this report hoping to make a contribution towards stronger data governance in civil society. Data Protection Support & Management is a niche data protection consultancy specializing in assisting humanitarian organizations, nonprofits, research institutes, and organizations dealing with vulnerable data subjects and complex processing environments These efforts aim to help these groups and organizations innovate responsibly and comply with their data protection obligations. © 2020 Open Society Foundations This publication is available as a PDF on the Open Society Foundations website under a Creative Commons license that allows copying and distributing the publication, only in its entirety, as long as it is attributed to the Open Society Foundations and used for noncommercial educational or public policy purposes. Photographs may not be used separately from the publication. ABOUT THE AUTHORS Vera Franz is deputy director of the Open Society Lucy Hannah is a data protection consultant with Information Program where she oversees the global Digital Protection Support & Management, a certified portfolios responding to threats to open society data protection officer, and an Australian-qualified created by information technology. For the past lawyer, working across Europe and the United decade, Franz has worked to develop and implement Kingdom. Hannah has extensive experience in field strategies to confront corporate data exploitation privacy, data protection, and regulatory compliance. and government surveillance. -
The Case for Legislating Toward a Privacy Right in India
PRESERVING CONSTITUTIVE VALUES IN THE MODERN PANOPTICON: THE CASE FOR LEGISLATING TOWARD A PRIVACY RIGHT IN INDIA Ujwala Uppaluri & Varsha Shivanagowda* As on date, the only meaningful, if arguably broad, affirmation of a right to privacy has been in the context of the Supreme Court’s treatment of Art. 21 of the Constitution, which embodies the guarantee of a right to life and personal liberty. No substantial legislative measures granting and detailing a broad and general right of privacy presently exist in the Indian context, although some measures are scattered across context-specific legislation. Recent events have brought to light the need to operationalise these judicial observations through a legislative statement of the right fleshing out the field within which the sanctity of the private domain will be recognised and upheld. This paper seeks to explore the contours of the notion of a general right to privacy. It confronts the critiques of such a right and discusses the predominant working models in other major jurisdictions. In the result, it asserts the need for an umbrella legislation addressing the varied areas in which the right of the individual to privacy, against governmental incursion into private spaces as well as against other forms of intrusion by the media and other citizens, must accrue. I. INTRODUCTION Recent concens with privacy and autonomy issues in India have arisen with regard to the State’s role in collecting and aggregating private or personal information in the context of the work of the Unique Identification Authority of India (UIDAI)1 and the National Intelligence Grid (NATGRID).2 * 3rd and 2nd year students respectively, the W.B. -
The Eprivacy Regulation (Epr) and Its Impact on Banks
www.pwc.ch/ Is ePrivacy defining the future standard of data protection for the banking industry? The ePrivacy Regulation (ePR) and its impact on banks “The question of the right to privacy must be one of the defining issues of our time.” S. Shetty PwC | The ePrivacy Regulation and its impact on banks | 2 Table of Contents The ePrivacy Regulation (ePR) and the impact on banks 3 The ePrivacy Regulation in a nutshell 3 Legal background 3 Key requirements of the ePrivacy Regulation 4 The ePrivacy Regulation and the GDPR 5 How does the ePrivacy Regulation affect you? 7 Key challenges of ePR for banks 7 What is our suggested approach 8 Prioritisation is key 8 How can PwC help? 9 Contacts 9 3 | The ePrivacy Regulation and its impact on banks | PwC The ePrivacy Regulation (ePR) and its impact on banks The European Commission is finalising the ePri- The cornerstones of the proposed rules on Privacy vacy Regulation (ePR), which may become effec- and Electronic Communications are: tive, together with the EU GDPR, from May 2018. All electronic communications must be confi- The ePR, which protects the right to respect for dential private life and communications, is one of the key Listening to, tapping, intercepting, scanning and pillars of the EU’s Digital Single Market Strategy. storing of, for example, text messages, e-mails or voice calls will not be allowed without the consent This new regulation is designed to be ‘future- of the user. The newly introduced principle of proof’: all existing and future communication confidentiality of electronic communications will technologies are and will be subject to it. -
Data Protection Directive 95/46/EC to the Internet, 25 J. Marshall J
The John Marshall Journal of Information Technology & Privacy Law Volume 25 Issue 2 Journal of Computer & Information Law Article 2 - Spring 2008 Spring 2008 All or Nothing: This is the Question? The Application of Article 3(2) Data Protection Directive 95/46/EC to the Internet, 25 J. Marshall J. Computer & Info. L. 241 (2008) Rebecca Wong Joseph Savirimuthu Follow this and additional works at: https://repository.law.uic.edu/jitpl Part of the Computer Law Commons, Internet Law Commons, Privacy Law Commons, and the Science and Technology Law Commons Recommended Citation Rebecca Wong & Joseph Savirimuthu, All or Nothing: This is the Question? The Application of Article 3(2) Data Protection Directive 95/46/EC to the Internet, 25 J. Marshall J. Computer & Info. L. 241 (2008) https://repository.law.uic.edu/jitpl/vol25/iss2/2 This Article is brought to you for free and open access by UIC Law Open Access Repository. It has been accepted for inclusion in The John Marshall Journal of Information Technology & Privacy Law by an authorized administrator of UIC Law Open Access Repository. For more information, please contact [email protected]. ALL OR NOTHING: THIS IS THE QUESTION? THE APPLICATION OF ARTICLE 3(2) DATA PROTECTION DIRECTIVE 95/46/ EC TO THE INTERNET REBECCA WONGt AND JOSEPH SAVIRIMUTHUtt I. INTRODUCTION The exponential growth of social networking Web sites, online per- sonal journals and the use of multimedia by individuals, raises impor- tant questions about the compatibility of Article 3(2) of the Data Protection Derivative 95/46/EC ("DPD") as applied to the internet.