Understanding Algorithmic Decision-Making: Opportunities and Challenges

Total Page:16

File Type:pdf, Size:1020Kb

Understanding Algorithmic Decision-Making: Opportunities and Challenges Understanding algorithmic decision-making: Opportunities and challenges STUDY Panel for the Future of Science and Technology EPRS | European Parliamentary Research Service Scientific Foresight Unit (STOA) PE 624.261 – March 2019 EN Understanding algorithmic decision-making: Opportunities and challenges While algorithms are hardly a recent invention, they are nevertheless increasingly involved in systems used to support decision-making. These systems, known as 'ADS' (algorithmic decision systems), often rely on the analysis of large amounts of personal data to infer correlations or, more generally, to derive information deemed useful to make decisions. Human intervention in the decision-making may vary, and may even be completely out of the loop in entirely automated systems. In many situations, the impact of the decision on people can be significant, such as access to credit, employment, medical treatment, or judicial sentences, among other things. Entrusting ADS to make or to influence such decisions raises a variety of ethical, political, legal, or technical issues, where great care must be taken to analyse and address them correctly. If they are neglected, the expected benefits of these systems may be negated by a variety of different risks for individuals (discrimination, unfair practices, loss of autonomy, etc.), the economy (unfair practices, limited access to markets, etc.), and society as a whole (manipulation, threat to democracy, etc.). This study reviews the opportunities and risks related to the use of ADS. It presents policy options to reduce the risks and explain their limitations. We sketch some options to overcome these limitations to be able to benefit from the tremendous possibilities of ADS while limiting the risks related to their use. Beyond providing an up-to- date and systematic review of the situation, the study gives a precise definition of a number of key terms and an analysis of their differences to help clarify the debate. The main focus of the study is the technical aspects of ADS. However, to broaden the discussion, other legal, ethical and social dimensions are considered. STOA | Panel for the Future of Science and Technology AUTHORS This study has been written by Claude Castelluccia and Daniel Le Métayer (Institut national de recherche en informatique et en automatique - Inria) at the request of the Panel for the Future of Science and Technology (STOA) and managed by the Scientific Foresight Unit within the Directorate-General for Parliamentary Research Services (DG EPRS) of the Secretariat of the European Parliament. ADMINISTRATOR RESPONSIBLE Mihalis Kritikos, Scientific Foresight Unit (STOA) To contact the publisher, please e-mail [email protected] Acknowledgments The authors would like to thank all who have helped, in any way whatever, in this study, in particular Irene Maxwell and Clément Hénin for their careful reading and useful comments on an earlier draft of this report. LINGUISTIC VERSION Original: EN Manuscript completed in March 2019. DISCLAIMER AND COPYRIGHT This document is prepared for, and addressed to, the Members and staff of the European Parliament as background material to assist them in their parliamentary work. The content of the document is the sole responsibility of its author(s) and any opinions expressed herein should not be taken to represent an official position of the Parliament. Reproduction and translation for non-commercial purposes are authorised, provided the source is acknowledged and the European Parliament is given prior notice and sent a copy. Brussels © European Union, 2019. PE 624.261 ISBN: 978-92-846-3506-1 doi: 10.2861/536131 QA-06-18-337-EN-N http://www.europarl.europa.eu/stoa (STOA website) http://www.eprs.ep.parl.union.eu (intranet) http://www.europarl.europa.eu/thinktank (internet) http://epthinktank.eu (blog) Understanding algorithmic decision-making: Opportunities and challenges Executive Summary Scope of the study: While algorithms are hardly a recent invention, they are nevertheless increasingly involved in systems used to support decision-making. Known as 'ADS' (algorithmic decision systems), ADS often rely on the analysis of large amounts of personal data to infer correlations or, more generally, to derive information deemed useful to make decisions. Human intervention in the decision-making may vary, and may even be completely out of the loop in entirely automated systems. In many situations, the impact of the decision on people can be significant, such as on access to credit, employment, medical treatment, judicial sentences, among other things. Entrusting ADS to make or to influence such decisions raises a variety of different ethical, political, legal, or technical issues, where great care must be taken to analyse and address them correctly. If they are neglected, the expected benefits of these systems may be negated by the variety of risks for individuals (discrimination, unfair practices, loss of autonomy, etc.), the economy (unfair practices, limited access to markets, etc.), and society as a whole (manipulation, threat to democracy, etc.). This study reviews the opportunities and risks related to the use of ADS. It presents existing options to reduce these risks and explain their limitations. We sketch some options to benefit from the tremendous possibilities of ADS while limiting the risks related to their use. Beyond providing an up-to-date and systematic review of the situation, the study gives a precise definition of a number of key terms and an analysis of their differences to help clarify the debate. The main focus of the study is the technical aspects of ADS. However, to broaden the discussion, other legal, ethical and social dimensions are considered. ADS opportunities and risks: The study discusses the benefits and risks related to the use of ADS for three categories of stakeholders: individuals, the private sector and the public sector. Risks may be intentional (e.g. to optimise the interests of the operator of the ADS), accidental (side-effects of the purpose of the ADS, with no such intent from the designer), or the consequences of ADS errors or inaccuracies (e.g. people wrongly included in blacklists or 'no fly' lists due to homonyms or inaccurate inferences). Opportunities and risks of ADS for individuals: ADS may undermine the fundamental principles of equality, privacy, dignity, autonomy and free will, and may also pose risks related to health, quality of life and physical integrity. That ADS can lead to discrimination has been extensively documented in many areas, such as the judicial system, credit scoring, targeted advertising and employment. Discrimination may result from different types of biases arising from the training data, technical constraints, or societal or individual biases. However, the risk of discrimination related to the use of ADS should be compared with the risk of discrimination without the use of ADS. Humans have their own sources of bias that can affect their decisions and, in some cases, these could be detected or avoided using ADS. The deployment of ADS may also pose a threat to privacy and data protection in many different ways. The first is related to the massive collection of personal data required to train the algorithms. Even when no external attack has been carried out, the mere suspicion that one's personal data is being collected and possibly analysed can have a detrimental impact on people. For example, several studies have provided evidence of the chilling effect resulting from fear of online surveillance. Altogether, large-scale surveillance and scoring could narrow the range of possibilities and choices available to individuals, affecting their capacity for self-development and fulfilment. Scoring also raises the fear that humans are increasingly treated as numbers, and reduced to their digital profile. Reducing the complexity of human personality to a number can be seen as a form of alienation and an offence to human dignity. I STOA | Panel for the Future of Science and Technology The opacity or lack of transparency of ADS is another primary source of risk for individuals. It opens the door to all kinds of manipulation and makes it difficult to challenge a decision based on the result of an ADS. This is in contradiction with defence rights and the principle of adversarial proceedings with right to all evidence and observations in most legal systems. The use of ADS in court also raises far-reaching questions about the reliance on predictive scores to make legal decisions, in particular for sentencing. Opportunities and risks for the public sector: ADS are currently being used by state and public agencies to provide new services or improve existing ones in areas such as energy, education, healthcare, transportation, the judicial system and security. Examples of applications of ADS in this context are predictive policing, smart metering, video protection and school enrolment. They can also contribute to improving the quality of healthcare, education and job skill training. They are increasingly used in cyber-defence to protect infrastructures and to support soldiers in the battlefield. ADS, or smart technologies in general, such as mobility management tools, or water and energy management systems, can improve city management efficiency. They can also help make administrative decisions more efficient, transparent and accountable, provided however that they themselves are transparent and accountable. ADS also create new 'security vulnerabilities' that can be exploited by people with
Recommended publications
  • Algorithmic Design and Techniques at Edx: Syllabus
    Algorithmic Design and Techniques at edX: Syllabus February 18, 2018 Contents 1 Welcome!2 2 Who This Class Is For2 3 Meet Your Instructors2 4 Prerequisites3 5 Course Overview4 6 Learning Objectives5 7 Estimated Workload5 8 Grading 5 9 Deadlines6 10 Verified Certificate6 11 Forum 7 1 1 Welcome! Thank you for joining Algorithmic Design and Techniques at edX! In this course, you will learn not only the basic algorithmic techniques, but also how to write efficient, clean, and reliable code. 2 Who This Class Is For Programmers with basic experience looking to understand the practical and conceptual underpinnings of algorithms, with the goal of becoming more effective software engi- neers. Computer science students and researchers as well as interdisciplinary students (studying electrical engineering, mathematics, bioinformatics, etc.) aiming to get more profound understanding of algorithms and hands-on experience implementing them and applying for real-world problems. Applicants who want to prepare for an interview in a high-tech company. 3 Meet Your Instructors Daniel Kane is an associate professor at the University of California, San Diego with a joint appointment between the Department of Com- puter Science and Engineering and the Department of Mathematics. He has diverse interests in mathematics and theoretical computer science, though most of his work fits into the broad categories of number theory, complexity theory, or combinatorics. Alexander S. Kulikov is a senior research fellow at Steklov Mathemat- ical Institute of the Russian Academy of Sciences, Saint Petersburg, Russia and a lecturer at the Department of Computer Science and Engineering at University of California, San Diego, USA.
    [Show full text]
  • Solving the Social Dilemma (Existential Threats and How to Deal with Them)
    Solving the Social Dilemma (Existential threats and how to deal with them) Cassandra Introduction Cassandra was a priestess to Apollo and sister to Hector of Troy. Apollo took a shine to her and gave her the gift of prophecy. When he realised his affections were not reciprocated he became very angry. He was not able to take back a gift from the Gods so instead he cursed her to make prophecies that were true but would never be believed. "The biggest mistake you can make is to be prematurely right" Peter Drucker In September this year a Netflix documentary called 'The Social Dilemma' (TSD) was given its worldwide release. It contains disturbing warnings about the human and societal impact of AI driven social media. Is the prophecy true or false? If true is it a tipping point or is it prematurely right? This article/paper is going to look at this in the context of two other 'existential threats', Climate Change and Pathogens. What are the patterns, how will the debate evolve, will it create change, what might the solutions be - what can we do about it? Here's what I'm going to cover:- • The meaning of 'Existential' • The Big 3 Existential Threats and the division of opinion • A dialectic process to reframe the debate • Why AI driven social media might be the priority • Some possible solutions My own personal position is irrelevant. It matters not whether my views concur with those expressed in 'The Social Dilemma' any more than it matters what I think about Climate Change or Covid 19.
    [Show full text]
  • Conservative Media's Coverage of Coronavirus On
    University of South Carolina Scholar Commons Theses and Dissertations Fall 2020 Conservative Media’s Coverage of Coronavirus on YouTube: A Qualitative Analysis of Media Effects on Consumers Michael J. Layer Follow this and additional works at: https://scholarcommons.sc.edu/etd Part of the Mass Communication Commons Recommended Citation Layer, M. J.(2020). Conservative Media’s Coverage of Coronavirus on YouTube: A Qualitative Analysis of Media Effects on Consumers. (Doctoral dissertation). Retrieved from https://scholarcommons.sc.edu/ etd/6133 This Open Access Dissertation is brought to you by Scholar Commons. It has been accepted for inclusion in Theses and Dissertations by an authorized administrator of Scholar Commons. For more information, please contact [email protected]. CONSERVATIVE MEDIA’S COVERAGE OF CORONAVIRUS ON YOUTUBE: A QUALITATIVE ANALYSIS OF MEDIA EFFECTS ON CONSUMERS by Michael J. Layer Bachelor of Arts Goucher College, 2017 Submitted in Partial Fulfillment of the Requirements For the Degree of Master of Arts in Mass Communications College of Information and Communications University of South Carolina 2020 Accepted by: Linwan Wu, Director of Thesis Leigh Moscowitz, Reader Anli Xiao, Reader Cheryl L. Addy, Vice Provost and Dean of the Graduate School © Copyright by Michael J. Layer, 2020 All Rights Reserved. ii DEDICATION To the wonderful creators on the internet whose parasocial relationships have taught me so much about the world and myself - namely Olly Thorn, Philosophy Tube; Carlos Maza; Natalie Wynn, Contrapoints; Lindsey Ellis; Bryan David Gilbert; Katy Stoll and Cody Johnson, Some More News; and Robert Evans, Behind the Bastards - thank you. To Dr. Linwan Wu - I am profoundly grateful for your understanding, guidance, and patience.
    [Show full text]
  • Dynamic Programming
    Chapter 19 Dynamic Programming “An interesting question is, ’Where did the name, dynamic programming, come from?’ The 1950s were not good years for mathematical research. We had a very interesting gentleman in Washington named Wilson. He was Secretary of Defense, and he actually had a pathological fear and hatred of the word, research. I’m not using the term lightly; I’m using it precisely. His face would suffuse, he would turn red, and he would get violent if people used the term, research, in his presence. You can imagine how he felt, then, about the term, mathematical. The RAND Corporation was employed by the Air Force, and the Air Force had Wilson as its boss, essentially. Hence, I felt I had to do something to shield Wilson and the Air Force from the fact that I was really doing mathematics inside the RAND Corpo- ration. What title, what name, could I choose? In the first place I was interested in planning, in decision making, in thinking. But planning, is not a good word for var- ious reasons. I decided therefore to use the word, ‘programming.’ I wanted to get across the idea that this was dynamic, this was multistage, this was time-varying—I thought, let’s kill two birds with one stone. Let’s take a word that has an absolutely precise meaning, namely dynamic, in the classical physical sense. It also has a very interesting property as an adjective, and that is it’s impossible to use the word, dy- namic, in a pejorative sense. Try thinking of some combination that will possibly give it a pejorative meaning.
    [Show full text]
  • The Social Dilemma (Netflix, 2020)
    Review: The Social Dilemma (Netflix, 2020) An uncomfortable observation Toward the beginning of 2020, I remember visiting my local VUE cinema and seeing a commercial starring John Boyega entitled ‘Get Lost in Great Stories’. It portrays a scene in which every person in a household is simultaneously viewing a different screen whilst going about their days; whether televisions, laptops, desktops or mobile phones. The door of the house opens and we enter another scene in the hustle and bustle of a city street, confronted with a picture of everyone walking with poor posture over their phones. A range of emoJis and reactions explode over the screen reflecting all that is going on in a relentless stream of data consumption. The commercial, interpreted simply, was indicating that cinema is the best way to watch a film and ‘step away from the outside world’. Nevertheless, the portrayal of everyday life disturbed me, not because it is portraying an imagined dystopia, but because in some ways such a dystopia is already here. We have become ape-like slaves of the very tools we invented and we are seeing health and cohesion in society suffer for it. The ongoing COVID-19 crisis has made people even more dependent on technology and probably even more addicted to screens as a result of isolation; if this is not swiftly identified as a problem, many will suffer as a result. Netflix’s 2020 documentary film, The Social Dilemma does a very good Job of highlighting what is going on behind the scenes of our screens and deserves consideration in developing a healthy pastoral theology of technology.
    [Show full text]
  • Your Undivided Attention Podcast Bonus Episode: the Social Dilemma
    Center for Humane Technology | Your Undivided Attention Podcast Bonus Episode: The Social Dilemma Tristan Harris: Today on Your Undivided Attention, instead of a podcast guest, we would like to tell you about something really exciting. Netflix just released a new documentary called The Social Dilemma. The film has been in the works for more than three years and includes appearances from me and from Aza and our co-founder, Randy Fernando. But what really has us excited is the incredible number of tech insiders at Facebook, Instagram, Twitter. They're all there in the film. Tristan Harris: What the film discloses, clearly from the people who built these systems, is how it is warping the collective consciousness of humanity. When you look outside and you see the madness of shortening attention spans, increasing irritability, increasing anxiety, more polarization escalating into conflict and you say, "Is this just natural? Is this just where we were headed from the start?" Or, if you rewind 10 years and you imagine putting on a pair of outrage and conflict colored glasses, so everywhere you looked, you went to your kitchen, you would see things you should be outraged about that your spouse left the cups in a different location than when you left it and you see the sneeze that someone sneezed unto the refrigerator handle before you walked in, what would your world be like if you wore those glasses for 10 years? Tristan Harris: Now, imagine that you didn't wear those glasses, but everyone else around you wore these outrage and conflict colored glasses.
    [Show full text]
  • Chapter 17 Dynamic Programming
    Chapter 17 Dynamic Programming “An interesting question is, ’Where did the name, dynamic programming, come from?’ The 1950s were not good years for mathematical research. We had a very interesting gentleman in Washington named Wilson. He was Secretary of Defense, and he actually had a pathological fear and hatred of the word, research. I’m not using the term lightly; I’m using it precisely. His face would suffuse, he would turn red, and he would get violent if people used the term, research, in his presence. You can imagine how he felt, then, about the term, mathematical. The RAND Corporation was employed by the Air Force, and the Air Force had Wilson as its boss, essentially. Hence, I felt I had to do something to shield Wilson and the Air Force from the fact that I was really doing mathematics inside the RAND Corporation. What title, what name, could I choose? In the first place I was interested in planning, in decision making, in thinking. But planning, is not a good word for various reasons. I decided therefore to use the word, ‘programming.’ I wanted to get across the idea that this was dynamic, this was multistage, this was time-varying—I thought, let’s kill two birds with one stone. Let’s take a word that has an absolutely precise meaning, namely dynamic, in the classical physical sense. It also has a very interesting property as an adjective, and that is it’s impossible to use the word, dynamic, in a pejorative sense. Try thinking of some combination that will possibly give it a pejorative meaning.
    [Show full text]
  • The Social Dilemma
    Journal of Religion & Film Volume 24 Issue 1 April 2020 Article 22 January 2020 The Social Dilemma Jodi McDavid Cape Breton University, [email protected] Follow this and additional works at: https://digitalcommons.unomaha.edu/jrf Part of the Film and Media Studies Commons Recommended Citation McDavid, Jodi (2020) "The Social Dilemma," Journal of Religion & Film: Vol. 24 : Iss. 1 , Article 22. Available at: https://digitalcommons.unomaha.edu/jrf/vol24/iss1/22 This Sundance Film Festival Review is brought to you for free and open access by DigitalCommons@UNO. It has been accepted for inclusion in Journal of Religion & Film by an authorized editor of DigitalCommons@UNO. For more information, please contact [email protected]. The Social Dilemma Abstract This is a film review of The Social Dilemma (2020) directed by Jeff Orlowski. Keywords Social Media, Artificial Intelligence, Advertising Author Notes Jodi McDavid is an instructor in folklore and gender and women's studies at Cape Breton University. This sundance film estivf al review is available in Journal of Religion & Film: https://digitalcommons.unomaha.edu/jrf/ vol24/iss1/22 McDavid: The Social Dilemma The Social Dilemma (2020) dir. Jeff Orlowski The Social Dilemma is a documentary which interviews executives and programmers who left social media sites such as Pinterest, Facebook, Instagram and Twitter for ethical reasons. It treads the familiar ground that you might expect about the addictiveness of social media, however, it does give in-depth information about how the artificial intelligence works. Tristan Harris, co-founder of the Centre for Humane Technology and previous Google employee, figures prominently in the film.
    [Show full text]
  • Machine Unlearning Via Algorithmic Stability
    Proceedings of Machine Learning Research vol 134:1–71, 2021 34th Annual Conference on Learning Theory Machine Unlearning via Algorithmic Stability Enayat Ullah [email protected] Johns Hopkins University Tung Mai [email protected] Adobe Research Anup B. Rao [email protected] Adobe Research Ryan Rossi [email protected] Adobe Research Raman Arora [email protected] Johns Hopkins University Editors: Mikhail Belkin and Samory Kpotufe Abstract We study the problem of machine unlearning and identify a notion of algorithmic stability, Total Variation (TV) stability, which we argue, is suitable for the goal of exact unlearning. For convex risk minimization problems, we design TV-stable algorithms based on noisy Stochastic Gradient Descent (SGD). Our key contribution is the design of corresponding efficient unlearning algorithms, which are based on constructing a near-maximal coupling of Markov chains for the noisy SGD procedure. To understand the trade-offs between accuracy and unlearning efficiency, we give upper and lower bounds on excess empirical and populations risk of TV stable algorithms for convex risk minimization. Our techniques generalize to arbitrary non-convex functions, and our algorithms are differentially private as well. 1. Introduction User data is employed in data analysis for various tasks such as drug discovery, as well as in third- party services for tasks such as recommendations. With such practices becoming ubiquitous, there is a growing concern that sensitive personal data can get compromised. This has resulted in a push for broader awareness of data privacy and ownership. These efforts have led to several regulatory bodies enacting laws such as the European Union General Data Protection Regulation (GDPR) and California Consumer Act, which empowers the user with (among other provisions) the right to request to have their personal data be deleted (see Right to be forgotten, Wikipedia(2021)).
    [Show full text]
  • The Social Dilemma, by Jeff Orlowski, Netflix, 2020
    The Political Economy of Communication 8(2), 75–103 © The Author 2020 http://www.polecom.org SPECIAL SECTION The Social Dilemma, by Jeff Orlowski, Netflix, 2020 Film reviews with section editor’s introduction Editor: Paschal Preston, School of Communication, Dublin City University, Dublin Review I: The Social Dilemma: A Contradictory Narrative About Platform Power Robin Mansell, Department of Media and Communications, London School of Economics and Political Science Review II: Missing in Action: Silences and Evasions in The Social Dilemma Graham Murdock, Centre for Research in Communication and Culture, Loughborough University, UK Review III: Escaping the Matrix or Repeating Liberal (Techno)Solutions? Eugenia Siapera, Information and Communication Studies, University College Dublin Review IV: The Social Dilemma's Dilemma: Challenging and Reproducing Social Media Myths Yuqi Na, CAMRI, University of Westminster SPECIAL SECTION 76 The Social Dilemma: Partial Insights Amidst Fuzzy Frames Paschal Preston, School of Communication, Dublin City University, Dublin Keywords: social dilemma; social media; media theory; political economy of communication; critical theory; technology and society; AI; algorithms; digital media; media policy The core concerns of this journal, a specific niche within the broad field of media and communication studies, have become ever more mainstream of late. Indeed, whilst concerns about the evolving scope, role, influence, operations, control and regulation of the media of public communication have always been central features of modern capitalism, they have come to occupy a prominent spot in public attention and debate. This has little to do with spectacular, media-friendly, enterprising boosterism or knowledge- based entrepreneurship, or related efforts in celebrity promotion on the part of key researchers and authors from the political economy of communication.
    [Show full text]
  • Depictions of Social Media Using the Social Construction of Technology Concept
    Depictions of Social Media Using the Social Construction of Technology Concept A Research Paper submitted to the Department of Engineering and Society Presented to the Faculty of the School of Engineering and Applied Science University of Virginia • Charlottesville, Virginia In Partial Fulfillment of the Requirements for the Degree Bachelor of Science, School of Engineering Alexa Guittari Spring, 2021 On my honor as a University Student, I have neither given nor received unauthorized aid on this assignment as defined by the Honor Guidelines for Thesis-Related Assignments Signature ________________________________________________ Date ______________ Alexa Guittari Approved ___________________ ______________ Date __5/4/2021___ Hannah Rogers, Department of Engineering and Society Abstract The sociotechnical relationship between human activity on social media and the depiction of social media is a complex subject and has many opposing views. In general, individuals either praise or criticize the usage of social media. Some are not in favor of social media as it can often lead to negative outcomes, such as decreasing one’s feelings of self-worth. There is often a presence of a skewed reality on social media. On the other hand, others admire social media as a positive concept. They feel as though they may benefit using social media by expanding their means of connecting with others and networking. These two different views can be acknowledged as to how behavior has ultimately shaped some of the platforms of social media. This paper will explore these two different outcomes and will be using the Science and Technology studies (S&TS) framework, social construction of technology (SCOT) concept to better understand these two perspectives.
    [Show full text]
  • Big Tech Giants Summoned to Senate After 'Social Dilemma' Backlash
    Big Tech giants summoned to Senate after 'Social Dilemma' backlash foxnews.com/tech/facebook-twitter-google-senate-social-dilemma TECHNOLOGIES Published October 27 'There's kind of a World War III of global information warfare that's happening right now,' an ex-Google staffer says The leaders of three of the biggest tech companies in the world are facing a dilemma of their own after a Netflix documentary about the issues caused by their services sparked a Senate hearing slated for later this week. The CEOs of technology giants Facebook, Google and Twitter are expected to testify for an Oct. 28 Senate hearing on tech companies’ treatment of speech and information on their platforms. EX-GOOGLE CONSULTANT SOUNDS ALARM ON NEGATIVE IMPACT OF BIG TECH IN ‘THE SOCIAL DILEMMA’ The Senate Commerce Committee voted last week to authorize subpoenas for Mark Zuckerberg of Facebook, Sundar Pichai of Google and Jack Dorsey of Twitter to force them to testify if they didn’t agree to do so voluntarily. Spokespeople for the companies said Monday that the CEOs will cooperate. The hearing comes on the heels of the recent release of "The Social Dilemma," a documentary that explores the negative impact social media and Big Tech can have on people, and focuses on how some tech companies use psychology to influence user habits for profit. “It’s because of this business model that’s at the heart of these technology companies, which is that they make more money the more time they get you to spend,” Tristan Harris, a former Google executive and co-founder of the Center for Humane Technology, said in a recent interview with Fox News' Bill Hemmer.
    [Show full text]