A Broken Utopia: Big Data Bias and the Need for a New Ethics of Faultless Responsibility

Total Page:16

File Type:pdf, Size:1020Kb

A Broken Utopia: Big Data Bias and the Need for a New Ethics of Faultless Responsibility A Broken Utopia: Big data bias and the need for a new ethics of faultless responsibility In 1970, Salvador Allende, a Marxist, was elected President of Chile on the promise of implementing la vía chilena al socialismo – the Chilean Path to Socialism. Allende wanted to precisely coordinate Chile’s economy to maximise fairness: “our objective,” he said, “is total, scientific, Marxist socialism.”1 Soon after Allende’s election the government hired English ‘cybernetician’ Stafford Beer, the world’s leading pioneer in the use of computers to organize production. Beer was to apply his methods on a national scale and computerize the whole of Chile’s economy. The physical aspect of Project Cybersyn, as it came to be known, was the Operations Room; a hexagonal room fitted with white fibreglass chairs and orange cushions, in which real-time information from Chile’s factories would appear on wall-mounted screens, and from where Chilean economists could oversee, coordinate and model the nation’s economy.2 Project Cyberysn was “a dispatch from the future”; it was a prototype of today’s big data society.3,4 Project Cybersyn was meant to perfect socialism; instead, capitalism has since perfected Project Cybersyn. While Chile had to make do with the technology of the ’70s, today faster processors and ubiquitous sensors5 have made it simple for businesses to collect and process huge datasets, with massive gains in efficiency, accuracy, and profits. ‘Big data’ is the catch- all term used to refer to these “increased capabilities to amass and store data and the 1 Régis Debray and Salvador Allende, Conversations with Allende: Socialism in Chile (N.L.B., 1971). 2 Eden Medina, Cybernetic Revolutionaries - Technology and Politics in Allende's Chile (MIT Press, 2011). 3 Evgeny Morozov, ‘The Planning Machine - Project Cybersyn and the origins of the Big Data nation’, The New Yorker (New York), 13 October, 119. 4 Project Cybersyn was never fully realised. A military coup broke over the head of Chile’s government, and in September 1973 Allende died in the Presidential Palace, defending himself with an AK-47 given to him by Fidel Castro. During the coup, a military officer entered the Operations Room, took out a knife, and stabbed the screens. Stafford Beer survived, but was no longer the rich Rolls Royce-driving industrialist the Allende government had first contacted: Beer spent the final years of his life in a cottage in Toronto, writing poetry and giving private yoga lessons in exchange for incense and flowers (“Money gets in the way of everything,” he said). All of this is in Medina’s wonderful book, which, as far as I can see, is the only book published on the subject. 5 In July of this year Gizmodo reported that the automated Roomba vacuum has been surreptitiously mapping homes for the past few years: Rhett Jones, ‘Roomba's Next Big Step Is Selling Maps Of Your Home To The Highest Bidder’, Gizmodo (online), 25 July 2017 <https://www.gizmodo.com.au/2017/07/roombas-next-big- step-is-selling-maps-of-your-home-to-the-highest-bidder/>. If you need further proof that sensors are everywhere, look no further than Snowden’s leaks on the NSA. 1 analytical models applied to them for yielding knowledge”.6 But even as the technology has changed, big data has maintained the air of the utopic. Big data promises efficiency and fairness. This essay does not dispute that there are many, many positive applications of big data – but I hope to show that in other cases, the best intentions can lead to morally repugnant outcomes, even when all human involved have acted ethically. This essay will focus on situations where big data has accidentally amplified bias and prejudice, as a way of illustrating the need for a new ethics to answer the question of who should be held responsible if big data inadvertently leads to a morally bad outcome.7 This essay argues for the adoption of Luciano Floridi’s 2016 concept of faultless responsibility as the compass by which to chart society’s use of big data. The complex and distributed nature of big data makes the attribution of responsibility a difficult task: but without the attribution of responsibility, the harms of big data will go unchecked, and the breathy utopianism of big data will continue to be undercut. 1. Toy versions of the world: the basics of big data Big data is the world translated into numbers. It is big data that measures workplace productivity,8 or ranks universities,9 or tells us that Nabokov’s favourite word was “mauve”.10 Big data uses numbers to build baseball teams and personalised Spotify playlists, or to tell Facebook which ads we’re most likely to click. It’s big data that collects our online 6 Effy Vayena and John Tasioulas, ‘The dynamics of big data and human rights: the case of scientific research’ (2016) 374(2083) Philosophical Transactions of the Royal Society A, 2. 7 Because the topic of big data is so vast, I won’t be able to cover the other attention-worthy issues that arise from its use. For an introduction to the interaction between big data and privacy see Edith Ramirez, ‘The Privacy Challenges of Big Data: A View From The Lifeguard’s Chair’ (Speech delivered at the Technology Policy Institute Aspen Forum, Aspen, Colorado, 19 August 2013) <https://www.ftc.gov/sites/default/files/documents/public_statements/privacy-challenges-big-data-view- lifeguard%E2%80%99s-chair/130819bigdataaspen.pdf>; and for a brilliant overview of the challenges big data poses to the media and the democratic process, read Katherine Viner, ‘How technology disrupted the truth’, The Guardian (online), 12 July 2016 <https://www.theguardian.com/media/2016/jul/12/how-technology-disrupted- the-truth>. 8 Joshua Rothman, ‘Big Data Comes to the Office’, The New Yorker (online) 3 June 2014 <http://www.newyorker.com/books/joshua-rothman/big-data-comes-to-the-office>. 9 Robert Morse, ‘The Birth of the College Rankings’, U.S. News (online) 16 May 2008 <https://www.usnews.com/news/national/articles/2008/05/16/the-birth-of-college-rankings>. 10 Dan Piepenbring, ‘The Heretical Things Statistics Tell Us About Fiction’, The New Yorker (online) 27 July 2017 <http://www.newyorker.com/books/page-turner/the-surprising-things-statistics-tell-us-about-fiction>. 2 activity to figure out if we need a new car, or a loan,11 or even to tell if we’re pregnant.12 Big data is not any one thing, but rather, is the name given when mathematics are heavily involved in guiding human activity. There are no limits to the potential uses of big data, and, seemingly, no limits to how willing we are to let big data into our lives. Many uses are innocuous (the alert on our phone telling us to take an umbrella), but even when the use is more serious (say, calculating our insurance premiums based on our driving record, or scoring our creditworthiness) we generally still accept the role of big data, because we trust the calculation will be fair. Data, after all, is math, and math is objective: two plus two equals four no matter if you’re David Duke or Ghandi. Big data implies the possibility of turning human organisation from an art into a science. But the idea that big data is free from human bias is simply incorrect. The calculations in the examples above are powered by algorithms, and these algorithms are written by humans. Far from being a science, the data scientists who write these algorithms refer to the process as “the ‘art’ of data mining”.13 The process begins by selecting a ‘target variable’ that the algorithm is trying to calculate. The target variable is rarely simple, which means that often the data scientist will be attempting to express an amorphous, real-world problem – for example, will this person commit a violent crime? – as a maths question. They must select the numbers that correlate with the target variable – for crime: low income, number of past crimes, and so on – and build these ‘proxies’ into a mathematical model of a violent criminal. When applied to an individual, the model spits out a number telling us how statistically similar that individual is to previous violent offenders, and, therefore, how likely they are to commit violent crime. These mathematical models are, by definition, simplifications; “no model can include all of the real world’s complexity.”14 Besides, nobody can give a universal explanation of why 11 Emily Steel and Julia Angwin, ‘On the Web’s Cutting Edge, Anonymity in Name Only’, The Wall Street Journal (online) 4 August 2010 <https://www.wsj.com/news/articles/SB10001424052748703294904575385532109190198>. 12 Charles Duhigg, ‘How Companies Learn Your Secrets’, The New York Times Magazine (online) 16 February 2012 <http://www.nytimes.com/2012/02/19/magazine/shopping-habits.html?_r=2&pagewanted=all>. 13 Solon Barocas and Andrew D. Selbst, ‘Big Data’s Disparate Impact’ (2016) 104, California Law Review 671, 678. 14 Cathy O’Neil, Weapons of Math Destruction - How Big Data Increases Inequality and Threatens Democracy (Penguin Random House, 2016), Loc 289. 3 people do commit violent crimes: but big data is not interested in causation, only results.15 These ‘results’ are essentially just an elaborate form of sorting, and when data scientists build what have been called mathematical “toy versions” of the world, they make subjective choices about how best to divide the world into categories.16 Big data algorithms necessarily engage in statistical discrimination: the sorting of people into groups with others who are statistically similar.17 The danger of this statistical discrimination crossing moral boundaries is always present.
Recommended publications
  • Amazon's Antitrust Paradox
    LINA M. KHAN Amazon’s Antitrust Paradox abstract. Amazon is the titan of twenty-first century commerce. In addition to being a re- tailer, it is now a marketing platform, a delivery and logistics network, a payment service, a credit lender, an auction house, a major book publisher, a producer of television and films, a fashion designer, a hardware manufacturer, and a leading host of cloud server space. Although Amazon has clocked staggering growth, it generates meager profits, choosing to price below-cost and ex- pand widely instead. Through this strategy, the company has positioned itself at the center of e- commerce and now serves as essential infrastructure for a host of other businesses that depend upon it. Elements of the firm’s structure and conduct pose anticompetitive concerns—yet it has escaped antitrust scrutiny. This Note argues that the current framework in antitrust—specifically its pegging competi- tion to “consumer welfare,” defined as short-term price effects—is unequipped to capture the ar- chitecture of market power in the modern economy. We cannot cognize the potential harms to competition posed by Amazon’s dominance if we measure competition primarily through price and output. Specifically, current doctrine underappreciates the risk of predatory pricing and how integration across distinct business lines may prove anticompetitive. These concerns are height- ened in the context of online platforms for two reasons. First, the economics of platform markets create incentives for a company to pursue growth over profits, a strategy that investors have re- warded. Under these conditions, predatory pricing becomes highly rational—even as existing doctrine treats it as irrational and therefore implausible.
    [Show full text]
  • The Pulitzer Prizes 2020 Winne
    WINNERS AND FINALISTS 1917 TO PRESENT TABLE OF CONTENTS Excerpts from the Plan of Award ..............................................................2 PULITZER PRIZES IN JOURNALISM Public Service ...........................................................................................6 Reporting ...............................................................................................24 Local Reporting .....................................................................................27 Local Reporting, Edition Time ..............................................................32 Local General or Spot News Reporting ..................................................33 General News Reporting ........................................................................36 Spot News Reporting ............................................................................38 Breaking News Reporting .....................................................................39 Local Reporting, No Edition Time .......................................................45 Local Investigative or Specialized Reporting .........................................47 Investigative Reporting ..........................................................................50 Explanatory Journalism .........................................................................61 Explanatory Reporting ...........................................................................64 Specialized Reporting .............................................................................70
    [Show full text]
  • 1 Statement of Justin Brookman Director, Privacy and Technology Policy Consumers Union Before the House Subcommittee on Digital
    Statement of Justin Brookman Director, Privacy and Technology Policy Consumers Union Before the House Subcommittee on Digital Commerce and Consumer Protection Understanding the Digital Advertising Ecosystem June 14, 2018 On behalf of Consumers Union, I want to thank you for the opportunity to testify today. We appreciate the leadership of Chairman Latta and Ranking Member Schakowsky in holding today’s hearing to explore the digital advertising ecosystem and how digital advertisements affect Americans. I appear here today on behalf of Consumers Union, the advocacy division of Consumer Reports, an independent, nonprofit, organization that works side by side with consumers to create a fairer, safer, and healthier world.1 1 Consumer Reports is the world’s largest independent product-testing organization. It conducts its advocacy work in the areas of privacy, telecommunications, financial services, food and product safety, health care, among other areas. Using its dozens of labs, auto test center, and survey research department, the nonprofit organization rates thousands of products and services annually. Founded in 1936, Consumer Reports has over 7 million members and publishes its magazine, website, and other publications. 1 Executive Summary My testimony today is divided into three parts. First, I describe some of the many ways that the digital advertising ecosystem has gotten more complex in recent years, leaving consumers with little information or agency over how to safeguard their privacy. Consumers are no longer just tracked through cookies in a web browser: instead, companies are developing a range of novel techniques to monitor online behavior and to tie that to what consumers do on other devices and in the physical world.
    [Show full text]
  • Julia Angwin
    For more information contact us on: North America 855.414.1034 International +1 646.307.5567 [email protected] Julia Angwin Topics Journalism, Science and Technology Travels From New York Bio Julia Angwin is an award-winning senior investigative reporter at ProPublica, a nonprofit newsroom in New York. From 2000 to 2013, she was a reporter at The Wall Street Journal, where she led a privacy investigative team that was a finalist for a Pulitzer Prize in Explanatory Reporting in 2011 and won a Gerald Loeb Award in 2010. Her book Dragnet Nation: A Quest for Privacy, Security and Freedom in a World of Relentless Surveillance was shortlisted for Best Business Book of the Year by the Financial Times. Julia is an accomplished and sought-after speaker on the topics of privacy, technology, and the quantified society that we live in. Among the many venues at which she has spoken are the Aspen Ideas Festival, the Chicago Humanities Festival, and keynotes at the Strata big data conference and the International Association of Privacy Professionals. In 2003, she was on a team of reporters at The Wall Street Journal that was awarded the Pulitzer Prize in Explanatory Reporting for coverage of corporate corruption. She is also the author of Stealing MySpace: The Battle to Control the Most Popular Website in America. She earned a B.A. in mathematics from the University of Chicago and an MBA from the page 1 / 3 For more information contact us on: North America 855.414.1034 International +1 646.307.5567 [email protected] Graduate School of Business at Columbia University.
    [Show full text]
  • United States
    FREEDOM ON THE NET 2016 United States 2015 2016 Population: 321.4 million Internet Freedom Status Free Free Internet Penetration 2015 (ITU): 75 percent Social Media/ICT Apps Blocked: No Obstacles to Access (0-25) 3 3 Political/Social Content Blocked: No Limits on Content (0-35) 2 2 Bloggers/ICT Users Arrested: No Violations of User Rights (0-40) 14 13 TOTAL* (0-100) 19 18 Press Freedom 2016 Status: Free * 0=most free, 100=least free Key Developments: June 2015 – May 2016 ● The USA FREEDOM Act passed in June 2015 limited bulk collection of Americans’ phone records and established other privacy protections. Nonetheless, mass surveillance targeting foreign citizens continues through programs authorized under Section 702 of the FISA Amendments Act and Executive Order 12333 (see Surveillance, Privacy, and Anonymity). ● Online media outlets and journalists face increased pressure, both financially and politically, that may impact future news coverage (see Media, Diversity, and Content Manipulation). ● Following a terrorist attack in San Bernardino in December 2015, the FBI sought to compel Apple to bypass security protections on the locked iPhone of one of the perpetrators (see Surveillance, Privacy, and Anonymity). www.freedomonthenet.org FREEDOM UNITED STATES ON THE NET 2016 Introduction Internet freedom improved slightly as the United States took a significant step toward reining in mass surveillance by the National Security Agency (NSA) with the passage of the USA FREEDOM Act in June 2015. The law ended the bulk collection of Americans’ phone records under Section 215 of the PATRIOT Act, a program detailed in documents leaked by former NSA contractor Edward Snowden in 2013 and ruled illegal by the Second Circuit Court of Appeals in May 2015.
    [Show full text]
  • Fighting Cyber-Crime After United States V. Jones Danielle K
    Boston University School of Law Scholarly Commons at Boston University School of Law Faculty Scholarship Summer 2013 Fighting Cyber-Crime After United States v. Jones Danielle K. Citron Boston University School of Law David Gray University of Maryland Francis King Carey School of Law Liz Rinehart University of Maryland Francis King Carey School of Law Follow this and additional works at: https://scholarship.law.bu.edu/faculty_scholarship Part of the Privacy Law Commons Recommended Citation Danielle K. Citron, David Gray & Liz Rinehart, Fighting Cyber-Crime After United States v. Jones, 103 Journal of Criminal Law and Criminology 745 (2013). Available at: https://scholarship.law.bu.edu/faculty_scholarship/625 This Article is brought to you for free and open access by Scholarly Commons at Boston University School of Law. It has been accepted for inclusion in Faculty Scholarship by an authorized administrator of Scholarly Commons at Boston University School of Law. For more information, please contact [email protected]. Fighting Cyber-Crime After United States v. Jones David C. Gray Danielle Keats Citron Liz Clark Rinehard No. 2013 - 49 This paper can be downloaded free of charge at: The Social Science Research Network Electronic Paper Collection http://ssrn.com/abstract=2302861 Journal of Criminal Law and Criminology Volume 103 | Issue 3 Article 4 Summer 2013 Fighting Cybercrime After United States v. Jones David Gray Danielle Keats Citron Liz Clark Rinehart Follow this and additional works at: http://scholarlycommons.law.northwestern.edu/jclc Part of the Criminal Law Commons Recommended Citation David Gray, Danielle Keats Citron, and Liz Clark Rinehart, Fighting Cybercrime After United States v.
    [Show full text]
  • 34:3 Berkeley Technology Law Journal
    34:3 BERKELEY TECHNOLOGY LAW JOURNAL 2019 Pages 705 to 918 Berkeley Technology Law Journal Volume 34, Number 3 Production: Produced by members of the Berkeley Technology Law Journal. All editing and layout done using Microsoft Word. Printer: Joe Christensen, Inc., Lincoln, Nebraska. Printed in the U.S.A. The paper used in this publication meets the minimum requirements of American National Standard for Information Sciences— Permanence of Paper for Library Materials, ANSI Z39.48—1984. Copyright © 2019 Regents of the University of California. All Rights Reserved. Berkeley Technology Law Journal University of California School of Law 3 Law Building Berkeley, California 94720-7200 [email protected] https://www.btlj.org BERKELEY TECHNOLOGY LAW JOURNAL VOLUME 34 NUMBER 3 2019 TABLE OF CONTENTS ARTICLES THE INSTITUTIONAL LIFE OF ALGORITHMIC RISK ASSESSMENT ............................ 705 Alicia Solow-Niederman, YooJung Choi & Guy Van den Broeck STRANGE LOOPS: APPARENT VERSUS ACTUAL HUMAN INVOLVEMENT IN AUTOMATED DECISION MAKING .................................................................................. 745 Kiel Brennan-Marquez, Karen Levy & Daniel Susser PROCUREMENT AS POLICY: ADMINISTRATIVE PROCESS FOR MACHINE LEARNING ........................................................................................................................... 773 Deirdre K. Mulligan & Kenneth A. Bamberger AUTOMATED DECISION SUPPORT TECHNOLOGIES AND THE LEGAL PROFESSION .......................................................................................................................
    [Show full text]
  • Automating the Risk of Bias
    41893-gwn_87-5 Sheet No. 99 Side B 01/29/2020 09:32:14 \\jciprod01\productn\G\GWN\87-5\GWN507.txt unknown Seq: 1 29-JAN-20 9:19 Automating the Risk of Bias Kristin N. Johnson* ABSTRACT Artificial intelligence (“AI”) is a transformative technology that has radi- cally altered decision-making processes. Evaluating the case for algorithmic or automated decision-making (“ADM”) platforms requires navigating tensions between two normative concerns. On the one hand, ADM platforms may lead to more efficient, accurate, and objective decisions. On the other hand, early and disturbing evidence suggests ADM platform results may demonstrate bi- ases, undermining claims that this special class of algorithms will democratize markets and increase inclusion. State law assigns decision-making authority to the boards of directors of corporations. State courts and lawmakers accord significant deference to the board in the execution of its duties. Among its duties, a board must employ effective oversight policies and procedures to manage known risks. Conse- quently, the board of directors and senior management of firms integrating ADM platforms must monitor operations to mitigate enterprise risks including litigation, reputation, compliance, and regulatory risks that arise as a result of the integration of algorithms. After the recent financial crisis, firms adopted structural and procedural governance reforms to mitigate various enterprise risks; these approaches may prove valuable in mitigating the risk of algorithmic bias. Evidence demon- strates that heterogeneous teams may identify and mitigate risks more success- fully than homogeneous teams. Heterogeneous teams are more likely to overcome cognitive biases such as confirmation, commitment, overconfidence, and relational biases.
    [Show full text]
  • Chapter 1 NOTES
    Chapter 1 NOTES ::: 1 Sam Hill and Glenn Rifkin, Radical Marketing (New York: Impact of Market Structure, Firm Structure, Strategy and HarperBusiness, 1999); Gerry Khermouch, “Keeping the Froth Market Orientation Culture on Dimensions of Business on Sam Adams,” BusinessWeek, September 1, 2003, pp. 54–56. Performance,” Journal of the Academy of Marketing Science 24, 2 American Marketing Association, 2004. no. 1 (1996): 27–43; Rohit Deshpande and John U. Farley, 3 Peter Drucker, Management: Tasks, Responsibilities, Practices “Measuring Market Orientation: Generalization and Synthesis,” (New York: Harper and Row, 1973), pp. 64–65. Journal of Market-Focused Management 2 (1998): 213–232. 4 Philip Kotler, “Dream Vacations: The Booming Market for 19 John C. Narver, Stanley F. Slater, and Douglas L. MacLachlan, Designed Experiences,” The Futurist (October 1984): 7–13; B. “Total Market Orientation, Business Performance, and Joseph Pine II and James Gilmore, The Experience Economy Innovation,” Working Paper Series, Marketing Science Institute, (Boston: Harvard Business School Press, 1999); Bernd Schmitt, Report No. 00-116, 2000, pp. 1–34. See also, Ken Matsuno and Experience Marketing (New York: Free Press, 1999). John T. Mentzer, “The Effects of Strategy Type on the Market 5 Irving J. Rein, Philip Kotler, and Martin Stoller, High Visibility Orientation–Performance Relationship,” Journal on Marketing (Chicago: NTC Publishers, 1998). (October 2000): 1–16. 6 Philip Kotler, Irving J. Rein, and Donald Haider, Marketing 20 John R. Brandt, “Dare to Be Different,” Chief Executive,May Places: Attracting Investment, Industry, and Tourism to Cities, 2003, pp. 34–38. States, and Nations (New York: Free Press, 1993); and Philip 21 Christian Homburg, John P.Workman Jr., and Harley Krohmen, Kotler, Christer Asplund, Irving Rein, Donald H.
    [Show full text]
  • The Interviews
    Jeff Schechtman Interviews December 1995 to April 2017 2017 Marcus du Soutay 4/10/17 Mark Zupan Inside Job: How Government Insiders Subvert the Public Interest 4/6/17 Johnathan Letham More Alive and Less Lonely: On Books and Writers 4/6/17 Ali Almossawi Bad Choices: How Algorithms Can Help You Think Smarter and Live Happier 4/5/17 Steven Vladick Prof. of Law at UT Austin 3/31/17 Nick Middleton An Atals of Countries that Don’t Exist 3/30/16 Hope Jahren Lab Girl 3/28/17 Mary Otto Theeth: The Story of Beauty, Inequality and the Struggle for Oral Health 3/28/17 Lawrence Weschler Waves Passing in the Night: Walter Murch in the Land of the Astrophysicists 3/28/17 Mark Olshaker Deadliest Enemy: Our War Against Killer Germs 3/24/17 Geoffrey Stone Sex and Constitution 3/24/17 Bill Hayes Insomniac City: New York, Oliver and Me 3/21/17 Basharat Peer A Question of Order: India, Turkey and the Return of the Strongmen 3/21/17 Cass Sunstein #Republic: Divided Democracy in the Age of Social Media 3/17/17 Glenn Frankel High Noon: The Hollywood Blacklist and the Making of an American Classic 3/15/17 Sloman & Fernbach The Knowledge Illusion: Why We Think Alone 3/15/17 Subir Chowdhury The Difference: When Good Enough Isn’t Enough 3/14/17 Peter Moskowitz How To Kill A City: Gentrification, Inequality and the Fight for the Neighborhood 3/14/17 Bruce Cannon Gibney A Generation of Sociopaths: How the Baby Boomers Betrayed America 3/10/17 Pam Jenoff The Orphan's Tale: A Novel 3/10/17 L.A.
    [Show full text]
  • Public Scrutiny of Automated Decisions: Early Lessons and Emerging Methods an Upturn and Omidyar Network Report Acknowledgments
    Public Scrutiny of Automated Decisions: Early Lessons and Emerging Methods An Upturn and Omidyar Network Report Acknowledgments We would like to acknowledge the many people who gave us helpful insight and feedback during the course of our research, including Salmana Ahmed at Omidyar Network, Gemma Galdon Clavell at Eticas Research & Consulting, Luis Fernando Garcia at R3D, Luciano Floridi at the University of Oxford, John Havens at IEEE, Gus Hosein at Privacy International, Estelle Massé at Access Now, Valeria Milanes at ADC Digital, Vivian Ng at the Human Rights, Big Data and Technology Project at the Essex Human Rights Centre, Cathy O’Neil at O’Neil Risk Consulting & Algorithmic Auditing, Matthew Sheret at IF, Matthias Spielkamp at AlgorithmWatch, Martin Tisné at Omidyar Network, Frank Pasquale at the University of Maryland, Pablo Viollier at Derechos Digitales and Adrian Weller at the Leverhulme Centre for the Future of Intelligence. Authors Aaron Rieke is a Managing Director at Upturn. He holds a JD from Berkeley Law, with a Certificate of Law and Technology, and a BA in Philosophy from Pacific Lutheran University. Miranda Bogen is a Policy Analyst at Upturn. She holds a Master’s degree in Law and Diplomacy with a focus on international technology policy from The Fletcher School of Law and Diplomacy at Tufts, and bachelor’s degrees in Political Science and Middle Eastern & North African Studies from UCLA. David G. Robinson is a Managing Director and co-founder at Upturn. He holds a JD from Yale Law School, and bachelor’s degrees in philosophy from Princeton and Oxford, where he was a Rhodes Scholar.
    [Show full text]
  • Pulitzer Prize Winners and Finalists
    WINNERS AND FINALISTS 1917 TO PRESENT TABLE OF CONTENTS Excerpts from the Plan of Award ..............................................................2 PULITZER PRIZES IN JOURNALISM Public Service ...........................................................................................6 Reporting ...............................................................................................24 Local Reporting .....................................................................................27 Local Reporting, Edition Time ..............................................................32 Local General or Spot News Reporting ..................................................33 General News Reporting ........................................................................36 Spot News Reporting ............................................................................38 Breaking News Reporting .....................................................................39 Local Reporting, No Edition Time .......................................................45 Local Investigative or Specialized Reporting .........................................47 Investigative Reporting ..........................................................................50 Explanatory Journalism .........................................................................61 Explanatory Reporting ...........................................................................64 Specialized Reporting .............................................................................70
    [Show full text]