Institutional and Technological Design Development Through Use Of

Total Page:16

File Type:pdf, Size:1020Kb

Institutional and Technological Design Development Through Use Of Institutional and Technological Design Development Through Use of Case Based Discussion Arindrajit Basu, Elonnai Hickok and Regulatory Amber Sinha Interventions For Emerging Economies Governing The Use Of Artificial Intelligence In Public Functions Regulatory Interventions For Emerging Economies Governing The Use Of Artificial Intelligence In Public Functions Introduction Background and Scope The use of artificial intelligence (AI) driven decision making in public functions has been touted around the world as a means of augmenting human capacities, removing bureaucratic fetters, and benefiting society. Yet, with concerns over bias, fairness, and a lack of algorithmic accountability, it is being increasingly recognized that algorithms have the potential to exacerbate entrenched structural inequality and threaten core constitutional values. While these concerns are applicable to both the private and public sector, this paper focuses on recommendations for public sector use, as standards of comparative constitutional law dictate that the state must abide by the full scope of fundamental rights articulated both in municipal and international law. For example, as per Article 13 of the Indian Constitution, whenever the government is exercising a “public function”, it is bound by the entire range of fundamental rights articulated in Part III of the Constitution. However, the definition and scope of “public function” is yet to be clearly defined in any jurisdiction, and certainly has no uniformity across countries. This poses a unique challenge to the regulation of AI projects in emerging economies. Due to a lack of government capacity to implement these projects in their entirety, many private sector organizations are involved in functions which were traditionally identified in India as public functions, such as policing, education, and banking. The extent of their role in any public sector project poses a set of important regulatory questions: to what extent can the state delegate the implementation of AI in public functions to the private sector?; and to what extent and how can both state and private sector actors be held accountable in such cases?. 107 Institutional and technological design development through use cases based discussion AI-driven solutions are never “one-size-fits-all” and the second section, we expand on the use cases we exist in symbiosis with the socio-economic context chose to study in detail and the policy. In the third in which they are devised and implemented. As such, section, we identify core constitutional values that any it is difficult to create a single overarching regulatory regulatory framework on AI in the public sector should framework for the development and use of AI in any look to protect. In this section, we also highlight key country, especially in countries with diverse socio- regulatory interventions that need to be made to economic demographics like India. Configuring the protect these values by developing a set of guiding appropriate regulatory framework for AI correctly is questions. important. Heavy-handed regulation or regulatory uncertainty might act as a disincentive for innovation We chose to work on the Indian ecosystem for three due to compliance fatigue or fear of liability. Similarly, substantive reasons, apart from the convenience of regulatory laxity or forbearance might result in the geographic proximity, which allowed us to conduct dilution of safeguards, resulting in a violation of our primary research. First, in terms of public constitutional rights and human dignity. Therefore, policy advancement, we feel that working in India we have sought to conceptualize optimal regulatory is important, as the technology and its governance interventions based on key constitutional values and frameworks are both in their nascent stages and human rights that the state should seek to protect the potential for the use of these technologies when creating a regulatory framework for AI. To devise and their impact on the populace, especially those these interventions, we identify a decision-making emerging technologically, is immense. Second, the framework consisting of a set of core questions that constitutional framework in India on key issues such can be used to determine the extent of regulatory as privacy, discrimination, and exclusion has both a intervention required to protect these values and legacy of jurisprudence and is at a critical juncture, rights. as they evolve and adapt with respect to emerging technologies. Finally, we believe that focusing on India We have arrived at the framework by identifying key allows us to make a unique contribution to the existing values and rights, and analyzing AI use cases to literature, as it charts out a potential regulatory model understand how different uses and configurations of for other similarly placed emerging economies. AI can challenge these values and rights. Specifically, the paper examines: Our framework limits itself to decision making by a regulator when designing or deploying the AI 1. Use of AI in predictive policing by law enforcement; solution. It does not delve into the adaptive regulatory 2. Use of AI in credit rating by means of establishment strategy that needs to be devised as the AI project is of the Public Credit Registry (PCR) in India; and implemented. It is also not an exhaustive framework, 3. Use of AI in improving crop yields for farmers. as many context-specific questions will alter its application. The objective is limited to framing This paper is divided into three sections. In the first broad questions that can guide specific regulatory section, we look at various models of regulation. In interventions as decision makers choose to adopt AI. 108 Regulatory Interventions For Emerging Economies Governing The Use Of Artificial Intelligence In Public Functions Methodology Section I: From the outset, we realize that the term “artificial Regulatory Models for AI intelligence” is used in multiple ways, and its definition is often contested. For the purposes of this paper, Privatizing Public Functions we define AI as a dynamic learning system where Across the world, activities traditionally undertaken by a certain level of decision-making power is being the state, including running prisons, policing, solving delegated to the machine (Basu & Hickok, 2018). In disputes, and providing housing and health services, doing so, we distinguish AI from automation, where a are increasingly being delegated to private actors, machine is being made to perform a repetitive task. often either private firms operating transnationally (Palmer, 2008) or quasi-governmental actors (Scott, The first stage of our research involved studying 2017a). It is not only a shift in the extent of legislative three applications of AI in public functions. Through discretion but the creation of formal and rule-based primary interviews and desk research, we sought to arrangements that were not needed in the welfare understand: state model, where the state delivered all services directly (Scott, 2017a). Braithwaite’s conception of • How the decision was arrived at to devise an AI- a regulatory state combines state oversight with based solution the commodification of service provision, where • Relevant policy or political enablers or detractors the citizen is treated as a consumer (Braithwaite, • What preparatory research or field work was done 2000). Businesses must deliver services with state before implementing the solution oversight, but the extent of oversight and the modes • How the data was gathered and collected of regulation must be determined contextually (Scott, • Impact assessment frameworks or evaluation 2017b). metrics used to determine the success of the project by the developers and implementers The increasing privatization of public functions • External assessments of the impact throws up two key constitutional questions. First, to what extent can public functions be delegated to a Using what was learnt from these case studies, we private actor? Little jurisprudence exists on this, as created a decision-making framework that relied on there have been very few challenges to privatization key threshold questions, as well as possible regulatory across jurisdictions. The Indian Supreme Court tools that could be applied. in Nandini Sundar and Ors vs State of Chattisgarh (2011), which banned the state designated private police organization, Salwa Judum, held that “modern constitutionalism posits that no wielder of power should be allowed to claim the right to perpetuate state’s violence… unchecked by law, and notions of 109 Institutional and technological design development through use cases based discussion innate human dignity of every individual” (Sundar and human dignity instead of undermine it.2 This is a valid Ors v State of Chattisgarh, 2011). The Court went concern for emerging economies as there are various on to criticize the state of Chattisgarh’s “policy of circumstances, including AI deployment, where private privatization” that was the cause of income disparity actors can deliver services more efficiently than an and non-allocation of adequate financial resources overstretched state. However, given the implications in the region, which in turn was responsible for the for human rights and dignity, it is conceptually difficult Maoist/Naxalite insurgency. However, there was no to draw an objective line on delegation. The Court clarification on what services are “governmental” and “assumed there is no constitutional impediment to
Recommended publications
  • Centering Civil Rights in the Privacy Debate
    August 2019 Centering Civil Rights in the Privacy Debate Becky Chao, Eric Null, Brandi Collins-Dexter, & Claire Park Last edited on September 17, 2019 at 11:28 a.m. EDT Acknowledgments The authors would like to thank Francella Ochillo, Erin Shields, Alisa Valentin, Miranda Bogen, Priscilla González, and Gaurav Laroia for participating in the event highlighted in this report and Lisa Johnson, Austin Adams, and Maria Elkin for communications support. Open Technology Institute would also like to thank Craig Newmark Philanthropies for generously supporting its work in this area. newamerica.org/oti/reports/centering-civil-rights-privacy-debate/ 2 About the Author(s) Becky Chao is a policy analyst at New America’s Open Technology Institute, where she works to promote equitable access to a fair and open internet ecosystem. Eric Null is senior policy counsel at the Open Technology Institute, focusing on internet openness and affordability issues, including network neutrality, Lifeline, and privacy. Brandi Collins-Dexter is the senior campaign director at Color Of Change and oversees the media, democracy and economic justice departments. Claire Park was an intern at New America's Open Technology Institute, where she researched and wrote about technology policy issues including broadband access and competition, as well as privacy. About New America We are dedicated to renewing America by continuing the quest to realize our nation’s highest ideals, honestly confronting the challenges caused by rapid technological and social change, and seizing the opportunities those changes create. About Open Technology Institute OTI works at the intersection of technology and policy to ensure that every community has equitable access to digital technology and its benefits.
    [Show full text]
  • Connecting Equity to a Universal Broadband Strategy
    Wired: CONNECTING EQUITY TO A UNIVERSAL BROADBAND STRATEGY A JOINT REPORT FROM: BY RAKEEN MABUD AND MARYBETH SEITZ-BROWN SEPTEMBER 2017 About the Roosevelt Institute Until economic and social rules work for all, they’re not working. Inspired by the legacy of Franklin and Eleanor, the Roosevelt Institute reimagines America as it should be: a place where hard work is rewarded, everyone participates, and everyone enjoys a fair share of our collective prosperity. We believe that when the rules work against this vision, it’s our responsibility to recreate them. We bring together thousands of thinkers and doers—from a new generation of leaders in every state to Nobel laureate economists— working to redefine the rules that guide our social and economic realities. We rethink and reshape everything from local policy to federal legislation, orienting toward a new economic and political system: one built by many for the good of all. About The New School Founded in 1919, The New School was born out of principles of academic freedom, tolerance, and experimentation. Committed to social engagement, The New School today remains in the vanguard of innovation in higher education, with more than 10,000 undergraduate and graduate students challenging the status quo in design and the social sciences, liberal arts, management, the arts, and media. The New School welcomes thousands of adult learners annually for continuing education courses and calendar of lectures, screenings, readings, and concerts. Through its online learning portals, research institutes, and international partnerships, The New School maintains a global presence. For more information, visit The New School’s website.
    [Show full text]
  • Taylor, Taylor Vs ATT Corp Re
    Before the Federal Communications Commission Washington, DC 20554 In the matter of ) ) Edward G. Taylor, ) Proceeding Number ________ Ray Taylor ) Complainants, ) ) Bureau ID Number EB-______ v. ) ) AT&T Corp., ) ) Defendant. ) FORMAL COMPLAINT OF EDWARD G. TAYLOR AND RAY TAYLOR Daryl Parks Parks & Crump 240 North Magnolia, Drive Tallahassee, Florida, 32301 [email protected] (850) 222-3333 (850) 224-6679 Counsel for Edward G. Taylor and Ray Taylor ! !!!! ! ! Dated: September 25, 2017 ! ! 1 ! TABLE OF CONTENTS SUMMARY………………………………………………………………………………………2 PARTIES AND COUNSEL……………………………………………………………………...4 JURISDICTION………………………………………………………………………………….5 REQUIRED CERTIFICATIONS………………………………………………………………..6 FACTS……………………………………………………………………………………………7 I. Introduction…………………………………………………………………………………….7 II. Complainants………………………………………………………………………………….7 III. Evidence of AT&T Redlining in Detroit….………………………………………………9 IV. Redlining is Widespread in the United States and Not Unique to Detroit…..………………………………………………………………………………………..12 LEGAL ANALYSIS…………………………………………………………………………….14 I. No Unjust or Unreasonable Discrimination or Practices. ……………………………….14 II. Broadband Access Internet Services Legal Treatment…………………………………..15 III. Standard for Determining Discrimination Under Section 202…………………………..15 IV. Complainants Demonstrate an Unreasonable Difference in Service……………………18 V. The Commission Must Act Regardless of BIAS Title II Classification…………………19 VI. The Commission Should Initiate an Investigation Pursuant to Section 403. ……………20 COUNT I………………………………………………………………………………………...20
    [Show full text]
  • CUB Comments to the FCC Supporting Net Neutrality
    July 17, 2017 Federal Communications Commission 445 12th Street, SW Washington, DC 20554 Re: Comments Regarding Restoring Internet Freedom NPRM, WC Docket 17-108 INTRODUCTION On May 23, 2017, less than two months prior to this comment submission on July 17, 2017, the Federal Communications Commission (FCC) released a Notice of Proposed Rulemaking (NPRM) under what the Oregon Citizens’ Utility Board (CUB) views as a particularly ironic title: “Restoring Internet Freedom” (WC Docket. 17-108). The NPRM broadly recommends a reversal of the February 2015 FCC decision to classify Broadband Internet Access Service (BIAS) as a telecommunications service under Title II of the Communications Act.1 With that reversal, the FCC would do away with what CUB argues is the appropriate and supported legal authority to enforce necessary consumer protections for customers of mobile and fixed BIAS service. CUB strongly believes that such protections are critical in an age where an increasing number of people – especially those who are most vulnerable – rely on fair, safe, and affordable Internet access to navigate the world. CUB’s approach in writing these comments is somewhat, although not entirely, different from the approach with which we generally take concerning energy or telecommunications proceedings before the Oregon Public Utility Commission (OPUC). The key difference is that, in this case, we requested the support of our constituents. We did so by asking our supporters to “sign-on” to our comments, if in fact they agreed with the general premise that to ensure net neutrality – to safeguard a fair and open Internet – the FCC should preserve the rules enshrined by the 2015 Open Internet Order.
    [Show full text]
  • Risks of Discrimination Through the Use of Algorithms
    Risks of Discrimination through the Use of Algorithms Carsten Orwat Risks of Discrimination through the Use of Algorithms A study compiled with a grant from the Federal Anti-Discrimination Agency by Dr Carsten Orwat Institute for Technology Assessment and Systems Analysis (ITAS) Karlsruhe Institute of Technology (KIT) Table of Contents List of Tables 5 List of Abbreviations 6 Acknowledgement and Funding 7 Summary 8 1. Introduction 10 2. Terms and Basic Developments 11 2.1 Algorithms 11 2.2 Developments in data processing 12 2.2.1 Increasing the amount of data relating to an identifiable person 12 2.2.2 Expansion of algorithm-based analysis methods 13 2.3 Algorithmic and data-based differentiations 16 2.3.1 Types of differentiation 16 2.3.2 Scope of application 18 2.3.3 Automated decision-making 20 3. Discrimination 23 3.1 Terms and understanding 23 3.2 Types of discrimination 25 3.3 Statistical discrimination 25 3.4 Changes in statistical discrimination 27 4. Cases of Unequal Treatment, Discrimination and Evidence 30 4.1 Working life 30 4.2 Real estate market 34 4.3 Trade 35 4.4 Advertising and search engines 36 4.5 Banking industry 38 4 4.6 Medicine 40 4.7 Transport 40 4.8 State social benefits and supervision 41 4.9 Education 44 4.10 Police 45 4.11 Judicial and penal system 47 4.12 General cases of artificial intelligence 49 5. Causes of Risks of Discrimination 53 5.1 Risks in the use of algorithms, models and data sets 53 5.1.1 Risks in the development of algorithms and models 53 5.1.2 Risks in the compilation of data sets and characteristics
    [Show full text]
  • Weather Based Crop Advisories for Climate Resilience
    Weather based crop advisories for climate resilience Crop management advisories through mobile phones yield positive results for groundnut farmers in Kurnool, Andhra Pradesh Dr. AVR Kesava Rao and Dr. Sreenath Dixit I Patancheru, Telangana Global warming is likely to reach 1.5°C climate; rainfed groundnut is the major crop between 2030 and 2052, if it continues of Devanakonda. Red soils are predominant to increase at the current rate, as per the and length of the rainfed crop-growing report released in October 2018 by the period is about 100-130 days. Most of the Intergovernmental Panel on Climate Change. farmers are small and marginal farmers Climate related risks for natural and human with low land holding and with increasing systems are higher for global warming of variability in the rainfall distribution, they 1.5°C than at present. Global atmospheric are facing high risks for establishing the concentration of CO2 has increased from crops. Sowing at the right time as such is preindustrial level of 280 parts per million very critical to ensure that farmers harvest a (ppm) to 409 ppm in December 2018. Studies good crop. And if it fails, it results in loss as show that climate change in India is real and a lot of costs are incurred for seeds, as well it is one of the major challenges faced by as the fertiliser applications. Farmers having Indian agriculture, more so in the semi-arid access to climate and weather information tropics (SAT) of the country. ICRISAT under are more likely to sow at the optimum time the National Initiative on Climate Resilient and take better crop management actions Agriculture (NICRA) project, quantified the for achieving higher yields.
    [Show full text]
  • Fondamentaux & Domaines
    Septembre 2020 Marie Lechner & Yves Citton Angles morts du numérique ubiquitaire Sélection de lectures, volume 2 Fondamentaux & Domaines Sommaire Fondamentaux Mike Ananny, Toward an Ethics of Algorithms: Convening, Observation, Probability, and Timeliness, Science, Technology, & Human Values, 2015, p. 1-25 . 1 Chris Anderson, The End of Theory: The Data Deluge Makes the Scientific Method Obsolete, Wired, June 23, 2008 . 26 Mark Andrejevic, The Droning of Experience, FibreCultureJournal, FCJ-187, n° 25, 2015 . 29 Franco ‘Bifo’ Berardi, Concatenation, Conjunction, and Connection, Introduction à AND. A Phenomenology of the End, New York, Semiotexte, 2015 . 45 Tega Brain, The Environment is not a system, Aprja, 2019, http://www.aprja.net /the-environment-is-not-a-system/ . 70 Lisa Gitelman and Virginia Jackson, Introduction to Raw Data is an Oxymoron, MIT Press, 2013 . 81 Orit Halpern, Robert Mitchell, And Bernard & Dionysius Geoghegan, The Smartness Mandate: Notes toward a Critique, Grey Room, n° 68, 2017, pp. 106–129 . 98 Safiya Umoja Noble, The Power of Algorithms, Introduction to Algorithms of Oppression. How Search Engines Reinforce Racism, NYU Press, 2018 . 123 Mimi Onuoha, Notes on Algorithmic Violence, February 2018 github.com/MimiOnuoha/On-Algorithmic-Violence . 139 Matteo Pasquinelli, Anomaly Detection: The Mathematization of the Abnormal in the Metadata Society, 2015, matteopasquinelli.com/anomaly-detection . 142 Iyad Rahwan et al., Machine behavior, Nature, n° 568, 25 April 2019, p. 477 sq. 152 Domaines Ingrid Burrington, The Location of Justice: Systems. Policing Is an Information Business, Urban Omnibus, Jun 20, 2018 . 162 Kate Crawford, Regulate facial-recognition technology, Nature, n° 572, 29 August 2019, p. 565 . 185 Sidney Fussell, How an Attempt at Correcting Bias in Tech Goes Wrong, The Atlantic, Oct 9, 2019 .
    [Show full text]
  • Before the Federal Communications Commission Washington, DC 20554
    Before the Federal Communications Commission Washington, DC 20554 In the matter of ) ) Joanne Elkins, Hattie Lanfair, ) Rachelle Lee ) Complainants, ) Proceeding Number ______ ) File No. EB-___________ - v. ) ) AT&T Corp. ) Defendant. ) FORMAL COMPLAINT OF JOANNE ELKINS, HATTIE LANFAIR AND RACHELLE LEE Daryl Parks Parks & Crump 240 North Magnolia, Drive Tallahassee, Florida, 32301 (850) 222-3333 (850) 224-6679 (fax) Counsel for Joanne Elkins, Hattie Lanfair and Rachelle Lee Dated: August 24, 2017 OMD Control Number FCC 485 Federal Communications Commission 3060-0411 May 2014 Washington, D.C. 20554 SECTION 208 FORMAL COMPLAINT INTAKE FORM 1. Case Name: Joanne Elkins, Hattie Lanfair, Rachelle Lee v. AT&T Corp 2.Complainants Complainant ’s Name, Address, Phone and Facsimile Number, e-mail address (if applicable):Joanne Elkins, 1423 East 85th St, ClevelanD, Ohio, 44106; Hattie Lanfair, 12721 Iroquois Ave, ClevelanD, Ohio anD Rochelle Lee, 2270 73rD St, ClevelanD, Ohio 44103 3. Defendant’s Name, Address, Phone and Facsimile Number (to the extent known), e-mail address (if applicable): AT&T 208 S. Akard Street, Dallas, Texas 75202. 4. Complaint alleges violation of the following provisions of the Communications Act of 1934, as amended: Sections 201, 202, 206 and 208 of the Communications Act, Section 706 of the 1996 Telecommunications Act, and Sections 1.720 et seq. of the ComAnswermission’s (Y)es, (N)o rules, or N/A to the following: Y 5. Complaint conforms to the specifications prescribed by 47 C.F.R. Section 1.734. Y 6. Complaint complies with the pleading requirements of 47 C.F.R. Section 1.720.
    [Show full text]
  • Whose Streets? Our Streets! (Tech Edition)
    TECHNOLOGY AND PUBLIC PURPOSE PROJECT Whose Streets? Our Streets! (Tech Edition) 2020-21 “Smart City” Cautionary Trends & 10 Calls to Action to Protect and Promote Democracy Rebecca Williams REPORT AUGUST 2021 Technology and Public Purpose Project Belfer Center for Science and International Affairs Harvard Kennedy School 79 JFK Street Cambridge, MA 02138 www.belfercenter.org/TAPP Statements and views expressed in this report are solely those of the authors and do not imply endorsement by Harvard University, Harvard Kennedy School, or the Belfer Center for Science and International Affairs. Cover image: Les Droits de l’Homme, Rene Magritte 1947 Design and layout by Andrew Facini Copyright 2021, President and Fellows of Harvard College Printed in the United States of America TECHNOLOGY AND PUBLIC PURPOSE PROJECT Whose Streets? Our Streets! (Tech Edition) 2020-21 “Smart City” Cautionary Trends & 10 Calls to Action to Protect and Promote Democracy Rebecca Williams REPORT AUGUST 2021 Acknowledgments This report culminates my research on “smart city” technology risks to civil liberties as part of Harvard Kennedy School’s Belfer Center for Science and International Affairs’ Technology and Public Purpose (TAPP) Project. The TAPP Project works to ensure that emerging technologies are developed and managed in ways that serve the overall public good. I am grateful to the following individuals for their inspiration, guidance, and support: Liz Barry, Ash Carter, Kade Crockford, Susan Crawford, Karen Ejiofor, Niva Elkin-Koren, Clare Garvie and The Perpetual Line-Up team for inspiring this research, Kelsey Finch, Ben Green, Gretchen Greene, Leah Horgan, Amritha Jayanti, Stephen Larrick, Greg Lindsay, Beryl Lipton, Jeff Maki, Laura Manley, Dave Maass, Dominic Mauro, Hunter Owens, Kathy Pettit, Bruce Schneier, Madeline Smith who helped so much wrangling all of these examples, Audrey Tang, James Waldo, Sarah Williams, Kevin Webb, Bianca Wylie, Jonathan Zittrain, my fellow TAPP fellows, and many others.
    [Show full text]
  • Policy Platform Follows a Tradition NCAPA Began in 2004 to Present a Comprehensive Set of Policy Recommendations Related to the AA and NHPI Community
    2020 PresentedPOLICY by the National Council PLATFORM of Asian Pacific Americans (NCAPA) WHO NCAPA IS AND HOW TO USE THIS PLATFORM In 1996, a group of national Asian American Pacific Islander (AAPI) civil rights organizations recognized the need for a more unified AAPI voice at the federal level and came together to create the National Council of Asian Pacific Americans (NCAPA). Today, NCAPA has thirty-six members and represents the diverse communities within the broader Asian American, Native Hawaiian, and Pacific Islander (AA and NHPI) population—including East Asian, South Asian, Southeast Asian, Native Hawaiian, and Pacific Islander American communities. Our collective footprint through our members’ chapters, affiliates and partners span across the entire country and into three territories. This policy platform follows a tradition NCAPA began in 2004 to present a comprehensive set of policy recommendations related to the AA and NHPI community. The platform covers issue areas that include, but aren’t limited to, our five policy committees: immigration, civil rights, healthcare, education, and housing/ economic justice. This year, we include, for the first time, sections on tech and telecom policy, reflecting the work of the AAPI Tech Table, of which NCAPA is a member. We also acknowledge the threat of climate change and the need for real solutions for our communities who are/will be impacted. Just prior to publication, the COVID-19 pandemic arrived in the United States. Needless to say, few, if any of us, were prepared for the drastic impact this virus would have on life around the world. Our lives have been so fundamentally disrupted that in some ways, the policy recommendations originally outlined in this document feel strangely disconnected from the new normal of sheltering in place.
    [Show full text]
  • Transparency and Interpretability
    Transparency and Interpretability Responsible Data Science DS-UA 202 and DS-GA 1017 Instructors: Julia Stoyanovich and George Wood is reader contains links to online materials and excerpts from selected articles on transparency and interpretability. For convenience, the readings are organized by course week. Please note that some excerpts end in the middle of a section. Where that is the case, the partial section is not required reading. Week 11: Auditing black-box models 3 Tulio Ribeiro, Singh, and Guestrin (2016) “Why should I trust you? Ex- plaining the predictions of any classier” ACM KDD 2016: 1135-1144 ....................................... 4 Daa, Sen, and Zick (2016) “Algorithmic transparency via quantiative in- put inuence: eory and experiments with learning systems,” IEEE SP 2016:598-617 .............................. 14 Lundberg and Lee (2017) “A unied approach to interpreting model pre- dictions,” NIPS 2017:4765-4774 ...................... 34 Week 12: Discrimination in on-line ad delivery 44 Sweeney (2013) “Discrimination in online ad delivery” CACM 2013 56(5): 44-54 .................................... 45 Daa, Tschantz, and Daa(2015) “Automated experiments on ad privacy seings” PoPETs 2015 (1): 92-112 ..................... 56 Ali, Sapiezynski, Bogen, Korolova, Mislove, Rieke (2019) “Discrimination through optimization: How Facebook’s ad delivery can lead to biased outcomes” ACM CSCW 2019 ...................... 77 1 Weeks 13, 14: Interpretability 107 Selbst and Barocas (2018) “e intuitive appeal of explainable machines” Fordham Law Review 2018 (87): 1085-1139 ............... 108 Stoyanovich, Van Bavel, West (2020) “e imperative of interpretable ma- chines” Nature Machine Intelligence: .................. 163 Stoyanovich and Howe (2019) “Nutritional labels for data and models” IEEE Data Engineering Bulletin: 42(3): 13-23 .............
    [Show full text]
  • DAY -1 : 22.12.2015 : AGENDA Nos. 01 to 27
    List of applications to be placed in the scheduled 86th SEAC, A.P. meeting on 22nd & 23rd December 2015 at APPCB, Zonal Office, Visakhapatnam , Andhra Pradesh. Sl.No Date Name of the Project Line of Activity DAY -1 : 22.12.2015 : AGENDA Nos. 01 to 27. 1 24.11.2015 & 16.463 Ha. Calcite Mine of The A.P. Mineral letter received from 26.11.2015 Development Corporation Ltd, Sy. No. 16/2, 16/4, 18, industry & Transfer 19/1 to 5, 20/1 to 3, Karaiguda (V), Ananthagiri (M), of File from MoEF Visakhapatnam District. 2 25.11.2015 7.781 Ha. Marble Mine of Sri L. Amar Marble Mine - Clr Chandrasekhar Reddy, Sy. NO 34/p, S.R. Puram received Village, Rajampeta Mandal, Kadapa District. 3 26.11.2015 21.85 Ha Quartzite Mine of Hanumatha Rao, Sy.No. Quartzite mine - EIA 1, Sarayavalasa Village, Dattirajeru Mandal, Report & PH Vizianagaram District, Andhra Pradesh minutes submitted 4 2.2 Ha. Black Granite Mine of M/s. Sakku Granites, Black Granite Mine 28.11.2015 & Compt. No. 218 of Veerasetti palli RF, Chittoor West 02.12.2015 Division, Veerasettipalli (V), Yadamarri (M), Chittoor District. 5 01.12.2015 1.153 Ha. Colour Granite Mine of M/s. Taj Granites, Colour Granite Mine Sy.Nos. 142/1, H.D.Halli (V), Agali (M), Anantapur District 6 SEAC on 11.04 Acres. Construction project of M/s.Yukthi Infra Construction Project - 14.11.2015 Projects, Sy.No. 355/1, Sangam (V), Sangam Earlier defered case Panchayath & Mandal, SPSR Nellore District 7 38.75 Ha.
    [Show full text]