“Scored” Society: Are We Headed for a Social-Ratings-Filled Black Mirror Future?
Total Page:16
File Type:pdf, Size:1020Kb
Corporate governance, AI, and the “scored” society: Are we headed for a social-ratings-filled Black Mirror future? Jason R. Baron Of Counsel Faegre Drinker LLP Richard P. Kessler Director KPMG LLP Agenda • Black Mirror: “Nosedive” • China’s Social Credit System • Emerging “Social Credit”/Scoring in the U.S. • Legal Frameworks • Data Protection Laws • Data and Information Governance Models • Corporate Governance & Transparency 1 Black Mirror – “Nosedive” • Black Mirror considers the murky relationship between humans and technology. • This episode is set in a dystopian world where everyone in society is ranked one to five by others they interact with. • Personal data ratings are directly correlated to the character’s societal value based on with whom they interact. Business data carry a similar weight in the sense that they stimulate data-driven decision- 1 5 making based on our value evaluation. 2 News Headlines • “China’s New Social Credit System is a Dystopian Nightmare” • “China’s ‘Social Credit System’ Has Already Stopped Millions From Traveling” • “Laolai Map within WeChat – A System That Displays Blacklisted People, Companies, and Other Organizations Within a Given Area” • “China Will Implement Its Corporate Social Credit System in 2020. Foreign Companies Better Get Ready” • “Uh-Oh: Silicon Valley is Building a Chinese-Style Social Credit System” 3 China’s Social Credit System: your score = your social value • Rewards volunteer activity and repaying debts promptly. • Rewards “filial piety”: devotion to one’s parents, grandparents, • and perhaps other relatives. • Punishes people for court judgments, court records, academic dishonesty, jaywalking, moving violations, failing to pay transit fares. • Punishes for posting political opinions without prior permission, or “unreliable” information or engaging in “negative interactions” online (whatever that means). • Punishes for having a child without the necessary administrative permission • Peer scoring could be employed: when scores drop, system would allow for family and friends’ scores also being lowered China’s Social Credit System: Potential Punishments & Rewards Punishments • Restrictions on government subsidies, business licenses, social welfare services • Restrictions on travel or which hotels can be booked • Limitations on school choices (for oneself & one’s children) • Prohibition on using the Internet Rewards • Preferential treatment across the above range of services China’s Social Credit System: Total Algorithmic Governance Hypothetical: Company pollutes river, regulators uncover the problem 20th Century model: fines & criminal penalties imposed through regulatory, administrative or judicial process 21st Century model: CEO of company punished in all ways discussed Hypothetical: Bank sends false information about a customer, to prevent customer from seeking out better terms at another bank. There may be no way to ever find out about the “defamation.” “NOSEDIVE” & The Scored Society in the U.S. • Airbnb, Uber, Lyft, all employ 360-degree ratings (owner/driver + renter/passenger). • Ebay, Amazon, etc. use reputational scores affect purchase decisions. • Credit analytics companies are testing social media profiles to validate eligibility for loans. • Social media postings and vilification can make individuals unemployable (“Online footprints”). • Job candidates ranked by what their online activities say about creativity and leadership. • Software engineers assessed for contributions on open source projects with points awarded when others use their code. • Sentencing decisions based on algorithmic score on likelihood of recidivism. State v. Loomis, 881 N.W.2d 749 (Wis. 2016), cert. denied sub nom. Loomis v. Wisconsin, 137 S. Ct. 2290 (2017). • Algorithmic predictions about health risks based on what individuals share on mobile apps about caloric intake may soon result in higher insurance premiums. • Data on “bad drivers” can be aggregated (either from external reporting or from in-vehicle recording devices) leading to insurance companies adjusting risk & premiums. Legal issues Algorithms contain hidden biases of their programmers depending on choices for input variables. Algorithms depend on data which may be inaccurate, incomplete or biased. Algorithms are not transparent to civil society. Algorithmic scoring is proceeding without expert oversight. Disparate impact on individuals in groups or categories. Scoring individuals may become a self-fulfilling prophecy. 8 Legal frameworks for confronting our Black Mirror future Regulatory oversight: e.g., FTC jurisdiction under section 5 of the Federal Trade Commission Act, 15 U.S.C. 45, to review all forms of scoring systems or other deceptive practices that may constitute unfair trade practices that harm consumers. Fair Information Practice Principles (FIPPs) governing responsible collection, use and management of data (also foundational to EU privacy regulation) GDPR Article 22(1): “The data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her.” Technological due process: Standards and procedures to ensure that predictive algorithms and other near-future AI practices are reviewed and revised with humans in the loop. 9 Data Protection Laws: CCPA and GDPR 10 California Consumer Privacy Act (CCPA): Overview CCPA overview • On June 28, 2018, California passed a data privacy law known as the California Consumer Privacy Act (CCPA), after unanimous approval by its State Assembly and Senate. • CCPA gives consumers more control and protection over their personal data. • Many of these consumer protection provisions are similar to GDPR, the sweeping EU privacy regulation that took effect in May 2018 and impacted multinational organizations doing business in and with EU countries. • Day 1 compliance: January 1, 2020 • Scope: All companies around the globe processing personal data of California citizens 11 The KPMG name and logo are registered trademarks or trademarks of KPMG International California Consumer Privacy Act (CCPA): Overview Key consumer impacts • Right to know the categories of third parties with whom your data is shared • Right to know all data collected by a business • Right to know the categories of sources of • Right to say no to the sale of your information information from whom your data was acquired • Right to delete your data • Right to know the business or commercial • Right to be informed of what categories of data purpose of collecting your information will be collected about you prior to its collection • Private right of action when companies breach and to be informed of any changes to this your data to make sure these companies keep collection your information safe • Mandated opt-in before sale of children’s information (under the age of 16) • Enforcement by the Attorney General of the State of California • Day 1 compliance: January 1, 2020 “About the Initiative: California Consumer Privacy Act.” About the Initiative | California Consumer Privacy • Scope: All companies around the globe processing Act, www.caprivacy.org/about. personal data of California citizens 12 The KPMG name and logo are registered trademarks or trademarks of KPMG International General Data Protection Regulation (GDPR): Overview GDPR overview • The General Data Protection Regulation (GDPR), a comprehensive European data protection regulation, came into effect on May 25, 2018 • It clearly sets out the ways in which the privacy rights of every EU citizen must be protected and the ways in which a person’s “personal data” can—and cannot—be used • It places the onus on any person or entity involved in the processing of a person’s information (data controller/data processor) to comply with the legislation and to demonstrate compliance • It carries significant penalties for noncompliance GDPR principles of data protection Lawfulness, Accuracy Purpose Limitation Integrity and Data Minimisation Accountability Storage Limitation Fairness, (ensure data is (use only for one or Confidentiality (collect only the (essential not only to (kept for no longer Transparency kept up to date, more specified (ensuring amount of data be compliant, but to be than necessary for (legal and clear) accurate and purposes) appropriate security required for the able to demonstrate the specified purpose) complete) of data) specified purpose) compliance) • Day 1 compliance: May 25, 2018 • Scope: All companies around the globe processing personal data of EU citizens 13 The KPMG name and logo are registered trademarks or trademarks of KPMG International Data and Information Governance Models 14 Data governance vs. Information governance legacy view Data governance Information governance “The exercise of authority and control over the “The specification of decision rights and an management of data assets to define, approve accountability framework to ensure appropriate and communicate data strategies, policies, and behavior in the valuation, creation, storage, use, standards; to track and enforce regulatory archiving and deletion of information. It includes the processes, roles and policies, standards and metrics compliance and conformance to data…” etc. that ensure the effective and efficient use of – Data Architecture Management Association (DAMA) information in enabling an organization to achieve its goals.” – Gartner “The function that defines and “Activities and technologies that implements the standards,