<<

Corporate governance, AI, and the “scored” society: Are we headed for a social-ratings-filled future?

Jason R. Baron Of Counsel Faegre Drinker LLP

Richard P. Kessler Director KPMG LLP Agenda

• Black Mirror: “Nosedive” • China’s • Emerging “Social Credit”/Scoring in the U.S. • Legal Frameworks • Data Protection Laws • Data and Information Governance Models • Corporate Governance & Transparency

1 Black Mirror – “Nosedive”

• Black Mirror considers the murky relationship between humans and technology. • This episode is set in a dystopian world where everyone in society is ranked one to five by others they interact with. • Personal data ratings are directly correlated to the character’s societal value based on with whom they interact.

Business data carry a similar weight in the sense that they stimulate data-driven decision- 1 5 making based on our value evaluation.

2 News Headlines

• “China’s New Social Credit System is a Dystopian Nightmare” • “China’s ‘Social Credit System’ Has Already Stopped Millions From Traveling” • “Laolai Map within WeChat – A System That Displays Blacklisted People, Companies, and Other Organizations Within a Given Area” • “China Will Implement Its Corporate Social Credit System in 2020. Foreign Companies Better Get Ready” • “Uh-Oh: Silicon Valley is Building a Chinese-Style Social Credit System”

3 China’s Social Credit System: your score = your social value

• Rewards volunteer activity and repaying debts promptly. • Rewards “filial piety”: devotion to one’s parents, grandparents, • and perhaps other relatives. • Punishes people for court judgments, court records, academic dishonesty, jaywalking, moving violations, failing to pay transit fares. • Punishes for posting political opinions without prior permission, or “unreliable” information or engaging in “negative interactions” online (whatever that means). • Punishes for having a child without the necessary administrative permission • Peer scoring could be employed: when scores drop, system would allow for family and friends’ scores also being lowered China’s Social Credit System: Potential Punishments & Rewards

Punishments • Restrictions on government subsidies, business licenses, social welfare services • Restrictions on travel or which hotels can be booked • Limitations on school choices (for oneself & one’s children) • Prohibition on using the Internet Rewards • Preferential treatment across the above range of services China’s Social Credit System: Total Algorithmic Governance

Hypothetical: Company pollutes river, regulators uncover the problem 20th Century model: fines & criminal penalties imposed through regulatory, administrative or judicial process 21st Century model: CEO of company punished in all ways discussed Hypothetical: Bank sends false information about a customer, to prevent customer from seeking out better terms at another bank. There may be no way to ever find out about the “defamation.” “NOSEDIVE” & The Scored Society in the U.S.

• Airbnb, , Lyft, all employ 360-degree ratings (owner/driver + renter/passenger). • Ebay, Amazon, etc. use reputational scores affect purchase decisions. • Credit analytics companies are testing social media profiles to validate eligibility for loans. • Social media postings and vilification can make individuals unemployable (“Online footprints”). • Job candidates ranked by what their online activities say about creativity and leadership. • Software engineers assessed for contributions on open source projects with points awarded when others use their code. • Sentencing decisions based on algorithmic score on likelihood of recidivism. State v. Loomis, 881 N.W.2d 749 (Wis. 2016), cert. denied sub nom. Loomis v. Wisconsin, 137 S. Ct. 2290 (2017). • Algorithmic predictions about health risks based on what individuals share on mobile apps about caloric intake may soon result in higher insurance premiums. • Data on “bad drivers” can be aggregated (either from external reporting or from in-vehicle recording devices) leading to insurance companies adjusting risk & premiums. Legal issues

Algorithms contain hidden biases of their programmers depending on choices for input variables.

Algorithms depend on data which may be inaccurate, incomplete or biased.

Algorithms are not transparent to civil society.

Algorithmic scoring is proceeding without expert oversight.

Disparate impact on individuals in groups or categories.

Scoring individuals may become a self-fulfilling prophecy.

8 Legal frameworks for confronting our Black Mirror future

Regulatory oversight: e.g., FTC jurisdiction under section 5 of the Federal Trade Commission Act, 15 U.S.C. 45, to review all forms of scoring systems or other deceptive practices that may constitute unfair trade practices that harm consumers.

Fair Information Practice Principles (FIPPs) governing responsible collection, use and management of data (also foundational to EU privacy regulation)

GDPR Article 22(1): “The data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her.”

Technological due process: Standards and procedures to ensure that predictive algorithms and other near-future AI practices are reviewed and revised with humans in the loop.

9 Data Protection Laws: CCPA and GDPR

10 California Consumer Privacy Act (CCPA): Overview

CCPA overview • On June 28, 2018, California passed a data privacy law known as the California Consumer Privacy Act (CCPA), after unanimous approval by its State Assembly and Senate. • CCPA gives consumers more control and protection over their personal data. • Many of these consumer protection provisions are similar to GDPR, the sweeping EU privacy regulation that took effect in May 2018 and impacted multinational organizations doing business in and with EU countries. • Day 1 compliance: January 1, 2020 • Scope: All companies around the globe processing personal data of California citizens

11 The KPMG name and logo are registered trademarks or trademarks of KPMG International California Consumer Privacy Act (CCPA): Overview

Key consumer impacts • Right to know the categories of third parties with whom your data is shared • Right to know all data collected by a business • Right to know the categories of sources of • Right to say no to the sale of your information information from whom your data was acquired • Right to delete your data • Right to know the business or commercial • Right to be informed of what categories of data purpose of collecting your information will be collected about you prior to its collection • Private right of action when companies breach and to be informed of any changes to this your data to make sure these companies keep collection your information safe • Mandated opt-in before sale of children’s information (under the age of 16) • Enforcement by the Attorney General of the State of California • Day 1 compliance: January 1, 2020

“About the Initiative: California Consumer Privacy Act.” About the Initiative | California Consumer Privacy • Scope: All companies around the globe processing Act, www.caprivacy.org/about. personal data of California citizens

12 The KPMG name and logo are registered trademarks or trademarks of KPMG International General Data Protection Regulation (GDPR): Overview

GDPR overview • The General Data Protection Regulation (GDPR), a comprehensive European data protection regulation, came into effect on May 25, 2018 • It clearly sets out the ways in which the privacy rights of every EU citizen must be protected and the ways in which a person’s “personal data” can—and cannot—be used • It places the onus on any person or entity involved in the processing of a person’s information (data controller/data processor) to comply with the legislation and to demonstrate compliance • It carries significant penalties for noncompliance GDPR principles of data protection

Lawfulness, Accuracy Purpose Limitation Integrity and Data Minimisation Accountability Storage Limitation Fairness, (ensure data is (use only for one or Confidentiality (collect only the (essential not only to (kept for no longer Transparency kept up to date, more specified (ensuring amount of data be compliant, but to be than necessary for (legal and clear) accurate and purposes) appropriate security required for the able to demonstrate the specified purpose) complete) of data) specified purpose) compliance)

• Day 1 compliance: May 25, 2018 • Scope: All companies around the globe processing personal data of EU citizens

13 The KPMG name and logo are registered trademarks or trademarks of KPMG International Data and Information Governance Models

14 Data governance vs. Information governance legacy view

Data governance Information governance “The exercise of authority and control over the “The specification of decision rights and an management of data assets to define, approve accountability framework to ensure appropriate and communicate data strategies, policies, and behavior in the valuation, creation, storage, use, standards; to track and enforce regulatory archiving and deletion of information. It includes the processes, roles and policies, standards and metrics compliance and conformance to data…” etc. that ensure the effective and efficient use of – Data Architecture Management Association (DAMA) information in enabling an organization to achieve its goals.” – Gartner

“The function that defines and “Activities and technologies that implements the standards, controls and organizations employ to maximize best practices of the data management the value of their information while program in alignment with strategy.” minimizing associated risks and – Enterprise Data Management Council (EDM) costs.” – Information Governance Initiative

15 The KPMG name and logo are registered trademarks or trademarks of KPMG International What is unified data and information governance?

UDIG

What is UDIG? Value creation Value enablement KPMG LLP’s (KPMG) approach for governing data and information enterprise-wide to bridge silos and govern data Business leveraging collective knowledge and expertise across various Growth Intelligent Automation disciplines. Data value and re-use (e.g., monetization) is Lifecycle maximized as risks and costs are mitigated and minimized Analytics & Insights 1 14 Investigations What is the vision of UDIG? 2 13 Security 3 12 — Simplify governance – creating a common language for Experience 4 UDIGUDIG 11 Privacy data and information – fostering collaboration and 5 10 Management A decisions that are both data-driven and expedient 6 9 Risk 7 8 Operational framework Transformation — Data is governed across data and information disciplines Compliance Resilience

concurrently facilitating visibility, collaboration, new Operational insights, and accelerated change management Excellence — Organizations evolve to be more value-driven, agile, Data Management protected, and compliant by design — Recognizes data as both an asset and as a risk Data as an asset Data as a risk

16 The KPMG name and logo are registered trademarks or trademarks of KPMG International Privacy challenges are UDIG challenges

The ability to know what, where, why, and for how long an The ability to assess privacy and data protection risks and organization uses, processes, stores, shares, or sells influence how data is used, processed, stored, shared, or sold Data Inventory and personal data. Privacy Risk Mapping Management The ability to allow individuals to request access, changes The ability to detect and respond to a data breach within the or deletion of their personal information and fulfill these time limits defined by applicable laws and regulations requests within the time limits defined by applicable laws Consumer Request and Fulfillment and regulations Breach Response

The ability to define organizational privacy strategy, The ability to apply and maintain security controls to expectations, and structure to monitor program protect personal data Privacy effectiveness Data Protection Governance

The ability to notify individuals about how their personal The ability to promote awareness and understanding of data is used, processed, stored, shared, or sold while privacy risks, rules and safeguards Privacy Notice and accurately obtaining and tracking their granular consent Organizational Consent Change The ability to provide internal or external independent The ability to understand and mitigate the liabilities assurance (e.g., internal lines of defense) of the effectiveness associated with transferring data to and from third parties of the privacy program’s governance, risk management and Privacy Third Party Risk Assurance systems of internal controls

17 The KPMG name and logo are registered trademarks or trademarks of KPMG International Near future issues and additional governance considerations

Business Growth Cultural, ethical and moral

Intelligent implications Automation

Lifecycle

Deep Insights 1 14 Investigations 2 13 Long-term societal impact Security 3 12 Data Value Privacy Experience 4 Spectrum 11 5 10 Operational Resilience Mental health, 6 9 Risk quality of life and 7 8 Management Transformation Compliance overall well-being

Operational Excellence

Data Quality Macro-economic outcomes of disruptive technologies

18 The KPMG name and logo are registered trademarks or trademarks of KPMG International Data value model: Using value and risk scorecard to improve transparency and decisions

Business Value of data as an asset Growth Risk of data as a liability

Intelligent Automation Lifecycle What data do we have, where What are our aggregate is it located and how are we Analytics & Investigations Insights 1 14 information risks? using it? 2 13 Security Considering the value of and the intended use 3 12 How can we get insight into of a particularChange data set, Experience 4 11 Privacy data value across the do we have the properPortfolio controls applied to How do we mitigate them? organization? trust5 and protect 10it? 6 9 ManagementRisk 7 8 Transformation Operational Resilience Is the treatment of the data Compliance Is our residual risk controlled Operational proportionate to its value? Excellence within our appetite?

Data Management

19 The KPMG name and logo are registered trademarks or trademarks of KPMG International Corporate Governance & Transparency

20 Corporate Ethics In an Age of Social Credits, Ratings, Monitoring & Surveillance

21

Data Ethics Needs a C-Suite Champion

23 A Governance Strategy in an Age of Social Credits, Ratings, Monitoring & Surveillance

(1) Development of an IG council with an IG champion (2) Include engineering and R&D centers (3) Building in notions of “fairness by design, “nondiscrimination by design,” “transparency by design,” “due process by design” (4) Continuous review of analytics programs for ethical, legal concerns (5) Develop policies to govern algorithmic use (6) Attain Executive level support (C-suite and Board) (7) Consider outside audits, oversight

24 Corporate Ethical Review Boards?

• To evaluate hidden algorithms affecting employees, potential hires, consumers/clients

• To provide greater transparency in decision making

• To build trust and confidence in corporate strategies and tactics

• Modeled on IRBs, an ERB to be housed as a component of Chief IG Officer, Chief Data Officer, or CIO

25 Black Mirror Future

The greater “records and information management” community (including also lawyers and IT professionals) have a role in helping to frame and control our “Black Mirror” future.

26 Recommendations and call to action

Protection of Personal Data Regulatory Response Holistic Approach Technical Capabilities Personal data New laws and More holistic Understand the protection must be regulations may be compliance and technology, data and taken extremely required to avoid governance programs capabilities of what seriously by unintended and can help ensure laws, you use in your practitioners, potentially dangerous regulations, societal, business and personal attorneys, our firms, consequences of and ethical life, and take steps to and by our society to market- and profit- consequences are identify and mitigate protect individuals. driven new always considered. potential exposures. Protect the data = technologies. Protect the individual. Jason R. Baron [email protected]

Richard P. Kessler [email protected]

Thank you The information contained herein is of a general nature and is not intended to address the circumstances of any particular individual or entity. Although we endeavor to provide accurate and timely information, there can be no guarantee that such information is accurate as of the date it is received or that it will continue to be accurate in the future. No one should act upon such information without appropriate professional advice after a thorough examination of the particular situation. Further Reading: Social Scoring

Frank Pasquale, “Quantifying Love,” Boston Review (April 4, 2019), http://bostonreview.net/print-issues-politics/frank-pasquale-quantifying-love Frank Pasquale, “Data Nationalization in the Shadow of Social Credit Systems,” Law and Political Economy (June 18, 2018), https://lpeblog.org/2018/06/18/data- nationalization-in-the-shadow-of-social-credit-systems/ European University Institute, “The Chinese social credit system: A model for other countries?” (2019), https://cadmus.eui.eu/bitstream/handle/1814/60424/LAW_2019_01.pdf?sequence= 1&isAllowed=y Comments of the EPIC to the FTC re Consumer Welfare Implications Associated with the Use of Algorithmic Decision Tools, AI and Predictive Analytics (Aug. 20, 2018), https://www.ftc.gov/system/files/documents/public_comments/2018/08/ftc-2018- 0056-d-0024-155150.pdf Suggested Further Reading (Algorithmic Accountability)

Article 29 Data Protection Working Party, “Guidelines on Automated individual decision-making and Profiling for the Purposes of Regulation 2016/679” (last revised & adopted Feb. 6, 2018) https://iapp.org/media/pdf/resource_center/W29-auto- decision_profiling_02-2018.pdf Solon Barocas & Andrew D. Selbst, “Big Data’s Disparate Impact,” 104 California L. Rev 671 (2016), https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2477899## Maja Brkan, “Do Algorithms Rule The World? Algorithmic Decision-Making in the Framework of the GDPR and Beyond” (Feb. 22, 2018), https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3124901 Lillian Edwards & Michael Veale, “Enslaving the Algorithm: From a ‘Right to an Explanation’ to a “right to Better Decisions,’” IEEE Security & Privacy (2018), 16:3, https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3052831 Lillian Edwards & Michael Veale, “Slave to the Algorithm? Why a “Right to an Explanation” Is Probably Not The Remedy You Are Looking For,” 16 Duke Law & Tech. Review 17 (2017) Kathryn Hume, “When Is It Important for an Algorithm to Explain itself?,” Harv. Bus. Review (July 6, 2018), https://hbr.org/2018/07/when-is-it-important-for-an-algorithm-to-explain-itself Margot E. Kaminsky, “ The Right to Explanation, Explained,” U of Colorado Law Legal Studies Research Paper No. 18-24 (2018), https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3196985 Joshua Kroll et al., “Accountable Algorithms,” 165 U Pa. L.Rev. 633 (2017), https://scholarship.law.upenn.edu/cgi/viewcontent.cgi?referer=&httpsredir=1&article=9570&context=penn_law_review Frank Pasquale, THE BLACK BOX SOCIETY: THE SECRET ALGORITHMS THAT CONTROL MONEY AND INFORMATION (Harvard U 2015) Andrew D. Selbst & Julia Prowles, “Meaningful Information and the Right to an Explanation,” 7(4) International Data & Privacy Law 233 (2017), https://academic.oup.com/idpl/article/7/4/233/4762325 Sandra Wachter et al., “Counterfactual Explanations Without Opening The Black Box: Automated Decisions and the GDPR,” Harv. J. of Law & Technology 31:2 (2018), https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3063289##