<<

EFiled: Aug 14 2018 04:33PM EDT Transaction ID 62347665 Case No. 2018-0307-JRS IN THE COURT OF CHANCERY OF THE STATE OF DELAWARE

KAREN SBRIGLIO and FIREMEN’S : RETIREMENT SYSTEM OF ST. : LOUIS, derivatively on behalf of : Nominal Defendant , INC., : C. A. No. 2018-0307-JRS : Plaintiffs, : : v. : : , SHERYL : Filed August 7, 2018 SANDBERG, , : ERSKINE B. BOWLES, SUSAN : PUBLIC VERSION DESMOND-HELLMANN, REED : PURSUANT TO HASTINGS, JAN KOUM, PETER A. : CT. CH. R. 5.1 THIEL, and : PRICEWATERHOUSECOOPERS, LLP : Public Version Filed August 14, 2018 : Defendants, : : and : : FACEBOOK, INC., : : Nominal Defendant. : : : :

FIRST AMENDED VERIFIED STOCKHOLDER DERIVATIVE COMPLAINT This document contains information produced by Nominal Defendant Facebook, Inc. pursuant to Section 220 of the Delaware General Corporation Law (“DGCL”) under a confidentiality agreement.

120294587_1

TABLE OF CONTENTS

Page(s) TABLE OF EXHIBITS ...... iv

I. PRELIMINARY STATEMENT ...... 1

II. INTRODUCTION...... 3

III. JURISDICTION AND VENUE ...... 24

IV. PARTIES ...... 25

V. RELEVANT NON-PARTIES ...... 31

VI. THE SPECIFIC DUTIES OF FACEBOOK’S BOARD ...... 32

VII. FACTUAL BACKGROUND ...... 388

A. Facebook’s Aggressive Business Model and Open Platform ...... 388

B. Facebook’s Illusory User Agreements, Data Policies and Privacy Settings ...... 422

C. The Extensive Regulation of User Privacy and Facebook’s Business ...... 49

D. Individual Defendants Knew, or Were Reckless in Not Knowing, That Facebook Did Not Comply with Its Legal Obligations ...... 53

1. Facebook has a Culture of Indifference and Narrow Sense of Responsibility ...... 53

2. Facebook was Forced to Explain a Stream of Early Privacy Abuses ...... 57

3. Following Frequent, Ongoing Revelations About Facebook’s Data Privacy Practice, the FTC Takes Enforcement Action ...... 62

i

120294587_1

4. Following Entry of the FTC Consent Order, Facebook Continued to Violate Its Privacy Obligations ...... 624

5. Facebook Operations Manager and Sandy Parakilas Warned Senior Executives in 2012 of the Dangers of Lax Controls Over App Developers ...... 625

E. GSR and Use Facebook Data For Political Purposes ...... 71

F. Individual Defendants Illegally Concealed the Cambridge Analytica Scandal ...... 76

1. Reports on Cambridge Analytica’s Activities in December 2015...... 76

2. After The Guardian Published the December 2015 Article, Facebook Continued to Pursue Revenue Generated Through Election Activities, Ignoring the Warnings of the Media, Investigators, Zuckerberg’s Mentor Roger McNamee, and Facebook’s Chief Security Officer Alex Stamos...... 82

3. Zuckerberg Refuses to Testify Before the British Parliament...... 88

4. Individual Defendants Continue to Resist Admitting to the Public that they Concealed the Cambridge Analytica Scandal in Hearings Before the U.S. Congress...... 95

G. Following the Revelations of Cambridge Analytica, Zuckerberg Continued to Mislead The Public ...... 98

H. Individual Defendants Knowingly Made False and Misleading Public Statements Regarding Facebook’s Privacy Practices ...... 108

I. Defendants Koum, Zuckerberg, and Sandberg Sold Facebook Shares Based on Their Knowledge of Material, Non-Public Information ...... 127

J. Individual Defendants Consciously Disregarded Their Duties ...... 133 ii

120294587_1

K. Defendant PwC Aided and Abetted Individual Defendants’ Breaches of Fiduciary Duties ...... 137

L. Defendants’ Misconduct Has Harmed Facebook ...... 147

VIII. DERIVATIVE ALLEGATIONS ...... 151

A. Demand on Facebook’s Board Would Have Been Futile ...... 152

B. The Board Lacks Independence ...... 152

C. The Majority of the Board is Subject to Substantial Risk of Personal Liability ...... 159

IX. LEGAL COUNTS ...... 164

COUNT I – BREACH OF FIDUCIARY DUTY (AGAINST DEFENDANT ZUCKERBERG)...... 164

COUNT II – BREACH OF FIDUCIARY DUTY (AGAINST ALL INDIVIDUAL DEFENDANTS OTHER THAN DEFENDANT ZUCKERBERG)...... 165

COUNT III – BREACH OF FIDUCIARY DUTY (AGAINST INDIVIDUAL DEFENDANTS ZUCKERBERG AND SANDBERG) ...... 166

COUNT IV – AIDING AND ABETTING ...... 168

COUNT V – BREACH OF FIDUCIARY DUTY OF DISCLOSURE (AGAINST INDIVIDUAL DEFENDANTS) ...... 170

COUNT VI – INSIDER TRADING (AGAINST DEFENDANTS ZUCKERBERG, SANDBERG, AND KOUM) ...1711

COUNT VII - CONTRIBUTION OR INDEMNIFICATION (AGAINST ALL INDIVIDUAL DEFENDANTS) ...... 172

PRAYER FOR RELIEF ...... 173

iii

120294587_1

TABLE OF EXHIBITS

Exhibit Document Description Page(s) 1 Ex. A FTC Consent Order, dated 7.27.2012 6, 52, 63 2 Ex. B 12.11.2015, campaign using firm that 6 harvested data on millions of unwitting Facebook users _ US news _ The Guardian 3 Ex. C Letter from Senator Blumenthal to FTC, 8, 66 4.19.2018 4 Ex. D GSR and Facebook Confidential Settlement 9 Agreement & Mutual Release and Certifications 5 Ex. E The Guardian and New York Times March 17, 10, 73 2018 6 Ex. F Zuckerberg Opening Statement to Senate 11, 38 Committee, April 10, 2018 7 Ex. G Transcript of Mark Zuckerberg Senate hearing, 11, 42 4.10.2018 8 Ex. H Transcript of Zuckerberg appearance before 11, 80, 101 House committee 4.11.2018 9 Ex. I Utterly horrifying_ ex-Facebook insider says 17,19, 67, 91 covert data harvesting was routine _ News _ The Guardian 10 Ex. J 2015-2017 PwC Report on Facebook's Privacy 18 Program 11 Ex. K Copy of Andrew Bosworth Memo, dated 56 6.18.2016 12 Ex. L Facebook Data Policy, as of Dec. 11, 2015 47 13 Ex. Facebook Data Policy, April 19, 2018 48 14 Ex. N Facebook's 2018 Proxy Statement 123, 170 15 Ex. O Zuckerberg Insider Sales Table 130 16 Ex. P Sandberg Insider Sales Table 132 17 Ex. Q Koum Insider Sales Table 132 18 Ex. R 2012-2013 PwC Report on Facebook's Privacy 141 Program 19 Ex. S 2013-2015 PwC Report on Facebook's Privacy 141 Program 20 Ex. T Facebook -- 2018 Proxy -- SUPP 121, 170

iv

120294587_1

I. PRELIMINARY STATEMENT

Plaintiffs Firemen’s Retirement System of St. Louis (“FRS”) and Karen

Sbriglio (collectively, the “Plaintiffs”) derivatively on behalf of nominal defendant

Facebook, Inc. (“Facebook” or the “Company”), bring the following Amended

Verified Derivative Complaint against Facebook’s Chief Executive Officer

(“CEO”) and Chairman of its board Mark Zuckerberg (“Zuckerberg”), Chief

Operating Officer (“COO”) and director (“Sandberg”), and directors Marc Andreessen (“Andreesen”), Erskine S. Bowles (“Bowles”), Susan

Desmond-Hellmann (“Desmond”), (“Hastings”), Jan Koum

(“Koum”), and (“Thiel”) (collectively “Individual Defendants”), for breaching fiduciary duties that they owed to the Company and to its shareholders, and also bring claims against PricewaterhouseCoopers, LLP (“PwC”) for aiding and abetting in several of these breaches of fiduciary duty.

Except for the allegations specifically pertaining to Plaintiffs and Plaintiffs’ own acts, the allegations in this Complaint are based on information and belief.

Shareholder Plaintiffs, by and through their attorneys, derivatively on behalf of

Facebook, allege upon personal knowledge as to themselves and their own acts, and upon information and belief as to all other matters, based upon, inter alia, the investigation conducted by and through their attorneys, which included, among other things: 1

120294587_1

• Exposé newspaper articles;

• Transcripts of testimony, written statements, and documents submitted by over 50 different witnesses in connection with Britain’s House of Commons, Digital, Culture, Media and Sport Committee Investigation of “;”

• Written statements and testimony by Cambridge Analytica whistleblower (“Wylie”)’

• Written statements and testimony of former Facebook operations manager and whistleblower Sandy Parakilas (“Parakilas”);

• Transcripts of testimony given by Facebook CEO Mark Zuckerberg before the U.S. Senate’s Judiciary and Commerce Committees and the U.S. House of Representatives Energy and Commerce Committee, and Facebook’s submissions in response to over 2,000 questions posed by members of Congress, submitted on June 8, 2018 and June 29, 2018;

• Documents secured through discovery in this action pursuant to Section 220 of the DGCL, including meeting minutes of Facebook’s Board and the Board’s Audit & Risk Oversight Committee;

• Facebook’s policies, statements, terms of service, and other Facebook documents provided on Facebook’s website and prior versions of data policies, terms of use, application developer policies, and related documents secured from www.waybackmachine.org;

• The Company’s public filings with the Securities and Exchange Commission (“SEC”);

• The United State ’s (“FTC”) repository of documents, available at https://www.ftc.gov/about-ftc/foia/frequently- requested-records/facebook; and

2

120294587_1

• Additional media reports, press releases, consent decrees, agreements, court filings, and other publicly available documents regarding Facebook, Cambridge Analytica, and other related parties.1

II. INTRODUCTION

1. This is a derivative suit brought on behalf of Facebook to address the most serious crisis in corporate governance currently facing a public Delaware corporation. More than 2.2 billion people trust their private data to Facebook— one of the ten largest companies on Earth—which is run by its 34-year-old founder, CEO and Chairman Mark Zuckerberg and his hand-picked Directors and

Officers. When the full extent of the governance crisis was revealed on July 25,

2018, Facebook shareholders lost more than $119 billion—the largest single-day loss of shareholder value in history.

2. With the benefit of investigative reporting, , hearings before the United States House of Representatives and the , hearings before the British Parliament, corporate records produced pursuant to

Section 220 of the DGCL, and Facebook’s now-public written responses to more than 2,000 questions posed by members of Congress, it is clear that the trust that existed between 2.2 billion users and the Company has been harmed. The

Defendants have repeatedly violated their most basic fiduciary duties to one of the

1 All exhibits and documents set forth in the Amended Complaint are incorporated herein by reference. 3

120294587_1

most important Delaware corporations in the country, and Court intervention is required to prevent further harm to the company and to remedy past harms.

3. Nominal Defendant Facebook is entitled to substantial contribution and indemnification from the Defendants for their historic breaches. Three defendants (Zuckerberg, Sandberg, and Koum) must also make restitution to the

Company for all gains realized by the improper sales of Facebook stock based on

Material Non-Public Information they obtained in their roles as Directors since

2015. And finally, the shareholder plaintiffs ask this Court for an extraordinary injunction ordering the following emergency measures:

a. First, that the Company be ordered to adopt a policy, and amend the bylaws as necessary, to require the Chair of the Board to be an independent member of the Board and the roles of Chair and CEO be split;

b. Second, that the Company be ordered to take all practicable steps in its control toward initiating and adopting a recapitalization plan for all outstanding stock to have one vote per share. This would include efforts at the earliest practicable time toward encouragement and negotiation with Class B shareholders to request that they relinquish, for the common good of all shareholders, any preexisting disproportionate rights; and

4

120294587_1

c. Third, that the Company be ordered to immediately establish an independent Risk Oversight Board Committee to be selected by a majority of the Class A shareholders.

4. The requested equitable remedies above (all based on recent shareholder proposals garnering broad past support from Class A shareholders) are extraordinary remedies, but appropriate for this extraordinary case. In the alternative, Plaintiffs ask this Court to order the Company to hold a shareholder vote at the earliest opportunity to allow Class A shareholders to vote on the three measures above and abiding by the majority vote of such shareholders (a

“majority of the minority”).

5. Facebook controls and is legally obligated to protect the of over 2.2 billion people worldwide. Facebook’s legal obligation to protect users’ private data arises in part from (a) consumer and privacy laws in the United States, the and other countries around the world; (b) its certification to comply with the Privacy Shield Frameworks (defined below); (c) its Facebook user agreements; (d) its policies on data security and user privacy; and (e) a consent decree that Facebook entered with the U.S. Federal Trade Commission (“FTC”) in

November 2011 that became a final order on July 26, 2012 (the “Consent Order” or “Consent Decree”). Zuckerberg, Sandberg, Andreesen, Bowles, Hastings, and

5

120294587_1

Thiel received a copy of the Consent Order by September 2012

6. The Consent Order requires, among other obligations, that Facebook monitor third parties’ access and use of Facebook users’ data, obtain users’ express consent prior to sharing their data, and notify the FTC of any violations of the

Consent Order.2

7. On December 11, 2015, The Guardian reported that political consultant Cambridge Analytica harvested the personal data of tens of millions of

Facebook users to construct psychological profiles using an application (“app”) created by developer Global Science Research Ltd. (“GSR”).3 The psychological profiles were then used to target and manipulate users for political gain.

8. There is also evidence that Facebook engineers assisted Cambridge

Analytica and GSR with the data transfer at some point between June and August

2014 when the massiveness of the transfer caused Facebook’s platform to throttle the app, meaning that it slowed down the data transfer to a crawl, effectively

2 See Ex. A, FTC Consent Order with Facebook, filed on July 26, 2012. 3 See Ex. B, Harry Davies, “Ted Cruz using firm that harvested data on millions of unwitting Facebook users,” The Guardian, available at https://www.theguardian.com/us-news/2015/dec/11/senator-ted-cruz-president- campaign-facebook-user-data, dated 12/11/2015. 6

120294587_1

disabling the transfer.4 This required GSR to reach out to Facebook for assistance to un-throttle the data transfer. In addition, In November 2015, Facebook hired

GSR co-founder, Joseph Chancellor, and likely learned about his work with GSR and Cambridge Analytica through the interview process.

9. Despite this, Zuckerberg testified that he had only learned of GSR, the data collection, and the misuse of the data on December 11, 2015 through The

Guardian article. At that point, outside counsel was retained to commence an investigation.5

10. By early 2016 (if not much earlier), Zuckerberg and the other

Individual Defendants knew or should have known that:

a) the private information of 87 million users, or 40% of the U.S. voting population, had been collected;

b) GSR had created an app called thisismydigitallife.com (“TIMDL”) which purported to provide users a personality quiz;

4 Testimony of Christopher Wylie before the British Parliament’s House of Commons, Digital, Culture, Media and Sport Committee on March 27, 2018, at Q1336, available at http://data.parliament.uk/writtenevidence/committeeevidence.svc/evidencedocume nt/digital-culture-media-and-sport-committee/fake-news/oral/81022.pdf [hereinafter “Wylie Tr.”]. 5 Testimony of Facebook’s Chief Technology Officer Michael Schroepfer before the British Parliament’s House of Commons, Digital, Culture, Media and Sport Committee, at Q2158, available at http://data.parliament.uk/writtenevidence/committeeevidence.svc/evidencedocume nt/digital-culture-media-and-sport-committee/fake-news/oral/82114.pdf [hereinafter, “Schroepfer Tr.”]. 7

120294587_1

c) 270,000 users downloaded TIMDL and consented to having their information collected by GSR for the purpose of “provid[ing] people an opportunity to see their predicted personalities based on their Facebook information” and “understand[ing] how people’s Facebook data can predict different aspects of their lives;”6

d) from June through August 2014, Cambridge Analytica—aided by GSR and Facebook—collected the information of those 270,000 users and all of their friends;

e) the information collected included “the name, demographics, status updates and Facebook likes of your profile and of your network;”

f) the information collected was not being used in the manner presented, but rather used for commercial purposes, campaigning, and manipulation of individuals for political gain; and

g) multiple entities and individuals had access to and copies of the data and/or profited from its collection, including: Dr. Aleksander Kogan (“Kogan”), (“Nix”), SCL Elections Ltd. (“SCL”), AggregateIQ (“AIQ”), Eunoia Technologies, Inc. and Christopher Wylie (“Wylie”), who would eventually become the whistle-blower who revealed all of the details above.

11. After conducting its investigation, Facebook did not:

a) report these “bad actors” to police, the FBI, ’s Attorney General, or the entities that regulate data privacy such as the FTC which was monitoring Facebook at the time pursuant to the Consent Order;

b) notify the 87 million Facebook users whose private information was unlawfully obtained for an undisclosed purpose without their consent;

c) sue these six individuals and entities in a civil action to obtain an immediate injunction; or

6See Ex. C, Letter from Senator Richard Blumenthal to FTC Chairperson Maureen Ohlhausen, dated 4/19/2018, at Attachment A: GSR and TIYDL Terms And Conditions. 8

120294587_1

d) demand a forensic audit of their systems to determine exactly what had been taken, whether that data was further distributed to or accessed by others.

12. Rather, Zuckerberg ordered his team of lawyers to secure silence and the six individual entities’ and individuals’ promises to destroy the information they had taken. Many (but not all) parties involved eventually signed a brief certification claiming that they had destroyed the user data.7

13. Facebook entered into a non-disclosure agreement with GSR and

Kogan and in exchange released and waived all claims related to the misappropriated data.8 Zuckerberg and the rest of the Board “considered it a closed case. In retrospect, that was clearly a mistake.”9

14. Individual Defendants had every incentive to conceal the data breach of 87 million users: (a) Facebook was being closely watched by the FTC for privacy abuses involving app developers like GSR and, pursuant to the Consent

Order, Facebook was subject to steep penalties of $40,000 per violation per day;

7 See Ex. D, Certifications obtained by Facebook, unsigned by Facebook, and produced by Facebook to the British Parliament in the course of its investigation. 8 Id. 9 See Nicholas Confessore, “Audit Approved of Facebook Policies, Even After Cambridge Analytica Leak,” , dated 4.18.2018, available at https://www.nytimes.com/2018/04/19/technology/facebook-audit-cambridge- analytica.html. 9

120294587_1

and (b) Facebook was starting to rake in political advertisement revenue and did not want to scare off these advertisers.

15. Between 2016 and 2018, Cambridge Analytica had become the focus of several investigations and had been routinely covered by the press with respect to its ties to the Trump 2016 Presidential campaign.

16. On February 8, 2018, in an effort to further conceal the Cambridge

Analytica scandal, senior Facebook executive Simon Milner misled the British

Parliament when he affirmatively rejected the notion that Cambridge Analytica had accessed Facebook’s data.10 Nix also testified falsely that he had not used nor possessed Facebook data.11

17. On March 17, 2018, The Guardian and The New York Times revealed the sordid details of the illegal data harvesting that had taken place through false pretenses, relying on specific facts and evidence supplied by whistleblower

Christopher Wylie, who confirmed that the data had not been destroyed. In the

10 Testimony of Simon Milner before the British Parliament’s House of Commons, Digital, Culture, Media and Sport Committee, on February 8, 2018, at Q447-449, http://data.parliament.uk/writtenevidence/committeeevidence.svc/evidencedocume nt/digital-culture-media-and-sport-committee/fake-news/oral/78195.pdf [hereinafter “Milner Tr.”] 11 See infra Section F.3. 10

120294587_1

week leading up to the release of these articles, Facebook threatened to sue the whistleblower, the media, and the reporter for making these revelations.12

18. In the aftermath of the Cambridge Analytica scandal, Zuckerberg was commanded to appear before the and the British

Parliament. While Zuckerberg refused several times to appear before Parliament,

Zuckerberg appeared before Congress on April 10 and 11, 2018.

19. In Zuckerberg’s opening remarks to Congress, he testified that in

2007 Facebook’s platform was available to third party application (“app”) developers who were given “access to a large amount of information before we locked down our platform in 2014.”13 Zuckerberg repeatedly testified that

12 Ex. E, and Emma Graham-Harrison, “Revealed: 50 Million Facebook Profiles Harvested for Cambridge Analytica in Major Data Breach,” The Guardian 3/17/2018; , Nicholas Confessore and Carole Cadwalladr, “How Trump Consultants Exploited the Facebook Data of Millions,” The New York Times, 3/17/2018; Wylie’s Written Statement to Senate Judiciary Committee, at 11-14, available at https://www.judiciary.senate.gov/imo/media/doc/05-16-18%20Wylie%20 Testimony.pdf. 13 Ex. F, Zuckerberg Written Statement to Senate Committee, 4/10/2018. See also Ex. G, Testimony of Mark Zuckerberg before the Senate Judiciary and Commerce Committees on 4/10/2018 [hereinafter “Zuckerberg 4/10/2018 Tr.”], available at https://www.washingtonpost.com/news/the-switch/wp/2018/04/10/transcript-of- mark-zuckerbergs-senate-hearing/?utm_term=.912215a7ecc0; Ex. H, Testimony of Mark Zuckerberg before the House Committee on 4/11/2018, at Opening Statement [hereinafter “Zuckerberg 4/11/2018 Tr.”], available at https://www.washingtonpost.com/news/the-switch/wp/2018/04/11/transcript-of- zuckerbergs-appearance-before-house-committee/?utm_term=.cb018e00f88b. 11

120294587_1

restriction on the disclosure of users’ personal information to outsiders were implemented by 2014, stating that Facebook had “dramatically limit[ed] the

Facebook information [that] apps could access.”14 Zuckerberg then claimed that the amendments made in 2014 would have “prevent[ed] this specific instance with

Cambridge Analytica from happening again today.” He claimed that Facebook would conduct a full investigation into every app that had access to a large amount of information “before we locked down platform to prevent developers from accessing this information around 2014.”15

20. In supplemental responses to Congressional questions, filed by

Zuckerberg on June 8, 2018, Facebook admitted thisismydigitallife was not the only third party application misappropriating user data. Zuckerberg testified as follows:

We are in the process of investigating every app that had access to a large amount of information before we changed our Platform in 2014. The investigation process is in full swing, and it has two phases. First, a comprehensive review to identify every app that had access to this amount of Facebook data and to focus on apps that present reason for deeper investigation. And second, where we have concerns, we will conduct interviews, make requests for information (RFI)—which ask a series of detailed questions about the app and the data it has access to—and perform audits using expert firms that may include on-site inspections. We have large teams of internal and external experts working hard to investigate these apps as quickly as possible. To date

14 See Exs. G and H. 15 See Exs. G and H. 12

120294587_1

thousands of apps have been investigated and around 200 apps have been suspended— pending a thorough investigation into whether they did in fact misuse any data. Where we find evidence that these or other apps did misuse data, we will ban them and let people know.

* * *

Additionally, we have suspended an additional 14 apps, which were installed by around one thousand people. They were all created after 2014, after we made changes to more tightly restrict our platform APIs to prevent abuse. However, these apps appear to be linked to AIQ, which was affiliated with Cambridge Analytica. So, we have suspended them while we investigate further. Any app that refuses to take part in or fails our audit will be banned. We will commit to briefing your staff on future developments.16

21. Shareholders and the public soon learned, however, that the misappropriation of user information was not an isolated incident and that it had not stopped by 2014.

22. To understand the full magnitude and scope of the problem, one must start from when Sandberg joined Facebook in 2008. At that time, Zuckerberg and

Sandberg put in place a highly-aggressive growth plan. They decided to give app

16 Letter from Facebook to Chairman John Thune and Ranking Member Bill Nelson, dated 6/8/2018, at 97-98, available at https://www.judiciary.senate.gov/imo/media/doc/Zuckerberg%20Responses%20to %20Judiciary%20Committee%20QFRs.pdf [hereinafter, “Facebook 6/8/2018 Senate Response”]. See also Olivia Solon and Julia Wong, “Facebook Suspends Another Analytics Firm Amid Questions Over , The Guardian, dated 7/20/2018, available at https://www.theguardian.com/technology/2018/jul/20/facebook-crimson-hexagon- analytics-data-surveillance (Facebook suspended Crimson Hexagon pending further investigation) [hereinafter “Guardian 7/20/2018 Article”]. 13

120294587_1

developers open access to Facebook’s platform and Application Programming

Interface (“API”), which is the intermediary that allows for the transfer of information among different applications.17 This represented a dramatic shift in business practice as Facebook had previously been the sole developer of its applications.

23. By 2010, Facebook had attracted a million app developers, all of whom had the ability to collect private information, including personally identifiable information (“PII”), posted by Facebook users on their password- protected Facebook account. As Facebook’s user-base grew from millions to hundreds of millions to billions, its business practice of offering an open platform remained unchanged despite the exponential growth in user information.

17 Testimony of Sandy Parakilas before the British Parliament’s House of Commons, Digital, Culture, Media and Sport Committee, on 3/21/2018, at Q1208, available at http://data.parliament.uk/writtenevidence/committeeevidence.svc/evidencedocume nt/digital-culture-media-and-sport-committee/fake-news/oral/80809.pdf [hereinafter, “Parakilas Tr.”] (“Most of the goals of the company were around growth in the number of people who use the service. At the time on the platform team it was about growth in apps, growth in developers. It was very focused on either building things very quickly, getting lots of people to use the service, or getting developers to build lots of applications that would then get lots of people to use the service.”). 14

120294587_1

24. None the apps had been vetted by Facebook and according to Chief

Technology Officer (“CTO”) Michael Schroepfer, Facebook did not generally review the apps’ terms of use to ensure they complied with Facebook’s policies.18

25. By 2011, Facebook’s operations manager Sandy Parakilas (and other

Facebook employees) objected to the lax monitoring and enforcement of apps handling of user data and provided a detailed monitoring presentation to “senior executives in charge of and people in charge of privacy,”19 only to be ignored.

26. Zuckerberg testified to Congress that measures taken by Facebook in

2014 would have prevented the Cambridge Analytica breach, only for the public to learn on June 29, 2018 that those measures had not been fully implemented until

January 1, 2016, and that in addition to apps, Facebook gave global device makers open access to users’ information without disclosure or consent and disregarded user privacy settings.

27. This complete and utter failure of leadership and governance left

Facebook subject to public scrutiny, over $120 billion dollars in market loss, millions of dollars in foreseeable fines and costs, and inquiries by government regulators worldwide, including a multiagency investigation led by the Department

18 Schroepfer Tr. at Q2141-2144. 19 Parakilas Tr. at Q1194. 15

120294587_1

of Justice (“DOJ”), Federal Bureau of Investigation (“FBI”), FTC, and SEC. The multiagency investigation is specifically focused on “what Facebook knew three years ago and why the company didn’t reveal it at the time to its users or investors, as well as any discrepancies in more recent accounts, among other issues.”20

28. As explained and will be shown at trial, the misconduct warranting the unprecedented level of Congressional and regulatory investigation is not the result of a rogue third-party outside the Board’s purview. Individual Defendants knew the details of the Cambridge Analytica breach and concealed it from shareholders and Facebook users.

29. Individual Defendants also misrepresented their data collection, sharing, and security practices to shareholders and Facebook users in violation of federal securities laws and the fiduciary duty of candor.

30. Finally, despite the Consent Order and being under FTC supervision,

Individual Defendants knowingly disregarded their obligations to monitor third parties with access to private user information. Individual Defendants knew that

Facebook had lax enforcement policies and acted only when forced to do so by

20 See Timberg, Dwoskin, Zapotosky and Barrett, “Facebook’s Disclosures Under Scrutiny As Federal Agencies Join Probe of Tech Giant’s Role in Sharing Data With Cambridge Analytica,” , dated 7/2/2018, available at https://www.washingtonpost.com/technology/2018/07/02/federal-investigators- broaden-focus-facebooks-role-sharing-data-with-cambridge-analytica-examining- statements-tech-giant/?utm_term=.64c8149dd19a. 16

120294587_1

outsiders. This intentional inaction and conscious disregard of the monitoring and enforcement of third parties with access to users’ information is a violation of their duty of care and loyalty. As Parakilas testified:

[T]he concern I had was that they had built this platform that would allow people to get all of this data on people who had not really explicitly authorised – and it was personally identifiable data – it had your name in some cases and your email address in cases, and in some cases it could include your private messages. It was really personal data, and they basically allowed that to leave Facebook’s servers intentionally, and then there were not any controls once the data had left to ensure that it was being used in an appropriate way. That was my concern.21

31. During his 16 months at Facebook, Parakilas stated that he did not recall a single physical audit and that lawsuits and bans were “quite rare.”22 When

Parakilas presented an enforcement plan to the CTO and other senior executives recommending a deeper audit of third party developers’ use of Facebook data, one executive asked, “Do you really want to see what you’ll find?”23 The senior leadership declined to implement Parakilas’ plan, or take any action beyond the minimal enforcement efforts being made.

21 Parakilas Tr. at Q1206. 22 Parakilas Tr. at Q1188. 23 See Parakilas Tr. at Q1194-1195 and Ex. I, Paul Lewis, “Utterly Horrifying: Ex- Facebook Insider Says Covert Data Harvesting Was Routine,” The Guardian, dated 3/20/2018, https://www.theguardian.com/news/2018/mar/20/facebook-data- cambridge-analytica-sandy-parakilas [hereinafter, “Guardian Article, 3/20/2018”]. 17

120294587_1

32. Pursuant to the FTC Consent Order, Facebook created a privacy program and retained PricewaterhouseCoopers LLP (“PwC”) to audit the program biennially. The audit reports show that PwC omitted an integral component of the mandated audit24 and by doing so was able to falsely conclude that for the periods ending February 2013, 2015, and 2017, “Facebook’s privacy controls were operating with sufficient effectiveness to provide reasonable assurance to protect the privacy of covered information and that the control have so operated throughout the Reporting Period.”

PwC, instead, rubber-stamped Facebook’s privacy program based on unverified management assertions, even in the face of an obvious need to investigate further.

33. Individual Defendants, as Facebook’s officers and/or directors, breached their fiduciary duties by concealing the Cambridge Analytica breach, and misrepresenting Facebook’s collection, sharing and monitoring of users’ private information. They consciously disregarded the known and systemic weaknesses in their infrastructure and system with a business practice that was designed to fail

24 Ex. J, PwC’s Report on Facebook’s Privacy Program for the period February 12, 2015 to February 11, 2017 (redacted version) [hereinafter “PwC 2017 Audit Report”]. 18

120294587_1

coupled with nearly nonexistent enforcement efforts. Facebook’s data security systems were so weak that Parakilas described the “main enforcement mechanism” employed with respect to vendors and developers who accessed user data as to

“call them and yell at them,” and that was only when someone complained.25

34. More specifically, the Individual Defendants failed to implement a reasonable system to monitor third-parties’ use of user information, and to ensure that the third parties were complying with Facebook users’ expectations and privacy settings. This was the entire basis of the FTC administrative action and

Consent Order, and yet the same misconduct and lax enforcement persisted before, during, and after the Consent Order was put in place.

35. The Board has long overlooked and consciously disregarded its duties and responsibilities with respect to privacy with a more than ten-year history of thousands of abuses, complaints, investigations, public outcry, civil suits, fines, the

FTC Consent Order, and dozens of apologies from Zuckerberg and Sandberg.26

36. Individual Defendants knew through multiple privacy and data incidences, well-documented internal and external reports, and the result of an FTC investigation, that: (a) third parties illegally accessed more data than what was consented to or allowed contractually; (b) Facebook had no enforcement

25 See Parakilas Tr. at Q1188 and Ex. I, Guardian Article, 3/20/2018. 26 See, infra, Sections D.2 and D.4. 19

120294587_1 mechanism in place to monitor the use of that data; and (c) Facebook users did not give informed consent to such use because they did not appreciate the number of developers accessing their data, the scope of the data they were accessing (i.e., their friends’ data), or how their data would be used. Facebook users were also misled into believing that privacy and data security were priorities at Facebook, which was patently false.

37. Facebook did not have the infrastructure in place to monitor users’ private information once it was in the hands of the over a million apps, and consciously disregarded the known risk that apps were misusing Facebook users’ information. Individual Defendants knew through multiple privacy and data incidences, well-documented internal and external reports, and an FTC investigation, that: (a) third parties illegally accessed more data than what was consented to or allowed contractually; (b) Facebook had no enforcement mechanism in place to monitor the use of that data; and (c) Facebook users did not, and could not, give informed consent to such use because they did not, and could not, appreciate the number of developers accessing their data, the scope of the data they were accessing (i.e., their friends’ data), or how their data would be used.

Facebook users were also misled into believing that privacy and data security were priorities at Facebook, which was patently false. According to one Facebook

20

120294587_1

insider, Facebook executives proactively discouraged “audit[ing] developers directly and see[ing] what’s going on with the data” and believed that “Facebook was in a stronger legal position if it didn’t know about the abuse that was happening”.

38. Meanwhile, Individual Defendants in SEC filings and on Facebook’s website misrepresented to users, shareholders, and lawmakers that it had a comprehensive privacy program in place, that it notified users their information had been compromised, and that it required app developers to adhere to strict confidentiality provisions.

39. Now that the details of the Cambridge Analytica data breach are public and Facebook’s indifference known, the Board is contemplating enforcement protocols—which it admits might take as many as 3 years to fully implement—as well as strategies for retrieving illegally harvested data from

Cambridge Analytica and investigating and auditing thousands of apps suspected of potential wrong doing.27

40. Only now, with the world watching, have Individual Defendants informed the impacted Facebook users of the Cambridge Analytica breach. Indeed,

Zuckerberg testified before Congress that “we didn’t focus enough on preventing

27 Facebook 6/8/2018 Senate Response, at 97-98. 21

120294587_1

abuse and thinking through how people could use these tools to do harm as well . .

. We didn’t take a broad enough view of what our responsibility is, and that was a huge mistake.” Zuckerberg also acknowledged after testifying before Congress that he still “didn’t understand all the details [on things like] how we were using external data on our ad system,” and that only after his testimony did Zuckerberg feel compelled to “sit down with this team and learn exactly all this stuff that [he] didn’t know.”28

41. By July 2018, Facebook had already identified 200 potential apps that need to be investigated and audited further, including Crimson Hexagon, which, like Cambridge Analytica, collected and misused Facebook users’ data and was for political gain.29

42. Individual Defendants are under a substantial threat of personal liability on the breach of fiduciary duty claims raised herein given (a) the gravity, frequency, and scope of the repeated data breaches (Cambridge Analytica and many others); (b) the number of users impacted (likely all individuals who have had a Facebook account at any time since 2007); (c) the lack of sufficient

28 Steven Levy, “Mark Zuckerberg Says It Will Take 3 Years To Fix Facebook,” Wired, dated 5/1/18, available at https://www.wired.com/story/mark-zuckerberg- says-it-will-take-3-years-to-fix-facebook/. 29 See Guardian 7/20/2018 Article (Facebook announced suspension of analytics firm Crimson Hexagon for misusing Facebook and user information for surveillance services). 22

120294587_1 infrastructure to monitor and enforce Facebook’s policies and terms of its agreements with third-party developers; (d) the failure to sufficiently monitor and enforce Facebook’s compliance with its legal obligations for a period of ten years;

(e) numerous instances where Facebook failed to comply with laws and its obligations to protect data and how it was used; and (f) the failure to provide users meaningful disclosures so that they could provide informed consent. In addition, while in possession of this material non-public information, Individual Defendants

Zuckerberg, Sandberg, and Koum sold billions of dollars of Facebook stock while knowingly concealing the Cambridge Analytica privacy breach.

43. Moreover, demand is excused because Facebook is controlled and dominated by Zuckerberg who, as of the end of 2017, controlled 60% of the shareholder vote, though he owned only 16% of outstanding shares. Zuckerberg personally selects and may remove any director on Facebook’s Board. He sets the business strategy and makes all key business decisions including his philosophy over the last ten years to “Move fast and break things.” Additionally, Zuckerberg has reportedly provided Board members access to lucrative investment opportunities.

44. Members of the Board had detailed knowledge through the December

11, 2015 Guardian article and the ensuing investigation into Cambridge Analytica,

23

120294587_1 but concealed it from the public until forced to acknowledge the breach. Indeed, when asked repeatedly between 2015 through 2018 whether Facebook knew of any wrongdoing by Cambridge Analytica, the answer was always no.

45. This has severely damaged the Company’s reputation and imposed significant costs, including due to the massive amounts of regulatory interest, inquiry, and investigations commenced in the wake of the Cambridge Analytica scandal. In addition, the Company has suffered a loss of user trust, harm to its core advertising business, and other damages associated with its exposure to litigation, regulation, fines, and other penalties. If Facebook is found to have violated the

FTC Consent Decree, the Company could face billions more in fines and penalties.

III. JURISDICTION AND VENUE 46. This Court has jurisdiction over this action pursuant to 10 Del. C. §

341 and 8 Del. C. § 111.

47. As directors of a Delaware corporation, Individual Defendants have consented to the jurisdiction of this Court pursuant to 10 Del. C. § 3114.

48. This Court has jurisdiction over Facebook pursuant to 10 Del. C. §

3111.

24

120294587_1

49. The proper venue for this action is in the Court of Chancery pursuant to Article IX of Facebook’s Restated Certificate of Incorporation, which states in relevant part as follows:

Unless the corporation consents in writing to the selection of an alternative forum, the Court of Chancery of the State of Delaware shall, to the fullest extent permitted by law, be the sole and exclusive forum for (1) any derivative action or proceeding brought on behalf of the corporation, (2) any action asserting a claim of breach of a fiduciary duty owed by, or other wrongdoing by, any director, officer, employee or agent of the corporation to the corporation or the corporation’s stockholders, ….30

IV. PARTIES 50. Plaintiff Karen Sbriglio (“Sbriglio”) owns and has owned shares of

Facebook, Inc. common stock during the entire period of wrongdoing alleged herein.

51. Plaintiff Firemen’s Retirement System of St. Louis (“FRS”) owns and has owned shares of Facebook common stock during the entire period of wrongdoing alleged herein (FRS, together with Sbriglio, are referred to herein as the “Plaintiffs”).

52. Nominal Defendant Facebook, Inc. is a corporation organized and existing under the laws of the State of Delaware, with its principal place of business at 1601 Willow Road, Menlo Park, California 94025, and its registered

30 Facebook’s Restated Certificate of Incorporation, Article IX, at 12. 25

120294587_1 agent, pursuant to 8 Del. C. § 131- et seq., at Corporation Service Company located at 251 Little Falls Drive, Wilmington, DE 19808. Facebook’s securities trade on the NASDAQ under the ticker symbol “FB.”

53. Defendant Zuckerberg is the founder of the Company and has served as the Company’s CEO and as a member of the Board since July 2004, and as

Chairman of the Board since January 2012. Zuckerberg is responsible for

Facebook’s day-to-day operations, as well as the overall direction and product strategy of the Company. He is also the Company’s controlling stockholder with ownership of stock and proxies for stock representing more than 53.3% of

Facebook’s voting power as of April 13, 2018, though he owns only 16% of

Facebook’s total equity.

54. Defendant Sheryl Sandberg (“Sandberg”) has been the Company’s

Chief Operating Officer (“COO”) since March 2008 and a member of the Board since June 2012.

55. Defendant Marc Andreessen (“Andreessen”) has been a member of the Board since June 2008 and during that time served on the Audit & Risk

Oversight Committee and, until May 2018, served on Facebook’s Compensation &

Governance Committee. Andreessen is a general partner of firm

Andreessen Horowitz, which he co-founded in July 2009. Andreessen also co-

26

120294587_1 founded, and was chairman of the board of software company , Inc.

(formerly known as Loudcloud Inc.). He served as CTO of America Online, Inc., and he co-founded Communications Corporation and served as its CTO and Executive Vice President of Products.

56. Andreesen was also an early investor and financial backer of Palantir

Technologies Inc. (“Palantir”), a private data-technology company co-founded by

Defendant Thiel. Palantir’s central technology is a software capable of compiling vast amounts of data onto a unified platform and analyzing the data for various purposes, including idiosyncrasies, patterns, building models, or identifying behaviors or conduct. As set forth below, Palantir was linked to Cambridge

Analytica by whistleblower Wylie. Wylie produced documentation showing that senior Palantir employees aided in the construction of Cambridge Analytica’s psychological profile models using illegally obtained Facebook data. Palantir also is no stranger to abuse of privacy. JP Morgan terminated its relationship with

Palantir in 2018 when the bank discovered that Palantir, JP Morgan’s consultant, was spying on the bank’s top executives without authorization. Through these various positions and investments and their connection to Palantir, Andreesen and

Thiel are uniquely and acutely familiar with the shortcomings of Facebook’s controls and oversight over its data once data is in the hands of outsiders.

27

120294587_1

57. Defendant Erskine B. Bowles (“Bowles”) has been a member of the

Board since September 2011 and chairs the Audit & Risk Oversight Committee.

Bowles is a politician who, among other appointments, served as President Bill

Clinton’s Chief of Staff from 1996 to 1998. In addition to Facebook’s Board,

Bowles currently serves on the board of Norfolk Southern Corporation, a position he has held since February 2011. Bowles also served on the boards of General

Motors Company from June 2005 to April 2009; Cousins Properties Incorporated from August 2003 to May 2012; Belk, Inc. from May 2011 to November 2015; and

Morgan Stanley from December 2005 to February 2018. Bowles has also been associated with a series of venture capital firms. He has been a Senior Advisor and non-executive vice chairman of BDT Capital Partners, LLC, a private investment firm, since January 2012.

58. Defendant Susan Desmond-Hellmann (“Desmond-Hellman”) has been a member of the Board since March 2013 and is the Lead Independent

Director of the Board. She is a member of Facebook’s Compensation & the

Governance Committee, and was a member of the Audit & Risk Oversight

Committee until May 2018.

59. Defendant Reed Hastings (“Hastings”) has been a member of the

Board since June 2011 and is the Chair of Facebook’s Compensation &

28

120294587_1

Governance Committee. Hastings has served as CEO and Chairman of ,

Inc., a provider of Internet subscription service for movies and television shows, since 1999. Hastings previously served on the board of Corporation.

60. Defendant Jan Koum (“Koum”) was a member of the Board from

October 2014 through April 2018. Koum was the co-founder and CEO of

WhatsApp Inc. (“WhatsApp”) a cross-platform mobile messaging application company and Facebook’s wholly-owned subsidiary. Facebook acquired WhatsApp in 2014 for billions of dollars. According to Facebook’s website, defendant Koum was “responsible for the design and interface of WhatsApp’s service and the development of its core technology and infrastructure.”31 On or around April 30,

2018, Koum unexpectedly resigned from the Board over disagreements about privacy and encryption. According to a Washington Post article, dated April 30,

2018, individuals familiar with the matter reported tensions with Facebook over

WhatsApp’s end-to-end encryption, which ensures that messages will not be intercepted and read by anyone outside of the conversation, including by

WhatsApp or Facebook.32 With over 1.5 billion WhatsApp users, Koum apparently

31 See Facebook website, Investor Relations, at http://facebook2016ir.q4preview .com/corporate-governance/?section= governance documents. 32 Elizabeth Dwoskin, WhatsApp founder plans to leave after broad clashes with parent Facebook, The Washington Post, dated April 30, 2018, https://www.washingtonpost.com/business/economy/whatsapp-founder-plans-to- 29

120294587_1

was concerned with this shift away from WhatsApp’s core values and positions on privacy, and resigned from Facebook’s Board.

61. Defendant Peter A. Thiel (“Thiel”) has been a member of the Board since April 2005 and serves on Facebook’s Compensation & Governance

Committee. Thiel co-founded and is a financial backer of Palantir. He has served as President of his investment firm since 2011, has been a partner of venture capital firm Founders Fund since 2005, and has served as President of Clarium

Capital Management, a global macro investment manager, since 2002. Thiel was one of Facebook’s, and Zuckerberg’s, early venture capital backers.

62. As directors and/or officers of the Company, Individual Defendants

Zuckerberg, Sandberg, Andreessen, Bowles, Desmond-Hellmann, Hastings, Koum, and Thiel (collectively, “Individual Defendants”), are in a fiduciary relationship with the Company, Plaintiffs, and the public stockholders of Facebook, and owe the highest obligations of due care, loyalty, and good faith and fair dealing.

63. Defendant PricewaterhouseCoopers, LLP (“PwC”) is a Delaware limited liability partnership with its principal place of business at 300 Madison

Avenue in New York, NY, and its registered agent pursuant to 6 Del. C. §15-1001 et seq. and § 15-111, at The Corporation Trust Company, Corporation Trust leave-after-broad-clashes-with-parent-facebook/2018/04/30/49448dd2-4ca9-11e8- 84a0-458a1aa9ac0a_story.html?utm_term=.881033a949ad. 30

120294587_1

Center, 1209 Orange Street, Wilmington, DE 19801. PwC audited Facebook’s privacy program from August 16, 2012 through the present, making repeated false representations to the FTC and the public about the effectiveness of Facebook’s privacy and data controls.

V. RELEVANT NON-PARTIES

64. Non-party Kenneth I. Chenault (“Chenault”) joined Facebook’s Board in February 2018 and is not a party to this action.

65. Non-party Jeffrey Zients (“Zients”) joined Facebook’s Board in May

2018 and is not a party to this action.

66. Non-party Cambridge Analytica Ltd. (“Cambridge Analytica”) was a

British political consulting firm which combines , data brokerage, and data analysis with strategic communication to influence voter behavior. It was incorporated in Canary Wharf, London as “SCL USA Limited” in January 2015, as a subsidiary of its American parent company SCL Group (“SCL”). In April 2016, it changed its name to Cambridge Analytica (UK) Limited. Prior to seeking liquidation pursuant to the laws of the United States and the ,

Cambridge Analytica operated in London, , and Washington, D.C.

Cambridge Analytica and SCL filed for bankruptcy on May 1, 2018.

31

120294587_1

VI. THE SPECIFIC DUTIES OF FACEBOOK’S BOARD 67. By reason of their positions as officers and directors of Facebook and because of their ability to control the business, corporate, and financial affairs of the Company, Individual Defendants owed Facebook and its shareholders the duty to act with loyalty, good faith, and diligence in the management and administration of the affairs of the Company and in the use and preservation of its property and assets, and owed the duty of full and candid disclosure of all material facts related thereto. In addition, Facebook’s foundational corporate documents (such as the

Board committee charters and the Code of Conduct) also expressly detail the requirements of the Board’s duties, requiring, inter alia, that the Board must actively identify and root out unlawful and/or unethical business practices the Company, must report and prevent such misconduct, and must disclose any deviation from the strict performance of these obligations.

68. Facebook and its shareholders depend and rely on the Board to carry out their fiduciary duties.

69. The members of the Audit & Risk Oversight Committee “shall oversee certain of the Company’s major risk exposures set forth below, provided that the Board may, in its discretion, exercise direct oversight with respect to any such matters.

32

120294587_1

Financial and Enterprise Risk. The Committee will review with management, at least annually, the Company’s major financial risk and enterprise exposures and the steps management has taken to monitor or mitigate such exposures, including the Company’s procedures and any related policies with respect to risk assessment and risk management.

Legal and Regulatory Compliance. The Committee will review with management, at least annually, (a) the Company’s program for promoting and monitoring compliance with applicable legal and regulatory requirements, and (b) the Company’s major legal and regulatory compliance risk exposures and the steps management has taken to monitor or mitigate such exposures, including the Company’s procedures and any related policies with respect to risk assessment and risk management.

Privacy and Data Use. The Committee will review with management, at least annually, (a) the Company’s privacy program, (b) the Company’s compliance with its Consent Order with the U.S. Federal Trade Commission, as well as the General Data Protection Regulation and other applicable privacy and data use laws, and (c) the Company’s major privacy and data use risk exposures and the steps the Company has taken to monitor or mitigate such exposures, including the Company’s procedures and any related policies with respect to risk assessment and risk management.”33

33 Facebook Investor Relations, III Responsibilities and Duties., “Audit Risk Oversight Committee Charter,” http://investor.fb.com/corporate-governance/audit- committe-charter/default.aspx. 33

120294587_1 70. The Audit & Risk Oversight Committee “provide[s] reports to the full board of directors regarding” the above matters.34

71. The Company also maintains Corporate Governance Guidelines, articulated by the Compensation & Governance Committee of the Board, which provide that the “Board acts as the management team’s adviser and monitors management’s performance.”35 In executing their duties:

Directors are encouraged to speak directly to any member of management regarding any questions or concerns the directors may have. In addition, the Board encourages members of management to be invited to attend Board meetings where they may share relevant information or insight related to business discussed at the meeting.36

* * *

The Board and each of its committees have the authority, at the company’s expense, to retain and terminate independent advisers as the Board and any such committee deems necessary.37

34 Facebook’s Schedule 14A Information, Proxy Statement, Executive Officers, Directors, and Corporate Governance, “Board Role in Risk Oversight”, dated 4/13/2018, https://www.sec.gov/Archives/edgar/data/1326801/000132680118000022/faceboo k2018definitiveprox.htm. 35 Facebook, Inc. Corporate Governance Guidelines (Amended as of May 31, 2018), available at https://investor.fb.com/corporate-governance/default.aspx. 36 Id., Corporate Governance Guidelines, at Section XV. 37 Id., Corporate Governance Guidelines, at Section XXII. 34

120294587_1

72. In addition to the Corporate Governance Guidelines, Board members and officers of Facebook must adhere to the Code of Conduct which provides that

“[c]onduct that violates the law or company policies is grounds for prompt disciplinary or remedial action.”38 In addition, the “failure to report a known violation of law or company policy by someone else may result in disciplinary action for employees and/or termination of employment/your relationship with

Facebook.”39

73. Pursuant to Facebook’s own corporate policies, Individual Defendants were obligated to take appropriate, affirmative measures to halt practices that they knew were in violation of the Consent Order that Facebook entered into with the

FTC in 2011, filed July 27, 2012, and to ensure that all Facebook policies and practices complied with applicable federal and state laws, rules, and regulations.

74. To discharge these duties, Individual Defendants were required to exercise reasonable and prudent supervision over the management, policies, practices, controls, and financial and corporate affairs of Facebook. By virtue of this obligation, Individual Defendants were required, among other things, to:

38 Facebook Policy, Code of Conduct, at Section II, available at https://investor.fb. com/corporate-governance/default.aspx. 39 Id. 35

120294587_1

(a) manage, conduct, supervise, and direct the employees, businesses, and affairs of Facebook in accordance with laws, rules, and regulations, as well as the charter and bylaws of Facebook;

(b) ensure that Facebook did not engage in imprudent or unlawful practices and that the Company complied with all applicable laws and regulations;

(c) neither violate nor knowingly or recklessly permit any officer, director, or employee of Facebook to violate applicable laws, rules and regulations, and to exercise reasonable control and supervision over such officers and employees;

(d) ensure the prudence, honesty, and soundness of policies and practices undertaken or proposed to be undertaken by Facebook;

(e) exercise appropriate control and supervision over public statements to the securities markets by the officers and employees of Facebook (including supervising the preparation, filing and/or dissemination of any SEC filing, press releases, audits, reports, or other information disseminated by Facebook) and examining and evaluating any reports of examinations or investigations concerning the practices, products, or conduct of officers of Facebook, and making full and accurate disclosure of all material facts, concerning, inter alia, each of the subjects and duties set forth above;

(f) remain informed as to how Facebook was, in fact, operating, and upon receiving notice or information of unsafe, imprudent, or unsound privacy and data sharing practices, to make reasonable investigation in connection therewith and to take reasonable corrective and preventative actions, including maintaining and implementing adequate operational controls over data; and

(g) preserve and enhance Facebook’s reputation as befits a public corporation and to maintain public trust and confidence in Facebook as a prudently managed institution fully capable of meeting its duties and obligations.

75. Individual Defendants Zuckerberg and Sandberg in their executive positions at Facebook bore direct responsibility for the supervision and oversight 36

120294587_1

of Facebook’s day-to-day operations, including Facebook’s agreements to share information with outsiders that was the source of the massive privacy breach to which Facebook has now admitted.

76. Because of their positions of control and authority as directors and officers of Facebook, Individual Defendants were able to and did, directly and indirectly, exercise control over the content of the various public statements issued by Facebook that were false and misleading (as described below) and failed to prevent the misconduct and enforcement shortcomings that resulted in the largest privacy breach in history.

77. The Board was obligated to monitor and oversee the legal sharing of

Facebook users’ nonpublic information with third parties by ensuring that it had a proper system in place that was reasonably calculated to: (1) give users proper disclosures and an opportunity to provide informed consent to the collection and use of their data and that of their user-friends before sharing the data with third parties; and (2) ensure that application developers were complying with

Facebook’s policies and not unlawfully transferring data to unknown third parties for unlawful uses. The Board was also obligated to modify those high-risk business practices at Facebook that, by design, were certain to lead to violations of the

Company’s legal duties.

37

120294587_1

VII. FACTUAL BACKGROUND

A. Facebook’s Aggressive Business Model and Open Platform 78. Since 2004, Facebook has operated http://www.facebook.com, a social networking website that enables a consumer who uses the site (the “user”) to create an online profile and communicate with other users.

79. Among other things, a user’s information consists of the user’s name,

“profile picture,” email address, home address, birth date, telephone number, an

Internet Protocol (“IP”) address, interest groups they join, a “Friend List” of other users who are the user’s “friends” on the site, photo albums and videos they upload, messages from one user to another, and comments posted on “timelines.”

80. In 2007, Facebook launched its platform “with the vision that more apps should be social.”40 Zuckerberg’s expressed belief was that “[y]our calendar should be able to show your friends’ birthdays, your maps should show where your friends live, and your address book should show their pictures.”41 Users could download and use third-party apps through the website to, for example, play games, take quizzes, track their physical fitness routines for comparison to their friends’ routines, or receive discount offers or calendar reminders. Facebook

40 Ex. F, Zuckerberg Written Statement to Senate Committee, dated 4/10/2018. 41 Id. 38

120294587_1 “enabled people to log into apps and share who their friends were and some information about them.”42

81. However, users were not actually enabling or controlling the information shared with the apps for several reasons: (1) Zuckerberg and other executives at the Company set users’ privacy settings to the maximum permissions allowable; (2) users were unaware of these default settings and were misled into believing that the information collected about them by third-party app developers was limited to public information; and (3) privacy settings were illusory.

82. In reality, apps were accessing private information set forth in users’ posts, messages, users’ PII, and private information about a user’s friends. Even in cases where users changed their privacy settings to limit apps’ access to information, apps could still access users’ private information if one of the users’ friends had downloaded an app and not changed their default privacy settings.

Thus, the entire Facebook ecosystem was a sieve, providing third-party apps free and open access to Facebook users’ data with only illusory user controls over data and privacy.

83. By 2010, Facebook had attracted an estimated one million app developers.43 The apps generated on Facebook led to more activity and features

42 Id. 39

120294587_1 that, in turn, increased the number of Facebook users from millions to billions. By

August 2011, Facebook had 750 million users and by March 2018 that number had grown to 2.2 billion users.

84. Throughout the relevant time frame, the nature of the data and information collected by Facebook also expanded significantly, making Facebook the largest repository of individuals’ personal data in the world, collecting over

29,000 data points on every user, including their likes and dislikes, and where they live, shop, eat, and pray, to name a few examples.

85. As one of the largest repositories of personal data in the world and a daily destination for millions of users, Facebook’s primary source of revenue is advertising dollars.

86. Facebook users obtain Facebook’s services for free in exchange for providing Facebook personal information, which has economic value because that information is used to sell of content and services. For example, Facebook can target specific demographics, social connections, interests, and habits and is sophisticated, varied, and highly data-intensive.

43 Zuckerberg interview on privacy at the D8 Conference. See https://www.wsj.com/video/d8-video-facebook-ceo-mark-zuckerberg-on- privacy/68578040-D4B5-4002-A679-130E9D833813.html. 40

120294587_1 87. Facebook generates “substantially all of its revenue from selling advertising placements to third parties.”44

88. As reflected in SEC filings and in its response submitted on June 8,

2018, to written questions from members of the United States Senate, the

Company’s “total revenue and the percentage of which comes from third-party ads” is as follows:

2017: $40,653,000,000 (98% from third party ads) 2016: $26,638,000,000 (97% from third party ads) 2015: $17,928,000 (95% from third party ads) 2014: $12,466,000,000 (92% from third party ads) 2103: $7,872,000,000 (89% from third party ads) 2012: $5,089,000,000 (84% from third party ads) 2011: $3,711,000,000 (85% from third party ads) 2010: $1,974,000,000 (95% from third party ads) 2009: $777,000,000 2008: $272,000,00045

89. The infrastructure and business of Facebook has and continues to hinge on Facebook’s ability to collect data and convert it into advertising revenue.

If users do not trust that their data is secure, protected, and used in a proper manner, then users will stop providing Facebook with information. Without its ability to collect and aggregate massive amounts of personal data and information on billions of users worldwide, Facebook’s existence could be threatened.

44 Facebook 6/8/2018 Senate Response, at 100-01. 45 Facebook’s 6/8/2018 Senate Responses, at 97-98. 41

120294587_1 B. Facebook’s Illusory User Agreements, Data Policies and Privacy Settings 90. The importance of user trust to the existence and operations of

Facebook, was a concept highlighted throughout Facebook’s website, company policies, and user agreement.

91. Facebook’s user agreement and associated privacy policies are set forth in the “Terms of Service” available on the Company’s website. This document explains the Company’s business model and represents the user’s relationship with Facebook. The Terms of Service was meant to inform users about

Facebook’s intentions with their data and act as the mechanism that gave the

Company permission to proceed with its data gathering and data sharing practices.

92. The Terms of Service provided that “privacy is very important” to

Facebook, and that users “own all of the content and information … post[ed] on

Facebook, and [] can control how it is shared through [their] privacy and application settings.”

93. According to Zuckerberg, “the first line of our Terms of Service says that you control and own the information and content that you put on Facebook. . .

42

120294587_1

you own [your data] in the sense that you chose to put it there, you could take it down anytime, and you completely control the terms under which it’s used.”46

94. Moreover, Facebook’s Terms of Service available on its website and in effect from December 1, 2008, prohibited the “harvest[ing] or collect[ing] [of] email addresses or other contact information of other users from the Service or Site by electronic or other means for the purposes of sending unsolicited emails or other unsolicited communications.”

95. Facebook conceptualized privacy in terms of users’ control over how their data was collected, used, and shared. The idea was that if a user had options with respect to their personal data, then the Company provided sufficient controls to users to protect their own privacy. In reality, this practice was exactly the vehicle by which Facebook turned people into data spigots.

96. Facebook emphasized that users always had the option to “allow”

Facebook to collect and share their information. But Facebook’s business depended upon users selecting the most permissive option available and as a result, their incentive was to use every possible strategy to engineer user consent. The notion of privacy as control benefitted Facebook, at the expense of its users, by

46 Ex. G, Zuckerberg 4/10/2018 Tr. at 13. 43

120294587_1 allowing the Company to leverage an illusion of agency via terms and settings to keep the data engine humming.

97. For example, Facebook’s Privacy Policy provided that users “have control over who sees what you share on Facebook” and sets forth the following privacy tenets:

(a) We give you control of your privacy. (b) We help people understand how their data is used. (c) We design privacy into our products from the outset. (d) We work hard to keep your information secure. (e) We work around the clock to help protect people’s accounts, and we build security into every Facebook product. (f) You own and can delete your information. (g) Improvement is constant. (h) We are accountable.47

98. Facebook’s Data Use Policy in effect from 2013 through April 2018 stated:

Your trust is important to us, which is why we don’t share information we receive about you with others unless we have received your permission; given you notice, such as by telling you about it in this policy; or removed your name and any other personally identifying information from it.48

47 “Facebook’s Privacy Principles,” www.facebook.com/about/basics/privacy- principles. 48 See Facebook’s website, https://www.facebook.com/full_data_use_policy rev’d 2013 (abbreviation_full_data.use_policy rev’d 2013). 44

120294587_1 Importantly, however, the above policies did not accurately characterize the extent to which Facebook collected and shared user information, using its default privacy settings as implicit permission.

99. First, prior to April 18, 2018, Facebook’s privacy policies stated that data and information shared with Facebook’s advertisers, measurement, and analytics partners did not include users’ PII. On April 18, 2018, Facebook removed that language because Facebook was in fact sharing PII with these partners, and had previously misrepresented otherwise.

100. Second, Facebook’s privacy settings falsely purported to give users control over their data and, in particular, information collected and shared with third-party developers of Facebook apps. However, that promise was illusory because any app downloaded by any user’s Facebook “friend” could access the users’ personal information, despite the most rigorous of privacy settings chosen by either the user or their “friend.”

101. Since the average Facebook user had 200 Facebook “friends,” the average user’s data could be collected by every app downloaded by each of their

200 friends. This translated into thousands of apps accessing an average user’s data, including users who never downloaded a single third-party app. Thus, for over a decade, no privacy setting available to any Facebook user could have

45

120294587_1 prevented the harvesting and use of users’ private information by third-party app developers.

102. On June 29, 2018, users discovered that Facebook had shared their data with third-party device makers. Facebook had entered into contractual partnerships with device makers so that the device makers could recreate the

“Facebook experience” on their respective portable devices. These partnerships— with their unfettered access to personal user information—however, were never disclosed to users and there was no privacy setting or control in place to specifically limit the sharing of information with these third party device makers.

103. Neither Facebook’s Terms of Service, data policies, nor privacy settings disclosed these partnerships with dozens of device maker companies, each of whom had unencumbered access to user data and private information. The New

York Times also reported on experiments through which it found that Facebook users’ privacy settings were being overridden by the devices used to download and access Facebook.

104. In addition, prior to April 18, 2018, Facebook was not properly advising its users about the scope of information Facebook was collecting about each user from his or her devices.

105. Facebook told users that it collected:

46

120294587_1 information from or about the computers, phones, or other devices where you install or access our Services, depending on the permissions you’ve granted. We may associate the information we collect from your different devices, which helps us provide consistent Services across your devices. Here are some examples of the information we collect:

• Attributes such as the operating system, hardware version, device settings, file and software names and types, battery and signal strength, and device identifiers. • Device locations, including specific geographic locations, such as through GPS, Bluetooth, or WiFi signals. • Connection information such as the name of your mobile operator or ISP, browser type, language and time zone, mobile phone number and IP address.49 106. In reality, Facebook was collecting the following far broader categories:

Device Information:

As described below, we collect information from and about the computers, phones, connected TVs and other web-connected devices you use that integrate with our Products, and we combine this information across different devices you use. For example, we use information collected about your use of our Products on your phone to better personalize the content (including ads) or features you see when you use our Products on another device, such as your laptop or tablet, or to measure whether you took an action in response to an ad we showed you on your phone on a different device.

Information we obtain from these devices includes:

49 See Ex. L, Facebook’s Data Policy, dated 1/30/2015, which was in effect on December 11, 2015 when Zuckerberg first claimed he learned about the Cambridge Analytica breach. 47

120294587_1 • Device attributes: information such as the operating system, hardware and software versions, battery level, signal strength, available storage space, browser type, app and file names and types, and plugins. • Device operations: information about operations and behaviors performed on the device, such as whether a window is foregrounded or backgrounded, or mouse movements (which can help distinguish humans from bots). • Identifiers: unique identifiers, device IDs, and other identifiers, such as from games, apps or accounts you use, and Family Device IDs (or other identifiers unique to Facebook Company Products associated with the same device or account). • Device signals: Bluetooth signals, and information about nearby Wi- Fi access points, beacons, and cell towers. • Data from device settings: information you allow us to receive through device settings you turn on, such as access to your GPS location, camera or photos. • Network and connections: information such as the name of your address, connection speed and, in some cases, information about other devices that are nearby or on your network, so we can do things like help you stream a video from your phone to your TV. • Cookie data: data from cookies stored on your device, including cookie IDs and settings. Learn more about how we use cookies in the Facebook Cookies Policy and Instagram Cookies Policy.50 107. Facebook’s terms of use, data policies, and privacy settings had not and continue to not fully disclose to users the scope of the private information

Facebook collects from user devices.

50 See Ex. M, Facebook Data Policy, dated 4/18/2018, available at https://facebook.com/full_data_use_policy. 48

120294587_1

C. The Extensive Regulation of User Privacy and Facebook’s Business 108. Facebook’s users are located all over the world, and, as such,

Facebook operates in every country and must comply or, at a minimum, attempt to comply with the consumer and privacy laws and regulations around the globe.

109. According to Facebook’s Form 10-K for the period ending December

31, 2017, filed with the SEC on February 1, 2018 (“2017 Form 10-K”), Facebook is “subject to a number of U.S. federal and state and foreign laws and regulations that affect companies conducting business on the Internet… In particular, we are subject to federal, state, and foreign laws regarding privacy and protection of people’s data.”51 (Emphasis added.)

110. Facebook has made numerous privacy and data security commitments regarding the collection and processing of personal data in its certification to the

EU-U.S. Privacy Shield Framework and the Swiss-U.S. Privacy Shield Framework

(collectively, “Privacy Shield Frameworks”), filed with the U.S. Department of

Commerce. For example, the Privacy Shield Principles which were updated by the

European Union in April 2016, provide in part:

51 Facebook Form 10-K, “Government Regulation,” dated 2/1/2018, https://www/sec.gov/Archives/edgar/data/1326801/000132680118000009/fb- 12312017x10k.htm. 49

120294587_1

Informed consent means that you must be given information about the processing of your personal data, including at least: the identity of the organization processing data; the purposes for which the data is being processed; the type of data that will be processed; the possibility to withdraw consent (for example by sending an email to withdraw consent); where applicable, the fact that the data will be used solely for automated-based decision-making, including profiling; . . . .52

111. According to the 2017 Form 10-K, Facebook “is liable for any processing of personal data by such third parties that is inconsistent with the

Privacy Shield Principles unless Facebook was not responsible for the event giving rise to any alleged damage.”53

112. In addition, the General Data Protection Regulation (GDPR) (EU)

2016/679 is a European regulation on data protection and privacy for individuals within the European Union that aims “to give control to citizens and residents over their personal data,” in particular “personally identifiable information,” or PII.

Businesses, like Facebook, which handle personal data are required to protect and store this data using the highest-possible privacy settings by default, so that the

52 See https://ec.europa.eu/info/law/law-topic/data-protection/reform/rights- citizens/how-my-personal-data-protected/how-should-my-consent-be- requested_en#references. 53 “Facebook Inc. and the EU-U.S. and Swiss-U.S. Privacy Shield,” https://www.facebook.com/about/privacyshield. 50

120294587_1 data is not available publicly without explicit, informed consent, and cannot be used to identify a subject without additional information stored separately. The

GDPR was adopted on April 14, 2016 and became enforceable on May 25, 2018.

113. Facebook is also regulated and monitored by the FTC and, under the

Federal Trade Commission Act, cannot engage in unfair and deceptive practices with respect to the collection and use of individuals’ personal information.

114. On November 29, 2011, following an investigation by the FTC, the

FTC and Facebook reached a proposed settlement agreement which was made a

Final Order and Decision (the “Consent Order”) on July 27, 2012 and served on

August 15, 2012.

115. The Consent Order:

(a) bars Facebook from making misrepresentations about privacy or security of users’ information, including: the extent to which third parties collect and access this information, steps taken by Facebook to verify privacy or security protections of third parties, and the extent to which information is available after users delete their accounts;

(b) requires Facebook to obtain a user’s affirmative express consent before sharing private data and enacting changes that override privacy settings;

51

120294587_1

(c) requires Facebook to prevent anyone from accessing a user’s material more than 30 days after the user has deleted his or her account;

(d) requires Facebook to establish and maintain a comprehensive privacy program designed to address privacy risks associated with the development and management of new and existing products and services and protect the privacy and confidentiality of consumers’ information;

(e) requires Facebook to design and implement reasonable controls and procedures to address the risks identified through a privacy risk assessment, and regular testing or monitoring of the effectiveness of those controls and procedures; and

(f) requires Facebook, every two years for 20 years after entry of the Consent Order, to obtain independent, third-party audits certifying that it has a privacy program in place that meets or exceeds the requirements of the Consent

Order, and to ensure that the privacy of users’ information is protected.54

116. Under the terms of the Consent Order, Facebook had numerous legal obligations to: (a) protect users’ privacy; (b) explain what information was being shared and with whom; and (c) secure users’ informed consent before sharing PII and nonpublic information with third parties. Each of Facebook’s directors, as

54 Ex. A, Consent Order. 52

120294587_1

signatory to the 2017 Form 10-K and other public disclosures required by law, is fully aware that Facebook is subject to the above laws and regulations and that violation “could subject us [Facebook] to substantial monetary fines and other penalties that could negatively affect our financial condition and results of operations.”55

D. Individual Defendants Knew, or Were Reckless in Not Knowing, That Facebook Did Not Comply with Its Legal Obligations 1. Facebook has a Culture of Indifference and Narrow Sense of Responsibility.

117. Notwithstanding Individual Defendants’ repeated promises about the importance of privacy and maintaining trust, Individual Defendants made a series of calculated business decisions to “move fast and break things,” including users’ expectations of privacy.

118. In 2010, it was famously reported that Zuckerberg, while still developing “The Facebook” at Harvard, bragged to a friend in an online chat session how easy it was to convince strangers to give him personal data, pictures and even social security numbers – he typed: “they ‘trust me’… dumb f*cks,” This

55 See Facebook’s Form 10-K, dated 2/1/2018, at 7, https://www.sec.gov/Archives/edgar/data/1326801/ 000132680118000009/fb- 12312017x10k.htm. 53

120294587_1

callous indifference towards the protection of users’ information continues to echo in the Boardroom and down the chain of command to this day.56

120.

121. The culture of indifference to privacy and accountability is well illustrated by CTO Schroepfer who freely admits that there is no way to determine what data has been transferred and shared once it crosses Facebook’s platform. He

56 Facebook Founder Called Trusting Users Dumb F*cks, The Register, May 14, 2010, available at https://www.theregister.co.uk/2010/05/14/facebook_trust_dumb/. 54

120294587_1

explained that: “[t]he problem is we can’t observe the actual data transfer that happens there. I don’t actually even know physically how the data went from one to the other. There isn’t a channel that we have some sort of control over.”57

Schroepfer then blamed users, stating: “as a consumer you’re ultimately trusting a third party with your data. Whatever data you brought from Facebook, whatever data, you’re taking these personality quizzes and you’re inputting new data in there.”58 However Facebook enables these app developers to reach Facebook users by providing them access to its platform and Application Programming Interface

(“API”), Facebook’s software intermediary that allows for the transfer of information among different apps. For Schroepfer to claim that the onus is on users to be cautious, further highlights the Company’s narrow view of its responsibility to its users.

122. During the questioning of Facebook executive Simon Milner, British

Parliament Member, Damian Collins, raised the following analogy about

Facebook’s attitude:

It is extraordinary: if Facebook were a bank, and somebody was laundering money through it, the response to that would not be, “Well, that is a matter for the person

57 See Transcript of Facebook COO Sheryl Sandberg and CTO at Code 2018, available at https://www.recode.net/2018/5/30/17397126/facebook- sheryl-sandberg-mike-schroepfer-transcript-code-2018. 58 Id. 55

120294587_1

who is laundering the money and for the authorities to stop them doing it. It is nothing to do with us. We are just a mere platform through which the laundering took place.” That bank would be closed down and people would face prosecution. What you are describing here is the same attitude…59

123. Under the Zuckerberg-Sandberg leadership, the attitude of indifference is paired with a relentless pursuit of growth. For example,

Zuckerberg’s lieutenant, Vice President Andrew “Boz” Bosworth, grotesquely explained in an employee memorandum:

We connect people… Maybe it costs someone a life by exposing someone to bullies. … Maybe someone dies in a terrorist attack coordinated on our tools. … We connect people. Period. That’s why all the work we do in growth is justified. All the questionable contact importing practices. All the subtle language that helps people stay searchable by friends.…60

59 Milner Tr. at Q421. 60 Ex. K, Andrew Bosworth, Facebook, Internal Memorandum dated June 18, 2016. When asked by the U.S. House about the “disturbing” Bosworth memorandum and Zuckerberg’s “failed obligation to enforce ethics,” Facebook responded “We recognize that we have made mistakes, and we are committed to learning from this experience to secure our platform further and make our community safer for everyone going forward.” See Letter from Facebook to Chairman Greg Walden and Ranking Member Frank Pallone, U.S. House Energy and Commerce Committee, dated 6/29/2018, available at https://docs.house.gov/meetings/IF/IF00/20180411/108090/HHRG-115-IF00- Wstate-ZuckerbergM-20180411-SD003.pdf [hereinafter, “Facebook 6/29/2018 House Response”]. 56

120294587_1

2. Facebook was Forced to Explain a Stream of Early Privacy Abuses.

124. As a result of that culture, throughout its history, Facebook has been embroiled in controversy over its failure to protect user information. Individual

Defendants were aware of numerous red flags—including lawsuits, internal and external reports, and their own admissions through repeated apologies—that

Facebook was not securing users’ private information. The misappropriation and exploitation by third parties of user data entrusted to Facebook was met with negligible response from Individual Defendants, despite its constant presence in public and private conversations.

125. As early as 2006, users protested that Facebook’s feature was publicly disclosing information that users had intended to keep private, exposing updates to friends in one central place. In response, Zuckerberg stated:

We really messed this one up. . . We did a bad job of explaining what the new features were an even worse job of giving you control of them.61

126. In December 2007, after launching Beacon, which automatically enrolled all users into sharing with advertisers what they were doing on outside websites and apps, Zuckerberg stated:

61 An Open Letter from Mark Zuckerberg, dated 9/8/2006, available at https://www.facebook.com/notes/facebook/an-open-letter-from-mark- zuckerberg/2208562130/. 57

120294587_1

We simply did a bad job with this release, and I apologize for it. . . People need to be able to explicitly choose what they share.62

127. In December 2009, Facebook again changed its website so that information that users may have designated as private—such as their Friends

List—was made public without notice or consent. Facebook changed the default settings for users’ posts from private to public. The incident triggered anger and confusion from Facebook’s user, and an investigation that ultimately led to the

FTC Consent Order.

128. In 2009, the Canadian Internet Policy and Public Interest Clinic

(“CIPPIC”) filed a complaint against Facebook alleging twelve distinct data privacy issues relating to default privacy settings, collection and use of users’ personal information for advertising purposes, disclosure of users’ personal information to third-party application developers, and collection and use of non- users’ personal information.

129. In August 2009, the Office of the Privacy Commissioner of Canada which adjudicated CIPPIC’s complaint reached a resolution with Facebook, stating:

62 Facebook Thoughts on Beacon, dated 12/5/2007, available at https://www.facebook.com/notes/facebook/thoughts-on-beacon/7584397130/. 58

120294587_1

I am pleased that Facebook reconsidered my recommendations with respect to improving consent and safeguards around third-party application developers’ access to users’ personal information. I was concerned about open access by developers to users’ personal information and recommended that Facebook introduce technical measures to limit access.

Facebook has agreed to adopt such measures and will be implementing significant changes to its site (namely, retrofitting its API) in order to give its users granular control over what personal information developers may access and for what purposes. Facebook plans to introduce a permissions-based model whereby the user can choose what information she wants to share with that particular application. There will also be a link to a statement by the developer explaining how it will use the data. Currently, other than a user choosing to opt out of the Facebook API altogether, there is no way a user can choose what information is shared with all applications.

As for friends’ data, a user can now choose if they want to share their friends’ data with a particular application. The application will only be able to access the information the friend is already sharing with the user. Friends can limit the information they share with their friends, de-friend someone, block all applications, block specific applications or block certain information through their application privacy settings. Facebook has also agreed to add information to explain the new permissions model so that users will know what happens when their friends add applications and can take steps to limit their data should they wish to.

* * *

Facebook has committed to using its best efforts to roll out the permissions model by September 1, 2010. In the

59

120294587_1

meantime, Facebook will oversee the applications developers’ compliance with contractual obligations.63

130. Despite its commitment to adopt a new permissions-based model in

2010, Facebook did not update its platform until 2014 and waited until January 1,

2016 to fully implement the change.

131. In March 2010, Facebook settled a class action in a California federal court for $9.5 million to resolve claims regarding its “Beacon” feature, which tracked what users buy online and shared the information with their friends.

Zuckerberg said that he regretted making Beacon an “opt-out system instead of opt-in. . . if someone forgot to decline to share something, Beacon went ahead and still shared it with their friends.”64

132. Also in 2010, reported that online tracking firm RapLeaf Inc. used Facebook data to build databases of personal user information and sold the data to political advertisers and other commercial entities, which went beyond its policies and users’ privacy settings.65

63 https://www.priv.gc.ca/en/opc-news/news-and-announcements/2009/let_ 090827/. 64 Id. 65Emily Steel And Geoffrey A. Fowler, Facebook in Privacy Breach, The Wall Street Journal, 10/18/2012, available at https://www.wsj.com/articles/SB1000142405270230477280457555848407523696 8. 60

120294587_1

133. In May 2010, after reporters found a privacy loophole allowing advertisers to access user identification, Zuckerberg stated: “Sometime we move too fast.… We will add privacy controls that are much simpler to use. We will also give you an easy way to turn off all third-party services.”66

134. In 2011, Facebook users complained to the Company that some of their old profile data was inexplicably posted for anyone to view on a site called

Profile Engine. The developer of that site had illegally collected 420 million user profiles, which prompted a lawsuit against the Company.

135. In 2011, a social-media startup, Klout Inc., reportedly created profiles for minors using Facebook data, without the minors’ knowledge.

136. In 2011, Max Schrems, an Austrian lawyer and privacy activist, filed a complaint with Ireland’s data protection authority over loopholes in Facebook’s policy that allowed apps to “harvest” data about their friends without consent.

137. Individual Defendants knew that Facebook’s practices created risk for the Company but allowed the ongoing abuses and lax enforcement to persist.

66 Geoffrey A. Fowler and Chiqui Esteban, 14 Years of Zuckerberg Saying Sorry, Not Sorry, The Washington Post, 4/9/2018, available at https://www.washingtonpost.com/graphics/2018/business/facebook-zuckerberg- apologies/?utm_term=.db1e361d79fd [hereinafter “Washington Post Article 4/9/2018”]. 61

120294587_1

3. Following Frequent, Ongoing Revelations About Facebook’s Data Privacy Practice, the FTC Takes Enforcement Action. 138. By 2011, the FTC had filed an administrative action against Facebook alleging that Facebook had engaged in unfair and deceptive practices over the collection and use of user data.

139. The FTC alleged that Facebook misrepresented to users that apps installed by users would have access only to user information that they needed to operate when “the apps could access nearly all of users’ personal data,” including data the apps did not need. The eight count complaint detailed various privacy abuses, including that:

(a) Facebook was misrepresenting to users that they could restrict sharing of data to limited audiences – for example, selecting “Friends Only” on a post did not prevent users’ information from being shared with third-party apps that their “friends” downloaded;

(b) Facebook was falsely claiming that it had a “Verified Apps” program and that Facebook certified the security of participating apps when it did not;

(c) Facebook was promising users that it would not share their personal information with advertisers when, according to the FTC, it did;

(d) Facebook was misrepresenting that when users deactivated or deleted their accounts, their photos and videos would be inaccessible. Facebook continued to have access to the content, even after users deactivated or deleted their accounts; and

(e) Facebook not complying with the U.S.-EU Safe Harbor Framework that governs data transfer between the U.S. and the European Union. 62

120294587_1

As set forth above, the FTC’s action led to the Consent Order, attached as Ex. A.

140. In a November 29, 2011 press release, announcing the settlement between the FTC and Facebook, then-FTC Chairman Jon Leibowitz stated that

“Facebook is obligated to keep the promises about privacy that it makes to its hundreds of millions of users,” and “Facebook’s innovation does not have to come at the expense of consumer privacy.”67

141. Zuckerberg apologized for the misconduct, stating: “I’m the first to admit that we’ve made a bunch of mistakes. . . . Facebook has always been committed to being transparent about the information you have stored with us. . .

.”68 Zuckerberg’s statement rang hollow as the lack of transparency was the reason the FTC took action against the Company to begin with.

142. All of the officers and/or directors of Facebook were aware of the

Consent Order and Facebook’s obligations thereunder as it was delivered to

Individual Defendants Zuckerberg, Sandberg, Andreesen, Bowles, Hastings, and

67 FTC Press Release, Facebook Settles FTC Charges That It Deceived Consumers By Failing To Keep Privacy Promises, dated 11/29/2011, available at https://www.ftc.gov/news-events/press-releases/2011/11/facebook-settles-ftc- charges-it-deceived-consumers-failing-keep. 68 Zuckerberg’s statement on FTC agreement, dated 11/19/2011, copy available at https://www.facebook.com/notes/facebook/our-commitment-to-the-facebook- community/10150378701937131/. 63

120294587_1

Thiel in 2012,

4. Following Entry of the FTC Consent Order, Facebook Continued to Violate Its Privacy Obligations.

143. The entry of the Consent Order did not half the stream of public revelations regarding Facebook’s failure to protect users’ private information:69

a. In December 2012, Facebook was sued and settled a class action for

$20 million over claims that in 2011 it used subscribers’ names without their permission to advertise products in its “Sponsored Stories.”

b. In July 2014, after an academic paper exposed that Facebook conducted psychological tests on nearly 700,000 users without their knowledge,

Defendant Sandberg stated: “It was poorly communicated. . . And for that communication we apologize. We never meant to upset you.”

c. In February 2016, a German court fined Facebook €100,000 for failing to comply with a 2012 order related to its policies on data usage.

d. In May 2017, France’s privacy regulator fined Facebook €150,000 for misusing user data for targeted advertising and illegally tracking what users do on and off the site via cookies.

69 See Washington Post Article 4/9/2018. 64

120294587_1

e. In May 2017, the EU’s antitrust regulator fined Facebook

€110,000,000 after it changed its privacy policy in contradiction to the pledge it made to segregate WhatsApp data from other Facebook platforms to secure approval of the merger of WhatsApp and Facebook in 2014.

f. In September 2017, the same EU regulator fined Facebook €1.2 million for failing to obtain proper consent to collect and store sensitive personal data, including information on gender, religion and internet use.

g. In March 2018, Spain’s data protection watchdog fined Facebook and

WhatsApp a total of €600,000 for processing user data without people’s consent.70

5. Facebook Operations Manager and Whistleblower Sandy Parakilas Warned Senior Executives in 2012 of the Dangers of Lax Controls Over App Developers

144. The repeated violations of Facebook users’ privacy rights were especially egregious in light of the revelation that a Facebook manager warned

Company senior executives of the very problems that led to the swirl of investigations and fines the Company now faces.

145. According to Sandy Parakilas, a former Facebook operations manager during a sixteen-month period between 2011 through 2012: “[a]nyone can create a

70 See https://www.telecompaper.com/news/spain-watchdog-fines-whatsapp- facebook-eur-600000-for-data-swap--1236383 and https://biglawbusiness.com/facebook-whatsapp-fined-by-spain-for-failure-to- obtain-consent/. 65

120294587_1

Facebook app – there is no background check.”71 In other words, Facebook never vetted the one million estimated app developers, including private companies and hobbyists, working off of Facebook’s API as of 2010.

146. Parakilas (and other Facebook employees) expressed concerns that allowing unknown, unvetted apps to scrape Facebook users’ nonpublic information, including PII, without their express consent violated users’ reasonable expectation of privacy over the information they chose to share through their password-protected Facebook accounts.72

147. According to Defendant and CTO Schroepfer, Facebook did not review the apps’ terms and conditions, which presumably set forth the scope and purpose of data collection by the app.73

71 Parakilas Tr. at Q1213. 72 Parakilas Tr. at Q1191-1194; Ex. C, Letter from Senator Richard Blumenthal to FTC Chairperson Maureen Ohlhausen, dated 4/19/2018, at Attachment B: Letter from Sandy Parakilas to Senator Blumenthal, undated. 73 See Schroepfer Tr., Parliament, at Q2141- 2144 (Schroepfer: We require that people have terms and conditions and we have an automated check to make sure they are there. At the time, this was in 2014 or maybe earlier, we didn’t read all of the terms and conditions. Stevens: You didn’t read their terms and conditions? Schroepfer: No. Stevens: Prior to you putting those measures in place, you did not ever read terms and conditions of any developer’s apps that were put on Facebook; is that right? Schroepfer: I can’t say we never read them. I think it was not a requirement that we read them). 66

120294587_1

148. Indeed, Parakilas (and other employees) expressed concerns as early as 2011 in the midst of Facebook entering into a settlement agreement with the

FTC over this exact misconduct.

149. Parakilas warned senior executives, including Facebook’s former

CTO —who was then responsible for Facebook’s platform—that

Facebook’s practices violated Facebook’s obligations to protect users’ privacy.

Still, Facebook maintained its practice of allowing third-party app developers to scrape information of users and users’ Facebook “friends.”

150. Parakilas told senior managers that developers collected and exploited the private information of hundreds of millions of Facebook users, and warned senior executives at the Company that the lax approach to data protection risked a major breach.

151. Parakilas testified that there were no controls over the data accessed by apps, stating that there were: “Zero. Absolutely none” and “[o]nce the data left Facebook servers there was not any control, and there was no insight into what was going on.”74

152. Parakilas was concerned that, once data migrated from Facebook’s servers to third-party developers, Facebook could no longer monitor the data, “so

74 Ex. I, Guardian Article, 3/20/2018. 67

120294587_1

[Facebook] had no idea what developers were doing with the data.”75 The

Company seldom used enforcement mechanisms, such as audits of third-party developers and their apps, to ensure data was not being misused.

153. Wylie corroborated Parakilas’s testimony: “You had all kinds of people having access to the data. Staff at Palantir had access to the data; all kinds of people had access to the data.”76

154. Parakilas created a PowerPoint presentation warning about the areas where the Company was exposed and user data was at risk, and shared the presentation “with a number of people in the company.”77 He outlined his concerns with Facebook’s platform stating that “I made a map of the various data vulnerabilities of the Facebook platform,”78 “I included lists of bad actors and potential bad actors,”79 and advised on “some of the things these people could be doing and here’s what’s at risk.”80

155. Parakilas relayed how he had discovered a social games app using

Facebook data to automatically generate profiles of children without their consent,

75 Id. 76 Wylie Tr. at Q1341. 77 Parakilas Tr. at Q1192. 78 Id. 79 Id. 80 Id. 68

120294587_1

and another app asking permission to gain access to user’s Facebook messages and posted photos. Facebook’s senior management ignored these concerns.

156. Parakilas testified that he shared the presentation “with a number of people in the company at the time”81 including “senior executives in charge of

Facebook Platform and people in charge of privacy,”82 including executives who are still working at the Company.

157. Parakilas also warned senior executives at Facebook of the risk that its data protection policies could be breached given the Company’s minimal or nonexistent procedures for auditing and enforcing those policies.

158. Parakilas explained that his “concerns were that all of the data that left

Facebook servers to developers could not be monitored by Facebook.”83 According to Parakilas, Facebook did not conduct regular audits, and although his primary responsibilities “were over policy and compliance for Facebook apps and data protection,”84 Parakilas said that “during my 16 months in that role at Facebook, I do not remember a single physical audit of a developer’s storage.”85

81 Id. 82 Id. at Q1194. 83 Id. at Q1194. 84 Id. 85 Id. 69

120294587_1

159. Parakilas “asked for more audits of developers and a more aggressive enforcement regime”86 but he did not get a specific response. “Essentially, they did not want to do that.”87 According to Parakilas, “the company felt that it would be in a worse legal position if it investigated and understood the extent of abuse, and it did not.”88

160. According to Parakilas, “Facebook had very few ways of either discovering abuse once data had been passed or enforcing on abuse once it was discovered.”89

161. In an Op-Ed in The New York Times, Parakilas stated that when he reported these incidents to senior executives, they did not care:

At a company that was deeply concerned about protecting its users, this situation would have been met with a robust effort to cut off developers who were making questionable use of data. But when I was at Facebook, the typical reaction I recall looked like this: try to put any negative press coverage to bed as quickly as possible, with no sincere efforts to put safeguards in place or to identify and stop abusive developers. When I proposed a deeper audit of developers’ use of Facebook’s

86 Id. 87 Id. 88 Id. 89 Parakilas Tr. at Q1188. 70

120294587_1

data, one executive asked me, “Do you really want to see what you’ll find?”90 162. Facebook’s “trust model” was rife with security vulnerabilities and a near total abnegation of its responsibility to audit its own rules limiting use of

Facebook data by third parties. In Parakilas’ own words, “[Facebook] felt that it was better not to know,” which he found to be “utterly shocking and horrifying.”91

163. Parakilas stated that “it was known and understood. . . that there was risk with respect to the way that Facebook Platform was handling data”92 but “it was a risk that they were willing to take.”93

164. At his hearing, Parliament commented that “it sounds like they turned a blind eye because they did not want to find out that truth.” Parakilas agreed, stating, “That was my impression, yes.”94

E. GSR and Cambridge Analytica Use Facebook Data For Political Purposes 165. In 2014, Christopher Wylie (“Wylie”), a Canadian data analytics expert, began working with Cambridge Analytica, Cambridge University professor

90 We Can’t Trust Facebook to Regulate Itself, Op-Ed Contributor, Sandy Parakalis, New York Times, dated 11/19/2017. 91 Ex. I, Guardian Article, 3/20/2018. 92 Parakilas Tr. at Q1196. 93 Id. at Q1215. 94 Id. at Q1229. 71

120294587_1

Dr. Aleksandr Kogan (“Kogan”), and Kogan’s company, Global Science Research

Ltd. (“GSR”), to develop a tool that would be used to influence elections.

166. GSR and Cambridge Analytica harvested millions of Facebook users’ profiles and personal information to build a system that could profile U.S. voters and target them with personalized political advertisements.

167. According to Wylie: “[we] built models to exploit what we knew about them and target their inner demons. That was the basis the entire company was built on.”

168. GSR collected user data through an app called “thisismydigitallife”

(“TIMDL”) designed by Kogan. GSR convinced Facebook users to download

TIMDL and take a personality test under the guise that GSR was collecting the data for academic purposes. TIMDL’s disclosures described it as a “research app used by psychologists,” and designed by a Cambridge academic, that would give users a better understanding of their own personalities.

169. Based on these disclosures, an estimated 270,000 Facebook users downloaded the TIMDL app using their Facebook login credentials. Facebook’s developer platform in turn allowed Kogan and Cambridge Analytica backdoor access to nonpublic information about users who downloaded the TIMDL app as well as their Facebook “friends”—including those who did not download the

72

120294587_1

TIMDL app and never read the app’s disclosures nor agreed to its terms and conditions.

170. From June through August 2014—notably after the introduction of

Facebook’s Graph API 2.0 and supposed system upgrades—GSR and Cambridge

Analytica had harvested the personal information of 87 million users, equivalent to

40% of the U.S. voting population and 27% of the entire U.S. population.

171. Wylie testified that when Facebook transferred the data in 2014,

Facebook’s servers flagged the transmission due to its size and throttled the app’s transfer of data from Facebook’s server. Kogan requested assistance from

Facebook’s engineers who helped GSR successfully secure these 87 million records without notice or consent.95

95 Wylie Tr. at Q1335 (“Facebook would have known from that moment about the project because he had a conversation with Facebook’s engineers — or at least that’s what he told me… Facebook’s account of it is that they had no idea until the Guardian first reported it at the end of 2015 — and then they decided to send out letters. They sent letters to me in August 2016 asking do you know where this data might be, or was it deleted?. . . It’s interesting that… the date of the letter is the same month that Cambridge Analytica officially joined the Trump campaign. So I’m not sure if Facebook was genuinely concerned about the data or just the optics of y’know now this firm is not just some random firm in Britain, it’s now working for a presidential campaign”). See also Ex. E, The Guardian 3/18/2018 Article (“‘Facebook could see it was happening,’ says Wylie. ‘Their security protocols were triggered because Kogan’s apps were pulling this enormous amount of data, but apparently Kogan told them it was for academic uses,’ Wylie said. ‘So they were like: ‘Fine.’”). 73

120294587_1

172. Wylie had also worked with senior employees from Palantir, a company controlled by Defendant Thiel and funded by Andreesen, stating:

[w]e had several meetings with Palantir while I was there, … There were senior Palantir employees who were also working on the Facebook data. That was not an official contract between Palantir and Cambridge Analytica, but there were Palantir staff who would come into the office and work on the data, and we would go and meet with Palantir staff at Palantir. Just to clarify, Palantir did not officially contract with Cambridge Analytica, but there were Palantir staff who helped build the models that we were working on.96

173. In total, Cambridge Analytica had collected the private information of

87 million Facebook users. According to Wylie, his access to this volume of data was “the saving grace” that allowed his team to develop statistical models to predict and influence political views and behavior. Facebook “contained enough information, including places of residence, that [Cambridge Analytica] could match users to other records and build psychographic profiles.” Wylie stated that

“[w]ith their profiles, likes, even private messages, [Cambridge Analytica] could build a personality profile on each person and know how best to target them with messages.”

174. Wylie produced receipts, invoices, emails, legal letters, and records that prove, between June and August 2014, GSR and Cambridge Analytica scraped

96 Wylie Tr. at Q1324. 74

120294587_1

the profiles of 87 million Facebook users even though “[o]nly about 270,000 users—those who participated in the survey—had consented” to sharing their data.

None of the Facebook users had consented to their personal information being used for political purposes.

175. Despite the lack of adequate disclosures or consent, Cambridge

Analytica used this data, obtained under false pretense, to provide strategic advice to 44 political campaigns in 2014, including Senator Ted Cruz’s presidential campaign in 2015, ’s presidential campaign in 2016, and the 2016

Leave.EU-campaign (known as “”) related to a referendum on the United

Kingdom’s withdrawal from the European Union.

176. With over one million app developers using Facebook’s platform by

2010, Facebook had no way to enforce its policies with respect to the use and sharing of user data. In addition, despite statements to the contrary, Facebook gave third-party app developers access to nonpublic user information that greatly exceeded users’ privacy settings. This practice left the Company with no practical means of monitoring third-party app developers’ use of Facebook data. According to Wylie, “[a]ll kinds of people [had] access to the data… It was everywhere.”

177. Notwithstanding their obligations as members of the Board and/or corporate officers, and (for some of the Individual Defendants) as members of

75

120294587_1

committees charged with overseeing Facebook’s risk exposure, corporate governance, and other critical aspects of the Company’s business and operations, the Individual Defendants maintained policies that allowed Kogan and other third- party app developers to obtain massive amounts of Facebook user information without verification as to the nature of its use. Upon learning that Cambridge

Analytica obtained and misappropriated more than 50 million users’ personal information, Facebook failed to notify users or disclose anything about the incident, or its significant impact on the Company, publicly or to investors.

F. Individual Defendants Illegally Concealed the Cambridge Analytica Scandal 1. The Guardian Reports on Cambridge Analytica’s Activities in December 2015. 178. By November 2015, elections around the world were generating increasing revenues for Facebook. On November 4, 2015, Sandberg stated during

Facebook’s Third Quarter 2015 earnings call that:

On the elections and political activity and political advertising, we’re excited about the elections because we think we give politicians and people a really compelling way to interact. If you wanted to feel like you were interacting with someone running for office before you had to go to a town hall meeting. And increasingly that’s happening on Facebook. So between January 1 and October 7 of this year over 68 million people on Facebook in the US made over 1 billion interactions

76

120294587_1

about the campaign alone and every candidate and every member of Congress is on Facebook now.

179. By the Third Quarter of 2015, advertising revenue—the vast majority of Facebook’s sales—had jumped 45% from the prior year to $4.3 billion. Mobile ad sales accounted for 78 percent of that, up from 66 percent in the prior year.

180. In November 2015, aware of the ground-breaking work of GSR,

Facebook recruited and hired quantitative social psychologist Joseph Chancellor, the founding director of GSR (“Chancellor”).97

181. Weeks later, on December 11, 2015, The Guardian published an article (the “2015 Guardian Article”) by investigative journalist Harry Davies

(“Davies”) that threatened both Facebook’s political revenue stream and

Chancellor’s research on psychological profiling.

182. The 2015 Guardian Article reported that Ted Cruz’s presidential campaign was “using psychological data based on research spanning tens of millions of Facebook users, harvested largely without their permission….”98

97 Testimony of Alexander Nix before the British Parliament’s House of Commons, Digital, Culture, Media and Sport Committee, on February 27, 2018, at Q1894-1901, available at http://data.parliament.uk/writtenevidence/committeeevidence.suc/evidencedocume nt/digital-culture-media-and-sport-committee/fake-news/oral/84838 [hereinafter, “Nix Tr.”]. 98 Harry Davies, Ted Cruz using firm that harvested data on millions of unwitting Facebook users, dated 12/11/2015, The Guardian, available at 77

120294587_1

183. The 2015 Guardian Article further reported that the Facebook data had been used to create “detailed psychological profiles about the US electorate using a massive pool of mainly unwitting US Facebook users built with an online survey that Cambridge Analytica had illegally harvested user data from

Facebook.”99

184. On the same day, a Facebook representative contacted the reporter for information. That evening, a senior employee at Facebook emailed the reporter, stating that: “[w]e are carefully investigating this situation. . . misleading people or misusing their information is a direct violation of our policies and we will take swift action against companies that do.”100

185. Facebook contacted “Cambridge Analytica to investigate the allegations reflected in the reporting.”101 According to Facebook, Cambridge

Analytica told Facebook that “if it had obtained any Facebook data, it had not been deliberate. . . disputed the factual accuracy of The Guardian report and assured https://www.theguardian.com/us-news/2015/dec/11/senator-ted-cruz-president- campaign-facebook-user-data. 99 Id. 100 Harry Davies, Facebook told me it would act swiftly on data misuse – in 2015, dated Mar. 26, 2018, The Guardian, https://www.theguardian.com/commentisfree/2018/mar/26/facebook-data-misuse- cambridge-analytica. 101 Letter from Rebecca Stimson, Facebook, to Chair Damian Collins, dated June 8, 2018, at Additional Questions to Response #25 [hereinafter “Facebook 6/8/2018 Parliament Response”]. 78

120294587_1

Facebook in writing, on 18 January 2016, that it had deleted the data it received from Dr. Kogan/GSR. . . .”102

186. On April 5, 2016, Fast Company reported that Ted Cruz’s campaign had stopped using Cambridge Analytica in light of The Guardians December 2015 reports. Notably, Fast Company reached out to Facebook regarding Cambridge

Analytica’s “scrap[ing] [of] millions of Facebook Profiles for ‘likes’ pointing to personality traits of Facebook users,” which Facebook responded that “it had stopped the practice soon after learning that Cambridge was doing it.”103

187. On March 30, 2017, The Intercept reported that “[Facebook] believes that Kogan and SCL complied with the request, [to delete the data] which was made during the Republican primary, before Cambridge Analytica switched over from Ted Cruz’s campaign to Donald Trump’s. It remains unclear what was ultimately done with the Facebook data, or whether any models or algorithms derived from it wound up being used by the Trump campaign.”104 The Intercept further reported that “[i]n public, Facebook continues to maintain that whatever happened during the run-up to the election was business as usual. ‘Our

102 Facebook 6/8/2018 Parliament Response. 103 https://www.fastcompany.com/3058639/cruz-campaign-abandons-cutting-edge- behavioral-voter-targeting-tech-say-sources. 104https://theintercept.com/2017/03/30/facebook-failed-to-protect-30-million-users- from-having-their-data-harvested-by-trump-campaign-affiliate/. 79

120294587_1

investigation to date has not uncovered anything that suggests wrongdoing,’ a

Facebook spokesperson told The Intercept.”105

188. Zuckerberg and the other members of the Board claim they learned about the Cambridge Analytica scandal through the 2015 Guardian Article in

December 2015.106 According to records obtained by the British Parliament, there was “a lot of evidence that would show that Facebook was aware of the data harvesting by GSR prior to December 2015.”107

189. For example, GSR provided the TIMDL app terms and conditions to

Facebook in 2014, or earlier, advising of the collection of data for psychological profiling purposes. Despite warnings from Parakilas and others about the dangers of privacy breaches through data sharing with third-party app developers,

105 Id. 106 Ex. H, Zuckerberg 4/11/2018 Tr. at 19-20 (ESCHOO: When did Facebook learn that Cambridge Analytica’s research project was actually for targeted psychographic work? ZUCKERBERG: Congresswoman, it might be useful to clarify what actually happened here. A developer does research. . . (CROSSTALK) ESHOO: Well, no. I — I don’t have time for a long answer, though. When did Facebook learn that? And, when you learned it, did you contact their CEO immediately? And, if not, why not? ZUCKERBERG: Congresswoman, yes. When we learned in 2015 that a Cambridge University researcher associated with the academic institution that built an app that people chose to share their data with. . . ESHOO: We know what happened with them. But I’m asking you. ZUCKERBERG: Yes. I’m answering your question. ESHOO: Yes. All right. ZUCKERBERG: When — when we learned about that, we . . . ESHOO: So, in 2015, you learned about it? ZUCKERBERG: Yes. 107 Id. 80

120294587_1

Facebook’s management did not implement a policy or practice of reviewing the terms and conditions of apps such as TIMDL.108 Also in 2014, Facebook engineers aided Kogan with the transfer of the 87 million users’ data and Palantir employees were meeting with and assisting GSR with Facebook’s data.

190. The Guardian journalist Davies started asking Facebook questions about the improprieties around data flows outside of Facebook and the use of that data for political purposes by Cambridge Analytica in early 2015.

191. After learning of the misappropriation of Facebook users’ data by

Cambridge Analytica in at least 2015, Facebook never publicly confirmed the results of any investigation into the 2015 Guardian Article. Facebook never informed FBI, DOJ, FTC, or ICO. Neither Zuckerberg, Sandberg, nor any of the other Defendants took any action to ensure that impacted Facebook users would be notified that their personal information had been compromised in accordance with applicable notification and disclosure laws. According to CTO Schroepfer, he did not know why that decision was made and the decision was not his.109

108 Schroepfer Tr. at Q2141-2144. 109 Schroepfer Tr. at Q2175-2177. 81

120294587_1

2. After The Guardian Published the December 2015 Article, Facebook Continued to Pursue Revenue Generated Through Election Activities, Ignoring the Warnings of the Media, Investigators, Zuckerberg’s Mentor Roger McNamee, and Facebook’s Chief Security Officer Alex Stamos.

192. In January 2016, Sandberg told Facebook’s investors that the 2016 election was “a big deal in terms of ad spend.” The ability to target voters was key:

“Using Facebook and Instagram ads you can target by congressional district, you can target by interest, you can target by demographics or any combination of those.

. . And we’re seeing politicians at all levels really take advantage of that targeting.”

193. During the Second Quarter 2016 earnings release, Sandberg stated:

That said we are pleased by what’s happened on Facebook for the election cycle. Not just on the paid side but actually on the organic side as well. We really see Facebook being embraced by politicians all of the world to get in touch with their constituents and we’re pleased with that.

194. Facebook repeatedly encouraged political operatives to take full advantage of the personal information its users shared on their Facebook accounts, and even stationed Facebook employees at political campaign headquarters to serve as consultants.

195. The impact on revenue of political campaign spending continued to grow throughout 2016. By the end of 2016, revenues totaled $27,638,000,000

82

120294587_1

(97% from third party ads) as compared to $17,928,000,000 (95% from third party ads) at the end of 2015, a nearly $10 billion increase in revenue.

196. The advertising spending on Facebook prompted Zac Moffat, the former digital director for ’s presidential campaign and co-founder of the political consultancy Targeted Victory, to state: “Everybody thought 2008 was the Facebook election, but I’d argue 2016 is the Facebook election. . . Facebook’s real value is in its size and scale. . . It’s that you can hit three out of four

Americans on one platform.”110

197. In 2016, one of Zuckerberg’s earliest investors and mentors, Roger

McNamee (“McNamee”), noticed a problem with Facebook’s campaign news feed becoming increasingly negative and believed that it was being manipulated by

“bad actors.” The problem seemed to be “systemic– the algorithms themselves made the site vulnerable because they were coded to prioritize attention, and attention is best gained by messages that elicit fear, outrage, and hate-sharing.”111

198. On October 30, 2016, McNamee reached out to Zuckerberg and

Sandberg about the issue:

110 https://www.wsj.com/articles/how-facebook-is-dominating-the-2016-election- 1475429365. 111 See Roger McNamee, How to Fix Facebook—Before It Fixes Us, Washington Monthly, Jan-Mar. 2018, https://washingtonmonthly.com/magazine/january- february-march-2018/how-to-fix-facebook-before-it-fixes-us/. 83

120294587_1

[t]hey each responded the next day. The gist of their messages was the same: We appreciate you reaching out; we think you’re misinterpreting the news; we’re doing great things that you can’t see. Then they connected me to Dan Rose, a longtime Facebook executive with whom I had an excellent relationship. Dan is a great listener and a patient man, but he was unwilling to accept that there might be a systemic issue. Instead, he asserted that Facebook was not a media company, and therefore was not responsible for the actions of third parties.112

199. Meanwhile, by early 2016 (if not earlier), Zuckerberg and other executives at Facebook knew that the personal data of at least 30 million Facebook users was being used to manipulate voters and that this data was in the possession of, at a minimum, Kogan, GSR, SCL, Cambridge Analytica CEO Nix, and

Eunoia—a company owned by Wylie. These individuals and entities had obtained and were misappropriating Facebook users’ personal data for political purposes

(and financial gain) as the 2016 elections were getting underway.

200. Facebook contacted Kogan and GSR and asked them to “explain what data they collected, how they used it, and to whom they disclosed it.” Facebook was less concerned about the misuse and more concerned about protecting political advertisement revenue stream.

201. For this reason, Facebook sought to keep the Cambridge Analytica situation reported by The Guardian quiet by securing from Kogan and GSR a non-

112 Id. 84

120294587_1

disclosure agreement (“NDA”) about their collection of data. In June 2016, Kogan and GSR promised to delete the 30 million Facebook users’ data and not to disclose the manner in which they obtained and used the data. In exchange,

Facebook waived and released any and all claims against Kogan/GSR concerning

Facebook’s data.113

202. In August 2016, Facebook sent Wylie a letter asking: “Do you know where this data might be or was it deleted?” Wylie believed that the outreach was made after Zuckerberg and others realized that “Cambridge Analytica had officially joined the Trump campaign, so I am not sure if Facebook was genuinely concerned about the data or just the optics of, ‘Okay, now this firm is not just some random firm in Britain. It is now working for a presidential campaign.’”114 Also by that time, Facebook sales employees were working onsite aiding the Trump presidential campaign and its political consultant Cambridge Analytica.

203. According to Facebook, “[o]n September 6, 2016, counsel for SCL informed counsel for Facebook that SCL had permanently deleted all Facebook data and derivative data received from GSR and that this data had not been transferred or sold to any other entity.”

113 See Ex.D, Confidentiality Agreement between Facebook, Kogan and GSR and related Certifications produced by Facebook to Parliament on May 16, 2018. 114 Wylie Tr. at Q1336. 85

120294587_1

204. Nix and SCL, however, did not sign a certification that “Facebook user data and Facebook user friend data and data derived from such Facebook user data and Facebook user friend data” collected through the app

“thisisyourdigitallife” had been destroyed until April 3, 2017.

205. By early 2017, Facebook’s Chief Security Officer (“CSO”) Alex

Stamos (“Stamos”) had become a vocal critic of Facebook’s policies and lack of action.

206. He co-authored a “White Paper” titled “Information Operations and

Facebook” which set forth that Facebook’s lax data security practices were pervasive and supported by management. The “White Paper” confirmed that

Individual Defendants’ public statements about its business practices, infrastructure, and systems were false and misleading, and misrepresented that

Facebook had “no evidence of any Facebook accounts being compromised” in connection with the 2016 election, as of the date it was published on April 27,

2017.

207. Stamos later said that he had initially provided a written report to

Facebook executives concerning the circumstances which led to the Cambridge

Analytica leak, but instead of taking appropriate action and disclosing the breach, the report was rewritten and presented as a hypothetical scenario, which appeared

86

120294587_1

in the whitewashed “White Paper” that Facebook published which further suppressed and concealed the wrongdoing at the Company. As a result of Stamos outspokenness, his relationship with Zuckerberg and Sandberg began to deteriorate.

208.

He reportedly told senior management “that we have the threat profile of a Northrop Grumman or a Raytheon or another defense contractor, but we run our corporate network, for example, like a college campus, almost.”115

209. On September 6, 2017, Stamos published “An Update on Information

Operations On Facebook” in the Facebook newsroom, and addressed concerns raised in the media about Russian interference with the U.S. presidential election via Facebook’s platform, but Individual Defendants brushed them aside as frivolous.

210. On September 6, 2017, The Wall Street Journal reported that

Facebook provided information to investigators at the U.S. Senate and House of

115 http://www.zdnet.com/article/leaked-audio-facebook-security-boss-says- network-is-like-a-college-campus. 87

120294587_1

Representatives that it had uncovered 500 “inauthentic” accounts linked to Russia that purchased $100,000 worth of ads between June 2015 to May 2017.116

211. On October 22, 2017, The Guardian reported that Facebook had handed to the special counsel and congressional investigators thousands of advertisements related to possible Russian interference in the elections.

212. Stamos repeatedly communicated his concerns over Facebook’s privacy vulnerabilities to Zuckerberg and Sandberg and by December 2017, infuriated with his stance on platform security, Stamos was relieved of the majority of his duties as CSO.

3. Zuckerberg Refuses to Testify Before the British Parliament.

213. On or around January 30, 2017, the Digital, Culture, Media and Sport

Committee of Parliament launched an inquiry into the spread of “fake news” through social media, collecting written evidence from 79 different sources, including Facebook.117 Parliament reconvened its inquiry on September 15, 2017, and received oral testimony from over 50 witnesses and experts in 18 full days of hearings. To date, Parliament has collected additional written evidence and

116 https://www.wsj.com/articles/facebook-identifies-100-000-in-ad-spending-by- fake-accounts-with-suspected-ties-to-russia-1504730852?mod=article_inline 117 https://www.parliament.uk/business/committees/committees-a-z/commons- select/culture-media-and-sport-committee/inquiries/parliament- 2015/inquiry2/publications/. 88

120294587_1

statements from another 76 witnesses, obtained 21 letters from companies involved, including Facebook, and 14 pieces of witness evidence.118

214. On February 8, 2018, members of Parliament questioned Facebook executives—Simon Milner (“Milner”), Policy Director UK, Middle East and

Africa, and Monika Bickert (“Bickert”), Head of Global Policy Management— about Cambridge Analytica gathering of data from users on Facebook.119

215. Although it was obvious by then that Zuckerberg and other senior executives knew of Cambridge Analytica’s misuse of 30 million users’ data as early as 2014, Milner testified that Cambridge Analytica did not have Facebook users’ data. His testimony was as follows:

Member of Parliament (“MP”) Christian Matheson: Have you ever passed any user information over to Cambridge Analytica or any of its associated companies?

FB Simon Milner: No.

MP Matheson: But they do hold a large chunk of Facebook’s user data, don’t they?

FB Simon Milner: No. They may have lots of data, but it will not be Facebook user data. It may be data about people who are on Facebook that they have gathered themselves, but it is not data that we have provided.

118 See The British House of Commons, Fake News Investigation, available at https://www.parliament.uk/business/committees/committees-a-z/commons- select/digital-culture-media-and-sport-committee/inquiries/parliament-2017/fake- news-17-19/publications/. 119 Milner Tr. at Q447-449. 89

120294587_1

MP Matheson: How will they have gathered that data from users on Facebook?

FB Simon Milner: There can be all kinds of things that these organizations do. I think what data they have would be a good question to ask them, rather than us. We have no insight on that.120 216. In a letter dated February 23, 2018, Cambridge Analytica CEO Nix informed Parliament that “Cambridge Analytica does not gather such data.”

Facebook never sought to correct or clarify this false statement.121

217. During a hearing on February 27, 2018, MP Rebecca Pow asked Nix:

“Does any of the data come from Facebook?” Nix replied: “We do not work with

Facebook data and we do not have Facebook data.”122 Facebook never sought to correct or clarify this false statement.

218. Nix further claimed that “Cambridge Analytica has always been clear that it did not use any personality modelling or ‘’ in the election, and that it has no access to Facebook likes.” Facebook never sought to correct or clarify this false statement.

219. Thereafter, on March 17, 2018, an article in The Guardian by Carole

Cadwallader gave a detailed account of the Cambridge Analytica scandal.

120 Id. (emphasis added). 121 Letter from Nix to House of Commons, dated 2.23.2018, http://data.parliament.uk/writtenevidence/committeeevidence.svc/evidencedocume nt/digital-culture-media-and-sport-committee/fake-news/written/79053.pdf. 122 Nix Tr. at Q447-449. 90

120294587_1

Cadwallader based her reporting on interviews of Wylie and other anonymous employees, and the review of documents describing personal information collected from Facebook users and the nefarious misuse of that data.

220. Facebook threatened legal action against The Guardian before the release of the newspaper’s March 17, 2018 exposé “blowing the lid off Facebook

& Cambridge Analytica.”123

221. According to Wylie, “[a] week before publication they threatened to sue The Guardian over the use of the word ‘breach’ and then they tried to spike the story for The Guardian and The New York Times by creating this press release and giving it to some other newspapers to try to get ahead of the story. They threatened

The Guardian with a lawsuit over revealing it.”

222. Reporter Cadwallader confirmed that Facebook threatened her and

The Guardian with lawsuits in the following tweet:

123 Schroepfer falsely testified before Parliament that “we did not threaten to sue.” Schroepfer Tr. at Q2150-2153. 91

120294587_1

223. Following the revelations in The Guardian on March 17, 2018, and unable to kill the story, Zuckerberg went into hiding as news of the profound abuses of privacy and misuse of Facebooks data became center stage around the world. News of the unprecedented privacy breach led to losses of over $100 billion in Facebook’s market value. In the months that followed, the stock reclaimed its value, only to decline even further by $120 billion in a single day on news arising from to the Cambridge Analytica privacy breach.

224. On March 20, 2018, MP Collins wrote to Mark Zuckerberg requesting that he testify before Parliament following the news articles in The Guardian and

The New York Times, stating:

The Committee has repeatedly asked Facebook about how companies acquire and hold on to user data from their site, and in particular about whether data had been taken without their consent. Your officials’ answers have consistently understated this risk, and have been 92

120294587_1

misleading to the Committee. It is now time to hear from a senior Facebook executive with the sufficient authority to give an accurate account of this catastrophic failure of process.124

225. On March 26, 2018, Facebook’s Head of Public Relations in the

United Kingdom, Rebecca Stimson (“Stimson”), responded by offering the testimony of CTO Schroepfer in lieu of Zuckerberg.

226. On March 28, 2018, Parliament reiterated the seriousness of the issues and again requested that Zuckerberg appear before Parliament.

227. On May 14, 2018, Stimson again reiterated Zuckerberg’s refusal to appear before Parliament, stating, “Zuckerberg has no plans to meet with the committee or travel to the UK.”125

228. On May 21, 2018, Parliament offered to take Zuckerberg’s testimony by video conference.126 By letter to Facebook, Parliament pressed again about whether Facebook “really ha[s] no records of developer violations for the time-

124 Letter from the Chair Damian Collins to Mark Zuckerberg, Facebook, March 20, 2018, available at https://www.parliament.uk/documents/commons- committees/culture-media-and-sport/180320%20Chair%20to%20Mark%20 Zuckerberg%20re%20oral%20evidence.pdf. 125 Letter from Rebecca Stimson, Facebook, to the Chair Damian Collins dated May 14, 2018, https://www.parliament.uk/documents/commons- committees/culture-media-and-sport/180514-Rebecca-Stimson-Facebook-to-Ctte- Chair-re-oral-ev-follow-up.pdf. 126 Letter from the Chair Damian Collins to Rebecca Stimson, Facebook, May 21, 2018, https://www.parliament.uk/documents/commons-committees/culture-media- and-sport/180521-Chair-to-Rebecca-Stimson-Facebook.pdf. 93

120294587_1

period before 2014,” and whether having no such records constitutes a “serious omission.”

229. On June 6, 2018, Parliament demanded that Nix, who by that time had been suspended as CEO of now-defunct Cambridge Analytica, reappear to testify about the misinformation he had previously provided to Parliament on February

27, 2018. Chairman Collins commented:

We had asked Facebook the same questions about Cambridge Analytica having access to Facebook data when we took evidence to them. We also feel that Facebook was not open and honest with the Committee in the answers it gave to us that day, and we have challenged it about that, too. I think there is fault on both sides, but what we asked you was pretty clear and you chose not to talk about any of this.127

230. After claiming that it had no records of app abuses due to record system change, Facebook on June 8, 2018 stated:

At the time of our response, we understood that we did not have records that reliably established the number of apps we terminated for developer violations prior to 2014. However as our investigation continues to progress, we have this week determined that we have records dating back to 2010 that we are currently analyzing. We will update the Commission in due course.128

127 Nix Tr. at Q3323. 128 Letter from Rebecca Stimson, Facebook, to Chair Damian Collins, dated 6/8/2018. 94

120294587_1

231. On May 22, 2018, Zuckerberg testified before the European

Parliament. During the 90-minute hearing, members directed questions to

Zuckerberg regarding Facebook’s privacy policies and how the Company planned to address data protection, fake accounts, the sharing of false news, and other privacy concerns. Members of the European Parliament complained afterwards that Zuckerberg’s answers “lack[ed] substance,” represented a “missed opportunity for proper scrutiny on many crucial questions,” and “blatantly dodged” questions he chose not to answer.

4. Individual Defendants Continue to Resist Admitting to the Public that they Concealed the Cambridge Analytica Scandal in Hearings Before the U.S. Congress

232. Individual Defendants learned of the Cambridge Analytica and its misuse of Facebook data for political purposes by December 11, 2015 at the latest.

At least several Facebook employees knew about the scandal earlier.

233. When asked “who was the person at Facebook responsible for the decision not to tell users affected in 2015?” Facebook responded, in part, that

“Zuckerberg has stated that ultimately he is responsible for what happens with

Facebook.”129

129 Letter from Rebecca Stimson, Facebook, to Chair Damian Collins, dated 6/8/2018, available at https://www.parliament.uk/documents/commons- 95

120294587_1

234. Zuckerberg made the decision and caused the Company not to notify investors, the public, and impacted Facebook users. And, in fact, Parakilas testified that even in those handful of cases where Facebook took action against developers for breaching Facebook’s terms for using data, Facebook did not notify the users of the fact that their private information had been inappropriately accessed, used and exploited by the developers.130

235. Individual Defendants efforts to conceal the Cambridge Analytica scandal had failed.

236. In addition to the British Parliament, the United States Congress held hearings on the Cambridge Analytica privacy breach. During Congressional hearings, on April 9, 2018, Senator Maria Cantwell asked Zuckerberg whether the

Board discussed issues relating to data privacy and elections, and Zuckerberg responded: “Yes, Senator. So data privacy and foreign interference in elections are

committees/culture-media-and-sport/180608-Rebecca-Stimson-Facebook-to-Chair- re-oral-ev-follow-up.pdf. 130 Parakilas Tr. at Q1200-1201 (“Chair: … Also, you said that when you were there you were involved in a few cases of action being taken against developers for breaching Facebook’s terms for using data. In those cases were Facebook users notified that their data had been inappropriately accessed, used and exploited by a developer?” Parakilas: “No, not to my knowledge. Chair: “There was no policy of informing Facebook users that their data had been breached in some way?” Parakilas: “Not to my knowledge. I do not believe that there was.” 96

120294587_1

certainly topics we have discussed at the board meeting. This are some of the biggest issues that the company has faced….”131

237. Congress asked, “[i]sn’t Facebook’s Board complicit after years of transgressions and apologies by management?” to which Facebook responded, in part, “[w]e recognize that we have made mistakes, and we are committed to learning from this experience to secure our platform further and make our community safer for everyone going forward.”132

238. Congress asked “[d]oes your board want you to resign? Not addressing security is armature [sic] behavior?” to which Facebook responded, in part, “[w]e recognize that we have made mistakes, and we are committed to learning from this experience to secure our platform further and make our community safer for everyone going forward.”133

239. Parakilas testified:

131 Testimony of Facebook CEO and Chairman Mark Zuckerberg before the U.S. Senate’s Commerce and Judiciary Committees on April 10, 2018, at pg. 21 of 107. 132 Responses from Facebook directed to Chairman Greg Walden, Ranking Member Frank Pallone Energy and Commerce Committee, U.S. House of Representatives, dated June 29, 2018, at Question 398 https://docs.house.gov/meetings/IF/IF00/20180411/108090/HHRG-115-IF00- Wstate-ZuckerbergM-20180411-SD003.pdf. 133 Response from Facebook directed to Chairman Greg Walden, Ranking Member Frank Pallone Energy and Commerce Committee, U.S. House of Representatives, dated June 29, 2018, at Question 519, https://docs.house.gov/meetings/IF/IF00/20180411/108090/HHRG-115-IF00- Wstate-ZuckerbergM-20180411-SD003.pdf. 97

120294587_1

I was very disappointed to see that they allowed the Cambridge Analytica situation to play out for more than two years after they had evidently received certification from Cambridge Analytica and Kogan that the data had been deleted despite multiple media reports saying that Cambridge Analytica was using Facebook data, which they did not follow up. That is a really big concern to me.134

G. Following the Revelations of Cambridge Analytica, Zuckerberg Continued to Mislead The Public 240. Zuckerberg reemerged on April 10 and 11, 2018 to testify before the

U.S. Senate and the House of Representatives about the Cambridge Analytica debacle.

241. From his opening remarks, Zuckerberg told a revisionist account of history, seeking to portray the problem as one that had been rectified in 2014 and was the fault of one bad actor. He testified:

In 2014, to prevent abusive apps, we announced that we were changing the entire platform to dramatically limit the Facebook information apps could access. Most importantly, apps like Kogan’s could no longer ask for information about a person’s friends unless their friends had also authorized the app. We also required developers to get approval from Facebook before they could request any data beyond a user’s public profile, friend list, and email address. These actions would prevent any app like Kogan’s from being able to access as much Facebook data today.

134 Parakilas Tr. at Q1212. 98

120294587_1

CTO Schroepfer told a similar corroborating story to Parliament in late 2015.135

242. Zuckerberg and Schroepfer also claimed that the reasons for the 2014 upgrades to Facebook’s platform were “to prevent abusive apps, we announced that we were changing the entire platform to dramatically limit the data apps could access. … [Apps] could no longer ask for data about a person’s friends unless their friends had also authorized the app.”

243. Based on Parakilas’s account, however, the Company was not concerned about app misconduct in 2014. During the Facebook Developer

Conference (“F8 Conference”) held on April 30, 2014, Facebook released the new version of its platform, Graph API 2.0. Facebook announced that it made the changes because “people want more control over sharing their personal information with apps” and people are “often surprised when a friend shares their information with an app. . . So we’ve updated Facebook Login so that each person decides what information they want to share about themselves, including their friend list.”136

244. Facebook did not claim at the F8 Conference—or at any time in

2014—that the changes to its platform were made to prevent or address “bad actors” who were scraping and misusing massive amounts of data from Facebook

135 Schroepfer Tr. at Q2143. 136 https://developers.facebook.com/blog/post/2014/04/30/the-new-facebook-login. 99

120294587_1

users and their friends. None of the public statements by Zuckerberg or other

Facebook employees in 2014 mentioned “abusive apps.”

245. When asked about evidence of pre-2014 abuses, Schroepfer told

Parliament that “[d]ue to system changes, we do not have records for the time- period before 2014.”137 According to Parakilas, Facebook did not monitor the one million apps accessing Facebook users’ information and, therefore, Facebook likely has nominal (at best) records of app misconduct.

246. Kogan’s statements on the subject confirm Parakilas’ account.

Specifically, Kogan stated that Facebook “created these great tools for developers to collect the data. And they made it very easy. I mean, this was not a hack. This was, ‘Here’s the door. It’s open. We’re giving away the groceries. Please collect them.’”138 Moreover, while Kogan admits he violated Facebook’s stated policy, which prohibits developers from disseminating, transferring, or selling gathered user data, Kogan did so openly, and in the direct view of Facebook—if they had bothered to look. Kogan’s user terms of service were publicly available and stated that users who clicked “OKAY” permitted him to “disseminate. . . transfer. . . or. . .

137 Letter from Facebook to Damian Collins, dated 5/14/2018, available at https://www.parliament.uk/documents/commons-committees/culture-media-and- sport/180514-Rebecca-Stimson-Facebook-to-Ctte-Chair-re-oral-ev-follow-up.pdf. 138 Lesley Stahl, Interview with Aleksandr Kogan, , dated 4/22/2018, available at https://www.cbsnews.com/news/aleksandr-kogan-the-link-between- cambridge-analytica-and-facebook/. 100

120294587_1

sell. . . your. . . data.”139 Yet, Facebook “never cared. I mean, it never enforced this agreement.”140 Kogan had his terms of service “up there for a year and a half that said [he] could transfer and sell the data. Never a word.”141

247. Zuckerberg testified before the U.S. Senate on April 10, 2018 that a

Cambridge Analytica-type scandal could not have occurred after 2015 when

Facebook changed its policy to allow its users to restrict the data that their

Facebook “friends” could share with third parties: “The good news here is that we already made big changes to our platform in 2014 that would have prevented this specific situation with Cambridge Analytica from occurring again today.”142

Before the House Energy and Commerce Committee, Zuckerberg stated: “There are tens of thousands of apps that had access to a large amount of people’s information before we locked down the platform in 2014.”143

248. On June 29, 2018, more than two months after Zuckerberg testified before Congress, Facebook submitted a 747-page document to the House Energy and Commerce Committee, responding to more than 2,000 follow-up questions

139 Id. 140 Id. 141 Id. 142 Ex. F, Zuckerberg Written Statement to Senate Committee, dated 4/10/2018. 143 Ex. H, Zuckerberg Tr. at 4/11/2018, at 39. 101

120294587_1

from members of Congress.144 The written responses revealed that Zuckerberg’s claim that the Company severed the unfettered access of third-party app developers to its users’ data in response to its discovery of the Cambridge Analytica incident in 2015 was false.

249. The list of third parties with whom Facebook shared data continues to grow. Moreover, the narrative that Facebook put forth that it controlled the use of its users’ data through contractual terms and policies has unraveled entirely.

250. Corroborating recent media investigations and reporting, Facebook’s disclosure to the House Committee reveals that Facebook entered into data-sharing partnerships—referred to by Facebook as “integration partnerships”—with 52 companies, primarily device makers and operating systems, including ,

Apple, AT&T, Microsoft Samsung, BlackBerry, and Chinese telecommunications company Huawei.

251. Facebook wrote to the Energy and Commerce Committees that it entered into “integration partnerships” with these companies so they could create versions of Facebook for their devices, send Facebook notifications to users, and allow users to sync their version of Facebook with photos and contacts on their

144 Letter from Facebook to Senator Maggie Hassan U.S. Senate Committee on Commerce, Science, and Transportation, dated 6/29/2018, available at https://www.judiciary.senate.gov/imo/media/doc/Zuckerberg%20Responses%20to %20Hassan.pdf. 102

120294587_1

devices. Facebook stated that these features the third-party companies built with

Facebook’s approval, and that Facebook provided user data for these purposes.

252. In its written response to the Energy and Commerce Committees,

Facebook differentiated these “integration partnerships” from its data-sharing agreements with third-party app developers because the former “typically were defined by specially-negotiated agreements that provide limited rights to use APIs to create specific integrations approved by Facebook, not independent purposes determined by the partner.” Facebook failed to mention that the types of data shared through these integration partnerships are the same types shared with other third parties, including the personal data of Facebook users’ “friends” who did not consent to having their data disclosed.

253. Facebook disclosed that as of June 29, 2018, it has severed its data- sharing partnerships with only 38 of these 52 companies. Seven companies will continue to have access until July 2018 and three companies will continue to have access to user data beyond October 2018 based on their current data-sharing agreements with Facebook.

254. By way of illustration, The New York Times conducted an experiment in which a reporter downloaded the Facebook app on to a 2013 BlackBerry device and monitored what data the BlackBerry requested and received. When

103

120294587_1

reporter connected the BlackBerry to his Facebook account, the device requested his user ID, name, photograph, “about” information, location, email, and mobile number. Once the reporter entered that information, the BlackBerry device retrieved his private Facebook messages and friends’ responses, along with the friends’ names and Facebook user IDs. The BlackBerry device transferred the

Facebook data to a BlackBerry app called “Hub,” which in turn requested and received Facebook users’ data. Ultimately, through a Facebook account for a user with 550 friends, the BlackBerry device was able to retrieve identifying information for nearly 295,000 Facebook users, including more than 50 types of information about Facebook “friends” of the reporter, and their “friends.”

255. In his testimony before Congress, Zuckerberg was evasive and failed to mention the existence of these “integration partnerships” with device makers when asked about cross-device tracking:

BLUNT: But do you collect user data through cross-device tracking?

ZUCKERBERG: Senator, I believe we do link people’s accounts between devices in order to make sure that their Facebook and Instagram and their other experiences can be synced between their devices.

BLUNT: And that would also include offline data, data that’s tracking that’s not necessarily linked to Facebook, but linked to one — some device they went through Facebook on, is that right?

ZUCKERBERG: Senator, I want to make sure we get this right. So I want to have my team follow up with you on that afterwards. 104

120294587_1

BLUNT: Well, now, that doesn’t seem that complicated to me. Now, you — you understand this better than I do, but maybe — maybe you can explain to me why that’s that — why that’s complicated.

Do you track devices that an individual who uses Facebook has that is connected to the device that they use for their Facebook connection, but not necessarily connected to Facebook?

ZUCKERBERG: I’m not — I’m not sure of the answer to that question.

BLUNT: Really?

ZUCKERBERG: Yes. There — there may be some data that is necessary to provide the service that we do. But I don’t — I don’t have that on — sitting here today. So that’s something that I would want to follow up on.

BLUNT: Now, the FTC, last year, flagged cross-device tracking as one of their concerns — generally, that people are tracking devices that the users of something like Facebook don’t know they’re being tracked.

How do you disclose your collected — collection methods? Is that all in this document that I would see and agree to before I entered into Facebook?

ZUCKERBERG: Yes, senator. So there are — there are two ways that we do this. One is we try to be exhaustive in the legal documents, or on the terms of service and privacy policies. But, more importantly, we try to provide in-line controls so that — that are in plain English, that people can understand.

They can either go to settings, or we can show them at the top of the app, periodically, so that people understand all the controls and settings they have and can — can configure their experience the way that they want.

BLUNT: So do people — do people now give you permission to track specific devices in their contract? And, if they do, is that a relatively new addition to what you do?

ZUCKERBERG: Senator, I’m sorry. I don’t have that.

105

120294587_1

BLUNT: Am I able to — am I able to opt out? Am I able to say, “It’s okay for you to track what I’m saying on Facebook, but I don’t want you to track what I’m texting to somebody else, off Facebook, on an Android phone.”?

ZUCKERBERG: Okay. Yes, senator. In — in general, Facebook is not collecting data from other apps that you use. There may be some specific things about the device that you’re using that Facebook needs to understand in order to offer the service.

But, if you’re using or you’re using some texting app, unless you specifically opt in that you want to share the texting app information, Facebook wouldn’t see that.

BLUNT: Has it always been that way? Or is that a recent addition to how you deal with those other ways that I might communicate?

ZUCKERBERG: Senator, my understanding is that that is how the mobile operating systems are architected.

BLUNT: The — so do you — you don’t have bundled permissions for how I can agree to what devices I may use, that you may have contact with? Do you — do you bundle that permission? Or am I able to, one at a — individually say what I’m willing for you to — to watch, and what I don’t want you to watch?

And I think we might have to take that for the record, based on everybody else’s time.145

256. When Zuckerberg told members of the Congress that access to user data was “locked down” in 2014, he failed to mention that Facebook exempted device makers who produced mobile phones, tablets, and other hardware on which

Facebook’s users access the social media site (mobile devices account for the majority of Facebook’s advertising revenues).

145 Zuckerberg Tr. 4/10/2018, at 36. 106

120294587_1

257. Zuckerberg also failed to mention in his testimony before the U.S.

Senate that when Facebook changed its data-sharing policies in 2014, companies would have a year to transition before they were subject to Facebook’s more restricted API and new review and approval protocols.

258. What was not public, and not disclosed by Facebook until it submitted its responses to the House Energy and Commerce Committee, is that what

Facebook calls a “small set of companies” were given an extension of several months beyond 2015.

259. Facebook continued to share users’ personal information with these companies– which it called “whitelist” companies—including dating apps Hinge and Coffee Meets Bagel, Royal Bank of Canada, and Nissan Motor Co. The information shared with these companies included “friend links” that allowed third parties to determine the degree of closeness between Facebook users and others in their networks.

260. Facebook claims these contracts are “entirely consistent with

Facebook’s FTC consent decree.” Jessica Rich, a former FTC official, stated:

“Under Facebook’s interpretation, the exception swallows the rule. They could argue that any sharing of data with third parties is part of the Facebook experience.

107

120294587_1

And this is not at all how the public interpreted their 2014 announcement that they would limit third-part app access to friend data.”

261. Facebook’s disclosure of user data to Chinese telecommunications company Huawei has come under particular scrutiny. Huawei is the world’s third- largest maker by volume. Facebook’s data-sharing agreement with

Huawei dates back to 2010. A congressional report published in 2012 noted the

“close relationships between the Chinese Communist Party and equipment makers like Huawei.” The company’s footprint in the U.S. has been limited due to security concerns of government agencies, including the FBI, CIA, NSA, FCC, and the

House of Representatives Intelligence Committee.

262. In February 2018, FBI Director Christopher Wray testified before the

Senate Intelligence Committee that the FBI was “deeply concerned” about Huawei and another Chinese device maker. Disregarding publicly available and easily accessible information, Facebook continued to share its users’ data with Huawei until as recently as June 2018.

H. Individual Defendants Knowingly Made False and Misleading Public Statements Regarding Facebook’s Privacy Practices 263. Facebook’s core assets are the public’s trust and the commitment to protect one of the largest troves of confidential personal data on the planet.

Facebook has repeatedly disclosed this fact, and the Director Individual 108

120294587_1

Defendants’ highest duty to the Company was to protect those assets. Countless times, the Individual Defendants caused the Company to inform shareholders that the Board was fulfilling its duty. Facebook does not share your private data with third parties, the public was told. And besides, the FTC enforced this promise through the 2011 Consent Order. Zuckerberg has testified that “[p]rivacy is at the core of everything we do, and our approach to privacy starts with our commitment to transparency and control.”146 The Company’s “approach to control is based on the belief that people should be able to choose who can see what they share and how their data shapes their experience on Facebook.”147 The Company is also committed to its “threefold approach to transparency,” which includes: “whenever possible, providing information on the data we collect and use and how people can control it in context and in our products;” “provid[ing] information about how we collect and use data in our user agreements and related educational materials;” and

“enable[ing] people to learn more about the specific data we have about them. . .

.”148

146 Facebook 6/8/2018 Senate Response, at 189. 147 Response from Facebook directed to U.S. House of Representatives Energy and Commerce Committee, Question #3, dated June 29, 2018, available at https://docs.house.gov/meetings/IF/IF00/20180411/108090/HHRG-115-IF00- Wstate-ZuckerbergM-20180411-SD003.pdf. 148 Id. 109

120294587_1

264. It is now known that Facebook lied in spectacular fashion. Facebook was in fact sharing the confidential personal data of at least 87 million users with

Cambridge Analytica without the users’ express consent.

265. Then, in Congressional hearings and testimony before the European

Parliament, Facebook revealed for the first time that data was being shared with sixty additional third-parties, including several manufacturers with close ties to the

Chinese government. Most notably, data was being shared with Huawei, a company recently accused by the U.S. government of spying on U.S. citizens.

When the Company finally revealed the full impact of the Directors’ breaches on earnings on July 25, 2018, Facebook shareholders lost $119 billion in a single day—the largest single-day destruction of shareholder value in history.149 To put this reckoning in context, it exceeded the entire market capitalization of General

Electric, Goldman Sachs or IBM. The $119 billion shareholder loss was larger than the gross domestic product of 153 nations.

266. Facebook has always emphasized the importance of user trust to its business model. For example, Facebook’s 2014 Form 10-K emphasized that:

Trust is a cornerstone of our business. We dedicate significant resources to the goal of building user trust through developing and implementing programs designed to protect user privacy, promote a

149 See Therese Poletti, Facebook Pays for All Its Mistakes at Once, And It Is a Big Bill, MarketWatch.com, dated 7/28/2018. 110

120294587_1

safe environment, and assure the security of user data. The resources we dedicate to this goal include engineers, analysts, lawyers, policy experts, and operations specialists, as well as hardware and software from leading vendors and solutions we have designed and built.

• Privacy and Sharing. People come to Facebook to connect and share with different audiences. Protecting user privacy is an important part of our product development process. Our objective is to give users choice over what they share and with whom they share it. This effort is fundamental to our business and focuses on control, transparency, and accountability.

• Control. We believe that by providing our users with clear and easy-to-use controls, we will continue to promote trust in our products. For example, when a user posts a status update or uploads a photo to Facebook, our in-line controls allow the user to select his or her audience at the same time that he or she is publishing the post. In addition, we provide other data management tools. “Activity Log” is a unified tool that people can use to review and manage the content they have posted and the actions they have taken on Facebook. When using the Activity Log, a user can view his or her activity with a particular application, delete a specific post, change who can see a photo, or remove an application completely. Additionally, our “Download Your Information” tool enables users to access and store their personal information off Facebook.

• Transparency. Our Data Use Policy describes in plain language our data use practices and how privacy works on Facebook. We also offer a number of tools and features that provide users with transparency about their information on Facebook. Our application settings feature enables users to view each of the applications they have chosen to use, the information needed by each application, and the audience with whom the user has chosen to share his or her interactions with each application. We believe that this transparency enables people to make more informed decisions about their activities on Facebook.

111

120294587_1

• Accountability. We continue to build new procedural safeguards as part of our comprehensive privacy program. These include a dedicated team of privacy professionals who are involved in new product and feature development from design through launch; ongoing review and monitoring of the way data is handled by existing features and applications; and rigorous data security practices. We regularly work with online privacy and safety experts and regulators around the world. In August 2012, the Federal Trade Commission formally approved a 20-year settlement agreement requiring us to enhance our privacy program and to complete biennial third-party assessments. We also have undergone two audits by the Office of the Irish Data Protection Commissioner. The audits comprehensively reviewed our compliance with Irish data protection law, which is grounded in European data protection principles. As part of the audit process, we agreed to enhance various data protection and privacy practices to ensure compliance with the law and adherence to industry best practices.

267. Facebook has also admitted that failing to respect privacy would present a clear risk to the Company. For example, the 2014 10-K states:

[Facebook is] subject to a number of U.S. federal and state, and foreign laws and regulations that affect companies conducting business on the Internet. Many of these laws and regulations are still evolving and being tested in courts, and could be interpreted in ways that could harm our business. These may involve user privacy, rights of publicity, data protection, content, intellectual property, distribution, electronic contracts and other communications, competition, protection of minors, consumer protection, taxation and online payment services. In particular, we are subject to federal, state, and foreign laws regarding privacy and protection of user data. Foreign data protection, privacy, and other laws and regulations can be more restrictive than those in the United States. U.S. federal and state and foreign laws and regulations are constantly evolving and can be subject to significant change. In addition, the application and 112

120294587_1

interpretation of these laws and regulations are often uncertain, particularly in the new and rapidly-evolving industry in which we operate, and may be interpreted and applied inconsistently from country to country and inconsistently with our current policies and practices . . . the European Commission is currently considering a data protection regulation that may include operational requirements for companies that receive personal data that are different than those currently in place in the European Union, and that may also include significant penalties for non-compliance.150

268. Similarly, the 2017 10-K repeatedly emphasized the risk to the

Company of failing to respect privacy (and similar language appeared in the 2015 and 2016 10-K’s). Specifically, the size of Facebook’s user base may be impacted if “we adopt terms, policies, or procedures related to areas such as sharing or user data that are perceived negatively by our users or the general public”, and if

“developers whose products are integrated with our products, or other companies in our industry are the subject of adverse media reports or other negative publicity.”

269. Another potential risk factor was “decisions by marketers to reduce their advertising as a result of adverse media reports or other negative publicity involving us, content on our products, developers with mobile and web applications that are integrated with our products, or other companies in our industry.”

150 Facebook’s 2015, 2016, 2017, and 2018 Form 10-K restated many of these same obligations with substantially similar language. 113

120294587_1

270. Several other related risks included:

If we are not able to maintain and enhance our brands, or if events occur that damage our reputation and brands, our ability to expand our base of users, marketers, and developers may be impaired, and our business and financial results may be harmed.

We believe that our brands have significantly contributed to the success of our business. We also believe that maintaining and enhancing our brands is critical to expanding our base of users, marketers, and developers. Many of our new users are referred by existing users. Maintaining and enhancing our brands will depend largely on our ability to continue to provide useful, reliable, trustworthy, and innovative products, which we may not do successfully. We may introduce new products or terms of service or policies that users do not like, which may negatively affect our brands. Additionally, the actions of our developers or advertisers may affect our brands if users do not have a positive experience using third-party mobile and web applications integrated with our products or interacting with parties that advertise through our products. We will also continue to experience media, legislative, or regulatory scrutiny of our decisions regarding user privacy, content, advertising, and other issues, which may adversely affect our reputation and brands. For example, we previously announced our discovery of certain ads and other content previously displayed on our products that may be relevant to government investigations relating to Russian interference in the 2016 U.S. presidential election. We also may fail to respond expeditiously to the sharing of objectionable content on our services or objectionable practices by advertisers, or to otherwise address user concerns, which could erode confidence in our brands. Our brands may also be negatively affected by the actions of users that are deemed to be hostile or inappropriate to other users, by the actions of users acting under false or inauthentic identities, by the use of our products or services to disseminate information that is deemed to be misleading (or intended to manipulate opinions), by perceived or actual efforts by governments to obtain access to user information for security-related purposes or to censor certain content on our platform, or by the use of our products or services for illicit, objectionable, or 114

120294587_1

illegal ends. Maintaining and enhancing our brands may require us to make substantial investments and these investments may not be successful. Certain of our past actions have eroded confidence in our brands, and if we fail to successfully promote and maintain our brands or if we incur excessive expenses in this effort, our business and financial results may be adversely affected.

Security breaches and improper access to or disclosure of our data or user data, or other hacking and phishing attacks on our systems, could harm our reputation and adversely affect our business.

Our industry is prone to cyber-attacks by third parties seeking unauthorized access to our data or users’ data or to disrupt our ability to provide service. Any failure to prevent or mitigate security breaches and improper access to or disclosure of our data or user data, including personal information, content or payment information from users, could result in the loss or misuse of such data, which could harm our business and reputation and diminish our competitive position. In addition, computer malware, viruses, social engineering (predominantly spear phishing attacks), and general hacking have become more prevalent in our industry, have occurred on our systems in the past, and will occur on our systems in the future. We also regularly encounter attempts to create false or undesirable user accounts, purchase ads, or take other actions on our platform for purposes such as spamming, spreading misinformation, or other objectionable ends. As a result of our prominence, the size of our user base, and the types and volume of personal data on our systems, we believe that we are a particularly attractive target for such breaches and attacks. Such attacks may cause interruptions to the services we provide, degrade the user experience, cause users to lose confidence and trust in our products, impair our internal systems, or result in financial harm to us. Our efforts to protect our company data or the information we receive may also be unsuccessful due to software bugs or other technical malfunctions; employee, contractor, or vendor error or malfeasance; government surveillance; or other threats that evolve. In addition, third parties may attempt to fraudulently induce employees or users to disclose information in order to gain access to our data or our users’ data. Cyber-attacks continue to evolve in 115

120294587_1

sophistication and volume, and inherently may be difficult to detect for long periods of time. Although we have developed systems and processes that are designed to protect our data and user data, to prevent data loss, to disable undesirable accounts and activities on our platform, and to prevent or detect security breaches, we cannot assure you that such measures will provide absolute security, and we may incur significant costs in protecting against or remediating cyber- attacks.

In addition, some of our developers or other partners, such as those that help us measure the effectiveness of ads, may receive or store information provided by us or by our users through mobile or web applications integrated with Facebook. We provide limited information to such third parties based on the scope of services provided to us. However, if these third parties or developers fail to adopt or adhere to adequate data security practices, or in the event of a breach of their networks, our data or our users’ data may be improperly accessed, used, or disclosed.

Affected users or government authorities could initiate legal or regulatory actions against us in connection with any security breaches or improper disclosure of data, which could cause us to incur significant expense and liability or result in orders or consent decrees forcing us to modify our business practices. Such incidents may also result in a decline in our active user base or engagement levels. Any of these events could have a material and adverse effect on our business, reputation, or financial results.

Unfavorable media coverage could negatively affect our business.

We receive a high degree of media coverage around the world. Unfavorable publicity regarding, for example, our privacy practices, terms of service, product changes, product quality, litigation or regulatory activity, government surveillance, the actions of our advertisers, the actions of our developers whose products are integrated with our products, the use of our products or services for illicit, objectionable, or illegal ends, the actions of our users, the quality and integrity of content shared on our platform, or the actions 116

120294587_1

of other companies that provide similar services to us, has in the past, and could in the future, adversely affect our reputation. Such negative publicity also could have an adverse effect on the size, engagement, and loyalty of our user base and result in decreased revenue, which could adversely affect our business and financial results.151

271. In public comments, the Individual Defendants continued to falsely assure investors that the Company was protecting user privacy while also admitting that failure to do so would harm the Company. For example, on November 1,

2017, Facebook held its Q3 Earnings Call, which among other things, discussed a renewed “focus” on “protecting the security and integrity of our platform and the safety of our community.” According to Zuckerberg,

[The threat] goes beyond elections and it means strengthening all of our systems to prevent abuse and harmful content. . . We already have about 10,000 people working on safety and security, and we’re planning to double that to 20,000 in the next year to better enforce our Community Standards and review ads. In many places, we’re doubling or more our engineering efforts focused on security. . . I’m dead serious about this, and the reason I’m talking about this on our earnings call is that I’ve directed our teams to invest so much in security-- on top of the other investments we’re making-- that it will significantly impact our profitability going forward, and I wanted our investors to hear that directly from me. I believe this will make our society stronger and in doing so will be good for all of us over the long term. But I want to be clear about what our priority is: protecting our community is more important than maximizing our profits.

151 Virtually every single annual and quarterly filing since at least 2016 has included substantially similar statements concerning the potential impact of improper access or disclosure of private data. 117

120294587_1

* * * We’re also working with other tech companies to help identify and respond to new threats, because as we’ve now seen, if there’s a national security threat involving the internet, it will affect many of the major tech companies. And we’ve announced a number of steps to help keep this kind of interference off our platform.152

272. On the dates of each of these disclosures, the Individual Defendants were aware that Facebook was in fact unlawfully sharing private user data. Yet, none of the Company’s SEC filings prior to March 2018 mentioned Cambridge

Analytica, the data sharing agreements with manufacturers, nor any other adverse facts alleged herein. Because Facebook disclosed the importance of user trust to the business model and the risks associated with the failure to protect privacy and losing user trust, the Company was required to disclose material facts related to third-party data sharing of private user information. It was a breach of the

Individual Defendants’ highest fiduciary duties to mislead shareholders.

273. Indeed, Defendant Zuckerberg continued to mislead the public about the Cambridge Analytica scandal through February of 2018—causing the

Company to incur a monetary penalty of £500,000.

152 https://s21.q4cdn.com/399680738/files/doc_financials/2017/Q3/Earnings-call- prepared-remarks.pdf. 118

120294587_1

274. On July 29, 2018, the Parliament of the United Kingdom released an

Interim Report of “ and ‘Fake News.’” The Report, which has a section devoted to Facebook and Cambridge Analytica, notes that:

Facebook claimed that Dr [sic] Kogan had violated his agreement to use the data solely for academic purposes. On Friday 16 March 2018, Facebook suspended Kogan from the platform, issued a statement saying that he ‘lied’ to the company, and characterised [sic] his activities as ‘a scam—and a fraud’. Facebook also suspended Christopher Wylie at the same time. On Wednesday 21 March 2018, Mark Zuckerberg called Dr Kogan’s actions a ‘breach of trust’. However, when Facebook gave evidence to us in February 2018, they failed to disclose the existence of this ‘breach of trust’ and its implications.

In its commitment to update our Committee on its ongoing investigation, the ICO decided to publish a Notice of Intent to issue a monetary penalty to Facebook of £500,000, ‘for lack of transparency and security issues relating to the harvesting of data constituting breaches of the first and seventh data protection principles’, under the Data Protection Act 1998. It should be noted that, if the new Data Protection Act 2018 had been in place when the ICO started its investigation into Facebook, the ICO’s Notice of Intent to impose 4% of its annual turnover of $7.87 billion, which would have totalled [sic] £315 million.

275. Defendant Zuckerberg also continued to mislead the public during his

Congressional testimony in April 2018. As The Wall Street Journal noted on July

2, 2018, “[i]n testimony before lawmakers in April [2018]. . . Facebook’s Chief

Executive Mark Zuckerberg said Facebook moved to eliminate broad access to

119

120294587_1

information about users’ friends in 2014, and developers had until May 2015 to comply with the rules. Zuckerberg at the time didn’t mention that Facebook struck customized data-sharing deals that gave select companies access to user records after the point in 2015 when it said it walled off that information.”153 It was only in the 747-page follow-up written responses to Congress that Facebook finally disclosed that Facebook continued to allow data sharing after May 2015 through

2018.

276. The Individual Defendants simultaneously filed or caused to be filed materially false and misleading Proxy Statements with the SEC, which similarly failed to mention the known Cambridge Analytica user and data privacy breach.

277. On April 14, 2017, Defendants drafted, approved, reviewed, signed, and/or filed with the SEC a Proxy Statement in connection with the Company’s

Annual Meeting. The 2017 Proxy Statement sought stockholder approval for the re-election of all eight members of the Board of Directors to serve a one-year term.

278. The 2017 Proxy Statement also contained a shareholder proposal

“regarding an independent chair” requesting:

the Board of Directors to adopt a policy, and amend the bylaws as necessary, to require the Chair of the Board to be an independent member of the Board. This independence policy

153 https://www.wsj.com/articles/sec-fbi-question-facebook-over-user-data 1530575905?mod=searchresults&page=1&pos=1&mod=article_inline. 120

120294587_1

shall apply prospectively so as not to violate any contractual obligation. The policy should provide that (i) if the Board determines that a Chair who was independent when selected is no longer independent, the Board shall select a new Chair who satisfies the policy within 60 days of that determination; and (ii) compliance with this policy is waived if no independent director is available and willing to serve as Chair.154

279. The Supporting Statement accompanying the proposal started:

Facebook CEO Mark Zuckerberg has also been Facebook’s board Chair since 2012. We believe the combination of these two roles in a single person weakens a corporation’s governance, which can harm shareholder value. As Intel’s former chair Andrew Grove stated, “The separation of the two jobs goes to the heart of the conception of a corporation. Is a company a sandbox for the CEO, or is the CEO an employee? If he’s an employee, he needs a boss, and that boss is the board. The chairman runs the board. How can the CEO be his own boss?”

280. Another proposal, brought by Arjuna Capital and Baldwin Brothers, called on the Company to publish a report examining the public policy implications around “fake news.” Arjuna Capital believed Facebook needed to take clear action on the issue or risk alienating users. Another proposal sought to take away the special class of stock that gives Zuckerberg majority control of the company.

154 Ex. T, Facebook’s 2017 Proxy Statement, filed on Schedule DEF14A, dated 4/14/2017.

121

120294587_1

281. The 2017 Proxy contained materially false and misleading information because the stockholders did not know Defendants were withholding information related to the Cambridge Analytica breach. The 2017 Proxy Statement also failed to disclose it was sharing private user data with sixty additional third- parties that were on Facebook’s “whitelist”. Instead, Individual Defendants re- iterated to investors the opposite: that the Company had changed its developer policy in 2014 to restrict (limit) the information to which developers had access.

282. All shareholder proposals were considered (based on the false information in the Proxy) and of course were rejected by a majority vote, because

Defendant Zuckerberg had a majority of the vote. All Directors were reelected and compensated, at least partly, based on their performance.

283. However, one of the proposals (the proposal to strip Zuckerberg of control) was approved by 79 percent of shares not controlled by Zuckerberg. And the proposal to separate the company’s chairman and CEO roles won significant minority support (31 percent of the shares not controlled by Zuckerberg). Both percentages would have been higher had Individual Defendants properly disclosed all material facts and information related to the then known sharing of confidential personal data of 87 million Facebook users.

122

120294587_1

284. Similarly, on April 13, 2018, Defendants drafted, approved, reviewed, signed, and/or filed with the SEC a Proxy Statement in connection with the

Company’s Annual Meeting. Like the 2017 Proxy Statement, the 2018 Proxy

Statement sought stockholder approval for the re-election of all eight members of the Board of Directors to serve a one-year term155 and failed to disclose pertinent, known facts and information related to the sharing (or misappropriation) of user data. Indeed, Cambridge Analytica is never even mentioned in the 2018 Proxy.

285. Despite these material omissions, shareholders forced votes on proposals that, among other things, sought: (a) changes to the Company’s voting structure,156 (b) the creation of a risk oversight committee,157 and (c) an obligation to report how it enforces its terms of service to prevent election interference, fake news, hate speech, and sexual harassment from being posted to its platform.

286. In the proposal to change the voting structure to a “one share, one vote” system, the Supporting Statement noted:

By allowing certain stock to have more voting power than others, our company takes our public shareholder money but does not let us

155 Ex. N, Facebook’s 2018 Proxy Statement, filed on Form DEF14A, dated 4/13/2018, as updated on 5/8/2018. On May 8, 2018, the Company updated the 2018 Proxy to reflect Jan Koum’s decision to leave the Company and not to stand for election. 156 See id. at 49 (Stockholder Proposal Three). 157 See id. at 51 (Stockholder Proposal Four). 123

120294587_1

have an equal voice in our company’s management. Facebook founder Mark Zuckerberg personally controls the firm with over 51% of the vote, though he owns only 14% of the economic value of the firm. . . .

Independent analysts appear to agree with our concerns. Facebook, Inc.’s ISS Governance QualityScore as of December 1, 2017 is 10 (its highest risk category), including a pillar score of 10 for Board and 9 for Shareholder Rights indicating a relatively higher governance risk.

Our company’s own 10-K describes the risk of the current share system: “Mark Zuckerberg . . . is able to exercise voting rights with respect to a majority of the voting power of our outstanding capital stock and therefore has the ability to control the outcome of matters submitted to our stockholders for approval . . . this concentrated control could result in the consummation of such a transaction that our other stockholders do not support.”

287. Large pension funds that supported these proposals included the

$352.8 billion California Public Employees’ Retirement System; the C$356.1 billion ($274.4 billion) Canada Pension Plan Investment Board; the $224.8 billion

California State Teachers’ Retirement System; the $206.9 billion New York State

Common Retirement Fund; the $204.9 billion Florida State Board of

Administration; and the $146 billion Texas Teacher Retirement System, Austin.

288. In light of the Cambridge Analytica scandal and Zuckerberg’s testimony before Congress, the 2018 shareholder meeting become one of the highest-profile shareholder revolts in history. A majority of the minority shareholders (including all of the shares held by the pension funds above) voted 124

120294587_1

against the re-election of Facebook CEO and Chairman Mark Zuckerberg and chief operating officer Sheryl Sandberg.

289. Because Facebook is a controlled company, all of the pension funds’ shareholder proposals were voted down and all directors were reelected, prompting shareholder activist James McRitchie to famously declare that Facebook is run like

“a corporate dictatorship,” and called on “Mr. Zuckerberg [to] take a page from history: Emulate George Washington, not Vladimir Putin.” Meanwhile, Christine

Jantz of Northstar Asset Management, called the vote an “egregious example of when a board is formed by a CEO to meet his needs” rather than those of the investors. Another unidentified investor stood up and loudly proclaimed that

“[s]hareholder democracy is already lacking at Facebook.”

290. Patrick Doherty, the director of corporate governance for the New

York City comptroller stated: “The idea that there should be an autocrat in charge of a gigantic public company, which has billions of dollars of shareholder money invested in it, is an anachronism. It harks back to the 19th century when you had these robber barons who were autocrats and dictators.” Similarly, Scott Stringer, the Comptroller of the City of New York, noted that “We have concerns about the structure of the board that the company doesn’t seem ready to address, which can lead to risks – reputational, regulatory, and otherwise.”

125

120294587_1

291. The shareholder outrage would have been significantly greater had

Facebook properly disclosed all material facts and information related to the then known sharing of the confidential personal data of tens of millions of users. The

Individual Defendants’ breach of fiduciary duties stemming from the false statements and omissions in the 2017 and 2018 Proxy Statements materially injured the Company and the shareholder voting franchise.

292. Delaware law does not permit corporations to dispense with shareholder votes– even when a firm is “controlled.” In fact, even Individual

Defendants recognize the fundamental importance of shareholder votes. In Section

VI of the Company’s “Corporate Governance Guidelines,” when nominating members for election to the Board or filling vacancies on the Board, the Directors are required to “consider advice and recommendations from its stockholders.”

Furthermore, the Corporate Governance Guidelines are designed to “encourage effective policy and decision making at both the Board and management level, with a view to enhancing long-term value for Facebook stockholders.” Thus, by definition, a false and misleading Proxy harmed the Company—it made an informed advisory vote of shareholders impossible.

126

120294587_1

I. Defendants Koum, Zuckerberg, and Sandberg Sold Facebook Shares Based on Their Knowledge of Material, Non-Public Information 293. Between January 1, 2015 and July 25, 2018, three of the Individual

Defendants– Koum, Zuckerberg, and Sandberg (collectively, the “Insider Trading

Individual Defendants”)– sold more than 95 million shares, netting them more than

$13.7 billion:

Defendant Trading Period Shares Sold Proceeds ($) Koum 2/11/2015 - 5/16/2018 60,458,555 7,928,741,370 Sandberg 1/15/2015 - 7/19/2018 4,668,228 495,152,657 Zuckerberg 5/11/2015 - 7/25/2018 30,763,148 5,410,803,701 Total: 95,889,933 13,734,697,738

294. The Insider Trading Individual Defendants sold shares of Facebook stock based on their knowledge of material, non-public information (“MNPI”) obtained in their roles as Directors of the Company, thereby breaching their fiduciary duties to the Company.

295. In particular, many analysts noted the irregularity of insider selling and linked it to the record July 25, 2018 sell-off. For example, Neil Wilson, chief market analyst at Markets.com in London, said “I think we were all caught off guard by the extent of the move. However, investors should have really seen

127

120294587_1

something like this coming as insiders have been selling shares heavily in recent months.”158

296. On December 1, 2015, Defendant Zuckerberg announced that he would “sell or gift no more than $1 billion of Facebook stock for the next three years.” But this announcement was made several months after Zuckerberg knew that the Company was violating its own policy against sharing user data with third parties. Zuckerberg then improperly sold 7.674 million shares in 2016 (netting approximately $331 million) and another 5.75 million shares in 2017 (netting approximately $648 million).

297. On September 22, 2017, after Congress announced an investigation of

Cambridge Analytica, and after Zuckerberg knew the full extent of the Company’s privacy problems and the extent to which the problems were not being disclosed to shareholders, rather than suspend his previous plan to sell shares, Zuckerberg announced an acceleration—this time to exceed 30 million shares.

298. Zuckerberg commenced his accelerated sales on February 12, 2018

(several weeks before the full revelation of the Cambridge Analytica scandal), and continued the improper sales through the July 2018 sell-off. In this brief window

158 Roger Aitken, Facebook’s 20% Stock Implosion Signaled by Insider Selling, But Is It a Buy Now?, dated 7/28/2018, available at www.forbes.com. 128

120294587_1

alone (February 12, 2018 to July 25, 2018), Zuckerberg sold approximately 23.93 million shares (netting approximately $4.43 billion):

Mark Zuckerberg Insider Sales 1/1/2015 - 7/26/2018 6 6 6 7 7 7 6 6 1 1 1 7 7 7 7 7 7 7 7 7 1 1 1 8 8 8 8 8 8 8 1 1 0 0 0 1 1 1 1 1 1 1 1 1 0 0 0 1 1 1 1 1 1 1 0 0 2 2 2 0 0 0 0 0 0 0 0 0 2 2 2 0 0 0 0 0 0 0 / / / / / / 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 / / 7 7 7 / / / / / / / / / 7 7 7 / / / / / / / 7 7 1 1 1 7 7 7 7 7 7 7 7 7 1 1 1 7 7 7 7 7 7 7 / / / / / / 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 / / 0 1 2 / / / / / / / / / 0 1 2 / / / / / / / 8 9 1 1 1 1 2 3 4 5 6 7 8 9 1 1 1 1 2 3 4 5 6 7 0

(50,000)

Shares Sold: 4,165,553 (100,000) Value: $648,893,663,80 s e r a h

S Shares Sold: 23,922,682

f o

(150,000) Value: $4,430,094,466.21 r e b m u N

(200,000)

(250,000)

Shares Sold: 2,674,913 (300,000)Value: $331,815,570.76

299. A full chart of Zuckerberg’s sales of Facebook shares is attached as

Exhibit O to this Amended Complaint.

300. Defendant Sandberg sold Facebook shares on a routine basis in 2015.

But as can be seen in the chart below, her sales accelerated dramatically after the

129

120294587_1

first reports of the Cambridge Analytica scandal broke on December 11, 2015.

Despite Facebook’s assurance that it had a policy against granting access to user data to third parties, Sandberg possessed MNPI that the representation was false.

Based on this information, she increased both the amount and frequency of her sales starting in February 2016.

301. After suspending sales between June 2017 and early 2018, Sandberg again sold shares starting on February 14, 2018—two days after Zuckerberg accelerated his sales and several weeks before the public learned the full extent of the Cambridge Analytica scandal. In total, Sandberg improperly sold more than

4.66 million shares (netting more than $495 million in proceeds):

130

120294587_1

Sheryl Sandberg Insider Sales 1/1/2015 - 7/26/2018

0

(50,000) Shares Sold: 548,100 Shares Sold: 107,766 Value: $100,173,598.53 Value: $9,379,692.68 s e r

a (100,000) h S

f o

r e b

m Shares Sold: 4,012,362

u (150,000)

N Value: $385,599,365.76

(200,000)

(250,000)

302. A full chart of Sandberg’s sales of Facebook shares is attached as

Exhibit P to this Amended Complaint.

303. Defendant Koum sold more than 609.4 million shares of Facebook based on MNPI he obtained while Director of the Company—almost twice as many as Defendant Zuckerberg—netting him more than $7.9 billion:

131

120294587_1

Jan Koum Insider Sales 1/1/2015 - 7/26/2018 5 5 6 6 6 7 7 7 1 1 6 6 6 6 6 6 6 6 6 1 1 1 7 7 7 7 7 7 7 7 7 1 1 1 8 8 8 8 8 0 0 1 1 1 1 1 1 1 1 1 0 0 0 1 1 1 1 1 1 1 1 1 0 0 0 1 1 1 1 1 2 2 0 0 0 0 0 0 0 0 0 2 2 2 0 0 0 0 0 0 0 0 0 2 2 2 0 0 0 0 0 / / / / / / / / 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 6 6 / / / / / / / / / 6 6 6 / / / / / / / / / 6 6 6 / / / / / 1 1 6 6 6 6 6 6 6 6 6 1 1 1 6 6 6 6 6 6 6 6 6 1 1 1 6 6 6 6 6 / / / / / / / / 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 2 / / / / / / / / / 0 1 2 / / / / / / / / / 0 1 2 / / / / / 1 1 1 2 3 4 5 6 7 8 9 1 1 1 1 2 3 4 5 6 7 8 9 1 1 1 1 2 3 4 5 0

(500,000)

(1,000,000) s

e (1,500,000) r a h S

f o

(2,000,000) r e b m u

N (2,500,000)

(3,000,000)

(3,500,000)

(4,000,000)

304. A full chart of Koum’s sales of Facebook shares is attached as Exhibit

Q to this Amended Complaint.

305. At the time of these stock transactions, each of the Insider Trading

Individual Defendants knew about or recklessly disregarded MPNI concerning (1) the Cambridge Analytica breach, (2) Facebook’s misleading data sharing practices

(including with alleged Chinese state actors), (3) violations of user privacy and data security laws, and (4) other damages to Facebook caused by Individual

132

120294587_1

Defendants’ actions (or conscious inaction) in connection with the practices described above.

306. Each Insider Selling Defendant failed to inform users and the general public of this information, as well as other materially relevant information related to Facebook’s data protection deficiencies, prior to selling these shares of

Company stock.

307. All Insider Trading Individual Defendants knew or recklessly disregarded that certain relevant facts were necessary to make the Company’s statements truthful and not misleading but were not disclosed by the Company.

While material facts were concealed from Facebook shareholders and the public, the Insider Selling Individual Defendants sold or otherwise disposed of Company stock on the basis of that information, thereby breaching their fiduciary duties.

J. Individual Defendants Consciously Disregarded Their Duties 308. As set forth above, Individual Defendants had at all relevant times duties and obligations as members of the Board and/or officers of Facebook to oversee Facebook’s legal compliance, risk exposure, corporate governance, and other critical aspects of the Company’s business and operations. In addition,

Facebook had legal obligations to protect user privacy and secure its data.

133

120294587_1

309. Individual Defendants knew from the above incidences, the FTC regulatory investigation, and well-documented internal and external reports that:

(a) third parties had illegally accessed more data than what was consented to or allowed contractually; (b) Facebook had no enforcement mechanism in place to monitor the collection or use of user data; and (c) Facebook users did not understand or appreciate the number of developers accessing their data, the scope of the data being collected (i.e., their friends’ data and private messages between users), or how their data would be used.

310. Individual Defendants consciously opted out of effective monitoring of data mining activities of third-party app developers and applied lax monitoring policies. They also pursued broad, unclear user permissions that failed to give users adequate notice of the scope and use of the data being collected.

311. Individual Defendants adopted a culture of “move fast and break things.” Indeed, Defendant Desmond-Hellmann acknowledge this as Facebook’s decade-old motto in speeches she delivered in Seattle and at Cambridge University in the United Kingdom in June 2017.

312. Put differently, Zuckerberg made a strategic choice to not prioritize user privacy because it would have impeded Facebook’s growth. The other

Individual Defendants knew about, and acquiesced to, Zuckerberg’s strategy. The

134

120294587_1

unfettered mining of Facebook’s user data by third-party developers was

Facebook’s actual business model, in conflict with its representations to its users, the public, and governments around the World.

313. Individual Defendants believed that implementing a meaningful infrastructure with checks, controls, and oversight of third parties’ access to and use of Facebook user data would be an impediment to Zuckerberg’s vision for the

Company.

314. Instead, Individual Defendants maintained business practices that allowed third-party app developers to obtain massive amounts of Facebook user information without verifying the nature and extent of its use. Third parties paid for access to this data. While Facebook had written policies governing that access, it did not enforce the Company’s policies or otherwise monitor third parties’ compliance with those policies.

315. Individual Defendants knew and consciously disregarded the fact that

Facebook was violating various laws and consent decrees (including the FTC

Consent Decree), as well as its own policies and the Facebook user agreements by giving app developers nearly unfettered access coupled with no monitoring or oversight.

135

120294587_1

316. Even though Individual Defendants knew of this type of illegal data collection and data misuse by third party app developers, like Cambridge

Analytica, as early as 2015, Individual Defendants took minimal action to strengthen Facebook’s data security.

317. Individual Defendants instead enforced Company policy with respect to third party developers, like Cambridge Analytica, on an ad hoc basis. This is to say, the Company only sought to enforce its policies where it was notified of a breach by an outside party. Individual Defendants took no proactive measures to ensure that third parties’ access to and use of user data complied with Facebook’s terms.

318. For this reason, after learning that 50 million user accounts had been misappropriated, Individual Defendants engaged in an active cover-up and consciously decided not to notify any of the users whose private data had been improperly harvested (despite Facebook’s Policies described herein), or notify the

FTC (despite Facebook’s audit and reporting obligations).

319. Now that the Cambridge Analytica breach is public, Individual

Defendants are contemplating enforcement protocols as well as strategies for retrieving illegally harvested data from Cambridge Analytica and the thousands of other developers who had open access for over a decade. Only now, with the world

136

120294587_1

watching, will Individual Defendants identify and inform impacted Facebook users and audit third party developers.

320. Individual Defendants should be found personally liable for breaching their fiduciary duties given: (a) the gravity, frequency, and scope of the repeated data breaches (Cambridge Analytica and many others); (b) the number of users impacted (likely all individuals who have had a Facebook account at any time since 2007); (c) the lack of sufficient infrastructure to monitor and enforce

Facebook’s policies and terms of its agreements with third-party developers; (d) the failure to sufficiently monitor and enforce Facebook’s compliance with its legal obligations for a period of ten years; (e) the numerous instances where Facebook failed to comply with laws and its obligations to protect data and how it was used; and (f) the failure to provide users meaningful disclosures so that they could provide informed consent.

K. Defendant PwC Aided and Abetted Individual Defendants’ Breaches of Fiduciary Duties 321. In 2011, a coalition of consumer organizations (led by the Electronic

Privacy Information Center, or “EPIC”) filed an FTC complaint under Section 5 of the FTC Act. The complaint alleged that Facebook had changed user privacy settings and disclosed the personal data of users to third parties without user consent. Facebook was caught overriding the users’ privacy settings to reveal 137

120294587_1

personal information and to disclose, for commercial benefit, user data—and the personal data of friends and family members—to third parties without their knowledge or affirmative consent.

322. The FTC confirmed these allegations, which lead to the Consent

Order discussed in more detail above.

323. Under the 2011 Consent Order, the FTC stated that Facebook is:

• “barred from making misrepresentations about the privacy or security of consumers’ personal information;”

• “required to obtain consumers’ affirmative express consent before enacting changes that override their privacy preferences;”

• “required to prevent anyone from accessing a user’s material more than 30 days after the user has deleted his or her account;”

• “required to establish and maintain a comprehensive privacy program designed to address privacy risks associated with the development and management of new and existing products and services, and to protect the privacy and confidentiality of consumers’ information;” and

• “required, within 180 days, and every two years after that for the next 20 years, to obtain independent, third-party audits certifying that it has a privacy program in place that meets or exceeds the requirements of the FTC order, and to ensure that the privacy of consumers’ information is protected”

324. The reporting requirements described in the last bullet-point above are set out in more detail in the text of the consent:

[The] Respondent [Facebook] shall, no later than the date of service of this order, establish and implement, and thereafter maintain, a 138

120294587_1

comprehensive privacy program that is reasonably designed to (1) address privacy risks related to the development and management of new and existing products and services for consumers, and (2) protect the privacy and confidentiality of covered information. Such program, the content and implementation of which must be documented in writing, shall contain controls and procedures appropriate to Respondent’s size and complexity, the nature and scope of Respondent’s activities, and the sensitivity of the covered information, including:

A. the designation of an employee or employees to coordinate and be responsible for the privacy program.

B. the identification of reasonably foreseeable, material risks, both internal and external, that could result in Respondent’s unauthorized collection, use, or disclosure of covered information and an assessment of the sufficiency of any safeguards in place to control these risks. At a minimum, this privacy risk assessment should include consideration of risks in each area of relevant operation, including, but not limited to: (1) employee training and management, including training on the requirements of this order, and (2) product design, development, and research.

C. the design and implementation of reasonable controls and procedures to address the risks identified through the privacy risk assessment, and regular testing or monitoring of the effectiveness of those controls and procedures.

D. the development and use of reasonable steps to select and retain service providers capable of appropriately protecting the privacy of covered information they receive from Respondent and requiring service providers, by contract, to implement and maintain appropriate privacy protections for such covered information.

E. the evaluation and adjustment of Respondent’s privacy program in light of the results of the testing and monitoring required by subpart C, any material changes to Respondent’s operations or business arrangements, or any other circumstances that Respondent knows or has

139

120294587_1

reason to know may have a material impact on the effectiveness of its privacy program.

325. Moreover, the Consent Order required Facebook to “obtain initial and biennial assessments and reports (“Assessments”) from a qualified, objective, independent third-party professional, who uses procedures and standards generally accepted in the profession.” Each Assessment is required to:

set forth the specific privacy controls that Respondent has implemented and maintained during the reporting period;

explain how such privacy controls are appropriate to Respondent’s size and complexity, the nature and scope of Respondent’s activities, and the sensitivity of the covered information;

explain how the privacy controls that have been implemented meet or exceed the protections required by Part IV of this order; and

certify that the privacy controls are operating with sufficient effectiveness to provide reasonable assurance to protect the privacy of covered information and that the controls have so operated throughout the reporting period.

326. The Assessment was required to be a rigorous audit, with the FTC’s

Associate Director, Enforcement Division, James A. Kohm saying, for example:

“The auditor needs to be someone who is objective and independent. We don’t want someone who is going to just rubber stamp their procedures.”

327. Facebook retained Defendant PwC as this “independent” auditor to conduct the required privacy audits under the Consent Decree. To date, PwC has

140

120294587_1

produced three reports: the Initial Assessment Report for the period August 15,

2012 to February 11, 2013, attached as Exhibit R (the “2013 Report”); the Biennial

Report for the Period February 12, 2013 to February 11, 2015, attached as Exhibit

S (the “2015 Report”) and the Biennial Report for the period February 12, 2015 to

February 11, 2017, attached as Exhibit J (the “2017 Report”).

328. In each report, PwC acknowledged the high standards to which it would be held:

“Part V of the Order requires that the Assessments be performed by a qualified, objective, independent third-party professional, who uses procedures and standards generally accepted in the profession. This report was issued by PwC under professional standards which meet these requirements.

“As a public accounting firm, PwC must comply with the public accounting profession’s technical and ethical standards, which are enforced through various mechanisms created by the American Institute of Certified Public Accountants (“AICPA”). Membership in the AICPA requires adherence to the Institute’s Code of Professional Conduct. The AICPA’s Code of Professional Conduct and its enforcement are designed to ensure that CPAs who are members of the AICPA accept and achieve a high level of responsibility to the public, clients, and colleagues. The AICPA Professional Standards provide the discipline and rigor required to ensure engagements performed by CPAs consistently follow specific General Standards, Standards of Fieldwork, and Standards of Reporting (“Standards”).

“In order to accept and perform this FTC assessment (“engagement”), the Standards state that PwC, as a practitioner, must meet specific requirements, such as the following:

141

120294587_1

General Standards:

Have reason to believe that the subject matter is capable of evaluation against criteria that are suitable and available to users. Suitable criteria must be free from bias (objective), permit reasonably consistent measurements, qualitative or quantitative, of subject matter (measurable), be sufficiently complete so that those relevant factors that would alter a conclusion about subject matter are not omitted (complete), and be relevant to the subject matter;

Have adequate technical training and proficiency to perform the engagement;

Have adequate knowledge of the subject matter; and

Exercise due professional care in planning and performance of the engagement and the preparation of the report.

Standards of Fieldwork:

Adequately plan the work and properly supervise any assistants; and Obtain sufficient evidence to provide a reasonable basis for the conclusion that is expressed in the report.

Standards of Reporting:

Identify the assertion being reported on in the report; and State the practitioner’s conclusion about the assertion in relation to the criteria.

In performing this assessment, PwC complied with all of these Standards.”

329. PwC did not, however, follow the AICPA Standards. In privacy audits, at a minimum AICPA requires the use of “Generally Accepted Privacy

142

120294587_1

Principles” (“GAPP”),159 but GAPP was not followed in the Facebook privacy audits. As Facebook management admitted, Facebook merely “leveraged the

Generally Accepted Privacy Principles (“GAPP”) framework, set forth by the

American Institute of Certified Public Accountants (“AICPA”) and Canadian

Institute of Chartered Accountants (“CICA”), to define company-specific criteria for the foundation of the Facebook Privacy Program.”

330. In short, Facebook designed its own privacy program that was merely

“inspired” by GAPP. PwC knew that Facebook’s program was not fully GAPP- compliant because PwC put Facebook’s language above in the Reports.

331. More importantly, even if Facebook’s privacy program was sufficiently robust by GAPP standards in design, PwC knowingly failed to independently test the program for compliance, despite the plain language of the

2011 Consent and the expectations of the FTC that the privacy audit would be rigorous and independent.

332. Although not explicitly stated in the Reports, AICPA requires privacy audits to comply with the rules for a “Service Organization Controls” (or “SOC”)

Report, and to produce a SOC 2 Report. While a SOC 2 audit can be done by

“attestation” (as was done here), there are two types of SOC 2 audits—a Type I

159 Not to be confused with GAAP or Generally Accepted Accounting Principles. 143

120294587_1

audit and a Type II audit, the latter being a more rigorous and independent evaluation. The 2011 Consent Order used the term “Assessment,” but it was clearly understood that the Assessment here required a full Type II SOC 2 audit— and indeed, PwC quotes verbatim from the rules governing a Type II audit. It then in fact conducted a Type II audit (but with defects as noted below), and is held to the standards governing a Type II audit.

333. A Type I audit is merely an expression of opinion about whether the control principles have been effectively designed to meet the privacy requirements at issue. A Type II audit, on the other hand, not only certifies that the controls are designed appropriately, but also tests “operating effectiveness.” PwC claimed in all three reports that it tested operating effectiveness—but it knowingly omitted the independent testing required by AICPA.

334. Type II audits involve four techniques: (a) inquiry; (b) observation;

(c) inspection; and (d) re-performance. The first three of these techniques do not independently verify compliance—they simply attest to the fact that the policies are appropriate in design, and then to test compliance, they often rely on conclusory hearsay, formally known as “management assertions.” These assessments employ circular logic, forming conclusions such as “Management

144

120294587_1

asserts it has a reasonable privacy program. Based on management’s assertion, we certify that the company has a reasonable privacy program.”

335. The fourth and final technique, though (“re-performance”), is the truly important one: it is the final step where an auditor independently recreates a process to verify that it is operating effectively. The PwC Reports identify the first three techniques and quote extensively from AICPA materials to describe each technique as used in a Type II SOC 2 Report. But the PwC Reports omitted any description of the fourth technique—not only did PwC fail to conduct any “re- performance” testing, it deleted the technique from the list of test procedures reported to the FTC.

336. PwC’s decision not to employ independent re-performance testing was a clear breach of the standards PwC admits it is bound by. The Consent Order required the auditor to “certify that the privacy controls are operating with sufficient effectiveness to provide reasonable assurance to protect the privacy of covered information and that the controls have so operated throughout the reporting period.”

337. PwC’s failure to employ re-performance testing is particularly shocking in the 2017 Report because PwC had actual knowledge that Facebook’s controls had failed during the reporting period (user data was in fact unlawfully

145

120294587_1

shared with third parties in violation of the privacy controls). Initial news of the

Cambridge Analytica scandal broke publicly on December 11, 2015, mid-way through the February 2015 to February 2017 reporting period. Defendant

Zuckerberg admitted that he knew about the data exfiltration even prior to the public report. And yet PwC assessed Facebook’s privacy program and somehow found the Company’s internal controls were effective to detect and prevent similar wrongdoing. The 2017 Report never even mentions Cambridge Analytica.160

338. In 2018, Defendant Sandberg admitted in an interview with Recode

Media on May 30, 2018 that Facebook had not audited Cambridge Analytica to ensure they had actually deleted the data. “Looking back, we definitely wish we had put more controls in place. We got legal certification that Cambridge Analytica didn’t have the data, we didn’t audit them,” she said.

339. PwC’s failure to implement a full Type II audit is especially indefensible in light of the genesis of the Consent Order—Facebook had a robust privacy policy in place, but secretly had been overriding the user controls and sharing data without consent anyway. Independent testing should have been conducted.

160 See generally Tony Romm, “Facebook’s Handpicked Watchdogs Gave It High Marks for Privacy Even as the Tech Giant Lost Control of Users’ Data,” The Washington Post, dated 4/24/2018. 146

120294587_1

340. Had PwC actually undertaken its role as “independent” auditor, it would have learned that third party app developers had access to Facebook’s user data throughout 2015, 2016, 2017 and even into 2018, in violation of the very policies that PwC certified.

L. Defendants’ Misconduct Has Harmed Facebook 341. The recent revelations regarding Facebook’s actual practices with respect to user privacy and data security have severely damaged the Company’s reputation and imposed significant costs on it, including government and regulatory scrutiny, loss of user engagement, and exposure to consumer and shareholder class litigation. Facebook lost an estimated $100 billion in market value following news reports about Cambridge Analytica and the material weaknesses in Facebook’s data security systems and controls.

342. In the days after the breach was publicly revealed, the US, UK, EU,

Canada, and , initiated investigations into Facebook’s conduct. On March 26,

2018, Pennsylvania Attorney General, Josh Shapiro (“A.G. Shapiro”), backed by a coalition of 37 other State Attorneys General sent a letter to Defendant Zuckerberg demanding answers about the Company’s business practices and privacy

147

120294587_1

protections.161 Pennsylvania A.G. Shapiro stated: “[b]usinesses like Facebook must comply with the law when it comes to how they use their customers’ personal data.

. . State Attorneys General have an important role to play in holding them accountable. . .”162

343. On March 20, 2018, a committee in the British Parliament sent a letter to Zuckerberg asking him to appear before the panel to answer questions related to

Cambridge Analytica. The President of the EU Parliament also requested an appearance by Zuckerberg. “The committee has repeatedly asked Facebook about how companies acquire and hold on to user data from their site, and in particular about whether data had been taken without their consent,” wrote Damian Collins, chairman of the British committee. “Your officials’ answers have consistently understated this risk, and have been misleading to the committee.”

344. On March 19, 2018, Bloomberg reported that the FTC is investigating

Facebook’s use of personal user data, specifically “whether Facebook violated terms of a 2011 consent decree of its handling of user data that was transferred to

Cambridge Analytica without [user] knowledge.”

161 https://www.cbsnews.com/news/cambridge-analytica-state-attorneys-general- send-letter-to-facebook-ceo-mark-zuckerberg/. 162 Id. 148

120294587_1

345. Jessica Rich, who served as director the FTC’s Bureau of Consumer

Protection said, “[d]epending on how all the facts shake out, Facebook’s actions could violate any or all of these provision, to the tune of many millions of dollars in penalties. They could also constitute violations of both U.S. and EU laws,”163 adding, “Facebook can look forward to multiple investigations and potentially a whole lot of liability here.”164 Evidently, Facebook failed to adhere to a key term in the Consent Decree—that it “get user consent for certain changes to privacy settings as part of a settlement of federal charges that it deceived consumers and forced them to share more personal information than they intended.” If the FTC finds that Facebook violated terms of the Consent Decree, the FTC can fine

Facebook more than “$40,000 a day per violation,” or trillions of dollars in fines given the number of user accounts impacted.

346. An FTC spokeswoman said in a statement on March 20, 2018: “We are aware of the issues that have been raised but cannot comment on whether we are investigating. We take any allegations of violations of our consent decrees very seriously.”

163 “Facebook may have violated FTC privacy deal, say former federal officials, triggering risk of massive fines,” by Craig Timbergood, Tony Romm, March 15, 2015; https://www.washingtonpost.com/news/the-switch/up/2018/03/18/facebook- may-have-violated-the-privacydeal-say-former-federal-officials-triggering-risk-of- massive-fines/?tutm_term=.abaaacb4b9c9. 164 Id. 149

120294587_1

347. In addition to government investigations and regulatory problems,

Individual Defendants’ misconduct has harmed Facebook’s reputation and its losing advertisers and users worldwide.

348. The illegal practices and Individual Defendants’ gross failures to timely address, remedy, or disclose them severely damaged Facebook’s reputation within the business community and in the capital markets, as evidenced by, for example, the more than $120 billion loss in market capitalization after the

Cambridge Analytica incident and Individual Defendants’ related conduct were revealed.

349. Defendants’ conduct has harmed Facebook by impairing its ability to attract customers and investors. Users are more likely to delete their Facebook accounts or less likely to sign up for an account knowing their personal data will be compromised. Investors likely will be skeptical of Facebook due to its lack of internal controls and failure to timely disclose material information.

350. As a direct and proximate result of Individual Defendants’ actions,

Facebook has expended and will continue to expend significant funds, including costs incurred in defending against, and the potential settlement of, civil and criminal legal proceedings brought against the Company related to the unauthorized sharing and use of users’ personal information; and costs incurred

150

120294587_1

from the substantial compensation and benefits paid to Individual Defendants, who are responsible for the scheme.

351. On July 10, 2018, the U.K. Information Commissioner’s Office

(“ICO”) fined Facebook the highest allowable fine of £500,000 over breaches of the U.K. Data Protection Act in connection with the Cambridge Analytica scandal.

ICO found that Facebook contravened the law by “failing to safeguard people’s information” and “failed to be transparent about how people’s data was harvested by others.”

VIII. DERIVATIVE ALLEGATIONS 352. Plaintiffs bring this action derivatively in the right and for the benefit of Facebook to redress the breaches of fiduciary duty and other violations of law by Individual Defendants as alleged herein. Plaintiffs are current stockholders of the Company and were stockholders during THE misconduct alleged herein, holding Facebook stock continuously. Plaintiffs have committed to and will continue to hold Facebook stock through the resolution of this action.

353. Plaintiffs will adequately and fairly represent the interests of

Facebook and its stockholders in enforcing and prosecuting the Company’s rights, and Plaintiffs have retained counsel experienced in prosecuting this type of derivative action.

151

120294587_1

A. Demand on Facebook’s Board Would Have Been Futile 354. At the time this action was originally filed on April 25, 2018,

Facebook’s Board consisted of nine members, Zuckerberg, Sandberg, Andreessen,

Thiel, Hastings, Bowles, Koum, Desmond-Hellman, and Chenault. Today, the

Board still has nine members, but Koum has been replaced by Zients.

355. Plaintiff Sbriglio did not make a demand on the Board to institute this action because such demand would have been futile.

B. The Board Lacks Independence 356. This Board is incapable of making an independent and disinterested decision to institute and vigorously prosecute this action. With the exception of

Chenault and Zients, each of the other members of the Board are Individual

Defendants to this and other actions and have disabling personal conflicts of interest that render them incapable of exercising independent judgment.

357. By late 2017, Zuckerberg held only 16% of the Company’s total outstanding shares, yet he controlled 53% of the shareholder vote. As a result,

Zuckerberg exercises absolute control over the Company, its directors and officers, and all key decisions. As such, the Company concedes in its public filings that it is

152

120294587_1

a “Controlled Company”165 under the corporate governance rules of the NASDAQ

Stock Market.166

358. According to the 2017 Proxy:

Because Mr. Zuckerberg controls a majority of our outstanding voting power, we are a “controlled company” under the corporate governance rules of the NASDAQ Stock Market LLC (NASDAQ). Therefore, we are not required to have a majority of our board of directors be independent, nor are we required to have a compensation committee or an independent nominating function. In light of our status as a controlled company, our board of directors has determined not to have an independent nominating function and to have the full board of directors be directly responsible for nominating members of our board.

359. As a result of his voting power, Zuckerberg has the power and control to unseat any Defendant from the Board.

360. Zuckerberg’s control over the Board was also made apparent from the infamous deal to purchase Instagram for $1 billion—unilaterally reached by

Zuckerberg without the input of the other Board members. Indeed, the deal had already been largely finalized when it went to Facebook’s Board for approval.

165 Defined as “any company of which more than 50% of the voting power for the election of directors is held by an individual. . . .” See http://nasdaq.cchwallstreet.com/nasdaq/main/nasdaq- equityrules/chp_1_1/chp_1_1_4/chp_1_1_4_3/chp_1_1_4_3_8/default.asp. 166 See Facebook’s Definitive Proxy Statement, dated April 14, 2017 (“2017 Proxy”), attached as Ex T. 153

120294587_1

361. Andreessen, Thiel, Hastings, Bowles, Koum, and Desmond-Hellman have each proven their loyalty to Zuckerberg. On April 27, 2016, Facebook announced that the Board had approved the reclassification of Facebook shares creating a non-voting class of Facebook stock the purpose of which was to give

Zuckerberg the ability to give away a substantial portion of his Facebook shares but continue to maintain control over the Company by eliminating other shareholders’ voting rights.

362. While serving on a special committee appointed to review the reclassification of Facebook stock, Andreessen coached Zuckerberg through the process, pushing the proposed plan through the special committee and eventually through the rest of the Board. As a member of the special committee, Andreessen was supposed to evaluate the proposal independently but utterly failed in his duties to the Company and its shareholders.

363. Andreessen, Thiel, Hastings, Bowles, Koum, and Desmond-

Hellman’s willingness to abdicate their duties and approve a plan that would have ensured Zuckerberg’s perpetual control over the Company while stripping away shareholder voter rights strongly demonstrates that the Board was loyal to

Zuckerberg and not to the Company and its shareholders.

154

120294587_1

364. Andreesen has also been described as Zuckerberg’s best friend and was instrumental in convincing Facebook’s committee to issue no-vote shares that would have allowed Zuckerberg to sell his shares while retaining voting control over Facebook.

365. Moreover, Andreessen, Thiel, Hastings, and Bowles lacked independence because they were beholden to Zuckerberg for the prestige of being associated with Facebook and because each has been substantially enriched by their business relationship with Zuckerberg through investment opportunities with and outside of Facebook.

366. Individual Defendants Andreesen and Thiel have also demonstrated a personal bias in favor of keeping founders in control of the companies they fund.

367. Andreesen is one of the founders and principals of Andreessen

Horowitz, a venture capital firm that provides seed, venture, and growth stage funding to the “best new technology companies.” One of Andreesen Horowitz’s philosophies is to “enable founders to run their own companies” without interference from financial backers.

368. Andreessen also lacks independence from Zuckerberg as a result of several highly lucrative deals that has made with Zuckerberg in the past few years, including Facebook’s purchase of two of its portfolio

155

120294587_1

companies, Instagram and VR. Andreessen turned his firm’s $250,000 investment in Instagram into $78 million when the $1 billion acquisition by

Facebook closed.

369. Andreessen would not have even been able to invest in Oculus VR without Zuckerberg. Andreessen had declined to invest in the company previously, but desperately wanted to invest by the fall of 2013, according to an October 2015

Vanity Fair article. When Oculus VR’s CEO seemed reluctant to allow the investment, Andreessen reportedly had Zuckerberg talk to the CEO about

Andreessen. Andreessen Horowitz got the deal and Andreessen became one of four board members. Not long after, Zuckerberg offered $2 billion for Facebook to acquire Oculus VR.

370. Andreessen Horowitz’s access to the most promising and lucrative investments relies heavily on Andreessen’s relationship with Zuckerberg and

Facebook. In a May 18, 2015 New Yorker article titled “Tomorrow’s Advance

Man,” Andreessen reportedly explained:

Deal flow is everything. If you’re in a second-tier firm, you never get a chance at that great company. Andreessen Horowitz saw its biggest successes after “logo shopping” to add Facebook to the firm’s portfolio in 2010. Within two years of that investment, “Andreessen Horowitz was the talk of the town.”

156

120294587_1

371. Defendant Thiel was one of the early investors in Facebook. He co- founded PayPal, Inc., and in 2005 he founded and has since been a partner of The

Founders Fund—a venture capital firm that strives to keep founders in control of the companies they have created. The Founders Fund is marketed on the principle that company founders should have long-term control of the companies they create. In fact, The Founders Fund’s website touts Facebook as a primary example of that maxim, stating that “we have often tried to ensure that founders can continue to run their businesses through voting control mechanisms, as Peter Thiel did with Mark Zuckerberg and Facebook.”

372. Thiel, like Andreessen, has greatly benefited by his relationship with

Zuckerberg and his seat on the Facebook Board. The Founders Fund gets “good deal flow” from this high-profile association.

373. Moreover, both Thiel and Andreessen, through their close ties to

Palantir, which reportedly aided Cambridge Analytica’s exploitation of Facebook users’ data, have a direct conflict of interest in the instant litigation.

374. Defendant Bowles has a history of bowing to the financial interests of

CEOs. Over the last decade, Bowles has collected tens of millions of dollars in director compensation for his services on Facebook and other high-profile corporate boards. During his tenures on these various boards, the companies

157

120294587_1

underperformed against the market while he consistently approved lavish (and excessive) payouts to CEOs. For example, in 2012, he and the other board members increased the Norfolk CEOs compensation by 16% while the stock fell below the S&P 500 average. During his time on the Cousins board, the company underperformed, and yet Bowles and the other directors increased the CEO’s pay by 73% in 2011 and 276% in 2012. Similarly, during Bowles’ tenure on Morgan

Stanley’s board, the CEO pay went from less than $1.3 million annually in 2008 and 2009 to $38.8 million in 2010 through 2012, though ’s stock price was in decline. Bowles personally received over $3.8 million from Morgan

Stanley for his services as a director from 2007 to 2017. Bowles was also a member of the General Motors board from June 2005 until April 2009, when the auto giant filed for bankruptcy, and served on the board of the embattled doughnut maker Krispy Kreme. Ironically, during his tenure on many of these boards, he served as the Co-Chair of the National Commission on Fiscal Responsibility and

Reform and preached against reckless government spending. Yet he has consistently voted in favor of excessive compensation increases for CEOs despite poor performance as measured by the market.

375. Defendant Hastings also lacks independence from Zuckerberg.

Defendant Hastings is a co-founder of Netflix and currently serves as its CEO and

158

120294587_1

Chairman of its Board of Directors. In addition to being sympathetic to

Zuckerberg’s desire to maintain founder’s control due to his own founder role at

Netflix, Hastings has every incentive to cater to Zuckerberg’s desires at Facebook due to Facebook’s business relationship with Netflix.

376. Through the “Friends and Community” initiative launched in March

2013, Netflix enjoyed valuable word-of-mouth type marketing because the initiative allows Facebook users to share data about their Netflix viewing habits with their Facebook friends. Hastings would not want to risk losing this relationship, as the initiative’s launch caused Netflix’s share price to climb 6%.

377. Defendant Desmond-Hellmann lacks independence from defendant

Zuckerberg due to their close business and personal relationships. As Lead

Independent Director, Desmond-Hellmann serves as a liaison between Zuckerberg and the Board’s independent directors.

C. The Majority of the Board is Subject to Substantial Risk of Personal Liability 378. Demand is also excused because Individual Defendants face a substantial likelihood of liability for the claims alleged against them in this

Complaint.

379. The facts detailed in this Complaint demonstrate that they: (1) affirmatively adopted, implemented, and condoned a business strategy based on 159

120294587_1

deliberate and widespread violations of policies, applicable law, and the Consent

Decree, which is not a legally protected business decision and cannot be considered a valid exercise of business judgment; and/or (2) consciously disregarded numerous red flags of misconduct throughout the relevant period, subjecting them to a substantial likelihood of liability as to Plaintiffs’ claims against them in this action. Accordingly, demand on the Board is excused.

380. Individual Defendants were aware of the weaknesses of Facebook’s user privacy and data security controls and failed to address and repair them. They also disregarded their affirmative obligations to oversee Facebook’s compliance with the Consent Decree.

381. Section VII of the Consent Decree states that: “[Facebook] shall deliver a copy of this order to all current and future principals, officers, directors, and managers;. . . .” Every Individual Defendant was delivered a copy of the

Consent Order on September 12, 2012. Thus, each of the Individual Defendants had knowledge and understood the issues addressed therein and the Company’s affirmative obligations under the Decree. Yet, Individual Defendants consciously disregarded their duties set forth in the Consent Decree and concealed the fact that

Facebook had not complied with it.

160

120294587_1

382. The Board’s duty was heightened by the fact that the FTC imposed affirmative obligations with respect to the Company’s user privacy practices in the

Consent Order.

383. The Board consciously disregarded the Consent Decree by permitting

Zuckerberg, Sandberg, and other Facebook executives to grant special dispensation to numerous preferred app developers and 52 of the largest companies in the world. These developers and companies were granted full access to nonpublic user information without (1) a clear and prominent disclosure to the users that was separate and apart from any privacy policy, data use policy, or statement of rights and responsibilities; and (2) affirmative express consent. The Board knew that

Facebook had entered into the various partnership agreements and caused

Facebook to violate Facebook’s elevated legal duties under the Consent Decree.

Disclosures in policies and statements that sought to vaguely capture broad categories of permissions were inconsistent with the Consent Decree which rejected the use of implicit consent.

384. At a minimum, Zuckerberg, Sandberg, Andreessen, and Thiel (if not the entire Board) knew that Cambridge Analytica possessed and misused

161

120294587_1

nonpublic user information as early as 2015 when The Guardian exposed the beginnings of the scandal. Andreessen and Thiel likely knew much earlier given their connection to Palantir which reportedly aided Cambridge Analytica by working with Facebook user data to build the psychological models. But for the millions of users’ data from Facebook, Cambridge Analytica could not have built its models. By 2014 Facebook engineers were assisting Cambridge Analytica, according to Wylie.

385. Meanwhile Zuckerberg testified that he did not know the scope of the

Cambridge Analytica privacy breach until 2018, but that was because he and the other Individual Defendants disregarded their duty to investigate the issue. The message from the top was “the less you know, the better.”

386. The Board consciously disregarded its responsibilities under the

Consent Decree to proactively investigate this specific privacy breach and to inform the FTC and impacted users of this breach. This intentional violation of their legal obligations rises to the level of bad faith and constitutes the type of violation of the duty of loyalty owed to Facebook and its shareholders that cannot be exculpated nor subject to any safe harbor by Individual Defendants.

162

120294587_1

387. Significantly, Facebook’s business practices stood in direct contradiction to its privacy obligations, because as proclaimed by Vice President

Boz, “questionable” conduct is justified in the name of “growth.”

388. The Board failed to fulfill its duties, and this failure was even more egregious in light of the many warnings and actions by insiders who, during the relevant period, told Facebook executives that Facebook’s systems were not sufficient to address the misconduct at issue in this Complaint.

389. With respect to the failings of the Company, Parakilas testified that:

I guess my concern is mostly around the enforcement front and the auditing front. If they are allowing data to be passed to third parties- sensitive, personally identifiable data-they must have much stronger enforcement to ensure that developers do not use that data. I think it must be much more proactive about suing developers if they are going to continue to allow this to happen and it must be much more proactive about auditing and using procedures to actually go in to check and inspect what developers are doing with that data.167

390. Given the Board’s awareness and deliberate concealment of the

Cambridge Analytica breach from the public, and Facebook’s failure to notify affected users in accordance with applicable statutes– wrongful actions that resulted in the retention and unauthorized use of millions of users’ personal information for over several years—it is clear the Board either deliberately or

167 Parakilas Tr. at Q1212. 163

120294587_1

recklessly failed to take remedial action to stop the practices that allowed the illicit scheme to continue.

391. For these reasons, the Board is incapable or unwilling to take the actions required to seek the relief requested in this Complaint.

392. Because a majority of the Board faces a substantial risk of liability, demand is futile.

IX. LEGAL COUNTS

COUNT I – BREACH OF FIDUCIARY DUTY

(AGAINST DEFENDANT ZUCKERBERG) 393. Plaintiffs incorporates by reference each and every allegation set forth above, as though fully set forth herein.

394. Defendant Zuckerberg is the Company’s controlling shareholder and, as such, owed and continues to owe the Company the highest obligation of due care, loyalty, and good faith.

395. Defendant Zuckerberg breached his fiduciary duties and abused his control over the Company when he allowed the Company to disregard the law and its legal obligations.

396. Zuckerberg failed to monitor and enforce Facebook’s compliance with the FTC Consent Decree.

164

120294587_1

397. Zuckerberg knew that Facebook lacked the infrastructure to monitor and enforce Facebook’s policies and terms of its agreements with developers, advertisers, and other third parties.

398. Zuckerberg knew that Facebook was not providing users meaningful disclosures so that they could provide informed consent on who was accessing their data and for what purpose.

399. Zuckerberg unlawfully concealed the Cambridge Analytica breach from impacted users and shareholders.

400. As a direct and proximate cause of Zuckerberg’s conscious inaction and failure to perform his fiduciary duties, the Company has sustained, and will continue to sustain, significant damages, both financially and to its reputation.

COUNT II – BREACH OF FIDUCIARY DUTY

(AGAINST ALL INDIVIDUAL DEFENDANTS OTHER THAN DEFENDANT ZUCKERBERG) 401. Plaintiff incorporates by reference each and every allegation set forth above, as though fully set forth herein.

402. Individual Defendants owed and continue to owe fiduciary duties to the Company and its shareholders. By reason of their fiduciary relationships,

Individual Defendants specifically owed and owe Plaintiffs and the Company the highest obligation of good faith and loyalty in the administration of the affairs of

165

120294587_1

the Company, including, without limitation, the oversight of the Company’s compliance with laws, regulations, and the FTC Consent Decree.

403. Individual Defendants consciously breached their fiduciary duties and violated their corporate responsibilities by willfully abdicating their roles as fiduciaries. Individual Defendants failed to monitor the Company’s compliance with the law and the FTC Consent Decree, failed to monitor the Company’s compliance with its own terms of service, and failed to monitor the Company’s compliance with applicable laws regarding data privacy.

404. As a direct and proximate cause of Individual Defendants’ conscious failure to perform their fiduciary duties, the Company has sustained, and will continue to sustain, significant damages, both financially and to its corporate image and goodwill.

COUNT III – BREACH OF FIDUCIARY DUTY

(AGAINST INDIVIDUAL DEFENDANTS ZUCKERBERG AND SANDBERG) 405. Plaintiffs incorporates by reference each and every allegation set forth above, as though fully set forth herein.

406. Individual Defendants Zuckerberg and Sandberg owed the Company and its shareholders fiduciary duties by virtue of their positions as CEO and COO, respectively.

166

120294587_1

407. By reason of their fiduciary relationships, Zuckerberg and Sandberg specifically owed and continue to owe Plaintiffs and the Company the highest obligation of good faith and loyalty in the administration of the affairs of the

Company, including, without limitation, the oversight of the Company’s compliance with laws, regulations, and the FTC Consent Decree.

408. Individual Defendants Zuckerberg and Sandberg consciously breached their fiduciary duties and violated their corporate responsibilities by willfully abdicating their roles as fiduciaries by: (1) misrepresenting the extent of its ability to protect nonpublic user information (and that of their friends) once that information was in the hands of third parties; (2) concealing the Cambridge

Analytica privacy breach impacting 87 million users for over two years and consciously deciding not to identify impacted users or provide them notification of the privacy breach; (3) knowingly causing Facebook to violate parts of the FTC

Consent Decree in bad faith; (4) intentionally disregarding Facebook executives who identified and sought to correct the weaknesses in the practices and system that led to massive privacy breaches; (5) knowingly adopting business practices that disregarded user privacy rights and that were certain to violate the Company’s terms of service, the Consent Decree, and applicable laws regarding data privacy;

(6) consciously deciding not to monitor or take necessary actions (such as regular

167

120294587_1

audits) to ensure that outsiders to whom Individual Defendants granted significant access to nonpublic information were obtaining users’ express authorized consent and confirming that the use was proper and consistent with the consent granted; and (7) concealing and/or intentionally omitting discussion of a known material event—the Cambridge Analytica privacy breach—in Facebook’s public filings while knowing that this threat of a material privacy breach had actually materialized.

409. As a direct and proximate cause of Individual Defendants Zuckerberg and Sandberg’s breaches of their fiduciary duties, the Company has sustained, and will continue sustaining, significant damages, both financially and to its corporate reputation and goodwill.

COUNT IV – AIDING AND ABETTING

(AGAINST DEFENDANT PWC) 410. Plaintiffs reallege each allegation above, as though fully set forth herein.

411. Defendant PwC was aware of the Director Individual Defendants’ fiduciary duties to Facebook to not put self-interest and personal or other considerations ahead of the interests of its stockholders.

168

120294587_1

412. As stated more particularly herein, the knowing failure of Defendant

PwC to properly employ appropriate privacy auditing standards violated its duties under the FTC Consent Decree and made defendant PwC a knowing participant in, and aider and abettor of, the aforesaid numerous breaches of the fiduciary duties owed by the Directors.

413. The above breaches of the Directors’ fiduciary duties to shareholders were a substantial factor in causing the unprecedented loss in shareholder value and other injuries to the Company.

414. By aiding and abetting the Directors’ breaches of fiduciary duties, the aforesaid acts and omissions of defendant PwC was a substantial factor in causing the unprecedented loss in shareholder value and other injuries to the Company.

415. Defendant PwC had knowledge of, provided substantial assistance in, and knowingly participated in these breaches of fiduciary duties.

416. As a direct and proximate result of the Director Individual Defendants and Defendant PwC’s acts and omissions, the Company has sustained damages, as alleged herein.

169

120294587_1

COUNT V – BREACH OF FIDUCIARY DUTY OF DISCLOSURE

(AGAINST INDIVIDUAL DEFENDANTS) 417. Plaintiffs reallege each allegation above, as though fully set forth herein.

418. The Individual Defendants signed (either directly or through a power of attorney) the 2016 10-K, 2017 10-K, 2018 10-K, 2017 Proxy, and 2018 Proxy.

419. Delaware law recognizes that Directors are subject to a fiduciary duty to disclose fully and fairly all material information within the directors’ control when it seeks shareholder action in Proxy Materials.

420. Proxy Materials included the 2017 Proxy (Ex. T), 2018 Proxy (Ex. N), and the relevant Form 10-Ks are incorporated herein by reference.

421. The Proxy Materials contained materially false statements and omitted to disclose material information that a reasonable investor would have viewed as having significantly altered the “total mix” of information available. Without a materially truthful and complete Proxy Materials, shareholders where denied appropriate information upon which to make an informed choice when casting votes for Directors and the shareholder proposals discussed above.

170

120294587_1

422. As a direct and proximate cause of Individual Defendants’ breaches of their fiduciary duties, the Company has sustained, and will continue sustaining, significant damages, both financially and to its corporate reputation and goodwill.

COUNT VI – INSIDER TRADING

(AGAINST DEFENDANTS ZUCKERBERG, SANDBERG, AND KOUM)

423. Plaintiffs incorporate by reference and reallege each of the foregoing allegations as though fully set forth in this paragraph.

424. At the time of the stock sales set forth above, Insider Trading

Individual Defendants Zuckerberg, Sandberg, and Koum were in possession of material, adverse, non-public information as alleged herein, and sold Facebook stock on the basis of such information.

425. The information described above was proprietary non-public information material to the Company’s financial condition and future business prospects. It was a proprietary asset belonging to the Company, which the Insider

Trading Individual Defendants used for their own benefit when they sold Facebook stock.

426. At the time of their stock sales, Insider Trading Individual Defendants possessed information concerning the Company’s true third-party data sharing practices and the impact of such practices on the Company’s prospects. Insider 171

120294587_1

Trading Individual Defendants’ sales of Facebook stock while in possession of this materially adverse, non-public information was a breach of their fiduciary duties of loyalty and good faith.

427. As the use of the Company’s proprietary information for their own gain constitutes a breach of Insider Trading Individual Defendants’ fiduciary duties, the Company is entitled to disgorge and impose a constructive trust on any illegal profits obtained thereby.

428. Plaintiffs have no adequate remedy at law.

COUNT VII - CONTRIBUTION OR INDEMNIFICATION

(AGAINST ALL INDIVIDUAL DEFENDANTS) 429. Plaintiffs incorporate by reference each and every allegation set forth above, as though fully set forth herein.

430. This claim is brought derivatively on behalf of the Company against

Individual Defendants for contribution or indemnification.

431. Facebook is being investigated by the FTC to determine the extent to which Facebook violated its Consent Decree and is a named defendant in various putative consumer class actions centralized by the Judicial Panel on Multidistrict

Litigation in the Northern District of California and other related litigation throughout the world. The focus of the FTC investigation and the private consumer

172

120294587_1

class actions are Facebook’s violations of its legal obligations related to user privacy, data security, and data use.

432. If (and when) the Company is found liable for violating the FTC’s

Consent Decree and/or consumer laws, the Company’s liability will arise in whole or in part as a result of the intentional, knowing, or reckless acts or omissions of some or all of Individual Defendants as alleged herein.

433. The Company, therefore, is entitled to receive contribution and/or indemnification from Individual Defendants in connection with liabilities that will stem from the FTC investigation and the consumer class actions pending against the Company.

434. Accordingly, the Company is entitled to all appropriate contribution and/or indemnification from Individual Defendants.

PRAYER FOR RELIEF WHEREFORE, Plaintiffs demand judgment as follows:

(a) Determining that this action is a proper derivative action maintainable under the law and demand was excused;

(b) Finding that Individual Defendants breached their fiduciary duties;

(c) Finding that Mark Zuckerberg breached his fiduciary duty as controlling shareholder;

173

120294587_1

(d) Finding that Mark Zuckerberg and Sheryl Sandberg breached their fiduciary duties as officers;

(e) Finding that PwC aided and abetted the Individual Defendants’ breached of their fiduciary duties;

(f) Against all Individual Defendants and in favor of the Company for extraordinary equitable and injunctive relief as permitted by law and/or equity as follows:

(i) To order the Company to adopt a policy, and amend the bylaws

as necessary, to require the Chair of the Board to be an

independent member of the Board and the roles of Chair and

CEO be split (or in the alternative to require a shareholder vote

of Class A shareholders on the split of Chairman and CEO

roles);

(ii) To order the Company to take all practicable steps in its control

toward initiating and adopting a recapitalization plan for all

outstanding stock to have one vote per share, including efforts

at the earliest practicable time toward encouragement and

negotiation with Class B shareholders to request that they

relinquish, for the common good of all shareholders, any

174

120294587_1

preexisting disproportionate rights (or in the alternative to

require a shareholder vote of Class A shareholders on the

question of whether the Company should adopt a

recapitalization plan); and

(iii) To order the Company to immediately establish an independent

Risk Oversight Board Committee to be selected by a majority

of the Class A shareholders.

(g) Directing the Company to take all necessary actions to reform and improve its corporate governance and internal controls and Board oversight of user privacy, data security and use of data, compliance with various legal obligations including under the Facebook user agreement, the FTC Consent Decree, and any other court or administrative orders;

(h) Against all Individual Defendants and in favor of the Company for the amount of any and all damages sustained by the Company as a result of Individual

Defendants’ breaches of their fiduciary duties;

(i) Requiring Individual Defendants to contribute towards any third-party liability or judgment against Facebook and/or indemnify Facebook for any losses, fines and legal expenses that resulted from, in whole or in part, by Individual

Defendants’ misconduct;

175

120294587_1

(j) Requiring Defendants Koum, Zuckerberg and Sandberg to disgorge all gains realized in the sales of Facebook securities based on Material, Non-Public

Information obtained in their roles as Directors and/or Officers of the Company and make restitution to the Company;

(k) Against Defendant PwC requiring it to contribute towards any award of contribution or indemnity to the Company for its role in aiding and abetting the

Individual Defendants’ breaches of their fiduciary duties;

(l) Awarding Plaintiffs the costs and disbursements of this action, including reasonable attorneys’ fees and experts’ fees; and

(m) Granting such other relief as the Court deems just and proper.

Respectfully submitted: Dated: August 7, 2018 /s/ Thaddeus J. Weaver Thaddeus J. Weaver (Id. No. 2790) DILWORTH PAXSON LLP One Customs House 704 King Street P.O. Box 1031 Wilmington, DE 19899-1031 (302) 571-8867 (telephone) [email protected] Counsel for Plaintiffs Karen Sbriglio and the Firemen’s Retirement System of St. Louis

176

120294587_1

Of Counsel:

Catherine Pratsinakis (Id. No. 4820) Bryn M. McDonough (Admitted PHV) DILWORTH PAXSON LLP 1500 Market Street, Suite 3500E Philadelphia, PA 19102 (215) 575-7013 (telephone) [email protected]

Frederic S. Fox (pro hac to be filed) David A. Straite (Id. No. 5428) Aaron Schwartz (pro hac to be filed) KAPLAN FOX & KILSHEIMER LLP 850 Third Avenue New York, NY 10022 (212) 687-1980 [email protected] [email protected] [email protected]

Counsel for Plaintiffs Karen Sbriglio and the Firemen’s Retirement System of St. Louis

177

120294587_1

CERTIFICATE OF SERVICE

I hereby certify that a true, correct and complete copy of the “First Amended

Verified Stockholder Derivative Complaint” with Exhibits A through T,

Verifications of Plaintiffs Karen Sbriglio and Firemen’s Retirement System of St.

Louis, and “Blacklined Version” pursuant to Ct. Ch. R. 15(aa) showing amendments, were served on August 7, 2018, on all Delaware Counsel of Record as identified below, via File & ServeXpress:

David E. Ross, Esquire (Id. No. 5228) R. Garrett Rice, Esquire (Id. No. 6242) Ross Aronstam & Moritz LLP 100 S. West Street, Suite 400 Wilmington, DE 19801

DILWORTH PAXSON LLP

By: /s/ Thaddeus J. Weaver Thaddeus J. Weaver (Id. No. 2790) One Customs House 704 King Street, Suite 500 P.O. Box 1031 Wilmington, DE 19899-1031 (302) 571-8867 (telephone) (302) 655-1480 (facsimile) [email protected]

Counsel for Plaintiffs Karen Sbriglio and Firemen’s Retirement System of St. Louis

178

120294587_1