STANFORD INTERNET OBSERVATORY 2019-2021

The first two years. Internet Observatory Cyber Policy Center

io.stanford.edu CONTENTS

Two Year Review June 6, 2019 — June 6, 2021

LETTER FROM THE DIRECTORS TEACHING 04/05 14/15

SPOTLIGHT: SIO BY THE NUMBERS TELLING 'S STORY 06/07 16/17

RESEARCH POLICY 08/09 18/19

SPOTLIGHT: SPOTLIGHT: THE ELECTION POTEMKIN PAGES AND PERSONAS INTEGRITY PARTNERSHIP 10/11 20/21

SPOTLIGHT: PLATFORM TAKEDOWNS ACKNOWLEDGEMENTS 12/13 22/23 4

LETTER FROM THE DIRECTORS

Two years ago, we launched the Stanford of livestreams to push fear during Observatory as a cross-disciplinary laboratory for the Black Lives Matter protests; global superpowers study of abuse in current information technologies, using health diplomacy as concerted soft power with a focus on the misuse of social media. The moves in the global south; and the 2020 US election, Observatory was created to learn about these abuses culminating in the unprecedented—although in real time and to translate our research discoveries perhaps not unanticipated—Capitol insurrection on into education for the next generation of engineers January 6, 2021. and entrepreneurs and into policy innovations for the public good. The term “Observatory” was We launched on June 6, 2019, with an initial team not an accident: for centuries, physicists and of three and have since grown to a full-time team astronomers have coordinated resources to build of 10, working with 76 student research assistants the massive technological infrastructure necessary over the past two years. SIO’s success relies on the to research the universe. The internet is similarly an tireless efforts of the students and staff whose work ecosystem constantly in flux as new apps, emerging is highlighted in this report. technologies, and new communities of users transform the space; researchers need innovative As we embark on our third year, we reflect deeply capabilities to research this new information frontier. on our research and refine our path forward as a research center. In addition to highlighting the When we launched, we knew our work would be output of our team, this report details our focus important because of the extent to which online areas and goals for the coming year. activity increasingly shapes public perception of our society’s most important issues. We did not We would like to extend our gratitude to our faculty anticipate some of the specific forms this activity leads Nate Persily and Dan Boneh at the Stanford would take. The global pandemic moved human Cyber Policy Center; Michael McFaul, the director interaction from substantively online to near- of the Freeman Spogli Institute; and our generous completely online. As our team adapted to working supporters including Craig Newmark Philanthropies, from home, the spread of online information the William and Flora Hewlett Foundation, the intensified: an organized marketing campaign Omidyar Network, the Charles Koch Foundation and to launch the conspiratorial “” video; Felicis Ventures.

Alex Stamos, Director Elena Cryst, Associate Director LETTER FROM THE DIRECTORS

5

building the infrastructure to study the internet 6 STANFORD INTERNET OBSERVATORY BY THE NUMBERS

76 9 STUDENT RESEARCH COURSES TAUGHT ASSISTANTS TRUST & SAFETY ENGINEERING x3 ONLINE OPEN-SOURCE INVESTIGATIONS x3 479 HACK LAB x2 STUDENTS ENROLLED IN CURRENT TOPICS IN TECHNOLOGY PLATFORM SIO COURSES POLICY x1

3 5 COLLABORATIVE RESEARCH END-TO-END INITIATIVES ENCRYPTION WORKSHOPS

ATTRIBUTION.NEWS

ELECTION INTEGRITY PARTNERSHIP VIRALITY PROJECT 200+ WORKSHOP PARTICIPANTS BY THE NUMBERS

7

PUBLICATIONS

BLOG POSTS 58

REPORTS 33

ARTICLES AND OP-EDS 20

WORKING PAPERS UNDER REVIEW 2

PUBLISHED PEER-REVIEWED JOURNAL ARTICLES 1 8

RESEARCH

The Stanford Internet Observatory (SIO) started manipulation around the 2020 Taiwan election, the with intentionally broad research interests— influence of Russian cyber mercenaries in Libya, and including , trust and safety, and the dynamics and behavior of Parler’s first 13 million encryption policy—to see where the inquiry areas users. In addition, we spearheaded two inter- would grow. Our focus was on creating short-term institutional collaborations to study impact through policy briefs, blog posts, and media around specific domestic US events, the Election engagement that would allow our research to Integrity Partnership and the Virality Project, which inform decision makers in real time. That flexibility are detailed in this report’s Research Spotlights. enabled us to adapt our research and embrace new The research spotlights in this report highlight our projects. Our technical team built tools to ingest, takedown analysis and three of our pivotal reports process, and analyze data from a dozen different published in the past two years. social media platforms. These shared research tools have greatly reduced the manual effort necessary to The pathologies of information ecosystems extend understand and analyze online conversation trends, beyond mis- and disinformation, so we continue to and have empowered individual social science reevaluate the scope of our investigation. In looking researchers to pursue their own lines of inquiry at our work from the past two years—over 100 blog without the need for specialized data science skills. posts, op-eds, reports, and papers—we have refined our research into four distinct but interrelated Our researchers—from students to postdocs, young areas: Trust and Safety, Platform Policy, Information professionals to established experts—leveraged this Interference, and Emergent Technology. capacity to understand such topics as social media

Trust and Safety At the core of the Stanford Internet Observatory’s vision is the desire for the future of the consumer internet to be informed by the mistakes of its past. The Trust and Safety research area focuses on the ways consumer internet services are abused to cause real human harm and on the potential operational, design, and engineering responses. Our research will continue to investigate technical weaknesses on platforms such as Clubhouse, share innovative open-source research techniques such as our published work about Wikipedia, and examine the accessibility of self-harm prevention content on platforms across the internet. Our key outputs in the next year will be our open-access textbook on Trust and Safety Engineering, and a new cross-disciplinary journal incentivizing research collaboration in this space. RESEARCH

9 Platform Policy How well do technology companies’ published policies address user and content concerns? Is government regulation grounded in technical reality? An unexpected outcome of our Election Integrity Partnership was the power of our comprehensive platform policy analysis for creating a rubric to assess policy. Since publishing that analysis, we have continued pressure- testing platform policies in light of the ever-evolving geopolitical climate. This research area works in tandem with the Trust and Safety track but focuses strictly on policy as a way to strengthen trust and safety. Projects examine ways to limit abuse on end- to-end encrypted platforms, the impact of platform policy on access to self-harm content, the role of consent in online intimate imagery, and others.

Information Interference Information manipulation and disinformation research to date has primarily focused on finding and moderating away networks of inauthentic actors on social media platforms. We believe this scopes the problem too narrowly. Campaigns that deliberately spread false and misleading information no longer rely on inauthentic actors; they originate from domestic as well as foreign actors, who blend traditional media as well as social into their strategic efforts. In our assessments of campaigns we increasingly examine full-spectrum activity: both overt/attributable and covert/inauthentic activity spanning broadcast, print, and social media. Information interference, more so than mis- and disinformation, encompasses the variety of operations to manipulate public consensus within the information environment writ large. This area of our work increasingly intersects with the others, as cryptocurrencies, emerging technologies, and new platforms are incorporated into manipulation efforts. At times, these efforts incorporate hacked and leaked material as well, which serves as an intersection point with more traditional cybersecurity research.

Emerging Technology We identified our fourth research area with an eye to the future, which will include new generative machine learning technologies such as deep fakes or the Generative Pre-trained Transformer (GPT) autoregressive language models capable of emulating near-perfect human writing. Our research will look at both the potential and the harm inherent in these technologies, and how they will shape our online world. We also will explore the ways new machine learning techniques can be deployed in protective situations, and look at policy frameworks that can appropriately regulate AI advancements. Our technical infrastructure will play an integral role in creating toolkits for researchers interested in further exploring this new technology. RESEARCH SPOTLIGHT

10 POTEMKIN PAGES AND PERSONAS: ASSESSING GRU ONLINE OPERATIONS, 2014-2019 Renée DiResta and Shelby Grossman • November 12, 2019

The Stanford Internet Observatory published its first detailed whitepaper in 2019 at the request of the Senate Select Committee on Intelligence (SSCI). The report was based on a dataset of social media posts and Pages that Facebook provided to the Committee and attributed to the Main Directorate of the General Staff of the Armed Forces of the Russian Federation, known by its prior acronym GRU. A substantial amount of this content had not previously been publicly attributed to the GRU. Although the initial leads were provided by the Facebook dataset, many of these Pages had ties to material that remained accessible on the broader internet, and our report aggregated and archived that broader expanse of data for public viewing and in service to further academic research. RESEARCH SPOTLIGHT

11

Key takeaways Traditional Operators use The GRU narrative a two-pronged disseminates the laundering approach of results of its hack- operations have narrative and-leak operations and memetic been updated run through narrative for the internet through different laundering. age. media entities. One of the salient The GRU fed its narratives characteristics of the GRU’s Narrative laundering is an into the wider mass-media well-known hack-and-leak “active-measures” tactic with ecosystem with the help tactic is the need for a second a long history. Updating its of think tanks, affiliated party (such as WikiLeaks) to tactics for the social media websites, and fake personas. create an audience for the era, the GRU created think This strategy is distinct from operation. While the GRU’s tanks and media outlets to that of the Internet Research attempts to leak through its serve as initial content drops, Agency, which invested own social media accounts and fabricated personas to primarily in a social-first were generally ineffective, it serve as authors. A network meme-based approach to did have success in generating of accounts distributed maximize online engagement. media attention, which in turn the content to platforms Although the GRU conducted led to wider coverage of the such as Twitter and Reddit. operations on Facebook, results of these operations. A In this way, GRU-created it either did not view GRU group’s Facebook posts content could make its way maximizing social audience about its hack-and-leak attack from a GRU media property engagement as a priority or on the World Anti-Doping to a real, ideologically did not have the wherewithal Agency, for example, received aligned independent media to do so. To the contrary, it relatively little engagement, website—a process designed appears to have designed but write-ups in Wired and to reduce skepticism about its operation to achieve ensured that its the original little-known blog. influence in other ways. operations got wider attention.

This whitepaper is available online at io.stanford.edu. RESEARCH SPOTLIGHT

12 INDEPENDENT INVESTIGATIONS OF PLATFORM TAKEDOWNS

OCTOBER 2020: RALLY FORGE An astroturfing operation involving fake accounts left thousands of comments on Facebook, Twitter, and Instagram. The accounts, linked to the Rally Forge marketing agency, posted comments that appeared grassroots but were in fact paid commentary, much of it from people who did not exist.

SEPTEMBER 2020: US FIRM TARGETS BOLIVIAN ELECTIONS Facebook assets linked to a US strategy firm engaged in coordinated inauthentic behavior targeting people in Bolivia and . This is the first documented case of a platform suspending assets linked to a US firm targeting a foreign country

Platforms periodically ask the Stanford Internet Observatory to independently analyze networks of assets identified and suspended through internal investigations on social media platforms. Additionally, we occasionally identify operations in the course of our own analysis. To date, we have analyzed and RESEARCH SPOTLIGHT

13

OCTOBER 2019: RUSSIANS TARGET AFRICA A Russian businessman targeted people in Libya, Sudan, Central African Republic, Madagascar, Mozambique and the Democratic Republic of the Congo to create credible social media presences that furthered their interests on the ground. The operation appeared to have involved — wittingly or unwittingly — Sudanese individuals who had studied in .

SEPTEMBER 2020: MASS REPORTING FROM PAKISTAN A network of Pakistan-based Facebook assets engaged in mass-reporting to silence critics of Islam in Pakistan. The network used a Chrome extension to expedite reporting.

DECEMBER 2020: ASSETS TARGET LIBYA, SUDAN, SYRIA A large network cultivated over 5 million followers on Facebook. The operation involved participation by Syrians, and possibly Libyan and Sudanese individuals, living in Russia. The Libyan assets had developed media brands and supported the insurgent Libyan National Army.

shared our findings on 29 networks originating from or targeting 32 countries around the world. The map above highlights a selection of our investigations. The full archive of reports is accessible on our website, io.stanford.edu. 14

TEACHING

Over the last two years, SIO has developed and taught four new courses at Stanford, reaching Looking Forward students in programs in international policy, SIO will continue to expand our teaching impact through computer science, business, law, political science, new courses and open-source training material through and more. Outside of the classroom, SIO has been five initiatives. incredibly fortunate to continue our long-term engagement with students through our research An update of the Hack Lab course material to assistant and research analyst positions. In just incorporate new developments in relevant case the past two years, we have trained 76 student law, as well as to integrate recent significant research analysts to join our technical and research cybersecurity incidents, such as the SolarWinds projects, and students have coauthored 67% of our and Microsoft Exchange hacks. publications. In January 2021, we hosted the first Trust and Safety Job Fair at Stanford, which was A new training handbook and feedback process attended by 11 companies and 121 students. to improve the orientation and professional development of SIO research assistants.

Publication of the Trust and Safety Engineering textbook based on the course lectures from the Trust and Safety Engineering course and ongoing research by the SIO team.

New MOOCs adapted from the Hack Lab and Trust and Safety course lectures.

A new social science course to be taught in parallel with Trust and Safety Engineering. This course will explore political science research on topics such as the ways foreign and domestic actors promote disinformation, the use of the internet for extremist recruiting, and the use of chat apps to incite violence, and will explore how online platforms currently Riana Pfefferkorn and Alex Stamos respond to these threats. Students will gain an delivering a lecture in Hack Lab. understanding of the most pressing challenges in global communication platforms and a strong foundation for future research and work on mitigating these harms. For the final project, students will collaborate with those in the Trust and Safety course to design policy guidelines that respond to harmful content. TEACHING

15 HACK LAB

Hack Lab, cotaught by SIO director Alex Stamos and research scholar Riana Pfefferkorn, aims to give students a solid understanding of the most common types of attacks used in cybercrime and cyberwarfare. Lectures cover the basics of an area of technology, how that technology has been misused in the past, and the legal and policy implications of those hacks. In lab sections, students apply this background to attack vulnerable, simulated targets using techniques and tools seen in the field. Students going into leadership positions will understand how events like ransomware attacks occur, how they as leaders can protect cyber assets, and what legal responsibilities they have. In two years, 210 students have completed the course.

TRUST AND SAFETY ENGINEERING

Trust and Safety Engineering, also taught by Alex Stamos, is designed to expose computer science undergraduates to the ways consumer internet services are abused to cause real human harm, along with potential operational, product, and engineering responses. Abuses covered in the course include spam, fraud, account takeovers, the use of social media by terrorists, misinformation, child exploitation, and harassment. Students explore both the technical and sociological roots of these harms and the ways various online providers have responded; the students then complete a practical group project addressing a real-world example of harmful behavior. This course is designed to give future entrepreneurs and product developers an understanding of how technology can be unintentionally misused and to expose them to the trade-offs and skills necessary to proactively mitigate future abuses. The course was piloted in the fall of 2019 with a dozen selected students, and has since enrolled 180 students.

ONLINE OPEN-SOURCE INVESTIGATIONS

Online Open-Source Investigations was developed by SIO research scholar Shelby Grossman as a practical introduction to internet research using free and publicly available information. Dr. Grossman’s syllabus blends best-in-practice open-source intelligence tools (OSINT) developed by organizations such as Bellingcat with SIO’s own research practices. The course prepares students for online open-source research in jobs in the public sector, with technology companies and human rights organizations, and with other research and advocacy groups. SIO has also adapted the course into onboarding training for our research analysts. The small seminar has been taught three times and has enrolled an average of 15 students per quarter over the past two years.

CURRENT TOPICS IN TECH PLATFORM POLICY

Current Topics in Technology Platform Policy was created by SIO associate director Elena Cryst in the spring of 2021 to expose students to a broader set of outside experts and practitioners on the frontiers of emerging technology issues. Speakers included Chris Krebs, former director of the Cybersecurity and Infrastructure Security Agency; Julie Owono, executive director of Internet Sans Frontièrs and a member of the Oversight Board; Olga Belogolova, Facebook’s policy lead for countering influence operations; and Eva Galperin, director of cybersecurity at the Electronic Frontier Foundation. In its first quarter, the course had an enrollment of 32 students. COURSES RESEARCH SPOTLIGHT

16 TELLING CHINA’S STORY: THE CHINESE COMMUNIST PARTY’S CAMPAIGN TO SHAPE GLOBAL NARRATIVES Renée DiResta, Carly Miller, Vanessa Molter, John Pomfret, and Glenn Tiffert • July 20, 2020

Much of the attention to state-sponsored influence practices in recent years has focused on social media activity, particularly as companies such as Facebook and Twitter announced takedowns of accounts linked to state-backed operations. However, state-sponsored operations are broader than social media. This report looks at China’s demonstrated ability to operate a full-spectrum capability set that spans both traditional and social media ecosystems.

The Chinese Communist Party (CCP) relies on an extensive influence apparatus that spans a range of print and broadcast media to advance both its domestic monopoly on power and its claims to global leadership. This apparatus draws on nearly a century of experience running information operations. What is the scope and nature of China’s overt and covert capabilities, and how do they complement one another? China’s longstanding commitment to managing narratives means that it will likely continue to learn, iterate, and adapt. Our report analyzes its capabilities, how these capabilities are leveraged, and how they continue to evolve. RESEARCH SPOTLIGHT

17

Key takeaways

China has a robust overt propaganda apparatus, and managing both China has less- inward and attributable outward-facing China is growing communication messaging remains its social media options that it a top priority for and broadcast uses to influence the CCP. influence abroad. opinions.

Under President Xi Jinping, Internationally, overt These include content farms its work has taken on a infrastructure includes and fabricated accounts and new urgency, with the CCP regionalized and language- personas on social media tightening control of the specific traditional media channels. Perhaps the most media and elevating its channels, social media pages famous of China’s more covert propaganda machine to the and accounts, and prominent- domestic influence capabilities top tier of party organs. figure influencer accounts is the digital commenter brigade with millions of followers known as Wumao, or “50 Cent on social media Party.” In August 2019, clusters of platforms. fake accounts and content were concretely attributed to the CCP by several technology companies, including Facebook, Twitter, and YouTube.

This report was prepared with colleagues at the Hoover Institution's China Global Sharp Power Project. The full report is available on our website, io.stanford.edu. 18

POLICY Over the last two years, the Stanford Internet Observatory scholars and Looking Forward experts have pushed forward efforts Our team has been prolific during our year of sheltering in place. to keep technology policy grounded As we return to the office and to bicoastal travel this summer, we in technical reality. We have hosted will bring over 100 publications to ongoing global policy debates. events with policy makers, given Strategically, we have identified several goals that will accelerate expert congressional testimony, held our policy impact in the coming years. private briefings for government staff, organized a public event at the OPEN A NEW OFFICE IN , DC. European Parliament, released policy The Stanford Internet Observatory plans to establish a persistent briefs, and provided feedback to presence in DC as we locate several full-time staff positions in the online platforms on their policies and nation’s capital. This will facilitate engagement with all branches of practices. the federal government, and with our colleagues and counterparts at think tanks in Washington. Our launch two years ago coincided INFORM THE DEBATE AROUND SECTION 230 REFORM with the release of the Freeman Spogli AND END-TO-END ENCRYPTION REGULATION. Institute’s report, “Securing American Elections: Prescriptions for Enhancing The regulation debate persists both in government and within the Integrity and Independence of the technology companies. We will continue to publish commentary 2020 U.S. Presidential Election and on proposals to reform Section 230 of the Communications Beyond.” This report, coauthored by Decency Act. At the same time, we will use the convening power Alex Stamos, laid out concrete steps of our popular end-to-end encryption workshops to further the that election authorities could take discussion on how to improve user safety while preserving privacy to improve the security of election on encrypted networks. infrastructure, combat state-sponsored TAILOR RESEARCH OUTPUTS TOWARD POLICY disinformation campaigns, and regulate RECOMMENDATIONS. online advertising around the elections. Last year we piloted a research approach that combines short- term, policy-relevant research with longer-term research for In March 2021, after six months of peer-reviewed publications. For example, we published an monitoring, research, and analysis, SIO evaluation of internet platform self-harm policies, in order to and the Election Integrity Partnership have an immediate effect on those policies. We then used that published our report on misinformation whitepaper as a jumping-off point for a larger project evaluating in the 2020 presidential election, “The the implementation of these policies, which we will submit to a Long Fuse.” Those findings are detailed peer-reviewed journal. This model enables us to publish findings in the research spotlight on pages 19- and recommendations when they are still relevant and able to 20 of this report. We found that both have an impact. government and platforms took many EXPAND OUR INTERNATIONAL ENGAGEMENT. appropriate steps to mitigate the risks seen in the 2016 election, but were still Although we started 2020 with a visit to the European unprepared for the depth of grassroots Commission and Parliament, over the past year we concentrated and domestic misinformation spread most of our resources on two major domestic research projects: in the leadup to the 2020 election the 2020 election and the US COVID-19 vaccine rollout. In the and fed by the global pandemic and coming year we will reinvigorate our international research and unprecedented rhetoric of the sitting seek opportunities to expand engagement overseas, with a renewed focus on developing democracies. president. POLICY

19

Comparative Policy Analysis SIO publishes tables, to compare social media platforms’ policies around a specific topic based on a published rubric. This table shows our final comparison of platforms’ election related policies, as of October 28, 2020. Delegitimization Procedural Participation Fraud of Election Interference Interference Results Facebook Comprehensive Comprehensive Comprehensive Comprehensive Twitter Comprehensive Comprehensive Non-Comprehensive Comprehensive YouTube Comprehensive Comprehensive Non-Comprehensive Non-Comprehensive Pinterest Comprehensive Comprehensive Comprehensive Comprehensive Nextdoor Non-Comprehensive Non-Comprehensive Non-Comprehensive Non-Comprehensive TikTok Non-Comprehensive Non-Comprehensive Non-Comprehensive Non-Comprehensive Snapchat Non-Comprehensive Non-Comprehensive Non-Comprehensive Non-Comprehensive Parler Gab Discord WhatsApp *No election–related policies Telegram Reddit Twitch

March 3, 2021: The Long Fuse: Misinformation and the 2020 Election, published report

January 19, 2021: Launch of the Virality Project weekly policy briefs

October 13, 2020: Launch of Election Integrity Partnership weekly briefing calls

May 11, 2020: “Coronavirus And Homeland Security Part Seven: Flattening The Misinformation Curve,” House Committee on Homeland Security virtual forum

March 5, 2020: SIO report “Evidence of Russia-Linked Influence Operations in Africa” cited in Senate Hearing 116-275

February 18, 2020: “Safety, Privacy and Internet Policy in Europe,” event at the European Parliament at the invitation of Polish MEP Radosław Sikorski

February 17, 2020: “The Dilemma of Disinformation: How should democracies respond?” lecture by Alex Stamos at the European Parliament Research Service

2019–2021 June 25, 2019: “Artificial Intelligence and Counterterrorism: Possibilities and Limitations,” testimony by Alex Stamos at a hearing before the Subcommittee on Intelligence and Counterterrorism of the Committee on Homeland Security, House of Representatives

POLICY OUTPUTS POLICY June 6, 2019: Securing American Elections: Prescriptions for Enhancing the Integrity and Independence of the 2020 U.S. Presidential Election and Beyond, published report RESEARCH SPOTLIGHT

20 THE LONG FUSE: MISINFORMATION AND THE 2020 ELECTION The Election Integrity Partnership • March 3, 2021

On January 6, 2021, an armed mob stormed the US Capitol to prevent the certification of what they claimed was a “fraudulent election.” The insurrection was the culmination of months of online mis- and disinformation directed toward eroding American faith in the 2020 election. RESEARCH SPOTLIGHT

21 US elections are decentralized: almost 10,000 state The Election Integrity Partnership, a coalition of and local election offices are primarily responsible four organizations that specialize in understanding for the operation of elections. Dozens of federal those information dynamics, aimed to create a agencies support this effort. However, none of model for whole-of-society collaboration and these federal agencies has a focus on, or authority facilitate cooperation among partners dedicated regarding, election misinformation originating to a free and fair election. With the narrow aim from domestic sources. This limited federal of defending the 2020 election against voting- role reveals a critical gap for non-governmental related mis- and disinformation, it bridged the entities to fill. Increasingly pervasive mis- and gap between government and civil society, helped disinformation, both foreign and domestic, to strengthen platform standards for combating creates an urgent need for collaboration across election-related misinformation, and shared its government, civil society, media, and social media findings with its stakeholders, media, and the platforms. American public. The report details our process and findings, and provides recommendations for future actions. Key takeaways

Misleading and false claims and narratives The primary repeat spreaders of false coalesced into the metanarrative of a “stolen and misleading narratives were verified, election,” which later propelled the January 6 blue-check accounts belonging to partisan insurrection. media outlets, social media influencers, and political figures, including President Trump The production and spread of misinformation and his family. was multidirectional and participatory. Many platforms expanded their election- Narrative spread was cross-platform: repeat related policies during the 2020 election spreaders leveraged the specific features of cycle. However, application of moderation each platform for maximum amplification. policies was inconsistent or unclear.

The 2020 election demonstrated that actors— that necessitated its creation have not abated. both foreign and domestic—remain committed Academia, platforms, civil society, and all levels to weaponizing viral false and misleading of government must be committed to predicting narratives to undermine confidence in the US and pre-bunking false narratives, detecting mis- electoral system and erode Americans’ faith in our and disinformation as it occurs, and countering it democracy. While the Partnership was intended whenever appropriate. to meet an immediate need, the conditions

The Election Integrity Partnership was conceived of and initiated by the Stanford Internet Observatory with the partnership of the the Atlantic Council’s Digital Forensic Research Lab, Graphika, and the University of Washington’s Center for an Informed Public. Report available at eipartnership.net. ACKNOWLEDGEMENTS

22

thank you...

... to ... to our donors ... our Samantha Bradshaw Daniel Bush staff... Craig Newmark Philanthropies Elena Cryst Hewlett Foundation Renée DiResta Omidyar Network Isabella Garcia-Camargo Charles Koch Foundation Josh Goldstein Felicis Ventures Shelby Grossman Matt Masterson Carly Miller Riana Pfefferkorn Alex Stamos David Thiel ... and to over a hundred students, colleagues and friends who have been part of our team in our first two years. ACKNOWLEDGEMENTS

23

Internet Observatory Cyber Policy Center

The Stanford Internet Observatory is a cross-disciplinary program of research, teaching and policy engagement for the study of abuse in current information technologies, with a focus on social media. The Stanford Internet Observatory was founded in 2019 to research the misuse of the internet to cause harm, formulate technical and policy responses, and teach the next generation how to avoid the mistakes of the past.

io.stanford.edu •@stanfordio [email protected]

Photo Credits: p.2 – Stanford Dish/Stanford News Service; p.4 – Alex Stamos and Elena Cryst/Rod Searcy; p.10 – Potemkin Village, Carson City, Vårgårda, Sweden, 2016/Greg Sailer; p.14 – Screenshot from Hack Lab Course Recording/Stanford Center for Professional Development; p.16 – NASA Earth Observatory/Wikimedia Commons; p.20 – Original illustration for the election integrity partnership/Alex Atkins Design, Inc. This page spread L-R: SIO election night warroom; SIO E2EE Workshop; Carly Miller presenting at Cyber Center Seminar; Hack Lab TA with lab ma- terial; Alex Stamos delivering congressional testimony; students in a classroom; Renee DiResta presenting; SIO team members collaborating. Editing provided by Eden Beck. Internet Observatory Cyber Policy Center