INFORMATION INTERVENTION AND THE NEED FOR A SOCIAL CYBERSECURITY PERSPECTIVE: THE POWER STRUGGLE BETWEEN DIGITAL DIPLOMACY AND COMPUTATIONAL PROPAGANDA

By

PHILLIP C. ARCENEAUX

A DISSERTATION PRESENTED TO THE GRADUATE SCHOOL OF THE IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF DOCTOR OF PHILOSOPHY

UNIVERSITY OF FLORIDA

2019

© 2019 Phillip C. Arceneaux

This dissertation is dedicated to my two grandfathers, Harry Arceneaux and Richard “Dick” Hanrahan, both of whom never had a college degree and sadly were never able to see me become Dr. Arceneaux. You were both loved so incredibly much and are missed every single day.

ACKNOWLEDGMENTS

I would first like to thank the members of my dissertation committee, Dr. David Ostroff,

Dr. Spiro Kiousis, Dr. Jasmine McNealy, Dr. Laura Sjoberg, and Dr. Aida Hozic. As someone who is always interested in stressing interdisciplinary links and relationships in research, this dissertation is truly unique in its perspective, approach, and execution; this was only possible because of the incredible open-mindedness of each one of my committee members and the trust they had in me to go off and do some kind of justice to this research. They truly let me take this dissertation where I wanted to go rather than keeping me in the bumper lanes of what constitutes a “normal” dissertation in mass communication, and for that I am ever grateful.

I would also like to thank all of my teachers and mentors in the University of Florida’s

College of Journalism and Communication who helped and inspired me throughout my Ph.D. program. This list includes Dr. Debbie Treise, Dr. Norm Lewis, Dr. Wayne Wanta, Dr. Frank

Waddell, Dr. Kelly Chernin, Dr. Moon Lee, Dr. Sylvia Chan-Olmsted, Dr. Myiah Hutchens, Dr.

John Wright, Dr. Cynthia Morton Padovano, Dr. Michael Leslie, Dr. Sriram (Sri) Kalyanaraman,

Dr. Yu-Hao Lee, Dr. Ron Rodgers, Dr. Carla Fisher, Dr. Marcia DiStaso, Mindy McAdams, and

Pat Ford. I could also not forget my amazing statistics professor, Dr. Michael Marsiske of the

Department of Clinical and Health Psychology.

Next, an incredible acknowledgement and thanks to the University of Southern

California’s Center for Public Diplomacy (CPD). At every point in my Ph.D. program, the CPD worked tirelessly to support graduate students around the globe, just like myself, who are interested public diplomacy. In May of 2017, the CPD paired with the Digital Diplomacy

Research Group to host myself and other graduate students for a one-day conference at the

University of Oxford. In May of 2018, the CPD paired with the International Communication

Association’s Public Diplomacy Interest Group to host of pre-conference for doctoral students in

4

the field. And in September of 2018, the CPD awarded me their 2018-2019 Doctoral Dissertation

Grant to help fund this dissertation and offset the costs of an extensive array of in-depth expert interviews and supplementary interview transcriptions. Words cannot express the gratitude and appreciation for the potential the CPD saw in me and the willingness to help me pursue my research.

I would also like to especially mention Dr. R. S. Zaharna of American University. While the CPD paired with the International Communication Association’s Public Diplomacy Interest

Group to host of pre-conference for doctoral students in Czechia, it was Dr. Zaharna who personally funded a grant that covered the cost of the post-conference for all of the student presenters. I had never met Dr. Zaharna before travelling to Prague, and upon meeting her I instantly fell in love with her amazing personality and her passion to see students grow and excel. Having received my fair share of criticism from academe about being too much of an intellectual rather than a scholar, Dr. Zaharna offered no criticism, simply unconditional support to follow where my heart takes me. Regardless of where life and a career may take me, I will never forget such an amazing and loving person.

Beyond the CPD and Dr. Zaharna, I would like to acknowledge and thank all of the scholars who helped me through my studies and research agenda, Dr. Corneliu Bjola and Ilan

Manor of the University of Oxford, Dr. James Pamment of Lund University, Steven Pike of

Syracuse University, Dr. Jay Wang and Dr. Nicholas Cull of the University of Southern

California, Dr. Alina Dolea of Bournemouth University, Dr. Pawel Surowiec of the University of

Sheffield, Dr. Emily Metzger of Indiana University, Dr. Shawn Powers and Dr. Amelia

Arsenault of Georgia State University, and Dr. Eytan Gilboa of Bar-Ilan University. I would also like to make a special acknowledgement of thanks to Dr. Guy Golan of the Center for Media and

5

Public Opinion. Without a primary public diplomacy scholar at the University of Florida, Guy took me under his wing while at the University of South Florida in Tampa and really pushed my development in the realm of public diplomacy research. He helped situate me in the literature, network me among public diplomacy scholars, and develop a significant publication record during my Ph.D. program.

Lastly, I have to also acknowledge just a handful of the many others who helped me to completely my Ph.D. I would like to thank my parents, Bryan and Debbie Arceneaux, for their endless love and support, my grandparents Harry Arceneaux and his wife of 72 years, Verna

Mae, for loving and supporting me especially when my studies took me far away from them, to

B.C, Cassie, and Neil Thibeaux (no relation to Tim Tebow for any curious UF Gators), and the faculty and staffs at Our Lady of Fatima Catholic School, St. Thomas More Catholic High

School, Louisiana State University, and the University of Louisiana at Lafayette.

6

TABLE OF CONTENTS

page

ACKNOWLEDGMENTS ...... 4

LIST OF ABBREVIATIONS ...... 11

ABSTRACT ...... 13

CHAPTER

1 INTRODUCTION ...... 15

Mass Communication and Politics ...... 15 Politics and the Law ...... 18 Information Intervention ...... 22 Digital Diplomacy as Information Politics ...... 24 Computational Propaganda as Information Operations ...... 25 Purpose ...... 28 Conclusion ...... 31

2 LITERATURE REVIEW ...... 33

Strategic Communication ...... 33 Political Public Relations ...... 34 Strategic Narrative ...... 35 The Power of Public Opinion ...... 37 Power ...... 37 Public Opinion ...... 38 Soft and Sharp Power ...... 45 Theoretical Framework ...... 48 Research Questions ...... 50 Demarcating Propaganda and Public Diplomacy ...... 52 International Law and Digital Broadcasting ...... 56 Computational Propaganda and Cybersecurity ...... 60

3 METHODOLOGY ...... 64

Socio-Legal Research ...... 64 Research Design ...... 67 Expert Interviews in Grounded Theory ...... 68 Textual Analysis in Grounded Theory ...... 75 Weaknesses of the Methodology ...... 78

4 RESULTS ...... 82

Research Question 1 ...... 82

7

Demarcating Strategies of Information Intervention ...... 82 The actors ...... 83 The manner ...... 85 The target audience ...... 86 The method for content creation ...... 88 The model of communication flow ...... 91 The end-goal intent ...... 92 The policy outcome ...... 96 Emergent Themes Within the Data ...... 97 Attribution ...... 98 Emotion ...... 100 Accuracy ...... 102 Perspective ...... 103 Research Question 2 ...... 106 Capacity Building: Sybil Detection and Bot Identification ...... 109 Mutual Assistance and Spontaneous Information: Bot Tracking Across Borders ...... 113 Research Question 3 ...... 116 Privacy by Design ...... 117 Institutional controllers and differential privacy ...... 117 Data subjects and synthetic data ...... 119 Emergent Themes in the Data ...... 120 Algorithms ...... 120 Economic Business Models ...... 122 Political Partisanship ...... 125

5 DISCUSSION ...... 127

Premise of the Study ...... 127 Primary Findings ...... 132 Demarcating Strategies of Information Intervention ...... 132 Existing Frameworks for Cybersecurity Governance ...... 134 Newer Frameworks for Cybersecurity Governance ...... 136 Policy Recommendations ...... 138 Multi-Stakeholder Driven Cooperation ...... 138 Governments ...... 139 International organizations ...... 141 Technology economic sector ...... 142 NGOs & non-profits ...... 143 Think tanks and academia ...... 144 Multi-lateral Driven Data Regulation ...... 145 Regional regulation ...... 146 International regulation ...... 146 Increased Algorithm Oversight ...... 147 Multi-lateral transparency and evaluation ...... 149 International transparency and evaluation ...... 149 Computational Propaganda Detection Network ...... 150 Functional Cooperation Programs ...... 153

8

Public Opinion Management ...... 156 Limitations and Suggestions for Future Research ...... 159 Conclusion ...... 166

APPENDIX

A INTERVIEW PROTOCOL ...... 169

B QUANTITATIVE OVERVIEW OF IN-DEPTH INTERVIEWS ...... 176

C QUANTITATIVE OVERVIEW OF THE GDPR BY THEMATIC BREAKDOWN ...... 177

LIST OF REFERENCES ...... 178

BIOGRAPHICAL SKETCH ...... 202

9

LIST OF TABLES Table page

3-1 List of experts interviewed in the current research...... 72

3-2 List of codes as determined by themes determined through the dominant literature on the fields of public diplomacy, propaganda, public affairs, and psychological operations...... 74

3-3 Legal documents analyzed via textual analysis...... 76

4-1 Typology of Information Intervention Stratagem...... 105

10

LIST OF ABBREVIATIONS

BBC British Broadcasting Corporation

BBG Broadcasting Board of Governors

CIA Central Intelligence Agency

CCTV China Central Television

CGTN China Global Television Network

EDDE European Digital Diplomacy Exchange

EuroDig European Dialogue on Internet Governance

FVEY Five Eyes; UKUSA Agreement

DoD Department of Defense

GDPR General Data Protection Regulation

GEC Global Engagement Center

GGE Governmental Group of Experts on Developments in the Field of Information and Telecommunications in the Context of International Security

ICT Information Communication Technology

IRA Internet Research Agency

MAS Methodological Anxiety Syndrome

MISO Military Information Support Operations

PSYOP Psychological Operations

RFA Radio Free Asia

RFE/RL Radio Free Europe/Radio Liberty

RT Russia Today

USAGM United States Agency for Global Media

USIA United States Information Agency

VOA Voice of America

11

WCIT-12 World Conference on International Telecommunications, 2012

WRC World Radiocommunication Conference

12

Abstract of Dissertation Presented to the Graduate School of the University of Florida in Partial Fulfillment of the Requirements for the Degree of Doctor of Philosophy

INFORMATION INTERVENTION AND THE NEED FOR A SOCIAL CYBERSECURITY PERSPECTIVE: THE POWER STRUGGLE BETWEEN DIGITAL DIPLOMACY AND COMPUTATIONAL PROPAGANDA

By

Phillip C. Arceneaux

May 2019

Chair: David H. Ostroff Major: Mass Communication

Where political communication scholarship of the 20th century was plagued with debates surrounding propaganda and public diplomacy, the 21st century is facing similar debates surrounding computational propaganda and digital diplomacy. However, the digital information ecology is vastly different from that of the 20th century, increasingly calling into question parallel practices of strategic information dissemination, such as psychological operations and public affairs. In a charged international political arena where public opinion is a global force that, at times, constrains the behavior of state actors, information has now become a political tool that rivals the impact and value of traditional military force. Recognizing the growing nature to which digital state-sponsored disinformation campaigns are being used to influence political institutions and processes around the globe, this study uses a two-part qualitative methodology to demarcate digital diplomacy from computational propaganda, as well as explore legal avenues for Internet governance moving forward. Such findings contribute to recommendations for policy formulation in the area of telecommunication regulation and social cybersecurity. They also contribute to the development of a public relations strategy through which public diplomacy

13

practitioners can position digital diplomacy as the primary tool with which to combat computational propaganda.

14

CHAPTER 1 INTRODUCTION

The resurgence of rules and procedures in the service of an organized international order is the legacy of all wars, hot or cold. - A. M. Slaughter, 1993

Mass Communication and Politics

In his 1964 work, Understanding the Media: The Extensions of Man, communication theorist and philosopher Marshal McLuhan ultimately posed a simple question, do changes in communication technologies lead to changes in society, or vice versa? What served to contribute a unique communication-based perspective to larger causal primacy debates, joining other discussions surrounding the effects of biology and genetics (Martinez, 2014), the environment

(Harrison, 2013), and linguistics (Whorf, 1956) on society, clearly placed the phenomenon of mass communication as a prime variable in studying evolutionary changes in society.

As an interdisciplinary field of study, mass communication scholarship is highly intertwined in a variety of disciplines. Notably, two of the core foundations to civilization in which mass communication is highly implicated are politics (Hall & Kenski, 2017) and the law

(Trager, Ross, & Reynolds, 2018). An overview of the evolution of mass communication would seem to support the argument of a link between information communication technologies (ICT) and both political and legal institutions and processes. Without going so far as to assert a causal argument, even a basic understanding of political history would suggest a relationship between evolutions in ICTs and power distributions across political institutions. Martin Luther gave rise to the Protestant Reformation by publicly challenging papal infallibility; a movement was achieved through using the printing press to circulate record numbers of his 95 theses, as well as produce and distribute German translations of the bible.

15

The telegraph further modernized politics, allowing for greater trans-national interconnectivity. Political institutions again further evolved with the mass adoption of the radio, namely in the extent to which information could directly reach mass audiences from the source to the audience in a first-hand fashion, offering the appearance, or maybe illusion, of an intimate conversation. World leaders and politicians were humanized to the common person, as demonstrated through King George V’s Christmas day broadcasts or Franklin Roosevelt’s fireside chats. Indeed, the common radio owner had access to a significantly widened range of content and information, such as Orson Welles’ famous 1938 production of The War of the

Worlds (Lowery & DeFleur, 1988) and reports such as the 1940 bombing of London famously being broadcast live by journalist Eduard R. Murrow (Edwards, 2004).

The mass adoption of the television continued this ICT evolution to coexist alongside a process of political change, making people more and more party to the process of politics.

Whether it be the live broadcast of Queen Elizabeth’s coronation in 1953, to live coverage of

British prime ministers answering questions (PMQs) from the 1990s on, or the 1960 live debates between U.S. presidential candidates John Kennedy and Richard Nixon, to the round the clock congressional coverage on CSPAN, institutional politics are now under the close gaze of mass public opinion. Long distance news rose equally alongside domestic news coverage, with color video of Neil Armstrong walking on the moon, the Viet Cong’s capture of Saigon, the election of

Pope John Paul II, and the famed protests in Tiananmen Square captivating world viewers from the comfort of their own homes.

In contemporary times, the Internet has continued this process. Instantaneous access to copious amounts of information globally has called for considerable adaptations from political institutions, some more constructive than others. In participatory political models, the email put

16

every constituent within virtual arm’s length of his or her representative. The U.S. presidential race went digital for the first time in the 1996 General Election, with websites set up for the

Dole/Kemp and Clinton/Gore campaigns (Shields, 2016) which are still active domains as of

2018 (www.dolekemp96.org); further, presidential races have continued such trends with the institutional adoption and use of social media by the Obama (2008/2012) and Trump (2016) campaigns. Conversely, Internet access has also been party to less positive events in more totalitarian political spheres, such as playing a role in the Arab Spring as well as the growing concerns of privacy invasion in countries like China, Russia, and Iran.

While the Internet has no doubt contributed to changes in the institutions and processes of the political sphere, with around only 20 years of mass diffusion we have still yet to fully comprehend the repercussions this medium has had, and will continue to have, on civilization.

We are still learning about the psychological effects (Dunbar, Proeve, & Roberts, 2018; Kraut &

Burke, 2015), the economic effects (Hendrix, 2016; Rivlin & Litan, 2001), and the power redistribution effects the Internet has on society (Vanian, 2017; Weiss, Seyle, & Coolidge, 2013).

One of the arguments surrounding redistribution of socio-political power is the growing dominance of public opinion, increasingly empowered by global Internet access. The status of public opinion as a world superpower was first posed by a New York Times journalist in the early

2000s (Tyler, 2003), and since, the debate of public opinion as a second superpower to the

United States has taken on a life of its own (Chomsky, 2016; Khan, 2016; Adams, 2014; Krieg,

2014; Seppälä, 2012; Smith, 2009; Richter, 2007).

Given the historical overview provided, it is within the realm of plausibility to suggest that evolutions in ICTs have contributed to changes in the political realm through the mediated empowerment of public opinion. Indeed, the value of public opinion in the political order was

17

addressed in the 19th century by Carl von Clausewitz. In debating the political determinism inherent between state policy and war in early 1800s Europe, von Clausewitz acknowledged the power of domestic public opinion, and the societal support role it plays in an internal political context (Echevarria, 2007). Thus, as ICTs increasingly mediate the relationship between public opinion and political elites, it should be clear how the fate between the two are invariably linked.

Politics and the Law

What cannot be ignored, however, is the equally vital relationship between politics and the law (Whittington, Kelemen, & Caldeira, 2011; Cerar, 2009). Though governments are comprised by political entities, it is governments, and not these political entities, which create and enforce laws that, in turn, regulate political institutions and processes.

Law and politics are deeply intertwined. Law is an essential tool of government action, an instrument with which government tries to influence society. Law is also the means by which government itself is structured, regulated and controlled. It is no surprise, then, that law is an important prize in the political struggle and that law shapes how politics is conducted. (Whittington, 2012, para. 2)

If, as Whittington argues, law is a tool by which governments attempt to influence societies, then such a state presumes a causal effect upon society by the adoption and enforcement of new laws, thus introducing a primacy debate between the political and legal spheres. Accepting for the moment the assumption of an existent relationship between ICTs (A) and politics (B), if an indeterminable causal relationship between politics (B) and the law (C) exists, how then does the law (C) relate with ICTs (A), as mediated through their relationship with politics (B)? To answer this question, a clear conceptual understanding of legal scholarship is necessary.

Most scholarly interpretations on law through the 19th century were characterized primarily as classical, or doctrinal, legal theory (Chynoweth, 2008; Mccruden, 2006; Schiff,

1976). Such thinking viewed law as derived from doctrine, an internalized perspective that championed advancements in law by its inherently sound logic (Alberstein, 2017). Moving into

18

the 20th century, legal scholarship began to shift from this internalistic-only discipline to more broadly conceive of the relationship the law had with, and on, society. A proponent of the shifting perspective, U.S. Supreme Court Justice Oliver Wendell Holmes, said, “the life of the law has not been logic; it has been experience” (2005, p. 1). Indeed, as Mather put it, “law is not autonomous, standing outside of the social world, but is deeply embedded within society” (2011, para. 1). What came to be known in the United States as the Law and Society perspective placed the law in an ultimately circular relationship with societal events (Savelsberg & Cleveland,

2017), primarily through the vehicle of politics (Whittington et al., 2011).

Further scholarship in legal studies later introduced the concept of sociological jurisprudence (Pound, 1912), which came to represent a paradigm of research arguing for the continual lag of law behind the advancements of society. To put such a cross-lag relationship in the framework of a primacy debate is to recognize the potential for the mutual relationship between ICTs (A) and politics (B) to impact society in ways that the law (C) must then react to.

“Change occurs when there is sufficient demand for change to shift or expand the perimeter of the mass of surveyed claims … But in the harmonizing of claims, it must ever be by the ideals of the civilization of the time and place” (Gardner, 1961, p. 18). Indeed, law is ultimately a reflection of the social and political values of a civilization at a given time.

Looking to an example of such a phenomenon, the social issue surrounding privacy and big data offers a model for application. Where ICTs have long been involved in the political process, the Internet, including social media, has offered something unique to those seeking to influence the political process (Hanson, 2008). The U.S. Director of National Intelligence, as advised by the U.S. intelligence community, concluded that 13 Russian citizens working for the

Internet Research Agency (IRA) in St. Petersburg purchased advertisements on Facebook with

19

the intent to influence the course of the 2016 U.S. presidential election, in tandem with other more traditional cyber operations (National Intelligence Council, 2017). “Our task was to set

Americans against their own government: to provoke unrest and discontent” (Committee on

Foreign Relations, 2018, p. 45). Further investigations by the U.S. Senate Committee on Foreign

Relations (2018) determined that similar disinformation campaigns were likely used to influence the 2016 British Referendum, the 2017 French presidential and German federal elections, as well as other political processes across Italy, Spain, the Netherlands, Scandinavia, and the Baltics.

Indeed, this growing form of digital information operations campaigns, designed to use information-based persuasion as a tool of political manipulation and disruption, has been coined as computational propaganda. The phenomenon consists of the exploitation of mass quantities of private user data, via algorithmic micro-targeting, to curtail digital content on a person-to-person basis in such ways that are most likely to resonate and influence the political cognition, attitudes, and behavior of content recipients.

The manipulation of public opinion over social media platforms has emerged as a critical threat to public life. Around the world, a range of government agencies and political parties are exploiting social media platforms to spread junk news and disinformation, exercise censorship and control, and undermine trust in the media, public institutions, and science. At a time when news consumption is increasingly digital, artificial intelligence, big data analytics, and ‘black-box’ algorithms are being leveraged to challenge truth and trust: the cornerstones of our democratic society. (Bradshaw & Howard, 2018, p. 3)

For example, a resident of Great Britain identified as right-leaning, based on an inferential analysis of the user’s personal data, would likely see pro-Brexit social media advertisements outlining the threat refugees pose to British national security and how the E.U. has taken away the rights of the British people to control their own security. To the contrary, a similar resident of

Great Britain identified as left-leaning would likely see anti-Brexit advertisement outlining the humanitarian nature of taking in refugees, the contributions immigrants have made to British

20

society, and the increased security of British borders through membership in the E.U. and intelligence sharing between E.U.-member states. Scholarly research on computational propaganda, a fast-growing area of academic study, primarily being headed by Dr. Philip

Howard of the Computational Propaganda Project at the University of Oxford Internet Institute, as well as Dr. Mark Hansen of the Columbia University Journalism School.

While the mass perception in the United States of such political interference has led to varying scopes of public anxiety (Brattberg & Maurer, 2018; Shotter, 2018; Stetzenmüller,

2017), such fear has only been exacerbated by the growing public awareness of how data aggregators and the current business models surrounding data usage support comparatively open- door policies for computational propaganda efforts. Most notable, Cambridge Analytica made use of private data from an estimated 50-100 million Facebook users to strategically target politically-oriented advertisements during the 2016 U.S. presidential election, as well as at other times in the European and African theaters (Auchard & Ingram, 2018; Ingram & Henderson,

2018; Ingram, 2018). Social outrage over the way these mass communication tools were used for strategic political purposes has begun the process of harmonizing international concerns in ways that have sparked the reactive measures of the legal system. While the E.U.’s General Data

Protection Reregulation (GDPR) took effect in May of 2018, regulating the collection, storage, transfer, and use of user data (GDPR Key Changes, 2018), Facebook CEO Mark Zuckerberg testified before the U.S. Senate and House of Representatives (Romm, 2018) the E.U. Parliament

(Chappell, 2018), and has received requests and an impending summons to testify before the

U.K.’s House of Commons (Kottasová, 2018). Clearly, a sequence of events can be recognized: a new ICT enabled unprecedented political behavior that, when viewed pejoratively in the public eye, has in turn engaged the reactive nature of law makers in democratic political systems.

21

Information Intervention

Where the previous case serves to articulate an intermingled relationship between ICTs, politics, society, and the law, it is in a context based on the misuses of these technologies by non- state actors, namely individual persons and corporate entities. What it fails to do is acknowledge the predisposition for national governments, i.e. states themselves, to engage in such behavior.

Indeed, an established area in which both governmental and non-governmental entities are using

ICTs to engage with public opinion at a global scale is public diplomacy. A little over 50 years old, the concept of public diplomacy has seen a vibrant history; due to its foundations in soft power (Nye, 2004), it was heralded as a meaningful alternative to traditional hard power options of coercion such as military engagement or economic sanctions, stressing rather dialogue, cooperation, and cross-cultural understanding (Gilboa, 2015).

However, public diplomacy’s history has seen a plethora of accusations framing it as not being significantly different from its predecessor, propaganda (Bejesky, 2012; Hopkins, 2015;

Mehnaz, 2015; Misyuk, 2013; Osgood, 2017; Zaharna, 2010). Further, there have been plenty of accusations of the use of traditional public diplomacy tools for the purposes of inappropriately influencing political structures in foreign countries (Bischof & Jurgens, 2015; Snow & Taylor,

2006; Cone, 1998). Much of this confusion and debate has arisen from the extent to which both propaganda and public diplomacy make use of similar, if not identical, tools to achieve marginally different ends. Understanding that both socially constructive and destructive ends can be achieved from identical means argues the ultimate neutrality of the technologies. American

David Sarnoff, a reserve brigadier general in the U.S. Army Signal Corps, was quoted as saying,

“We are too prone to make technological instruments the scapegoats for the sins of those who wield them. The products of modern science are not in themselves good or bad; it is the way they are used that determines their value” (McLuhan, 1964, p. 3).

22

Where computational propaganda is the evolution of traditional propaganda tactics as facilitated by the capabilities of the Internet, digital diplomacy is the evolution of traditional public diplomacy tactics as facilitated by identical Internet capabilities. In July of 2018, The New

York Times published an article detailing how Jennifer Grygiel, a professor of communication at the Syracuse University Newhouse School, uncovered evidence showing that RFE/RL purchased

Facebook advertisements that were algorithmically micro-targeted at strategic populations

(Roose, 2018). Ignoring The New York Times’ primary concern that the targeting of such content at residents inside the United States violated the Smith-Mundt Act of 1948, such an outlook fails to address the extent to which one of the U.S.’s most prominent public diplomacy tools engaged in behavior that was identical to the methods of the Russian IRA Facebook advertisements in the

2016 presidential election.

Such behavior on the part of a prominent public diplomacy institution poses a major public relations crisis for the larger approach to state-sponsored broadcasting in the era of the

Internet. Whether RFE/RL was legally right or wrong in the eyes of U.S. domestic law, and whether the broadcaster had constructive intentions or not, the case rearticulates the extent to which public diplomacy and propaganda tools make use of identical strategies to achieve seemingly parallel political ends. Making the public aware of how a U.S. funded broadcaster used tax-payer money to purchase social media content designed to influence political perceptions could not come at a worse time, given the growing pervasiveness of computational propaganda. Where the history of propaganda and public diplomacy are vicariously linked, the practice of state-sponsored broadcasting in the 21st century and on must employ careful public relations strategies in order to distinguish itself between engaging in digital diplomacy and computational propaganda, particularly in so far as public opinion is concerned.

23

Digital Diplomacy as Information Politics

Indeed, in the extent to which traditional inter-state conflict is transitioning increasingly to less kinetic strategies and tactics, information activities such as digital diplomacy have a profound role to play, though one that must have a sound public relations strategy for remaining on the positive side of public opinion. Public diplomacy tactics were used for over 40 years by both the United States and the Soviet Union to combat the growth of communism and democracy, respectively. Though public diplomacy does not spill blood or take human life, these tools can, and have, been used in ways that give rise to significant moral and ethical concerns.

Indeed, public diplomacy has often been referred to as the “War of Ideas” (Glassman, 2010;

Krause & van Evera, 2009; Reilly, 2007; Satloff, 2004), and in the realist perspective of theory, it is without question that public diplomacy tools can be used as strategic avenues for non-traditional political warfare.

In the early days of the Cold War, a group of Floridians known as the Orlando

Committee battled J. William Fulbright over a congressional bill to establish a “Freedom

Academy” aimed at educating Americans on subversive Communist tactics. In a 1963 concession letter, the committee wrote, “Someday this nation will recognize that global non- military conflict must be pursued with the same intensity and preparation as global military conflicts” (Armstrong, 2017a, para. 38; Armstrong, 2017b, para. 23). Whether intentional or not, there are means through which public diplomacy, as coined a mere two years after the Orlando

Committee’s letter (Gilboa, 2015; Golan & Yang, 2015), can be a means through which states engage in intensive non-military political conflict around the world.

Likening modern state-sponsored international broadcasting to a continuation of the Cold

War, former U.S. Broadcasting Board of Governors (BBG) Chair Walter Isaacson was quoted as saying “In this new struggle, just like in the old one, one of the most important arrows in our

24

quiver will be the power of a free press in promoting democracy and freedom” (2010, p. 3). The exportation and propagation of democratic thought around the globe has often been cited in the

United States as a core reason behind its international broadcasting policies (Powers & Samuel-

Azran, 2015). Where Isaacson spoke of international broadcasting as a weapon metaphorically, another BBG board member, Edward Kaufman, openly supported international broadcasting as a tool for engaging in “modern media war” (2002, p. 115). Based on the tactical spread of carefully crafted information, public diplomacy, when mediated through ICTs, can very much be about the power-based domination of one way of thinking through a mechanism for information intervention, i.e. information politics (Bruce, Peltu, & Dutton, 1999; Davenport, Eccles, &

Prusak, 1992; Jordan, 2015; Pruce & Budabin, 2016). However, what must be maintained is a clear distinction as to what separates public diplomacy from propaganda, and digital diplomacy from computational propaganda.

Computational Propaganda as Information Operations

While Powers and Samuel-Azran (2015) concede the use of international broadcasting as a form of intervention, titling their chapter, “Conceptualizing international broadcasting as information intervention,” they argue that the information warfare perspective is ultimately inaccurate, due to the destructive nature of war; as they see it, information intervention, via international broadcasting, only serves constructive ends. Ultimately, however, the terms

“constructive” and “destructive” are matters of perspective, and to argue that international broadcasting ultimately serves constructive purposes only is to endorse a normative perspective of international broadcasting. The uses of computational propaganda alone since 2016 suggests that information intervention can play a significant social role that is disruptive and deconstructive. As Strömbäck and Kiousis (2011) argue, understanding the true nature of political public relations is a significant challenge due to the descriptive, prescriptive, and

25

normative characteristics imbedded in its conceptual framework from a host of scholars and practitioners. Such subjective interpretations are equally present within the conceptual frameworks of both international broadcasting and public diplomacy (Macnamara, 2015).

A further look through history shows instances in which the use of international broadcasting for information intervention served interests in non-constructive ways. Most famously, through what has now been pejoratively coined in the U.S. context as propaganda,

Nazi Germany used both radio broadcasting and cinematic production to expand Nazi ideologies

(Doherty, 2000). Beyond the rather straightforward broadcasting of state policy, key broadcasting actors used the radio to encourage defection and surrender among allied forces.

Mildred Gillars and Rita Zucca, coined collectively as Axis Sally, targeted English-based radio content at U.S. forces in Europe for pro-German and pro-Italian interests respectively (Dear &

Foot, 2001; Lucas, 2010; Richard, 2010). Equally, William Joyce, better known as Lord Haw-

Haw, directed similar radio tactics toward the British Isles (Kenny, 2008; Farndale, 2005). While these efforts may have been constructive for the expansion of the Third Reich and its allies, such information intervention could hardly be considered constructive for the overall Jewish population, countries under the control of Nazi Germany, including Poland, Czechoslovakia,

Austria-Hungary, or occupied France, much less the countries still confronting Nazi expansion, such as the United Kingdom of Great Britain and Northern Ireland.

Behaviors such as traditional and computational propaganda are therefore conceived of as information operations, a mechanism for information intervention that sits counter to information politics. Namely, such efforts involve the injection of content into social settings that seek to disrupt, discredit, or discontinue methods for orderly political discourse and debate. Where the term information warfare is popular among colloquial usage, the concept injects the notion of

26

combative military warfare, which necessitate a combative military response. On the contrary, however, states have been engaged in information operations as long as civilization has existed.

Therefore, in the notion of international relations realism, a space for state-intervention that sits below the threshold for military response must be made and defended.

A modern example of such an information operation can be outlined in the struggle to control the narrative of Russia’s annexation of Crimea. In February of 2014, Russian special forces entered the Crimean Peninsula of Ukraine in support of pro-Russian demonstrations following the ouster of the country’s president, Viktor Yanukovych. Through a highly contested referendum, the Crimean state seceded from Ukraine and eventually signed a treaty with Russia to become an annexed member of the Russian Federation. From this point, an armed conflict ensued in the Donbas region of Ukraine between pro-Russian groups backed by Crimea, as a member of the Russian Federation, and the Ukrainian state (Sasse, 2017). Ukraine’s new president, Petro Poroshenko, supported pro-West relations for which the Russian state accused his government of being benefactors of Western intervention (Walker, 2016).

Using Russia Today (RT), the Russian state-back international broadcaster, an information operation was waged to influence public opinion both in Crimea and Donbas, in favor of pro-Russian sentiments. Such broadcasts included tales of how Ukrainian troops had killed children via crucifixion (Bazov, 2014), and how children were being taught to kill birds that were the same colors as the Russian flag (Euromaidan, 2015). Further, looking beyond

Ukraine itself, the broadcasts echoed other Russian communication policies in how it,

“problematizes American or Western ‘hypocrisy’ and ‘interference’; blames these traits for global instability; and advocates a ‘multipolar’ world as the optimal solution, in which non-

Western states such as Russia would balance American power” (Szostek, 2017, p. 382).

27

Such an example illustrates the extent to which RT, one of Russia’s most prominent public diplomacy tools engaged in intensive non-military conflict tactics to further the interests of the Russian state, through the strategic influence, disruption, and attempted manipulation of foreign public opinion. While this case may be Russian-centric, similar moral and ethical questions have been raised regarding the United States’ Voice of America (Uttaro, 1982) and

Radio Free Europe/Radio Liberty (Bischof & Jurgens, 2015; Cone, 1998), China’s CCTV which has been rebranded at CGTN (Walker, 2016), and Great Britain’s BBC (Rawnsley, 1996). Where public diplomacy and propaganda began as products of radio and migrated to television broadcasting, the digital age has offered these phenomena a new medium through which to reach, and uniquely target, foreign publics.

Purpose

While public diplomacy is a broad encompassing term, it is a hotbed for debate among both scholars and practitioners. With such a large scope of application, the central meaning of public diplomacy is often abstract and ultimately determined in the eyes of the beholder

(Pamment, 2014). With much confusion on public diplomacy’s relationship to other areas of strategic communication and political public relations, such as propaganda, it becomes necessary to tease out the conceptual and practical similarities and differences in the terminology. As such, this research offers a typology of political public relation information intervention through which

ICTs are used to influence public opinion, i.e. public diplomacy, propaganda, public affairs, and psychological operations. In attempting to do so, this work seeks to present a framework of telecommunication broadcasting through which public and digital diplomacy can be contextualized as activities of information politics, whereas traditional and computational propaganda as activities of information operations.

28

It is important to stress that the moral reality of war is not fixed by the actual activities of soldiers but by the opinions of mankind. That means, in part, that it is fixed by the activity of philosophers, lawyers, and publicists of all sorts. But these people don’t work in isolation from the experience of combat, and their views have value only insofar as they give shape and structure to that experience in ways that are plausible to the rest of us. (Walzer, 2015, p. 15)

With insights from the political public relations literature, a socio-legal investigation on legal frameworks surrounding cybersecurity will contribute to the larger interdisciplinary understanding of Internet-based information politics and operations by placing ICTs as tools through which states engage in non-combative power struggles of ideological domination, i.e. information intervention. In so far as this research is concerned, the investigation solely targets unique applications of digital information intervention, that is, behavior which sits adjacent to the often hard to locate line demarcating digital diplomacy from computational propaganda. It is behavior along this socially constructed line that brings into question the moral and ethical concerns about the use of international digital broadcasting in the 21st century. In trying to locate and clarify this line of demarcation, this research contributes to a corner of knowledge in the study of public diplomacy, propaganda, political public relations, and strategic communications.

Such an investigation serves to probe the manner in which the Internet, i.e. the newest evolution in ICTs, can be used as a tool of the political system to impact societies at a global level. Consequently, it also shows the extent to which existing laws do not adequately account for such uses of the Internet but may still yet offer useful elements for the creation of policy frameworks aimed at curtailing the abuses of computational propaganda. To do this, however, a thorough understanding of technology policy and its relation to the law is necessary to map such a framework on the field of information intervention.

Most codified law surrounding ICTs, primarily broadcasting, exists at the national level.

While this has led to a diverse pool of national broadcasting models across the globe, what such

29

national communication laws fail to do is sufficiently address the international transmission of information, something already quite realistic through the adoption of the Internet and continued globalization. To understand how ICTs operate in the international political arena, their impacts on a global society, and frameworks for confronting such uses, a larger legal jurisdiction is necessary. Thus, one of the unique contributions of this research will be to further the literature by studying a mass communication-based issue in the frameworks of international and supra- regional law, as opposed to the traditionally more explored areas of national legislation.

Interestingly, the scholarly relationship between ICTs and international and supra- regional law is largely nonexistent in the literature. Despite being situated in a field filled with an extensive breadth of international communication scholars, political communication scholars, media law scholars, and technology policy experts, seemingly little scholarship has been interested in the relationship between the use of the ICTs for information intervention and legal frameworks existing above the state. In an area of research where international broadcasting is highly regulated by international and supra-regional law and state behavior is highly regulated by international and supra-regional law, there are too few academics actively pursuing such an area of research, both from the social science and legal studies perspectives.

In a digital age where both governmental, nongovernmental, and corporate entities have access to user data from potentially billions of people, and the technological means to use such data to strategically micro-target political messaging transnationally, there is an imperative need to better understand the relationship between 1) the capabilities of the technology, 2) the political uses of such technology, 3) the policy implications of such uses, and 4) the implications such uses will have on global public opinion and public relations response strategies. Therefore, the central focus of this research is to ask, and begin to answer the question: what does realistic

30

cyber and technology policy look like that seeks to curtail computational propaganda in international and supra-national law?

Conclusion

To discover and better explain the existent relationship between information intervention and laws above the state, this research probes key areas of inquiry. First, the research seeks to explore and functionally demarcate the differences between leading governmental strategies for information intervention, i.e. public diplomacy, propaganda, public affairs, and PSYOP.

However, understanding how they differ is of little use if there are not realistic measures for combatting the digital use of information operations. Therefore, this research also explores existing frameworks of international and supra-national law for methods of confronting methods of computational propaganda. Understanding that information intervention is now a well- established behavior taking place on the Internet, this research branches into elements of technology policy to understand how such behavior can be regulated. Understanding the extent to which concerns surrounding information politics are based on the uses and misuses of personal data, a cybersecurity lens is used to determine if existing laws at the international and supra-regional levels apply to the use of strategically targeted information operation campaigns on the Internet.

The following chapters expand on the issues raised in this introduction. Chapter two provides a scholarly context for the basis of this research, exploring such topics as strategic communication, political public relations, strategic narrative, the power of public opinion, and the relationship between soft and sharp power as they emulate the differences between digital diplomacy and computation propaganda. Chapter three expands on the methodological framework through which this research is approached. Such content explores the basis for socio- legal research, and the grounded theory means through which such research can be approached,

31

namely the qualitative methods of in-depth expert interviews and a textual analysis of legal documents. Chapter four reports the findings of such areas of inquiry, while chapter five offers a synthesis of these findings, contextualizing their place and value in the scholarly literature, as well as offer insight into how such findings can make tangible contributions to policy circles interested in the use of the Internet as a tool for engaging in information intervention.

32

CHAPTER 2 LITERATURE REVIEW

Strategic Communication

It should first and foremost be understood that information intervention is strategic communication. As defined by Hallahan, Holtzhausen, van Ruler, Verčič, and Sriramesh (2007), strategic communication consists of any organizational entity using deliberate and purposeful communication with the intent to achieve predetermined outcomes, i.e. a mission. Further, they argue that key elements of engaging in strategic communication, “include, but are not limited to, audience analysis, goal setting, message strategy, channel choice, and program assessment”

(Hallahan et al., 2007, p. 5). This list of elements incorporated into the functioning of strategic communication is important because it suggests proactive intent behind organizational behavior.

It is only through the purposeful, direct, and sustained expenditure of resources that an organization can 1) develop end-goal outcomes, 2) determine, target, and observe key audiences,

3) develop an effective message strategy to be delivered through an appropriate medium, 4) deploy the packaged content, and subsequently, 5) analyze audience responses to determine the effectiveness of the overall strategy. This multi-step process requires a substantial expenditure of financial resources and man-power; thus, it can be concluded that the use of strategic communication is the product of purposive intent aimed at the procurement of some kind of benefit or interest to the organization, the perceived value of which is determined to be of equal or greater value than the combined value of the resources spent.

As determined through scholarship, strategic communication is primarily representative of six independent communication disciplines: management communication, marketing communication, public relations, technical communication, political communication, and information/social marketing campaigns (Hallahan, 2004; Hallahan et al., 2007). Set in the

33

context of strategic communication, political communication is defined as building, “political consensus or consent on important issues involving the exercise of political power and the allocation of resources in society. This includes efforts to influence voting in elections as well as public policy decisions by lawmakers or administrators” (Hallahan et al., 2007, p. 6).

Interestingly, their definition of public relations concerns the construction and use of relationships with key publics, both by and with government officials. Thus, Hallahan et al.

(2007) indirectly suggest a mutual ground between the areas of public relations and political communication, i.e. a sub-discipline specialized around the area of political public relations.

Political Public Relations

While the term political public relations may not be commonly used in the literature, it is the central focus of Strömbäck and Kiousis’ (2011) edited book, Political Public Relations:

Principles and Applications. As they define it,

Political public relations is the management process by which an organization or individual actor for political purposes, through purposeful communication and action, seeks to influence and to establish, build, and maintain beneficial relationships and reputations with its key publics to help support its mission and achieve its goals. (Strömbäck & Kiousis, 2011, p. 8)

This understanding of political public relations very succinctly merges the definitional elements of strategic communication, public relations, and political communication into a single organization-driving stratagem. Thus, political public relations is the process whereby an institutional entity attempts to engage with key publics in order to diffuse tactically calculated messaging packages with the aim of influencing public sentiment in ways to build or deter favorability toward key policy initiatives or election outcomes which, in the end, serve as political gains or victories for the institutional entity.

While such a conceptual framework as political public relations may serve as a long-term strategy for organizational entities, it is larger in scope and is better fitted to win wars for public

34

perspective, i.e. public opinion, rather than more situational-based narrative battles. What is needed for the more microscopic approach to political public relations, especially at the international level, is a tactic-driven formula for engaging in more frequently occurring struggles or skirmishes to set the narrative. While mass communication research suggests that media effects are positively correlated to frequency and consistency of messaging (Chaffee & Roser,

1986; Okadar, 2017, McQuail, 2010; Weaver, 1980), political public relations is the strategy by which public opinion is cultivated over time, but strategic narratives are the tactics by which this is achieved on a frequent and consistent basis.

Strategic Narrative

Coined in the field of international relations, Miskimmon, O’Loughlin, and Roselle

(2013) introduced the concept to explain a significant communication-based portion of inter-state politics in the 21st century.

Strategic narratives are a means for political actors to construct a shared meaning of the past, present, and future of international politics to shape the behavior of domestic and international actors. Strategic narratives are a tool for political actors to extend their influence, manage expectations, and change the discursive environment in which they operate. (Miskimmon et al., 2013, p. 2)

Most importantly, however, Miskimmon et al. (2013) acknowledge that the core function behind strategic narratives is to influence the behavior of those who receive such narratives. As

Freedman argues, “Narratives are designed or nurtured with the intention of structuring the responses of others to developing events” (2006, p. 22). Thus, narratives are the linguistic means by which people process and understand reality, and to diffuse strategically constructed narratives to a mass public is to intentionally try to influence, as a means to control, public perception of key events and, subsequently, public opinion. The invested interest in strategically influencing key publics so as to bring about desired political outcomes very clearly situates strategic narratives as a prime tactic in the field of political public relations.

35

In essence, a strategic narrative is nothing more than a specific framed account of events.

As Entman (1993) argues, framing is about the selection and salience of key pieces of information in the overall construction and projection of a narrative. While frames are often studied in scientific and experimental vacuums, Chong and Druckman (2007) as well as Borah

(2011) suggest that frames never exist in isolation. Rather, frames operate in a social information ecology with multiple accounts of events competing for diffusion and eventual acceptance as the dominant interpretation of events. Therefore, strategic narratives not only fit within the scope of strategic communications for their maximized capability of influencing target audiences, but also for this maximized ability to outperform their competitors.

This notion of frame competition, that narratives must not only influence their target audience but also compete against rival narratives, is also a universal trait to political public relations. At its most fundamental level, politics is always divided down a power divide, those in power and those not in power. To varying degrees of complexity, political ideology, party affiliation, and ethnic/national identity only serve to add to the number of groups with invested interests in the political realm. As argued by the United Kingdom Ministry of Defence,

In the global information environment, it is very easy for competing narratives to also be heard. Some may be deliberately combative—our adversaries for example, or perhaps hostile media. Where our narrative meets the competing narratives is referred to as the battle of the narratives. (2012; p. 2-11)

Beyond the United Kingdom, similar thoughts of strategic narrative and strategic communication have been echoed by other international actors, such as Japan (Aoi, 2017), Russia (Fridman,

2017), Mali (de Orellana, 2017), Yemen (Farwell, 2017), and even non-state actors such as

NATO (Schindler, 2016), and the Islamic State (Verdon, 2016). In a time where global public opinion tends to look negatively on armed conflict, states must find proxy mechanisms to challenge their opponents, and an avenue such as information intervention is a very much

36

realistic, and regularly utilized, methods to influence and manipulate the power of public opinion.

The Power of Public Opinion

Power

Understanding power is critical to understanding politics. As E. H. Carr (2001) originally asserted in 1939, politics is fundamentally a power struggle, i.e. a conflict of power between a minimum of two entities. But what is power? Nye (2004) suggests that power is the capability to achieve or secure the outcomes that are intended by the entity holding the power. In the same light, Dahl argues that power is the ability to impose one’s will on another; “A has power over B to the extent that [A] can get B to do something that B would not otherwise do” (1957, p. 202-3).

Conversely, Mearsheimer (2014) rejects such a definition favoring, rather, a perspective of power that is based on the quantifiable nature of material resources. Such a perspective of power is, however, ultimately flawed. This is addressed in a second definition power offered by Nye which combines elements of his first definition and thinking similar to what Mearsheimer penned some ten years later. Power is, “the possession of capabilities or resources that can influence outcomes” (Nye, 2004; p. 3). Such a definition concedes that power is the ability to impose will on an ultimately less powerful entity, such that outcomes favorable to the more powerful entity are achieved. This process, however, is based on, if not driven by, the cumulative resources and assets available to the more powerful entity.

While Mearsheimer’s (2014) perspective on power does contribute to a functional definition, his understanding of power is narrow. The perspective that power is a product of military assets and economic resources alone is ultimately flawed. While autonomous military power is vital to a state’s ability to thrive, as well as a sizable economy capable of supporting a competitive military force, such a perspective lacks the explanatory mechanism to account for

37

the historical reach of the Catholic Church, namely in its ability to consistently raising non-

Italian armies across Europe an eventually, raise funds to support the extravagance of the Holy

See, and to export Christendom to the new world. In a more modern context, Mearsheimer’s perspective fails to explain the recent international influence of Qatar. Both the Catholic Church and Qatar maintained another kind of power, outside of military and economic power alone, something that is significantly more difficult to quantify, the means to influence public opinion.

Public Opinion

Lippman (1957) proposed that human beings are significantly limited creatures; among our physical limitations comes a lack of ability to be everywhere and observe every that transpires. Thus, what humans do to understand the world is to create imaginative portraits of events in their minds. While humans have an ability to travel certain distances, events within that relative distance are prone to be reflected more accurately in the human mind due to personal observance or being informed by acquaintances who observed the event. However, events taking place over the span of great distances require more of a person’s imagination to be cognitively portrayed, processed, and understood. This is what Lippman refers to as, “the world outside and the pictures in our heads” (1957, p. 3). The way in which individuals cognitively process these images and then construct attitudinal perspectives is what constitutes private opinion.

Conversely, “those pictures which are acted upon by groups of people, or by individuals acting in the name of groups, are Public Opinion” (Lippman, 1957, p. 7). Thus, public opinion is the culmination of communication effects, i.e. a process whereby the transmittance of information brings about cognitive, attitudinal, and/or behavioral modification. The extent to which this communication process is mediated by ICTs then becomes the definition of mass media effects, again bringing about some kind of cognitive, attitudinal, and/or behavioral modification (Potter, 2012).

38

The value of public opinion to the fields of mass communication and cannot be understated (Glynn, Herbst, Lindeman, O’Keefe, & Shapiro, 2015). In a democracy where the public partakes directly in governance, or in a republic where the public elects officials to serve the public interest through governance, it would be counterproductive not to be aware of the interests and sentiments of public opinion (Clawson & Oxley, 2017). Equally so, as the manner in which public opinion is formed, namely influence as a process carried out through both interpersonal and mediated relationships between the public and an organizational entity, the public opinion process falls precisely at the intersection of public relations and political science, i.e. political public relations as conceived of by Strömbäck and Kiousis (2011). In a democratic model like those seen in Westphalia, it becomes of particular interest to understand the depth to which influencing public opinion relates to power.

E. H. Carr’s (2001) classic book The Twenty Years’ Crisis, 1919-1939 serves as one of the founding texts of modern international relations theory. In his book, Carr argues that states exert great energy and resources to influence and mold public opinion to ultimately be an asset to the state. States do this in the realist light of building a strong domestic front from which the state can then exert greater influence and power abroad. He ultimately argues that public opinion is not separate or unique from the state.

Just as democratic governments have been compelled to control and organize economic life in their territories in order to compete with totalitarian states, so they find themselves at a disadvantage in dealing with these states if they are not in a position to control and organise opinion. (Carr, 2001, p. 142)

Thus, as a state recognizes public opinion as a significant source of power, realist thought guides such states to engage in measures that make public opinion a resource to its interests rather than a barrier.

39

To that end, Carr does acknowledge that no state has complete control over public opinion. Where Hitler asserted that the only limitation to propaganda’s effectiveness was a finite amount of fiscal resources, Carr disagrees. He suggests that two key elements undermine strategic overtures to public opinion, 1) a lack of consistency between words and actions, and 2) that long-term control of public opinion creates an environment of suppression from which human nature will eventually revolt (Carr, 2001). These caveats do not suggest states ignore the limits of their power over public opinion, more so that state influence of public opinion is a continuous process of meeting and engaging with the public to keep them in support of the state’s interests as much as possible.

It is only through understanding Carr that one can truly view information politics and information operations as tools of information intervention. Where Carl von Clausewitz places public opinion as a central variable in the political functioning of a society (Howard & Parent,

1976), Carr (2001) suggests that a state will engage in whatever means are available to consistently conform domestic public opinion with state policy. Information intervention, therefore, is a means by which a foreign state attempts to insert its own ideological and/or cultural perspectives into another state’s influence over its domestic public opinion. By influencing public opinion in ways that are contrary to the people’s own governing state, information intervention attempts to weaken, if not break, a state’s influence over its own people, thus disrupting that country’s internal functioning in such ways that weaken the state’s ability to project power and influence abroad. Information politicos achieves this by injecting content that sparks political discussion and debate where information operations inject content to destabilize political discussion and debate already in progress.

40

To put this in terms more equivalent with political science and military studies, information intervention, namely its broadcasting element, consists of a wide range of ICT assets that, as tangible and quantifiable resources, facilitate the ability of one state or actor to unwelcomingly inject and attempt to impose its political and cultural ideology on foreign civilian populations, i.e. non-combatants. The goal of such efforts is to influence the sentiment of foreign non-combatants so as to propagate directed sentiment among them towards their own government (positive or negative depending on the engaging actor’s relationship with the foreign state). From that point, the hope is that the foreign non-combatants become non-combative, quasi-insurgents that interfere in the normal functioning of that state, in turn weakening the political power of the state. Thus, information intervention, in this light, can realistically be viewed as an attempt of power domination by one actor, through strategic behavior designed to compel a foreign state to submit to the will of the engaging actor.

The use of an ICT as a weapon in a non-combative power struggle can be articulated through the United States’ use of RFE/RL during the Cold War. Ignoring, for the purposes of this argument, the proxy wars and struggles that took place around the world between pro-democratic forces backed by the United States and pro-communist forces backed by the Soviet Union, the

Cold War is characteristically defined as cold because the United States and the Soviet Union did not directly engage in combative warfare against each other. This is not to suggest, however, that an ideological struggle did not take place; under Carl von Clausewitz’s concept of warfare

(Howard & Paret, 1976), both the United States and Soviet Union engaged in various measures to try and force each other to submit to the will of the other.

Uniquely, the United States projected Radio Free Europe into eastern Soviet bloc states while Radio Liberty projected into the Soviet Union itself. The goal of such efforts was to

41

broadcast information packages that elicited pro-United States sentiment and conversely, turned

Russians and other citizens behind the Iron Curtain against the policies set out by Moscow and

Communist Party (Bischof & Jurgens, 2015; Cone, 1998). Indeed, U.S. information intervention has been touted as one of the key tools with which the Cold War was waged, and often attributed as one of the leading factors to the down fall of the Soviet Union (Critchlow, 2006; Cull, 2008;

Dale & Lord, 2007).

Interestingly, it should be noted that the Radio Free Europe and Radio Liberty broadcasts were conjointly developed and employed by a wide variety of institutional and individual actors in the United States. Though information intervention strategy via international broadcasting was overseen by the USIA during this time, primary funding for the radio operations were provided by the CIA (Prados, 2009). Further, civilian resources were utilized in the planning and execution of the broadcast strategy and content, including inputs from Joseph Grew, a former

U.S. ambassador, DeWitt Wallace, then-owner of Reader’s Digest, Dewitt Clinton Poole, founder of the academic journal Public Opinion Quarterly, and Frank Altschul, a prominent New

York banker (Holt, 1958; Puddington, 2003; Weiner, 2008).

Where similar levels of societal mobilization were employed to support similar war efforts, namely the manner in which Joseph Goebbels used propaganda in the service of the

Third Reich, this concept is not unique to World Wars I and II alone. In the mid-2000s, the concept of the Gerasimov Doctrine began to grow in scholarly literature. The concept, named after the Chief of the General Staff of the Armed Forces of Russia, General Valery Gerasimov, reflects the concept of full societal mobilization to support the needs of a state. “It’s all one war machine. Military, technological, information, diplomatic, economic, cultural, criminal and other

42

tools are all controlled by the state and deployed toward one set of strategic objectives (McKew,

2017, para. 9).

This manner of engaging in a mixture of both combative and non-combative warfare has frequently been referred to as Russia’s hybrid war (Banasik, 2017). But, as mentioned, this concept was prominent in military strategy stretching back to 19th and 20th century Europe.

What’s more, the so-called creator of the doctrine, Mark Galeotti of the Centre for European

Security, admits that he coined the name “Gerasimov Doctrine” based on a speech General

Gerasimov gave in 2013, and to the best of his knowledge, no such doctrine actually exists

(Galeotti, 2018) in Russian policy. Despite such a claim, Gerasimov’s thoughts presented in the speech offer a lens for the study of information intervention and its broadcasting element in the

21st century. Concerning the “use of propaganda and subversion” (Galeotti, 2018, para 3) in the

“actual tasks of military science” (Gerasimov, 2013, para. 2), Gerasimov outlined how the events of the Arab Spring illustrated the disastrous effects of information intervention and ICTs on regional, national, and civil stability.

In the 21st century, there is a tendency to erase the differences between the state of war and peace. Wars are no longer declared, and when they begin, they do not follow the pattern that we are accustomed to. The experience of military conflicts, including those related to the so-called color revolutions in North Africa and the Middle East, confirms that a completely prosperous state in a matter of months or even days can turn into an arena of fierce armed struggle, fall prey to foreign intervention, plunge into the abyss of chaos, humanitarian catastrophe and civil war … The very ‘rules of war’ have changed. The role of nonmilitary means of achieving political and strategic goals has grown, and, in many cases, they have exceeded the power of force of weapons in their effectiveness … The focus of applied methods of conflict has altered in the direction of the broad use of political, economic, informational, humanitarian, and other nonmilitary measures — applied in coordination with the protest potential of the population. (Gerasimov, 2013, para 3-6)

43

This interest in non-combative information intervention is also explored by Chekinov and

Bogdanov (2010). Their conclusions suggest that such non-traditional warfare is the growing norm of the 21st century.

asymmetric operations are characterized by qualitative differences in employing new (nontraditional) means of armed struggle and forms and methods of waging it, yet are close in content to the strategy of indirect operations. Asymmetric measures include implementing measures to induce apprehension in an opponent’s most vulnerable military assets … The strategy of indirect operations is characterized by the multiplicity of forms and methods of operations, including the conduct of information and remote (noncontact) confrontations. (Thomas, 2018)

Understanding that Russia Today (RT) is the state-sponsored international broadcaster of Russia, how then should its involvement in the Crimean and Donbasian regions of Ukraine be interpreted? As happenstance broadcasts vicariously propagating pro-Russian sentiment? Or, could it be strategic collaboration across a host of Russian government, military, and civilian sectors to support the foreign policy interests of the Russian Federation as a direct power conflict with both the Ukrainian government and its Western supporters? If the latter, then information intervention, via information operations, and indirect military conflict indeed seem to be linked together.

The use of information as a means to exert national power abroad is well appreciated and accepted in defense circles. Often a DIME Model is used to conceptualize a framework of national power, including: diplomacy, information, the military, and economics as the drivers of such power (Hillson, 2009; Howard, 2012; Kozloski, 2009). The U.S. DoD prefers the PMESII model, which places political, military, economic, social, informational, and infrastructural elements as the drivers of national power (Hartley, 2015; Joint Publication 5-0, 2017;

McDonnell, 2009). Hartley (2015) refers to this notion of using information as a means to exert one’s power as unconventional conflict. Via either model, the use of information as a means of

44

exerting power in unconventional conflict is thus constituted as information intervention.

However, it is important to note that this research distinguishes between two types of information intervention: information politics and information operations.

Information politics is conceptualized to be the extent to which information is used to induce key outcomes in a political setting through the instigation of political speech and discourse (Bruce et al., 1999; Davenport et al., 1992; Jordan, 2015; Pruce & Budabin, 2016).

Pfeffer (1992) defines politics as the process through which one uses power to exert influence and control over others. Information politics is thus conceived as the manner in which information is used as a source of power to bend and/or conform the will of others to the desired outcomes intended by the entity who wields such power through fostering strategic political debate.

To the contrary, information operations is the conceived to be the extent to which ICTs and other digital assets are used to, “influence, disrupt, corrupt, or usurp the decision making of adversaries and potential adversaries” (Joint Publication 3-13, 2012, p. ix). Where von

Clausewitz equated war with the continuation of politics by other means (Howard & Paret,

1976), this research posits that information intervention can be a continuation of conflict by other means, engaged in either through mechanisms for information politics or information operations.

Understanding that information intervention attempts to exert power in non-combative, i.e. less hard, ways ultimately ushers back to the foundational concept of soft power, as well as a relatively new member of the power typology, sharp power.

Soft and Sharp Power

Joseph Nye (1990) famously introduced the concept of soft power as a means to articulate power types in a post-Cold War world. His initial work set to articulate the differences between hard and soft power. For Nye (2004), hard power consists of the means to exert one’s

45

will through acquiring affinity through fear and threats (military or economic sticks) or to acquire affinity through its purchase or inducement (military or economic carrots). To the contrary, soft power consists of the means to exert one’s will through an affinity or attraction willingly given through the admiration, and possible emulation, of key characteristics or traits.

Indeed, as Nye suggests, “Soft power rests on the ability to shape the preferences of others”

(2004, p. 5). But, as he goes on further to say, persuasion and influence are mere pieces of soft power and do not rest at the core of this endeavor. Autonomous acceptance, which is willingly given rather than sought, is the true nature of soft power. Influence which is sought and desired is thus a perversion or distortion of the true nature of soft power.

Only through intimate understandings of hard and soft power capabilities can smart power be achieved. Smart power is the strategy of balancing the cons of one type of power with the pros of the other. Initial criticism of soft power held that in an international system filled with anarchy, i.e. no supreme authority figure to govern the system, soft power is meaningless in the face of a state who chooses to employ hard power tactics. This is indeed true. Soft power is a normatively liberal concept, and “winning hearts and minds” (Farwell, 2012, p. 49) does little to effectively defend against military force. However, hard power also has its limits. Dictatorships and autocratic regimes are seldom popular and, through extended governance, often face rebellion and revolution. Thus, blending the advantages of both hard and soft power creates an array of options for engaging with fellow states and/or their civilian populations that most optimally employs power resources (Nye, 2009). Smart power is not unlike good cop/bad cop tactics, blending elements of fear and attraction to produce the most desired results.

But again, the concept of smart power is typically viewed in a utopian light. Based in the ideal nature of soft power, smart power is typically conceived of as serving constructive ends.

46

But what must be asked, is to what extent can soft, and/or smart power, serve foreign policy interests that are less than constructive? Such a question has begun to be explored, and the answer rests in the concept of sharp power. First introduced by Walker and Ludwig (2017), sharp power mirrors soft power in that its tactics make use of resources outside the scope of military and economic means, but the overall strategy is significantly different, making use of psychological coercion, or manipulation, to achieve the desired results.

Influence efforts are ‘sharp’ in the sense that they pierce, penetrate, or perforate the information environments in the targeted countries … These regimes are not necessarily seeking to ‘win hearts and minds,’ the common frame of reference for ‘soft power’ efforts, but they are surely seeking to manage their target audiences by manipulating or poisoning the information that reaches them. (Walker & Ludwig, 2017, p. 13)

Indeed, Johnson adds that sharp power, “mixes in coercion, bribes and deception to suppress discussion of unwanted topics, or to discredit foreign leaders and to subvert the population of an adversary power” (2017, para. 5).

In his own grappling with the conceptual rise of sharp power, Nye defines the concept as,

“the deceptive use of information for hostile purposes” (2018, para. 7). Interestingly, he went on further to openly acknowledge the extent to which the United States has employed sharp power, primarily addressing the Cold War. But most importantly, Nye notes that such sharp power behavior is hardly something new, as characteristic of an anarchic international political arena.

What he does suggest is new, however, is the speed at which information can be transmitted, and the way such information can be diffused via social networks. In his eyes, it is the Internet, and applications such as social media, that drastically differentiate sharp power tactics of the 21st century from their predecessors.

What is still necessary, however, is the need to differentiate between soft power and sharp power in practice. Where the framework of information politics in this research is

47

conceived as being founded on the principles of soft power, information operations is conversely viewed as being founded in the perspective of sharp power. However, linking these conceptual frameworks to practice is tediously difficult, namely based on social misunderstanding and confusion. Shortly after the 9/11 terrorist attacks in the United States, The Washington Post’s

Richard Holbrooke articulated this lack conceptual clarity precisely, showing the public’s inability to perceive the nexus of strategic communication and political public relations.

Call it public diplomacy, or public affairs, or psychological warfare, or -- if you really want to be blunt -- propaganda. But whatever it is called, defining what this war is really about in the minds of the 1 billion Muslims in the world will be of decisive and historic importance. (2001, para. 1)

Where Holbrooke asserts that there is little more than semantic differences across the communication strategies of information intervention, it does not reflect the true nature of unique characteristics across different concepts of information politics and information operations. What is necessary is a research approach that teases out the practical differences between methods of information politics, i.e. public and digital diplomacy, and similar methods of information operations, i.e. traditional and computational propaganda. Further, beyond a clarified understanding of such differences, methods for countering the effects of computational propaganda, as a tactic of sharp power, must be explored.

Theoretical Framework

As with any sound research, a theoretical framework is necessary to benefit the quality of the research and its ability to contribute to future research and thus, ideally, help to move knowledge forward, to speak in a positivist. Shoemaker, Tankard, and Lasorsa suggest that theory, “is simply one’s understanding of how something works” (2003, p. 6). To move on from such a parsimonious definition, McQuail defines theory as, “a general proposition, itself based on observation and logical argument, that states the relationship between observed phenomena

48

and seeks to either explain or predict the relation,” (2010, p. 5). Where the fields of mass communication, political science, and legal studies offer vibrant depths of theoretical backgrounds, they do so within the conceptual boundaries of their respective fields. Where this research is informed by the backgrounds of public relations, international relations, and law, no one theory from any field offers a wide enough scope for the convergence of ideas and questions posed in this research. Therefore, a larger and more exploratory framework is needed, one such as the grounded theory approach. The following section outlines the theoretical contributions of the grounded theory to the research of this study.

Grounded theory is a qualitative approach to scientific research which has dominated qualitative approaches in the social sciences for a number of decades (Morse, 2009; Wiesche,

Jursich, Yetton, & Krcmar, 2017). One of the driving benefits to grounded theory is the extent which the approach relies on inductive observations of the real world, i.e. insights grounded in direct accounts of how the social world realistically operates; thus, grounded theory is an ideal approach to the study of social phenomena that are currently under discussed in the literature and under studied in the research (Fernandez, 2004; Lehmann, 2010; Seidel & Urquhart, 2013;

Wiesche et al., 2017). With an emphasis for allowing the observable world to direct the scope of the research, grounded theory’s central focus is to understand the social world and attempt to offer explanatory accounts for how and why the social world operates as it does (Bowen, 2006).

While grounded theory research has no driving theoretical framework to guide the scope of the investigation, it attempts rather to offer scientific rigor through a systematic and cumulative study of relevant artifacts and processes (Bowen, 2006). Therefore, grounded theory involves, “the discovery of theory from data systematically obtained from social research”

(Glaser & Strauss, 1967, p. 1). It is through this systematic and reflective understanding of the

49

social world that theory then emerges, i.e. emergent theory as derived from a grounded theoretical approach (Glaser & Stauss, 1967).

It is important to note that grounded theory is not without its shortcomings. As an initial and exploratory area of research, the direction from which to begin investigating is decided by the informed, yet ultimately subjective, background of the researcher. “The researcher chooses any groups that will help generate, to the fullest extent, as many properties of the categories as possible, and that they will help relate categories to each other and to their properties (Glaser &

Strauss, 1967, p. 49). Further, as Glaser and Strauss note, “The evidence may not necessarily be accurate beyond a doubt … but the concept is undoubtedly a relevant theoretical abstraction about what is going on in the area studied” (1967, p. 23).

Where grounded theory is pertinent for areas of research that are under addressed and under investigated, it offers particular contribution to areas that most crucial and/or most fundamentally affect the social world and those who live in it. As such, grounded theory, “is a methodology that seeks to construct theory about issues of importance in peoples’ lives through data collection often described as inductive in nature where the researcher has no preconceived ideas to prove or disprove” (Mills, Bonner, & Francis, 2006, p. 2-3). Therefore, the study of public diplomacy behavior on the Internet, and its relation to IL and cybersecurity issues, is a topic that directly affects any person connected to the Internet, i.e. billions of people. Therefore, to understand how any movement towards regulation and protection must proceed, an understanding of how both information intervention and broadcasting currently stand in the international legal system.

Research Questions

As information intervention is a behavior engaged in by states and non-state actors in the international arena, it would be understanding to assume that such acts would fall under the

50

jurisdiction of IL. However, the relationship between information politics (i.e. public and digital diplomacy), information operations (i.e. traditional and computational propaganda), and IL is rather non-existent in scholarly literature. As Carr (2001) outlines, the beginning of international law addressing propaganda began in late 1800 and early 1900 in the form of bi-lateral treaties of non-aggression between states. Kearney’s (2007) work further explores the use of tribunals following World War II to establish judicial precedent regarding acts of propaganda for war.

However, this vicariously leaves all other forms of propaganda not meant specifically for war in a grey middle ground.

The stance of public diplomacy in the eyes of IL is just as vicarious, if not more. In

Gilboa’s (2008) landmark article Searching for a Theory of Public Diplomacy, he outlined substantial gaps in the public diplomacy literature as they existed at the time, including the areas of, “international exchanges and nation branding; and NGOs, corporate, cyber, and Diaspora public diplomacy and use of international law” (2008, p. 73). One decade after the publication of

Gilboa’s (2008) article, substantial levels of public diplomacy scholarship have been published on international exchanges (Hartig, 2011; Hayden, 2009; Iwabuchi, 2015; Trilokekar, 2009; Yun

& Vibber, 2012), nation branding (Fan, 2010; Pamment, 2014; Pamment, Olofsson, & Hjorth-

Jenssen, 2017; Potter, 2009; 2018; Rasmussen & Merkelsen, 2012; Szondi, 2010), NGO diplomacy (La Porte, 2012; Pamment, 2013; Saari, 2014; Sharma, 2010; Yang, Taylor, & Yang,

2014), corporate diplomacy (Henisz, 2017; Jackson & Dawson, 2017; Kochar, 2018; Kochar &

Molleda, 2015; Ordeix-Rigo & Duarte, 2009), cyber and digital diplomacy (Bjola, 2015; Bjola &

Jiang, 2015; Hallams, 2010; Kampf, Manor, & Segav, 2015; Zaharna & Uysal, 2016), and diaspora public diplomacy (Akçapar & Bayraktar Aksel, 2017; Ho & McConnell, 2017; Manor,

2016; Murti & Zaharna, 2014; Thunø, 2017). Yet, no literature could be located that discussed

51

public diplomacy in the framework of, or even in relation to, IL. In a keyword search for the terms public diplomacy and international law, the databases Fastcase, LexisNexis Academic,

Academic Search Premier, JSTOR, Google Scholar, and ProQuest’s Social Science Database produced zero published literature that addresses the topic of public diplomacy from an international legal perspective, or vice versa.

Overall, it must be asked why have the mechanisms for information intervention, i.e. information politics and information operations, been so ignored in the literature, specifically as they relate to IL? The larger scope of mass communication scholarship offers an enormous array of expert scholars in both domestic and international political communication, communication law, and technology policy; surely the field possesses the necessary assets, resources, and expertise to recognize the value in understanding the use of information intervention in IL. This research aims to be a meaningful attempt to clarify the demarcation between information politics and information operations, and place behavior such as computational propaganda in a legal context from which policy options and be formulated.

Demarcating Propaganda and Public Diplomacy

While the core intent of this research is to explore mechanisms for tackling computational propaganda, such work cannot be achieved without understanding how and where computational propaganda differs from digital diplomacy. Interestingly, as described earlier in this literature, the use of computational propaganda in the 2016 U.S. presidential election disseminated what has been viewed as malicious content through the same framework that

RFE/RL used to promote pro-U.S. sentiment. While an objective perspective would conclude that such behaviors were indeed unique from each other, the nature of how social media platforms function, i.e. their telecommunication infrastructure, only serves to further cloud the differences between computational propaganda and digital diplomacy. To understand how these

52

mechanisms for information intervention differ in the digital world, a better understanding of their predecessors is necessary. Indeed, the behaviors of digital diplomacy and computational propaganda are not new, they are merely old tactics set to new technologies with new capabilities for dissemination. Therefore, to demarcate digital diplomacy from computational propaganda, it is critical to understand what demarcates public diplomacy from propaganda.

The term propaganda first appeared in applied use in the Catholic Church (Carr, 2001;

Kearney, 2007). In 1622, Pope Gregory XV ordered the convening of the Sacra Congregatio de

Propaganda Fide, translated into English as The Sacred Congregation for the Propagation of the

Faith. The purpose of this council was to devise improved tactics for the evangelization of the

Catholic faith, namely in response to the Holy See’s counter offensive against the Protestant

Reformation, as well as the growing exploration and colonization of the New World (Handy,

2016; Bernays & Miller, 2005). Pratkanis and Aronson define propaganda as,

mass ‘suggestion’ or ‘influence’ through the manipulation of symbols and the psychology of the individual. Propaganda involves the dexterous use of images, slogans, and symbols that play on our prejudices and emotions; it is the communication of a point of view with the ultimate goal of having the recipient of the appeal come to ‘voluntarily’ accept this position as if it were his or her own. (2001, p. 11)

Jowett and O’Donnell (2012) note the manner in which the activity is organized and methodical. They view propaganda as, “the deliberate and systematic attempt to shape perceptions, manipulate cognitions, and direct behavior to achieve the desired intent of the propagandist” (Jowett & O’Donnell, 2012, p. 7). This notion of the systematic persuasive transmission of information places propaganda well within the scope of strategic communication.

They also note, however, that it is deliberate.

This falls in line with what Taylor (1990) surmised, that intent is a key element behind

53

the methodical strategy of propaganda. Further, Seagren and Henderson (2017) define the concept, but go further in noting the method through which persuasion is sought.

We begin with a definition of propaganda as ‘the deliberate attempt to persuade people to think and behave in a desired way’ (Taylor 1995, 6, emphasis in original), to which we add, ‘using means that involve either selective information or outright deception. (Seagren & Henderson, 2017, p. 69)

It was not until the 19th century that propaganda began to take on a pejorative connotation in certain parts of the world. Following World War II, the term was so tainted in the eyes of U.S. public opinion that in 1965 former ambassador Edmund Gullion took a move out of the public relations playbook and attempted to rebrand the act of propaganda under a new label that would look significantly more inviting to Congressional funding; thus was born public diplomacy (Cull,

2006; 2009b).

Gullion defined PD as, “the means by which governments, private groups and individuals influence the attitudes and opinions of other peoples and governments in such a way as to exercise influence on their foreign policy decisions” (Edward R. Murrow Center of Public

Diplomacy, 2018, para. 3). Such a definition denotes the possibility that public diplomacy behavior can be enacted by a wide range of institutional and/or individual actors. Where traditional diplomacy was the behavior of states alone, public diplomacy offers the reigns of shaping and influencing a state’s public image to a wide host of actors. Professor of Diplomatic

History Alan Henrikson defined public diplomacy as, “the conduct of international relations by governments through public communications media and through dealings with a wide range of nongovernmental entities … for the purpose of influencing the politics and actions of other governments” (Edward R. Murrow Center of Public Diplomacy, 2018, para. 3). Such a definition contributes to public diplomacy by asserting that the target of such tactics, i.e. the receivers the messages, are not government entities. Where the end goal of public diplomacy is to impact

54

some kind of change on a foreign government, the means through which such change is achieved is the influence, and potential manipulation of, public opinion in said country.

To understand public diplomacy as the extent to which one entity (organizational institution or an individual) manages self-imagery in the lens of public opinion is to recognize public diplomacy as a function of public relations. Indeed, a host of scholars have explored the relationship existing between public relations scholarship and public diplomacy (Dodd &

Collins, 2017; Fitzpatrick, 2007; Gilboa, 2008; Kochhar & Molleda, 2015; Lee & Lin, 2017;

L’Etang, 2009; Manheim, 1994; Mogensen, 2017; Molleda, 2011; Özdora & Molleda, 2014;

Signitzer & Coombs, 1992; Signitzer & Wasmer, 2006; Tam, Kim, & Kim, 2018; Wang &

Chang, 2004; Vanc & Fitzpatrick, 2016; Yun, 2006; Yun & Toth, 2009). Specifically, the management role of relations with the public is one of the key distinctions between public diplomacy and traditional diplomacy.

Hartig suggests that traditionally, “diplomacy is concerned with advising, shaping and implementing foreign policy by conducting negotiations and maintaining relations between the various actors in the international arena” (2016a, p. 33). However, this “extent of actors in the international arena” is relegated to state actors and diplomats alone. “Traditional diplomacy is secret, formal and interpersonal” (Gilboa, 2015, p. 1). Public diplomacy is unique from this in that it intentionally shines a light on the institutionalized power-nature of diplomats with elite pedigrees, conversing in private and dark back rooms of foreign ministries, to invite the public to take part in the process. Whether such an offer is realistic and genuine or merely a formality to acknowledge the power of public opinion is still up for debate.

However, no matter how many definitions of propaganda and public diplomacy may be provided, such definitions tend to be conceptually normative. That is to say, they often describe

55

the most socially recognized forms of the behaviors in ways that conform to the world view of those doing the defining. What such definitions fail to do is offer practical characteristics of each behavior that can stand as criteria for the evaluation of a persuasive message, i.e. is this propaganda or is this public diplomacy. Having a finite ability to distinguish content as being propaganda or public diplomacy, i.e. as mechanisms of information operations or information politics, is the key in being able to distinguish computational propaganda from digital diplomacy.

Therefore, this study probes the following questions:

RQ1: How is public diplomacy functionally different from propaganda?

International Law and Digital Broadcasting

This research asks certain questions as to the place of computational propaganda tactics as one of the many mechanisms for information intervention, and such research would be significantly limited without an understanding of the IL surrounding the socio-legal backdrop that makes such behavior possible. Therefore, the following section will address literature and derive direction of inquiry based on existing legal knowledge surround telecommunication technologies.

Where social conversations of IL tend to gravitate around more grandiose issues such nuclear proliferation, war crimes, and the like, the majority of IL is actually directed toward the more micro-functioning of civilization that allows multiple societies around the globe to live and function in relation to each other, telecommunication regulation being one such area (Slaughter,

1993). Originally pioneered as the International Telegraph Union, the now International

Telecommunication Union (ITU) serves as the leading organ of the U.N. dedicated to international regulation of broadcasting, including radio, television, and increasingly, the

Internet.

56

As an organ of the U.N., the ITU is capable of crafting IL that is binding on members and treaty signatories; one of the prime examples of this is the 2012 World Conference on

International Telecommunications (WCIT-12). The conference’s revised language on the

International Telecommunications Regulations attempted to propose elements of how IL could begin to address regulation of the Internet, however, such work proved highly contested, ultimately leading to the conclusion of the conference with no substantial direction approved on how to move forward in addressing Internet governance. WCIT-12 placed IL in the middle of what Clark, Sollins, Wroclawski, and Braden call a tussle in cyberspace, or quarrels surrounding the functioning of the Internet, “that arise among the various parties with divergent interest”

(2002, p. 348).

The WCIT-12’s tussle became divided down two general lines of thinking surrounding

Internet governance. The first group, led primarily by the United States, was in favor of a decentralized model of Internet governance, stressing non-governmental and private industry collaborations to establish viable options for addressing regulation of the Internet. The second group, led primarily by Russia, was in favor of a governmental model that placed states and supra-national bodies in a more centralized role. The United States and other Western countries argued that such state-centric governance would function to put censorship abilities in the hands of governments, effectively curtailing freedom of expression (Fidler, 2013).

The WCIT-12’s lack of consensus suggests that positions have become entrenched, making meaningful compromise unlikely. In this context, concerns about a digital cold war and the political balkanization of the Internet and cyberspace are emerging. (Fidler, 2013, para. 21)

This digital cold war is divided between what Fidler (2016) refers to as the Internet

Freedom and Internet Sovereignty camps, primarily led by the United States and Russia respectively. Indeed, human rights, and the element of free expression specifically, have become

57

the overarching theme surrounding Internet governance and regulation (Fidler, 2013; 2016;

Land, 2016). Where the WCIT-12 attempted to begin addressing international governance and regulation of the Internet, it included in Articles 5A and 5B special provisions to begin language for securing the status of the Internet. Such verbiage received extensive backlash from the United

States who equated the articles as potential avenues for censorship and other violations of free speech (Fidler, 2013). Understanding this highly politicized nature surrounding regulation of the

Internet, IL has struggled to find a direction in which to move forward.

Where the WCIT-12 failed to mold any kind of consensus around how IL might regulate the Internet, this does not mean other precedents do not exist. As Land argues, the foundations of an “International Law of the Internet” (2016, p. 394) can be found in the International Covenant on Civil and Political Rights (1966) (ICCPR), however it again falls on the scope of free speech.

Article 19 of the ICCPR establishes an explicit right to mediated expression and the transmission of information. “Article 19 explicitly protects the technologies of connection and access to information, and it limits states’ ability to burden content originating abroad” (Land, 2016, p.

394). This provision protects the free exchange of knowledge, via mediated platforms such as the

Internet, effectively supporting the interests of the Internet Freedom coalition through serving to reduce the censorship abilities of the Internet Sovereignty coalition.

Additionally, in 2004, the U.N. created the Governmental Group of Experts on

Developments in the Field of Information and Telecommunications in the Context of

International Security (GGE) which was tasked with outlining Internet and cybersecurity policy in the scope of IL. Since its creation, the GGE has met only five times, the last occurring in

2017. It was at the GGE’s 2017 meeting in which significant discord arose between group members, again falling down the Internet Freedom and Internet Sovereignty lines, surrounding

58

rights of self-defense in cyberspace. Arguments for increasing cybersecurity measures were viewed in the Internet Freedom camp as threats to freedom of speech. The issue in hand was a directive from the U.N. General Assembly to investigate, “how international law applies to the use of information and communications technologies by states” (Korzak, 2017, para. 3). In the end, the conference concluded with no consensus between members on how to move forward

(Soesanto & D’Incau, 2017).

With the repeated failure of international meetings like WCIT-12 and the UN’s GGE, it can be plausibly argued that issues of content regulation on the Internet hold little to no merit as plausible ways to develop policy or laws for combatting computational propaganda. Regardless of the rise of fake news and disinformation, targeting content is the silver bullet for a host of

Western and democratic state actors. History has shown all too well that the relationship between democracies and free speech is too critical, and thus it is a frontier in which Westphalia is not willing to compromise.

With the U.S. State Department embracing ‘Internet freedom’ as an element of American foreign policy and countries like China and Iran ramping up the both cyber-security and Internet censorship capabilities in response, geo-politics once again permeate and complicate global communications policy, much as they did for the telegraph, radio, and satellite communications. (Penney, 2012, p. 6)

If content, therefore, is not the way forward then perhaps the spread and diffusion of such content may hold more promise. Looking to the mechanisms that facilitate the spread of fake news and disinformation, the framework of telecommunications infrastructure may hold the key.

Understanding how security concerns of technology policy are being directed toward the technical infrastructure that facilitates computational propaganda, the field of cybersecurity holds significant merit.

59

Computational Propaganda and Cybersecurity

One of the leading international legal documents on self-defense in cyberspace, the

Tallinn Manual on the International Law Applicable to Cyber Warfare, defines a cyberattack as,

“a cyber operation, whether offensive or defensive, that is reasonably expected to cause injury or death to persons or damage or destruction to objects” (Schmitt, 2013, p. 106). But as Soesanto and D’Incau point out, such verbiage does not reflect the nature of aggressive cyber behavior that does not involve infrastructure damage or loss of life, such as, “the Distributed Denial-of-Service

(DDoS) attack against Estonia in 2007; the 2015 hack of the U.S. Office of Personal

Management (OPM), which stole personal information belonging to 21.5 million current, former, and prospective U.S. government employees; and the hack into the Democratic National

Convention (DNC) to influence the 2016 U.S. Presidential Election” (2017, para. 8), or, needless to say, information operations.

As Soesanto and D’Incau (2017) suggest that such a legal diction significantly limits the scope of cybersecurity issues in the eyes of IL, they further suggest that foundational notions in humanitarian law also fail in the scope of cyberspace.

In the real world, the distinction between combatants and civilians are core elements of International Humanitarian Law (IHL), which exists to govern conventional warfare and balance military necessity with humanitarian consideration. But in cyberspace this distinction is extremely difficult, because cyberattacks can have serious impacts on national security without either the perpetrator or the target being a conventional military actor. (Soesanto & D’Incau, 2017, para. 10)

Thus, the traditional scope of IL in cybersecurity on both self-defense and human rights fail to meet the new and ultimately unique context of cyberspace. Thus, what avenues are available for moving IL and policy on cybersecurity forward, insofar as the relate to computational propaganda?

60

The 2001 Budapest Convention on Cybercrime stands today as the only binding source of

IL to address crimes in cyberspace (Shalini, 2016). Pioneered by the Council of Europe, the convention acquired international signatories such as the United States, Japan, Israel, Canada,

Australia, and South Africa (Treaty Office, 2018), and has served as a model for cybersecurity legislation for over a decade (Seger, 2016). Specifically, the convention, “attempts to cover crimes of illegal access, interference and interception of data and system networks, and the criminal misuse of devices” (Seger, 2016, para. 1). A. Seger is head of the Council of Europe’s

Cybercrime Division and Cybercrime Programme Office as well as the Executive Secretary of the Cybercrime Convention Committee.

The key to the convention’s success, however much that may or may not be, has been attributed to the extent to which the convention established cooperation through capacity building (Seger, 2016; Shalini, 2016). “The treaty functions on a mutual information sharing and formal assistance model in order to facilitate better law enforcement and lays down procedure to seek and receive such assistance” (Shalini, 2016, para. 3). Despite some acclaimed success, the convention eventually drew criticism for the overly complex and slow pace at which the capacity building took place, as well as small scopes of application due to the narrow scope of legal diction used in drafting the convention. With mixed perspectives on the novelty, but also limitations, of the Budapest Convention on Cybercrime, this research asks:

RQ2: How do existing frameworks for cybersecurity governance apply to curtailing

strategies of computational propaganda?

A recent and interesting development in cybersecurity conversation involves the growing perspective that data are a prime commodity in cyberspace, and thus data protection is, and should be, a prime extent of cybersecurity goals (Dua & Du, 2016). Indeed, scholarly and

61

industry conversations in this area include such topics as, “the exploiting of social media in the gray zone, the characterizing of information warfare in cyberspace, the protecting of domestic information systems, the countering of gray zone cyber threats, technology and warfare, and privacy implications of military cyberspace operations” (Adams & Reiss, 2018, para. 1).

In a unilateral effort to address such “gray” cyber altercations surrounding data mining and data misuses, the E.U. again has led the charge, passing in 2016 the General Data Protection

Regulation (GDPR), which took effect in May of 2018. One of the key driving factors of the

GDPR is the extent to which it revises the legal definition of personal data. “Aside from the basics of name, phone number and email address, information like a postal code, driver’s license, passport, credit card, bank account, IP address, workplace, union membership, social factors, genetics and biometrics also need to be taken into account” (Eichorn, 2018, para. 6).

Based on this expanded definition of personal data, the GDPR seeks to impose stronger regulations on the collection, storage, and use of European personal data, both in and outside the geographical boundaries of the European continent, at the cost of heightened financial penalties.

The European law, known as the General Data Protection Regulation, or GDPR, requires companies that collect data on E.U. citizens to use simple language to explain how they handle it. Companies must get explicit consent from consumers before doing anything with their information and allow them to request copies of their data or delete it entirely. The law also mandates that companies report data breaches on strict timelines. Fines for violations could cost them 4 percent of their global profits. (Hawkins, 2018, para. 4)

Understanding that much of the combative nature to which the Internet allows computational propaganda to be tactically micro-targeted and pin-point delivered to key foreign demographics on an individualized basis, it is critical to explore how the protective measures enumerated in the

GDPR can and/or may pertain to efforts to curtail computational propaganda. Thus, this study asks:

62

RQ3: How do newer frameworks for cybersecurity governance apply to strategies of computational propaganda?

63

CHAPTER 3 METHODOLOGY

Socio-Legal Research

Where the scholarly investigation of the law and the social sciences were, for a great deal of time, entirely unique from each other, scholars in the early 19th century began to question such a demarcation. Indeed, where both lawyers and scholars of the law traditionally approached legal studies and research in a pure and doctrinal manner (Chynoweth, 2008; Mccruden, 2006; Schiff,

1976), there were some who began to suggest that a more sociological perspective was necessary to understand the social pressures that existed on, and in, the legal system (Opalek, 1984).

Indeed, codified law is nothing more than a social construction (Banakar, 2000), i.e. a social agreement between mass groups of people that is ultimately created and enforced by social institutions (Mccrudden, 2006).

Cairns (1941) was one of the first to introduce and establish a socio-legal approach to legal research. As he suggests from his own perspective, jurisprudence consists of the study the law centered on human behavior in a social environment of disorder. Interestingly, Cairns (1941) views disorder in the legal system in a similar manner to the way in which international relations theory views anarchy in the international political arena. Where anarchy in international politics is the byproduct of a lack of a unifying authority, disorder in legal studies is the byproduct of the inability of law’s unifying authority to establish conformity within the legal system. As he suggests, the study of a social science jurisprudence is the effort to better understand the nature of the social impacts on the system so that greater efficiency within the legal system can be achieved (Cairns, 1941). Such thinking is mirrored in more recent literature which views politics as a primary crux or barrier to achieving efficiency in the legal system. “Modern law is considered only an instrument of the state’s political power and its efficiency is determined by

64

the concentration of political forces, independently from the support of other systems of social regulation” (Šlapkauskas, 2010, p. 167). Therefore, this research seeks to use socio-legal research to understand the political environment surrounding international digital broadcasting as a means to discern a way forward for media and technology policy surrounding growing cybersecurity concerns, or what Chynoweth (2008) calls a socio-legal approach to law reform.

Approaching such an area of study can only be achieved through a methodology that integrates the backgrounds of social science and legal studies scholarship. Where law is a social construct, it entails a variety of social institutions that create, appoint, fund, and regulate it. The majority of such institutions are political in nature, and thus to remove political and other social contexts of the law is to understand it only partially (Carins, 1941; Schiff, 1976; Šlapkauskas,

2010). However, both political and legal institutions function based on the use of communication. The social value of laws only has force when their meaning is communicated to the public, and political processes only function when philosophy and ideology is communicated through interpersonal, and more commonly, mass mediated channels (Dawson, 1992). Therefore, understanding the interconnected nature of the law, politics, and communication is vital in the approach to socio-legal research.

Where its importance in scholarship is growing, the place of socio-legal research is still contested. This is due, primarily, to the markedly different institutions of training between legal scholars and social scientists (Halliday & Schmit, 2009). Legal research approached from the social sciences is often chastised, unfairly critiqued, and ultimately excluded from mainstream academic circles due to the different background, training, and perspective of social scientists; the same is true of social research approached from legal studies (Banakar, 2000; Dawson,

65

1992). This entails what Dawson (1992) refers to as the epistemic gap between legal studies and the social sciences, or what Banakar (2000) calls the locus of struggles between the two fields.

Compounding these institutional pressures, Halliday and Schmidt (2009) suggest that methodological anxiety syndrome, or MAS, is a leading cause for stymying the advancement of socio-legal research. Where legal scholars and social scientists may have insightful and even ground-breaking questions and agendas of socio-legal research, too often both sides feel inadequate and unprepared to use research methods and perspectives from each respective side.

Ultimately, Halliday and Schmitd (2009) suggest that the advancement of knowledge can only be achieved by the asking of questions and attempts, however flawed, to answer such questions.

They promote the idea of a naïve design or naïve fieldwork in which a researcher submerges him or herself in areas of scholarship they are not overly familiar with, and essentially learn the nuances of social science and/or legal studies peculiarities as they move forward. This fits succinctly in the perspective of grounded theory, where a researcher is encouraged to begin the process of scientific investigation at a given point and then engage in a fluid methodological framework that allows for growth and knowledge attainment as the process of research pursuit.

In suggesting the role of naïve designs by average researchers, Halliday and Schmidt’s

(2009) perspective fits within a Kuhnian philosophy of science where average people can partake in the process of knowledge creation simply by solving everyday puzzles (Kuhn, 2012). Such a perspective of socio-legal grounded theory research contradicts Popper’s perspective that specialized-experts exploring the world through bold hypothesizing and falsification is the only avenue for scientific discovery (Popper, 1935). This research endorses the Kuhnian perspective of scientific advancement, that simple scholars can ask meaningful questions and make

66

meaningful attempts to solve such puzzles, based simply on everyday observations of the natural and social worlds around us.

Research Design

In order to probe the under-studied nexus of computational propaganda and international law (IL) specifically, this research makes use of grounded theory, a qualitative methodological approach. As Glaser and Strauss (1967) suggest, a grounded methodological approach is ideal in instances where existing literature and theory fail, or are insufficient, to offer tangible avenues for empirical research. An in-depth review of the literature suggests that current published scholastic work on the relationship between computational propaganda and IL is largely non- existent, and therefore hypothetic-deductive based research offers no substantial means to move the literature forward. Therefore, this research attempts to make inductive observations of how computational propaganda and IL are approached, and used, in the international theater, upon which emergent theory will be articulated to express and explain such a relationship.

Both Blandy (2014) and Webley (2010) suggest that grounded theory methods are particularly useful in the approach to socio-legal research. Where the social sciences tend to stress a research approach on both deductive and inductive principles, legal research has traditionally been a field much more dedicated to hypothetic-deductive reasoning. As such, its ability to link, or be linked, to the social world has been limited. Grounded theory methodology thus serves to supplement the highly deductive nature of legal research by situating it in a method that is highly inductive, thus functionally serving to ground the rational approach of the law in contexts relevant to the social workings of reality.

The methodology of grounded theory research offers a range of inductive approaches from which scientists can make observations of the natural and social world, including ethnographic observations, interviews, and the textual analysis of relevant documents (Charmaz,

67

2006). Where the scope of computational propaganda and IL are international in nature, they include thousands, if not millions, of participants (i.e. message creators, senders, and receivers) across a global scale of both geography and diversity. In such a way, ethnographic observation is not practical, or entirely relevant, to this line of research. However, intensive interviews and a textual analysis do hold promise for the goals of this research. In making use of two qualitative methods, this research classifies as a non-mixed methods approach (Tariq & Woodman, 2013).

The foundational basis for grounded theory research is its internal nature for continuous and repetitive comparison. Referred to as the Constant Comparative Method (CCM) (Belgrave &

Seide, 2018; Boeije, 2002; Byrman, 2012; Glaser, 1965; Glaser & Strauss, 1967; Knotten,

Hansen, Svalestuen, & Laedre, 2017; Kolb, 2012), the CCM stresses the continuous need for internal data comparison. Every time an interview is conducted or a text is analyzed in a grounded theory approach, the insights obtained from such research must then be compared and contextualized with the researcher’s existing knowledge. This way, new knowledge can then inform the research in the process of the study. Indeed, the CCM of grounded theory research stresses the inductive nature of the approach, constantly checking the data in its relation to reality. Based on the CCM, grounded theory methods allow for fluidity within the process of research, i.e. methods can grow and change in the course of the research process unlike other quantitative approaches (Byrman, 2012; Charmaz, 2006).

Expert Interviews in Grounded Theory

As Charmaz suggests, “intensive interviewing permits an in-depth exploration of a particular topic or experience and, thus, is a useful method for interpretive inquiry” (2006, p. 25).

Where this study offers a typology of information politics stratagem based on an extensive review of the literature, such literature spans almost a century and offers a variety of time- relative and conceptually-specific contexts in covering the four concepts of PD, PA, PSYOP, and

68

propaganda. In order to ground such an understanding of information politics in the context of how it is pursued in a 21st century digital information ecology, a method such as intensive, or in- depth, interviews allow for the ability to more closely align the proposed typology with the reality of contemporary international politics. Where such interviews offer relevant insight, the use of experts specifically adds to the relevance of the grounded theory contributions.

In-depth expert interviews consist of a specific sub-area of qualitative interview methods, one that serves particularly well as a beginning point for research.

Firstly, in relative terms, talking to experts in the exploratory phase of a project is a more efficient and concentrated method of gathering data than, for instance, participatory observation or systematic quantitative surveys. Conducting expert interviews can serve to shorten time-consuming data gathering processes, particularly if the experts are seen as “crystallization points” for practical insider knowledge and are interviewed as surrogates for a wider circle of players. (Bogner, Littig, & Menz, 2009, p. 2)

Where experts are expert based on their place, familiarity, and/or participation in an area of industry or practice that is of interest to the research, they offer contributions that more meaningfully inform the research and therefore more substantially ground the resulting conclusions in reality (Bogner et al., 2009). This is not unlike the benefits inherent in the participatory action research (PAR) method.

In PAR, the research participants also participate in the design and collection of research evidence. Participants are seen as collaborators with the knowledge and agency to contribute to an understanding of the research and its results. (Fletcher & Marchildon, 2014, p. 13)

Using experts in interview-based research, as informed by the PAR method, serves to make the research process more participatory across the field. It reduces the burden of the researcher by providing him/her a wide range of assets and resources that supplement such research.

The whole notion of an elite implies a group of individuals, who hold, or have held, a privileged position in society and, as such, as far as a political scientist is concerned, are likely to have had more influence on political outcomes than general members of the public. (Richards, 1996, p. 199)

69

Where the term expert can be used descriptively to denote the level of influence and/or experience an interviewee has in the area of interest, it does not, however, denote a limit on the interviewee’s professional diversity. Indeed, for a topic in the social sciences, a wide range exerts can, and should, be interviewed to most optimally and effectively ground the resulting conclusions in realty.

The target group for expert interviews is wide. Examples in literature include top level managers from politics, business, the judiciary, associations and the sciences, along with teachers, social workers or staff representatives. (Meuser & Nagel, 2005, p. 73, translated from German)

The use of a wide range of experts to synthesize a singular and unified perspective or opinion on the role of information politics stratagem in the digital environment of the Internet resembles the approach of the Delphi method, which serves to, “obtain the most reliable consensus of opinion of a group of experts” facilitated through “a series of intensive questionnaires interspersed with controlled opinion feedback” (Dalkey & Helmer, 1963, p. 458). Indeed, the manner in which the

CCM of grounded theory seeks to grow knowledge and understanding fluidly from interview to interview is also reflected in the evolutionary manner in which knowledge is attained in the

Delphi method (Fletcher & Marchildon, 2014; Ziglio, 1995). Based on such literature, this research makes use of in-depth expert interviews that follow a semi-structured format. An interview protocol was developed which successfully received institutional review board approval (IRB-201801196); this declaration of IRB approval outlines the compliance of this study with existing procedures and regulations designed to protect subjects in the course of interactive research; it does not, however, offer any insight into the validity or reliability of the method. The interview protocol can be viewed in Appendix A.

70

Interviewees were identified based on their academic and/or industry work experience, body of published literature, and general notoriety in the fields of public diplomacy, international relations, international law, or cybersecurity. Interviewees were primarily recruited from the

United States and Europe. While such geographic targeting could potentially limit the generalizability of the findings, such specificity was purposeful. In both the contexts of information intervention and IL, Westphalian precedents generally dominate. It was concluded that placing an emphasis on interview research from experts situated in U.S. and European communication, political, legal, and security circles would serve to most optimally ground the research in the most relevant areas of insight.

Video-based interviews were conducted via Zoom, a free video-conference software that, based on security measures, offers substantial protection guarantees for data storage and subsequently interviewee confidentiality. Further, Zoom is an ideal platform for the international nature of the interviews conducted, as the platform’s security measures satisfy the requirements for data storage and transfer stipulated in the E.U.’s GDPR. Interviews ranged anywhere from approximately 45 minutes to one hour in length. Transcripts were produced by Rev

(www.rev.com), a professional transcription service, and then coded and analyzed using the

NVivo 12 data analysis software package. Following each interview, transcripts were analyzed for the introduction of key concepts or frameworks that would prove valuable to the progression of the research. Such frequent instances of analyzing the data while in progress served the function of using the CCM to stress the inductive nature of the research, i.e. continuously taking stock of the research progress and using such opportunities to widen or reorient the research in appropriate direction to most optically keep it grounded in reality. Such continual analysis of the

71

data also helps to verify both the validity and reliability in the findings from each respective interview (Bogner et al., 2009).

Table 3-1. List of experts interviewed in the current research. Interviewee Institution Title Expertise Christopher Paul, Pardee RAND  Senior Social Scientist  Information Ph.D. Graduate School Operations; Psychological Warfare

Paul Rosenzweig, George Washington Professional Lecturer International Law; J.D. University in Law Cybersecurity

Cayce Myers, Assistant Professor of Public Relations; Ph.D., LL.M., J.D. Communication Media Law

R. S. Zaharna, American University Professor of Public Diplomacy Ed.D. Communication

Pawel Surowiec, University of Senior Lecturer Public Diplomacy; Ph.D. Sheffield Propaganda

Steven Pike Syracuse University Assistant Professor of Public Diplomacy Public Relations and International Relations

Corneliu Bjola, Oxford University Associate Professor of Digital Diplomacy Ph.D. Diplomatic Studies

Amelia Arsenault, Georgia State Assistant Professor of Public Diplomacy Ph.D. University Communication

Eytan Gilboa, Bar Ilan University Director of the School of Public Diplomacy Ph.D. Communication

Candace White, University of Professor of Public Public Diplomacy Ph.D. Tennessee Relations

Monroe Price, University of Adjunct Full Professor of Media Law & J.D. Pennsylvania Communication Policy

72

Table 3-1. Continued Interviewee Institution Title Expertise Guy Golan, University of Visiting Associate Public Diplomacy Ph.D. South Florida Professor of Communication

Emma Briant, George Washington Post-Doctoral Visiting Propaganda; Ph.D. University Scholar Information Warfare Emily Metzgar, Indiana University Associate Professor of Public Diplomacy Ph.D. Communication

Shaun Riordan European Institute Director of the Chair for Diplomatic Studies; for International Diplomacy and Cyberspace Pubic Diplomacy Studies James Pamment, Lund University Head of Department of Diplomatic Studies; Ph.D. Strategic Communication Public diplomacy

Samantha University of Oxford Research Assistant, Democracy and Bradshaw Internet Institute Computational Propaganda Internet Policy Project John Doe, Ph.D. Stanford University Center for International Internet Security and Cooperation Governance; Cybersecurity Vincent University of Florida Assistant Professor, Information Bindschaedler, Computer & Information Security & Data Ph.D. Science & Engineering Science

Juan Gilbert, University of Florida Professor, Computer & Information Ph.D. Information Science & Security Engineering Brandon Marine Corps Professor of Armed Cybersecurity Valeriano, Ph.D. University Politics

Gianluca Boston University Assistant Professor, Computer Stringhini, Ph.D. Electrical and Computer Programming; Engineering Internet Security Alexander Seger, Council of Europe Executive Secretary, Cybercrime Ph.D. Cybercrime Convention Committee

73

At the conclusion of the interviews, the body of transcripts were further analyzed to determine the presence of thematic characteristics, as identified by predetermined codes. Such codes were determined by the typology of information politics developed in the literature. A full breakdown of the themes and codes can be viewed in Table 3-2.

Table 3-2. List of codes as determined by themes determined through the dominant literature on the fields of public diplomacy, propaganda, public affairs, and psychological operations. Programmed Major Themes Codes Coding Options Who engages in information politics? Politicians, Military, 20 Intelligence; Corporate; Civilians In What Manner do they engage in information Overt; Covert; Mixed 12 politics? To what target audience do they engage in Domestic; Foreign; 12 information politics? Mixed

Through what method is information Framed Truth; 12 communicated? Deceptive Lies; Mixed

What model or flow does the communication One-Way; Two-Way; 12 resemble? Mixed What is the intent of such information politics? Inform; Persuade; 12 Disrupt What end does such information politics serve? Public Policy; 16 National Security; Foreign Policy; Economic Policy

Following such coding and analysis, an analytical memo was written. In the purview of in-depth interviews, analytical memos are a means for researchers to synthesize a singular and unique perspective based on extensive analysis of, and saturation in, the data (Bogner et al., 2009;

Bryman, 2012; Glaser & Strauss, 1967).

Memos catch your thoughts, capture the comparisons and connections you make, and crystallize questions and directions for you to pursue. Through conversing with yourself

74

while memo-writing, new ideas and insights arise during the act of writing. (Charmaz, 2006, p. 72).

The analytical memo serves a crucial function, namely bringing to a head conceptual clarity behind the terminology of PD, PA, PSYOP, and propaganda so that the initial typology of information politics provided in the literature review can be either verified or revised based on in-depth expert insight which is firmly grounded in both the practice and study of these concepts.

Based on such conceptual clarity and concise definitions, a socio-legal analysis will then be perused through the lens of a textual analysis.

Textual Analysis in Grounded Theory

Where the analysis and coding of interview transcripts involves a first-hand textual analysis, the text itself, i.e. the transcript, would not have otherwise existed had the researcher not sought out the expert and interviewed him/her. A formal textual analysis as a form of grounded theory research involves the in-depth and extensive analyses of textual documents that are objectively unique from the influences of the research, i.e. the documents to be analyzed were created and made public for reasons that do not include the goals of the researcher. Such texts are referred to as extant texts (Charmaz, 2006).

People construct texts for specific purposes and they do so within social, economic, historical, cultural, and situational contexts. Texts draw on particular discourses and provide accounts that record, explore, explain, justify, or foretell actions, whether the specific texts are elicited or extant … As a discourse, a text follows certain conventions and assumes embedded meanings. Researchers can compare the style, contents, direction, and presentation of material to a larger discourse of which the text is a part. (Charmaz, 2006, p. 35)

Official documents, i.e. those documents produced by a professional entity, serve as a key source of extant texts. Byrman (2012) notes that government documents are particularly useful as they consist of official accounts of public record. Scott (1990) proposed four key elements for the assessment of textual documents in the light validity and reliability concerns. As he notes,

75

authenticity, credibility, representativeness, and meaning dictate the quality of the texts to be used for analysis. Governmental and legal documents fit well within Scott’s (1990) framework as such official documents can generally be viewed as authentic and original, credible in that what it recounts is generally free from error, representative in both time and space of the society that produced such documents, and a clear and comprehensive understanding can be drawn from such documents.

The merging of textual analyses with expert interviews serves as an ideal dual-approach to grounded theory research. As Bogner et al. (2009) note, in-depth expert interviews are not without their limitations; the content of the interview is ultimately subjective based on the individual interpretations of each interviewee. Even when themes are pulled together across a wide and diverse range of expert interviews, the resulting insight is a product of inter- subjectivity (Neuendorf, 2002) rather than objectivity itself. Further, based on time and space limitations, interviews offer a seemingly finite account that is contextually and temporally specific (Bogner et al., 2009). Textual analysis of extant texts is therefore a unique method to supplement the characteristic shortcomings of interview research. Where insight and direction can be derived from interview-based research, applying such insight and direction to the in-depth

Table 3-3. Legal documents analyzed via textual analysis. Document Document Document Title Document Source Source Date Council of 23 Nov, European Treaty Series No. https://rm.coe.int/1680081561 Europe 2001 185: Convention on Cybercrime

Council of 27 Apr, General Data Protection Plan, https://eur- the European 2016 Regulation 2016/679 lex.europa.eu/legal- Union; content/EN/TXT/PDF/?uri= European CELEX:32016R0679 Parliament

76

analysis of extant texts serves to dilute the inherent subjectivity of the interviews by placing them further in touch with the real world, i.e. grounding their contributions further in reality. A list of the extant texts to be analyzed can be reviewed in Table 3-3.

The analysis of the collected texts, facilitated via the NVivo 12 data analysis software package, followed a three-part process outlined by the CCM. Such a process will first involve initial coding (Charmaz, 2006), or open coding (Boeije, 2002; Böhm, 2004; Bryman, 2012). This consists of analyzing each text line-by-line, from beginning to end, to objectively determine what was said and/or reported (Boeije, 2012). Strauss and Corbin define this as, “the process of breaking down, examining, comparing, conceptualizing, and categorizing data” (1990, p. 61).

Once this has been completed and a full saturation of the documents has been reached, a second series of coding will be employed. This second phase, referred to as focused coding (Charmaz,

2006) or axial coding (Boeije, 2002; Böhm, 2004; Bryman, 2012), consists of taking the knowledge gained through text-by-text open coding, and beginning to apply a larger scope of coding across documents. Strauss and Corbin define axial coding as, “a set of procedures whereby data are put back together in new ways after open coding, by making connections between categories. This is done by utilizing a coding paradigm involving conditions, context, action/international strategies and consequences” (1990, p. 96).

Lastly, a third round of coding was employed that weights specific codes as determined by the researcher; this third phase is referred to as selective coding (Böhm, 2004; Bryman, 2012).

This last phase picks key codes that are determined by the research to be of central value, and places them as the nexus for a relational mapping of codes across the sample of analyzed extant texts. Strauss and Corbin define selective coding as, “the process of selecting the core category,

77

systematically relating it to other categories, validating those relationships, and filling in categories that need further refinement and development” (1990, p. 116).

Weaknesses of the Methodology

While deemed to be the optimal approach for this research venture, grounded theory does present methodological weaknesses that must be addressed. As previously discussed, given that grounded theory is a prime avenue for exploring phenomena on which there is insufficient existing literature or knowledge, the place to begin such research is up to the informed, yet ultimately subjective, determination of the researcher. This inherently calls into question the objectivity, and hence quality and generalizability, of the findings.

What’s more, unique to this research topic is the highly interdisciplinary scope of the inquiry, particularly in the philosophic approaches to scientific research. The field of mass communication research is largely viewed as positivist in nature, that is, it employs a rigorous and systematic scientific approach to empirically study the natural and social world to discern universally objective truths that govern the workings of the natural and social worlds (Alakwe,

2017; Demers, 2000; Potter, Cooper, & Dupagne, 1993). The field of IR has had two great debates surrounding the role of empirical science in the field. Following World War II, behaviorism emanating from the U.S. academy pushed a positivist agenda on the global stage, particularly spearheaded with Waltz’ (1979) championing of rational choice in structural realism.

However, Keohane (1988) reintroduced the debate in proposing a new paradigm in IR theory, a reflectivist approach that rejects the positivist approach to rational choice (Kurki, 2008).

However, based on the extent to which this study situates many of its perspectives in realism and the notion of rational choice, a positivist approach to IR theory is most appropriate.

Lastly, where classical legal studies have tended to pursue a positivist approach to the law, i.e. a linear progression through deductive reasoning and evolution through case law, the

78

scope of socio-legal research is considerably less positivist (Schiff, 1976). It recognizes the force of unmeasurable variables on a system that is continuously pulsing with a diverse range of truths, but not necessarily moving forward in a linear fashion toward one universal truth. Indeed,

Banakar (2000) directly suggests that sociological approaches to legal studies are faced with methodological issues due to the post-positivist nature of the methodologies in a field typically identified as positivist.

Irrespective of what sociology chooses to study … the institutional characteristics of its subject-matter … affect how studies are actually conducted and the form of knowledge/truth which is produced. It also means that the methodological obstacles we are addressing here make their presence known more forcefully in those branches which focus on professions and social institutions. What makes them of particular consequence for the sociology of law is the aggregate form in which they manifest themselves within socio-legal research. The aggregation is in turn a result of the institutional properties of the law in relation to sociology. Underlying this argument is the assumption that the extent of methodological obstacles confronted by the sub-branches of sociology varies in direct proportion to the 'institutional strength' of the themes, problems, and so on which become the object of sociological investigation. (Banakar, 2000, p. 275)

Based on the inherent level of subjectivity embedded in the grounded theory methods of in-depth expert interviews and textual analysis of official documents, and an internal struggle between positivist and post-positivist principles in the socio-legal research of mass communication, international relations, and the law, this study’s method is not a tight-fitting glove for the current line of research. However, that is not to say it is not the best fitting glove out there, i.e. the best place to start such an area of study. Indeed, grounded theory research is meant precisely to be a more open methodological approach for beginning such areas of research.

The grounded theory methodology can be understood as offering a method of undertaking research as well as a systematic approach to qualitative analysis, a strategy for research rather than as method to generate more positivist findings. (Webley, 2010, p. 16)

79

Grounded theory, and socio-legal research included, are a means to build understanding of the workings of the social world in the face of seemingly countless forces and confounding complexity. Grounded theory seeks to explain rather than to finitely and precisely predict.

(Bowen, 2006).

Understanding that grounded theory seeks to explain rather than predict places the concepts of validity, reliability, and generalizability in markedly different contexts than in quantitative research (Bryman, 2012). Indeed, some suggest that these concepts do not fit qualitative research at all (Lincoln & Guba, 1994); others suggest that a new adapted perspective of the concepts is necessary (Kirk & Miller, 1986; Le Compte & Goetz, 1982; Mason, 1996).

Where validity in quantitative research is often operationalized as accuracy of measurement, qualitative research has no use for objective measurement; thus, validity can be reconstructed to mean accuracy of observation, i.e. is the researcher accurately overserving what he or she intends to be observing. Further, due to the social nature in socio-legal research, qualitative methods such as interviews are subjectively derived and executed as a single snapshot in time, and thus there is no guarantee that replications of such research at later times would produce identical, or even similar, results. Thus, while efforts are made to ensure the systematic and rigorous nature of this research, the traditional positivist assumptions of validity, reliability, and generalizability are not identically maintained in this post-positivist approach to socio-legal research.

Ultimately, the researcher fully acknowledges potential shortcomings in the methodological approach of this study. However, there is simply no better place to begin the study of PD and IL than in a grounded theory approach to a such socio-legal study. Readers of this research will have to decide for themselves on an individual basis the quality and merits of

80

this research, and ultimately its utility, as a means to contextualize an understanding of the roles

PD and IL play in and around each other in the 21st century.

81

CHAPTER 4 RESULTS

Following IRB approval, 99 experts were contacted, 37 of which responded to the inquiry; in all, 24 interviews were completed. Interviews totaled 15.70 hours, with an average length of 40.96 minutes, and ranged in length from 24 minutes to one hour and 17 minutes.

Interviews were initially coded for specific frames as laid out in Table 3-2. Such predetermined themes led to the pre-programming of 96 unique coding options. Including themes that emerged from the interviews, 33 unique themes, on top of the predetermined coding index, a total 129 coding options were programmed into the NVivo 12 data analysis software. Interviews were conducted between July and November of 2018, while the analysis of the interview transcripts was conducted between September and November of 2018.

Research Question 1

The first area of inquiry in this study (RQ1) asks how public diplomacy is functionally different from propaganda. The analysis of the data seeks to explicate, distinguish, and demarcate key characteristics that differentiate public diplomacy and propaganda, so that relevant distinctions can be made between digital diplomacy and computational propaganda.

Demarcating Strategies of Information Intervention

In order to explore functional characteristics demarking the strategies of information intervention, interviews were coded for specific frames as laid out in Table 3-2, namely 1) who engages in varying forms of information intervention (politicians, military, intelligence groups, the corporate sector, civilians), 2) in what manner do they engage in information intervention

(overt, covert), 3) to what target audience do they engage in information intervention (domestic, foreign, mixed), 4) through what method is content constructed and presented (framed truth, deceptive lies, mixed), 5) what communication model does information intervention resemble

82

(one-way, two-way, multi-way), 6) what is the end-goal intent of information intervention

(inform, persuade, disrupt), and 7) what policy end-goals does information intervention support

(public policy, national security, foreign policy, economic policy). The following section explores each of these items in further detail.

The actors

Overall, 11 interviewees clearly stated, or suggested in some manner, that the practice of public diplomacy suggests, and possibly necessitates, some kind of involvement with political institutions within a government. G. Golan, the Director of the Center for Media and Public

Opinion, provides a standard U.S.-centric definition. “Public diplomacy refers to the engagement of foreign publics by a government for the purpose of gaining support for its foreign policy objectives” (personal communication, August 23, 2018). This notion of government centrality is reinforced by other interviewees, like Dr. Jamie Fullerton of Oklahoma State University, who frames public diplomacy as a function of government speech.

When I talk about public diplomacy, I restrict that to government speech. That says it’s government sponsored communication aimed at foreign audiences … in order to call it public diplomacy in the strictest sense, then it needs to be the government. (J. Fullerton, personal communication, October 1, 2018)

A variety of the interviewees place public diplomacy as a form of governmental foreign affairs, i.e. in the purview of a foreign ministry such as the U.S. Department of State, the British Foreign and Commonwealth Office, etc.

Where the majority of interviewees identify public diplomacy as being state-centric, another six state the possibility of non-state actors such as corporations, interest groups, and individual persons to partake in the practice of public diplomacy.

I believe it includes non-state actors. I think as long as anyone … is purposefully trying to establish good will and understanding in the values of your country, then that’s public diplomacy. (C. White, personal communication, August 16, 2018)

83

While still a minority position, it is worthy to note that such a perspective fits in line with the thinking of new public diplomacy (Fitzpatrick, 2007; Gilboa, 2008; Melissen, 2005; Pamment,

2013).

Public affairs is considerably more divisive in its placement across the interviews. Six interviewees state that it was a core function of political institutions within the government. Dr.

Pawel Surowiec, a senior lecturer at the University of Sheffield, describes:

Public affairs is enacted by inter-political actors, which takes place and happens at the intersections between policy makers, domestic policy issues, and sometimes nowadays, foreign policy issues. (P. Surowiec, personal communication, July 23, 2018)

Three suggest it is a role filled by military personnel, which is understandable given the prominence of the public affairs officer in most military hierarchies, and interestingly, two view it as a corporate role fulfilling governmental interests.

PSYOPs is clearly identified in ten interviews as the practice of the military.

“Psychological operations I would consider to be the work conducted by military personnel in theater” (J. Pamment, personal communication, September 3, 2018). Indeed, S. Pike, a professor at Syracuse University and a former foreign service officer with the U.S. Department of State, adds that “It’s a term of art that the military invented for what they do” (personal communication, July 26, 2018). Where the dominance of PSYOP to military institutions is clearly established, in two interviews, it was noted that as modern militaries in democratic systems typically fall under the purview of political oversight, elite political institutions and actors do become implicated in such behavior, given their awareness, knowledge, and often approval of military operations.

Perhaps most interestingly, however, is the lack of identification made regarding the sources responsible for the work of propaganda. Interviewees almost universally referred to

84

propaganda as a noun, i.e. they said “propaganda is” or “propaganda involves,” etc. To the contrary, with little deviation the interviews referred to public diplomacy, public affairs, and

PSYOPs in the context of active verbs, i.e. they said “foreign ministries do public diplomacy,” or

“militaries do PSYOPs,” etc. While these distinctions are not mechanically correct in proper

English, such conceptual references are representative of some kind of perceptual difference in how the majority of the interviewed experts view differing strategies of information intervention.

The collected data, however, does not allow for definitive conclusions to articulate the meaning behind such perceptual differences. Overall, the source of who engages in propaganda was vicariously not addressed in the majority of interviews. Indeed, only one interviewee ascribed a tangible source to the act of propaganda, linking it to the work of intelligence agencies. “It was called black propaganda, it headed off toward what the CIA does” (S. Pike, personal communication, July 26, 2018).

The manner

In regards to the manner of how transparently information intervention strategies are engaged in, public diplomacy was identified in five interviews as an overtly transparent communication activity. “Public diplomacy is by definition transparent through and through.

There should be no subterfuge involved … whether mediated or relational” (E. Metzgar, personal communication, August 27, 2018). E. Metzgar is an Associate Professor in the Media

School at Indiana University, Bloomington. That is to say, public diplomacy overtly seeks to engage with a public audience through means that are transparent about the source of the content and the desired end-goal of the communication. As S. Bradshaw adds, public diplomacy is,

“visible and in your face. It’s clear where the message is coming from and it’s clear what the goal of this message is” (personal communication, September 21, 2018).

85

To the contrary, three interviewees characterized propaganda as being covert, hidden, and secretive. In stark contrast to the notion of public diplomacy, propaganda was most often viewed as a covert activity where the source and/or desired end-goal of the message sender was deliberately kept secret or hidden from the message receivers. “Whereas propaganda, I tend to view it as being a little bit more hidden, especially in the digital age. You don’t necessarily know who the message is coming from” (S. Bradshaw, personal communication, September 21, 2018).

S. Bradshaw is a researcher in the Computational Propaganda Research Project at Oxford

University and a Senior Fellow at the Canadian International Council.

However, not all interviewees identified propaganda as necessarily being covert in all cases. Indeed, the notion of white propaganda was introduced to represent scenarios in which such messages are more transparent. “I don’t think it’s covert, I don’t think that its evil in any way, I don’t think that it spreads falsehoods. So maybe people talk about soft propaganda, or white propaganda, so I guess you could call it that” (C. White, personal communication, August

16, 2018). As P. Rosenzweig, a Professorial Lecturer in Law at George Washington University, adds, “if they’re [governmental communications] operated in a transparent and overt manner by governmental things, they’re at most propaganda and most likely at best public diplomacy”

(personal communication, July 12, 2018).

The target audience

In addressing key target audiences, 11 interviewees suggested that public diplomacy seeks to communicate with foreign audiences exclusively. “Well, if we start with public diplomacy, the key concept here, it’s about engaging with foreign publics as a government, as a means of foreign affairs” (C. Bjola, personal communication, July 30, 2018). In his 2008 book,

Cull identified five key pieces of public diplomacy: listening, advocacy, cultural diplomacy,

86

exchange diplomacy, and international broadcasting. Where the very nomenclature of international broadcasting illustrates communication with public audiences abroad, R. S.

Zaharna, a Full Professor and Director of the Global Media program in the School of

Communication at American University, suggests that,

International broadcasting, or the use of electronic media by one society to shape the opinion of the people and leaders of another, includes the use of radio, television, and increasingly web-based broadcasting targeting a foreign, as opposed to a domestic, population. (personal communication, July 19, 2018)

Uniquely, public affairs was identified as an almost identical practice as public diplomacy, yet demarcated by the exclusive targeting of domestic populations. “Public affairs is when the U.S. is talking to its own citizens and a set of rules apply. Public diplomacy is when the government is talking to foreign citizens and the rules change” (S. Pike, personal communication, July 26,

2018). Certain interviewees event suggested some linkages to propaganda, “I’d move on to mention public affairs, which is generally targeted at home audiences, communication with the domestic press and is generally the truthful aspects of propaganda in western democratic societies” (E. Briant, personal communication, August 24, 2018). E. Briant is a Post-Doctoral

Scholar in the School of Media and Public Affairs at George Washington University as well as a

Senior Lecturer in Journalism at the University of Essex.

Where a number of interviewees referenced the target audience of propaganda communication, it was evenly divided around the inclusion/exclusion of domestic populations. In four interviews, propaganda was characterized as targeting uniquely foreign populations.

“Propaganda refers more and more particularly in the community of practice, to non-attributed efforts to influence foreign publics” (A. Arsenault, personal communication, August 3, 2018). A.

Arsenault is an Assistant Professor of Communication at Georgia State University. To the contrary, in the other four interviews referencing the target audiences of propaganda campaigns,

87

mention was made that such efforts can be directed to a multitude of audiences, both foreign and domestic. “Propaganda typically is associated and affiliated with persuasive communications, on a national or international scale” (P. Surowiec, personal communication, July 23, 2018). With such a balanced perspective that propaganda targets both foreign and domestic populations, the overall target audience for propaganda campaigns was identified as mixed.

Similar to the perspective of public diplomacy, PSYOPS was identified in four interviews as being targeted solely at foreign populations; “it’s [PSYOPs] exclusively focused on foreign audiences that military information support operations [U.S. name for PSYOPs] are never directed against US persons (C. Paul, personal communication, July 11, 2018). As PSYOPs is a military-centric behavior, and the position of public affairs officers is significant within military hierarchies, the relationship between PSYOPS officers (targeting foreign audiences) and public affairs officers (targeting domestic audiences) is murky, and is only becoming less and less clear.

Psychological operations is the area which is really targeting hostile actors and enemies. Now, depending on the country that might be domestic and it might be foreign. In America it’s quite controversial to target domestic audiences. And … there has become an increasing blurring between the practice of psychological operations, public affairs, and public diplomacy. (E. Briant, personal communication, August 24, 2018)

The method for content creation

Methods for content creation in information intervention strategies range from framing factual truths to using objectively unverifiable falsehoods, with the majority of strategies falling somewhere within that continuum. Four interviewees portrayed public diplomacy as a process of employing truths strategically framed to benefit the interests of the message sender.

I think it definitely has a particular tone to it, but it’s not making up facts. It’s not purposefully trying to mislead people. It’s definitely trying to convince people of a particularly ideology but it’s not using and fabricating stories to convince them. (S. Bradshaw, personal communication, September 21, 2018)

88

And as S. Pike adds, referencing the U.S. perspective of public diplomacy, “When Americans say ‘We do public diplomacy, we only tell the truth,’ there’s shading there, that has to be dealt with” (personal communication, July 26, 2018). Therefore, where public diplomacy content is purposefully framed to serve key strategic interests of the message sender, by and large it operates predominantly on the side of factually verifiable truths.

Public affairs operates in much of the same way as public diplomacy, gravitating toward content that is strategically framed, yet that tends to resist the diffusion of blatant unverifiable content. “Public affairs statutorily is bound to the truth … Even though they [public affairs practitioners] themselves won’t perpetrate any falsehoods, you have to know what you know and what you don’t know in order to be able to give honest answers” (C. Paul, personal communication, July 11, 2018). Thus, both the strategies of public diplomacy and public affairs involve communication engagement with respective target audiences that makes use of predominantly truthful and honest information presented in such subjective ways that support the interests of the sender.

In contrast to the approaches of public diplomacy and public affairs, in four interviews, propaganda was framed as significantly making more use of lies and falsehoods to achieve its end-goal objectives. “If it’s blatantly false, it’s wrong, it shouldn’t be done, it’s propaganda” (J.

Fullerton, personal communication, October 1, 2018). However, it should equally be noted that propaganda does not exist solely at the polar end of lies and falsehoods. Six interviewees noted that propaganda actually exists somewhere in between the polar ends of objectively verified facts and blatant falsehoods. “Propaganda is the dissemination of truthful or untruthful information by a government for the purpose of influencing public opinion domestically or abroad” (G. Golan,

89

personal communication, August 23, 2018). Indeed, as C. Paul, a Senior Social Scientist and

Professor in the Pardee RAND Graduate School, adds,

It’s either involving falsehoods, but not whole truths, in order to convey a mistaken impression or get someone’s attitude or behavior to change where they wouldn’t if presented information more honestly or better aligned with their context or more truly. (personal communication, July 11, 2018)

Based on the extent to which expert interviews identified propaganda as making use of partial truths through full deceptive falsehoods and lies, the overall method of content creation for propaganda was identified as mixed.

Lastly, PYSOPs was classified by the interviewees in much the same way as propaganda.

Where certain perspectives were presented characterizing PSYOPs as a behavior making use of both, the overall opinion was that PSYOPs exists much more in the center of a sliding scale between objectively verified able truths and blatantly deceptive falsehoods, perhaps even as a necessary evil. “I think a lot of people recognize that in wartime, governments can and do lie, especially to an enemy audience” (E. Briant, personal communication, August 24, 2018). As J.

Fullerton adds,

Here’s a distinction that I make, it’s almost to me if psychological operations feels like a weapon strategy. And it’s okay … it can be false. It can be tricky and that’s all right because it serves an end in a war situation. (personal communication, October 1, 2018)

But again, where some, if not a majority, of PSYOPs content may make use of lies, not all such content is mandated to do so.

The secret of military information support operations [U.S. name for PSYOPs] is that even though they are not doctrinally confined to virtuous persuasion, the majority of their activities still fall into virtuous persuasion. They use true information, sometimes selectively conveyed, for an intended influence effect. (C. Paul, personal communication, July 11, 2018)

90

Therefore, based again on the extent to which expert interviews identified PSYOPs as making use of partial truths through full deceptive falsehoods and lies, the overall method of content creation for PSYOPs was identified as mixed.

The model of communication flow

Based in the interpersonal and mass communication literature, two unique models of communication flow dominated, a one-way communication flow and a two-way communication flow. Five interviewees characterized public diplomacy as a two-way communication flow, denoting a circular relationship between message senders and message receivers. “Good public diplomacy involves engagement. It involves listening, and it involves having a conversation” (J.

Pamment, personal communication, September 3, 2018). Indeed, “in the case of public diplomacy, one of the key features is you try to reach out to foreign publics with a goal to build some bridges. You try to build the relationship” (C. Bjola, personal communication, July 30,

2018). Thus, as a two-way communication process, public diplomacy fundamentally seeks to use communication to engage with foreign audiences in dialogue and conversation, a process that ultimately seeks to build and/or maintain a relationship. J. Pamment is a Senior Lecturer and

Department Head of Strategic Communication at Lund University, and C. Bjola is an Associate

Professor of Diplomatic Studies at the University of Oxford.

To the contrary, propaganda was viewed as a one-way communication process. “I think propaganda is message selling. I think you have a message and you’re putting it across. You’re not engaging. You’re not listening. You’re selling your message” (S. Riordan, personal communication, August 28, 2018). J. Fullerton extends this in asserting that, “propaganda is covert, it’s false, it’s intended to benefit the sender, that’s one of the definitions, it’s intended to benefit the propagandist with little thought of the receiver” (personal communication, October 1,

91

2018). Therefore, as a one-way communication process, propaganda fundamentally seeks to use communication to advance the position and/or interests of the propagandist at the cost of the message recipient. There is no dialogue existent between said propagandist and the target audience, nor is there a healthy, or possibly even existent, relationship between the two parties.

Interestingly, no mention of communication flow pertaining to public affairs and

PSYOPs was made in any of the interviews. However, based on other characteristics such as manner of engagement (overt/covert) and methods for content creation (framed truths/blatant falsehoods) that link public diplomacy and public affairs together, while also similarly linking propaganda and PSYOPs, it can be suggested that public affairs likely engages in communication models that reflect two-way flows of information, while PSYOPs likely engages in communication models that reflect one-way flows of information. That is to say, public affairs most likely seeks to use communication to engage with domestic audiences in dialogue and conversation, a process that ultimately seeks to build and/or maintain a relationship. To the contrary, PSYOPs most likely seeks to use communication to advance the position and/or interests of the military at the cost of the message recipients. There is little to no dialogue existent between said military personnel and the target audience, nor is there a healthy, or possibly even existent, relationship between the two parties.

The end-goal intent

Three foreseeable options exist as the end-goal purpose of information intervention, to inform a target audience, to persuade a target audience, or to disrupt a target audience. Eight interviewees characterized public diplomacy as a behavior designed to persuade foreign audiences.

We think foreign publics can influence the decisions their governments take. If we don’t think that foreign publics are going to influence the decisions of their governments, and

92

chance decisions of their governments in ways that favor us, why are we spending money and precious resources on public diplomacy. (S. Riordan, personal communication, August 28, 2018)

S. Riordan is the Director of the Chair for Diplomacy and Cyberspace at the European Institute for International Studies and a former diplomat with the British Foreign and Commonwealth

Office. While the end goal of public diplomacy may indeed by to persuade a foreign audience of some social or political stance they do not naturally endorse, there is the potential for such a persuasion effort to disrupt the social and/or political order of a state. Public diplomacy seeks,

“to affect the composition of their markets for loyalties, to destabilize, to help mold opinion among their public and otherwise to assert ‘soft power’ for the purposes of achieving the national ends of the transmitting state” (R. S. Zaharna, personal communication, July 19, 2018).

In considering the intent behind public affairs, all interviewees who addressed the issue pegged public affairs as a behavior which seeks both to inform and, through possibly informing, persuade a domestic audience. “I have a very specific definition of public affairs. It is still an attempt to inform or persuade” (S. Pike, personal communication, July 26, 2018). Where the emphasis of public affairs may be on informing a domestic audience, this does not mean that such communication content does not have persuasive effects. “Now, there is a debate within and around the public affairs community when old public affairs can say that their mission is to inform by not influence, that suggests that they believe there’s such a thing as value-free information, that it is possible to inform without influencing” (C. Paul, personal communication,

July 11, 2018). Hence, based on the possibility for information to both inform and persuade, the end-goal intent behind public affairs is classified as mixed.

93

Interestingly, propaganda was characterized in a similarly mixed relationship, this one being between persuasion and disruption. Where some interviewees characterized propaganda as a behavior of persuasion, others characterized it as a behavior of disruption.

I see propaganda as a mutual concept which is somewhat controversial, but I see it as a kind of collective term which encompasses activities that are political or strategically motivated in order to shape ideas, emotions, and behaviors or opinions of a target audience. (E. Briant, personal communication, August 24, 2018)

Thus, as expressed here propaganda is centrally located around the end-goal intent of persuading a target audience in some fashion. To the contrary, however, other interviewees expressed the notion to which persuasion may have been a side effect, but the end-goal intent itself was some form of social and/or political disruption.

The context [of propaganda] is false … This is not what public diplomacy is supposed to do. This is where you cross the line into propaganda … sharp power is simply a modern term applied to propaganda, and the only difference, this is propaganda done with digital media. (E. Gilboa, personal communication, August 15, 2018)

And as C. Bjola adds, “See, for instance, the case that happened in the Skripal case in the UK, in which you basically use disinformation to confuse … It’s so easy, nowadays, to flood the channels with so much information that is particularly offensive” (personal communication, July

30, 2018). Hence, based on the feedback provided in the interviews on the extent to which propaganda campaigns can seeks effects of persuasion and/or societal disruption, the end-goal intent for propaganda is classified as mixed.

Lastly, PSYOPs was characterized as playing roles across the spectrums of inform, persuade and disrupt. Two interviewees suggested that there are times when PSYOPs seeks to in fact neutrally inform a target audience. “I would consider that to be a range of communication activities, beginning with basic intercultural communication. So putting up signs that people understand, for example, that you need to stop at a checkpoint” (J. Pamment, personal

94

communication, September 3, 2018). And as E. Briant further explains, “Mostly they [the military] tend to emphasize that they do kind of informational messaging a lot. So things that are, you know, ‘Get out of this area quickly! It’s about to be bombed’” (personal communication,

August 24, 2018).

Despite efforts to neutrally inform, PSYOPs does engage significantly in efforts to persuade target audiences. Three interviewees agreed as much. “Psych ops is something that I think of as being a type of practice to change public opinion through sometimes deception, or various means” (C. Myers, personal communication, July 18, 2018). C. Myers is an Assistant

Professor in the Department of Communication at Virginia Tech. As C. Paul adds, “It’s

[PYSOPs] something about efforts to influence foreign actors, and so the key elements of the definition are one, that it acknowledges that you’re actually trying to conduct influence, that you want to get someone or someones to do or not do something” (personal communication, July 11,

2018). To that end, the large majority of PSYOPs, as an element of strategic communication, seeks some kind of persuasive end-goal.

That is not to say, however, that PSYOPs does not also, at times, seek an end goal of societal disruption. In three interviews, references were made to the extent to which PSYOP campaigns seek to initiate or perpetuate certain levels of social and/or political discontinuity.

You’re not trying to convince the other side, you’re trying to undermine the confidence of foreign publics in all narratives … trying to fragment political and social debate … what they’re really trying to do is just break up our societies … it creates a kind of chaos, which [the adversary] can take advantage of. (S. Riordan, personal communication, August 28, 2018)

In such scenarios where PSYOPs seeks an end goal of societal disruption, it functions to diffuse information that seeks to exasperate, if not initiate, social and political divisions that interfere

95

with the normal functioning of said society so as to benefit the interests of some external and adversarial state.

The policy outcome

The range of policy options relevant to information intervention is conceived to include public policy broadly defined, which consists of the overarching point of view of an administration or federal government that controls and directs all other policies. Supplement this are the areas of foreign policy, national security, and economic policy. Five interviewees identified public diplomacy as seeking to service and/or fulfill a foreign policy outcome.

You do diplomacy for a reason, because you try to achieve something. Public diplomacy is a part of diplomacy. It’s a subset of diplomacy. Diplomacy, in turn, is a subset of state craft. State craft is how we achieve our foreign policy objectives. (S. Riordan, personal communication, August 28, 2018)

As E. Gilboa, a Professor and Director of the Center for International Communication at Bar-Ilan

University, adds, “Public diplomacy … is designed to engage foreign publics in order to influence foreign policy … for that public to influence the foreign policies of the relevant country they live in” (personal communication, August 15, 2018). Now while the end-goal outcome of public diplomacy may be the advancement of a countries foreign policies, such foreign policies can be extensively linked with other policies such as national security and even a state’s economic policy abroad.

As a domestically-centered behavior, three interviewees identified public affairs as addressing elements of domestic policy, two to foreign policy, two to economic policy, and one to national security. Given the reality that public affairs constitute government communication with its own domestic population, it would be conceivable to suggest that all ranges of state policy can, and would be addressed under its purview. Therefore, the policy outcome for public affairs communication is classified as public policy broadly defined.

96

Interestingly, none of the interviews explicitly or implicitly linked propaganda or

PSYOPs to specific policy outcomes. That being said, given the key target audience relationship between both public diplomacy and public affairs, it is conceivable to suggest that propaganda is a strategy that can be used broadly to affect an equally broad range of policy outcomes, including domestic, foreign, economic and national security concerns. Therefore, propaganda is classified as fulfilling a public policy role, again broadly conceived. To the contrary, given PSYOPs almost exclusive linkage to military establishments, it is conceivable to suggest that PSYOPs is a strategy and first and foremost seeks to achieve national security outcomes. Understanding that the predominant perspective of national security is directed at defending a state from military attack, it naturally asserts that such a military attack would come from another state, i.e. abroad.

Therefore, while the end-goal outcome of PSYOPs may be the advancement of a countries national security, such national security concerns can easily be linked with foreign policy issues and positions.

Emergent Themes Within the Data

Where the above sections address themes that were purposefully sought and explored within the in-depth expert interviews, there were noticeable themes that emerged from the collected data. Three of such themes were additional elements that serve as functional characteristics in demarcating public diplomacy from propaganda: attribution, accuracy, and emotional appeals. The fourth theme was significantly broader in scope, an element of caution that several interviewees addressed regarding the effort to succinctly demarcate the strategies of information intervention: perspective.

97

Attribution

Credibility is one of the key criteria used psychologically in the evaluation of persuasive communication (Perloff, 2014). Computational propaganda, as seen between 2016 and 2018, has been unique in how it has used anonymity as a cover in the diffusion of its content (Bjola, 2018), thereby inhibiting the ability of message receivers to critically evaluate the utility of such content. Understanding that such anonymity has been such a trademark of malicious online content, it was understandable a majority of interviewees address in relative detail the topic of attribution in distinguishing public diplomacy from propaganda.

Attribution never hurts, right? … misinformation online has to do with secure attribution, a one-person one-voice and authenticated voice, so you know that you’re not having a conversation with one person and their hundred bots and you know where that person is from. (C. Paul, personal communication, July 11, 2018)

To revisit a comment from A. Arsenault, “propaganda refers more and more particularly in the community of practice, to non-attributed efforts to influence foreign publics” (personal communication, August 3, 2018). Indeed, framing the work of public diplomacy as the work of advertising, J. Fullerton sees public diplomacy as necessitating transparent attribution as opposed to propaganda,

The definition of advertising is sponsored content that is paid with an identified sponsor. I think that if the U.S. government is sending out messaging, it doesn’t have to be explicit, but if they feel the audience does not understand it’s coming from the U.S. government, that falls into propaganda. (personal communication, October 1, 2018)

Where such an attribution difference exists between public diplomacy and propaganda, such distinctions also exist within the physical environment of cyberspace and the Internet.

The question of attribution is becoming relevant in the case of information intervention. How serious, how accurate can we make these kinds of attribution claims? … there are intelligence and techniques to pull it out who it might be … from this point of view, the attribution issue is being sorted out. (C. Bjola, personal communication, July 30, 2018)

98

Therefore, an emergent concept in the data that offers a functional demarcating factor between the approaches of information politics and information operations is an overarching level of transparent source attribution. Further, transparency of attribution need not be limited to information source alone, but can also include transparent attribution of financial backing, i.e. what source is fiscally supporting the publication and diffusion of such content. “Then there are lots of other questions that came up around SCL and Cambridge Analytica around funding and how funding is being obfuscated” (J. Pamment, personal communication, September 3, 2018).

To the contrary, where 15 interviewees brought up issues of transparency, many of them cautioned about the potential abuse of adopting policies that compelled transparent attribution.

As A. Arsenault suggests, closed societies can attack not only their adversaries, but also proxies who can be attributed to adversarial states. In the case of NGO work in Egypt or Ethiopia, transparency on the part of American NGOs are viewed as hostile and are censored if not banned. Further, where the United States may hostile toward of content of Confucius Centers in the country, such behavior would likely be reciprocated toward attributable content from

American centers broad, or the United Kingdom’s British Council, France’s Alliance Française, or Germany’s Geothe-Institut. Such a model for compelled attribution, where seemingly in the interest of free speech, is likely to instead offer methods for censorship in more totalitarian states

(personal communication, August 3, 2018). Indeed, anonymity has played a powerful role in the historical development of democratic political models, and to compel source and financial attribution is to potentially do as much damage to a democratic society as computational propaganda already seeks to achieve, in so far as the perspectives of affected Westphalian states, including the United States, Great Britain, France, and Germany.

99

Where compulsory attribution poses certain threats to open societies, it is not entirely clear that such source attribution would even be effective. Digital tactics for integrating sponsored content, such as native digital advertising, camouflages such content to achieve a cognitive association between the content and the publication source as opposed to between the content and the true source of the content. As G. Golan suggests, sponsored content is one of the most dangerous areas of native advertising because, even when disclosures of sponsorship are present, research suggests the majority of people are incapable of differencing between editorial content and native advertising content (G. Golan, personal communication, August 23, 2018).

Where transparent attribution of source and financial support can serve as an evaluator to aid in determining differentiations between public diplomacy and propaganda, and between digital diplomacy and computational propaganda, such utility stops at that point. Seeking governmental oversight of content attribution in cyberspace does not hold practical use in such ways that would aid democratic societies from identifying, confronting, and countering computational propaganda campaigns.

Emotion

A second theme that arose in a handful of interviews was the use of emotional appeals in the creation of persuasive content. In only two interviews were appeal types discussed. Despite such a small percentage, the topic contributes significantly to a theoretically-driven analysis of online persuasive content from a psychological social science perspective. Particularly, content tactics for traditional and computational propaganda were identified as significantly stressing emotion-laden content.

A lot of propaganda tends to be really emotional. They tend to use emotionally driven language that gets people riled up and enraged about something … They touch on already emotional issues and use that kind of language to push people further and to get that kind of response out of them. (S. Bradshaw, personal communication, September 21, 2018)

100

Where such content stresses emotionally-laden appeals, it often appeals to more basic animalistic instincts of inclusion and exclusion, i.e. in-grouping versus out-grouping through stereotyping and dehumanization.

It [propaganda] now describes a certain type of efforted communication, characterized by pejorative messages, generally negative stereotypes. They can be racist, they can be ethnic, they can be nationalistic, but in a sense, you are painting someone in the worst possible light, trying to convince people that the other guy is very evil. (S. Pike, personal communication, July 26, 2018)

It can therefore be concluded that content tactics for information operations tend to stress more intensive emotional appeals than do content tactics for information politics.

This framework of emotional appeals in the confines of persuasive communication fits neatly within mass communication scholarship, particularly the sub-fields of media psychology and media effects. Particularly, the Elaboration Likelihood Model of Persuasion (ELM) suggests that there are two core avenues for imparting behavior modification via persuasive appeals. The first, or the central path, behavior modification is achieved through high performance cognitive processing, i.e. critical evaluation of merit through argument and debate. The second, or the peripheral route, makes salient the positivity or negativity of characteristic cues that are less central to the decision for behavior modification at hand (Petty & Cacioppo, 1986). Thus, the central route makes use or more rational appeals while the peripheral is indicative of more emotional appeals.

Therefore, in attempting to discern functional characteristics in the demarcation of public diplomacy from propaganda, it can be suggested that content tactics for public and digital diplomacy tend to frame truths in ways that gravitate toward rational appeals. To the contrary, content tactics for traditional and computational propaganda tend to use more inaccurate falsehoods in ways that gravitate more towards emotional appeals. Interestingly, ELM suggests

101

that emotional appeals via the peripheral route tend to dissipate significantly faster than rational appeals via the central route, leading to decreased likelihood of behavior modification (Petty &

Cacioppo, 1986). To that end, the Internet poses as a unique modifying variable in this equation, due to the speed and consistency with which content can be viewed. Media psychology and media effects suggest that a key factor to content impact is repetition, i.e. how intense is the content of the message and how often does someone receive the message (Potter, 2012). The affordances of an ultra-high content medium operating 24 hours a day, such as the Internet, offers a route through which the long-term effectiveness of the peripheral route emotional appeals can be greatly increased. Understanding this perspective of emotional appeals in persuasive content, as an emergent theme in the data, offers an additional cue that can serve as a functional demarcation point in the evaluation of persuasive messaging.

Accuracy

A third theme that emerged in the literature, less emphatic than attribution but often in proximity to it, is the idea of information accuracy. Five interviewees made overt references to accuracy, i.e. truthfulness, of information as key qualifiers demarcating public diplomacy from propaganda.

The main differences between propaganda and public diplomacy lie in truth … if the information processes and content include misleading public opinion or providing false information on purpose, then public diplomacy would then become very close to propaganda. So, it’s the question of what kind of information you provide. (E. Gilboa, personal communication, August 15, 2018)

Where the concept of information accuracy is linked to the probed area of method for content creation, i.e. truths versus lies, the notion of accuracy here adds the perspective of a sliding scale.

That is, these approaches do not function in polarity, content is seldom uniquely one of the other, but rather, exist in a continuum between the two. As C. Bjola adds,

102

It’s important to consider how much emphasis you put on truthfulness, to what extent the statements approximate the truth. It’s not about agenda setting, …. but to what extent the substance of the content of the agenda is actually truthful. The more you lose that, the more you move in to the dark side, what we call propaganda. (personal communication, July 30, 2018)

Therefore, the emergent concept of information accuracy adds to the manifest topic of method for content creation by broadening its conceptualization to be a continuum of accuracy, as opposed to polar points of truth versus falsehood.

Perspective

One final notion that emerged from the interview data was a cautionary note on the ultimately utility of demarcating strategies for information intervention, given the universal nature for personal perspective to often dominate. 13 interviewees noted some extent to which functionally demarcating strategies for information intervention offered minimal utility. P.

Rosenzweig suggests the only utility in making definitions is to classify strategies on where they would fall under the law, and since there is no real domestic or international law in this area, the exercise would be theoretical at best. (personal communication, July 12, 2018).

Some interviewees thought that strategies for information intervention were largely indistinguishable in the field. “So I tend to think of these things as being on the same spectrum with varying degrees of transparency, … when in practice they may in many ways be the same thing (E. Metzgar, personal communication, August 27, 2018). To the contrary, others found there to indeed be differences yet marginal in nature, “much public diplomacy we see done by governments is little more than propaganda plus marketing” (S. Riordan, personal communication, August 28, 2018).

Others, still, found there to be conceptual differences in strategies for information operations, yet ones that tend to disappear practically when applied in the field. As A. Arsenault

103

suggests, classifying information intervention strategies may hold interest in an academic capacity, but once addressed in an applied capacity the abstract nature of the concepts would prove to be too much, “in many ways it’s like pornography. You know it when you see it, but many of the techniques that are used, there’s a gray area in between psychological operations, propaganda, and public diplomacy in terms of techniques” (personal communication, August 3,

2018).

Interestingly, similar to how A. Arsenault notes that demarcations between information strategies are predicated on contemporary circumstances, E. Briant notes how many of these conceptual differences bleed away in practice down a historical chronology.

Things that were much more rigorously policed as to their truthfulness are shifting in their definitions and organization, and [there is] much less separation between the practice of different disciplines than there used to be … becoming more acceptable that there had to be less of a clear line between them around 2007. (personal communication, August 24, 2018)

Such perspectives suggest that the definitional framing of information strategies is contingent upon a host of social factors, such as, for example, time in history.

Therefore, by and large, this emergent theme was conceived to represent the perspective through which an individual views such concepts, relevant to a number of time and place factors including century, location, nationality, ethnicity, political identification, religious identification, industry structure and hierarchy, etc. Therein, such definitions of public diplomacy, public affairs, propaganda, and psychological operations ultimately fall to the individual and the notion of who says.

The logic to understanding the differences between them, is who says? Ultimately, public diplomacy and public affairs, are highly specialized and professionalized aspects of propaganda. The fact that those concepts have been introduced, they’re just a manifestation of the growing professionalization of communicative practices with foreign policy-making and public-policy making. (P. Surowiec, personal communication, July 23, 2018)

104

Table 4-1. Typology of Information Intervention Stratagem. Through In What Target Appeal Comm. With What To What Major Type Minor Type Who Engages What Manner Audience Type Type Intent Policy End Method Collaboration across Framing Rational Inform Overt All Audiences One Way White Political & Civil Society Truth Appeal Public Opinion Promote, Collaboration across Persuade Support, and/or Mixed All Audiences Mixed Mixed One Way Propaganda Gray Political & Civil Society Public Opinion Change Public Policy Deception Collaboration across Emotional Disrupt Covert All Audiences through One Way Black Political & Civil Society Appeal Public Opinion Falsehoods Military and/or Foreign Framing Rational Inform Overt One Way White Intelligence Groups Audiences Truth Appeal Public Opinion Support National Psychological Military and/or Foreign Persuade Mixed Mixed Mixed One Way Security and Gray Intelligence Groups Audiences Public Opinion Operations Promote Deception Foreign Policy Military and/or Foreign Emotional Disrupt Covert through One Way Black Intelligence Groups Audiences Appeal Public Opinion Falsehoods Political Foreign Framing Rational Persuade Traditional Overt Mixed Support Public Institutions/Persons Audiences Truth Appeal Public Opinion Foreign Policy and Promote Diplomacy Political or Civil Foreign Framing Rational Persuade National Overt Two Way New Institutions/Persons Audiences Truth Appeal Public Opinion Security

Political Domestic Framing Rational Inform/ Persuade Overt Mixed Government Institutions/Persons Audiences Truth Appeal Public Opinion Promote, Corporate Domestic Framing Rational Inform/ Persuade Support, and/or Overt Mixed Public Affairs Corporate Institutions/Persons Audiences Truth Appeal Public Opinion Change Public Policy Interest Civil Domestic Framing Rational Inform/ Persuade Overt Mixed Groups Institutions/Persons Audiences Truth Appeal Public Opinion

105

Given this theme that individual perspective dominates the debate of defining strategies for information intervention, it would therefore be redundant to devote so much time and effort to try and tease out the functional differences between such strategies. In large, much of this work is still both valid and necessary as it offers a host of uses. First, it offers a time-specific glance at the mass perspective of such strategies in the early 2000s. Second, it offers functional insight as to the mechanical way in which such operations are carried out.

They’re all intended in some way to achieve the same effect, and they all use really quite indistinguishable means of activity. The only real distinctions between them are in the operational ways in which they’re implemented. (P. Rosenzweig, personal communication, July 12, 2018)

Therefore, the drive contribution of both the literature and the collected interview data is the formulation of a typology that offers eight functional characteristics in the operational makeup of public diplomacy, public affairs, propaganda, and psychological operations that can used for the means of demarcation in the study and evaluation of information intervention. The full typology of information intervention stratagem can be viewed in Table 4-1.

Research Question 2

The second area of inquiry in this study (RQ2) asks how existing frameworks for cybersecurity governance apply to curtailing strategies of computational propaganda. Therefore, the analysis of the data seeks to explore longstanding codified frameworks for confronting computational propaganda, so that policy-based recommendations can be formulated, evaluated, and presented for consideration. Namely, the existing framework analyzed is to date, the only piece of codified international law addressing security on the Internet, the 2001 Budapest

Convention on Cybercrime (Clough, 2014). Despite a minor addendum adopted in 2006, the convention exists in 2018 much as it did in 2001, presenting considerable challenges to a social and technological arena as fast evolving as the Internet.

106

A textual analysis of the Budapest Convention showed that just over one-third of text in the convention surrounds the theme of international cooperation; approximately one-fifth of text refers to avenues and mechanisms for the collection and use of evidence to detail cybercrime behavior; 16 percent addresses the criminality of unwanted behavior online, and 16 percent of text details the procedural mechanisms of the convention.

One of the current limitations of the Budapest Convention is the driving contextual perspective of cybersecurity in 2001. Such a perspective views non-state actors as the primary perpetrators of cybercrimes. J. Doe, a cybersecurity expert with the Center for International

Security and Cooperation at Stanford University, suggests that one of the leading limitations of the convention is in its purview of what actors engage in cybercrime. Significantly reflective of this presiding perspective of cybersecurity in the early 2000s, the convention views non-state actors as the predominant perpetrator of cybercrimes, i.e. civilian hackers rather than agents of the state. (personal communication, October 3, 2018).

The text of the convention suggests that the primary elements addressed as cybercrimes include such activities as: access to a computer without permission (Article II), the interception of computer data without permission (Article III), damaging, deleting, deteriorating, alerting, or otherwise suppressing computer data without permission (Article IV), or similar hindrance to the operation of a computer system without permission (Article V). More broadly conceived, these articles refer to behavior akin to hacking, child pornography, and intellectual property violations

(Electronic Privacy Information Center, 2005), i.e. behaviors that are conceived of as violations by non-state persons and/or groups. Where approximately one-third of the convention addresses avenues for international cooperation between state actors, the convention does not account for

107

the possibility, and in fact modern reality, that states themselves are active, if not leading, actors in the areas of cyber theft, espionage, and attack.

Another such limitation of the convention is its inability to garner sufficient signatories to ratify the convention outside of Westphalia or the British Commonwealth. South Africa is the only member of the BRICS states to ratify the convention, and outside of Japan none of the major state actors in technology were signatories nor have ratified the convention as of 2018, including India, South Korea, China, and Russia (Council of Europe, 2018). C. Paul acknowledges that the Budapest convention, as representative of international law more broadly in the area, is defined by its failure to secure key signatories such as Russia, China, Ukraine. If the main perpetrators of cybercrime are not party of the convention, then it likely regulates the actors most likely already in compliance, thus drastically minimizing the effectives of the conventions intended goal (personal communication, July 11, 2018).

Interestingly, it was also pointed out in the interview data that the Budapest Convention itself does not overtly establish the aforementioned behaviors as crimes under international law, it merely instructs state actors on how to modify their own national legal structures to do so, thereby harmonizing such a domestic perspective across a range of state actors.

Whether it’s a crime, you’re immediately defining it. Is it a crime in the U.S., or is it a crime in Russia … Is it a crime under international law? I’m not sure there’s a treaty that makes these activities crimes under international law … It [the Budapest Convention] in itself doesn’t make something a crime. (M. Price, personal communication, August 22, 2018)

M. Price is a Professor at the Annenberg School for Communication at the University of

Pennsylvania. Where approximately 15 percent of the convention addresses the criminality of unwanted behavior online, it is in the context of instructing state actors how to view and address such behavior.

108

Overall, the framework of the Budapest Convention as it exists is unlikely to be a leading solution for confronting computational propaganda. This is because computational propaganda, as witnessed between 2016 and 2018, has in fact failed to violate any serious laws, either at the international or domestic level. The network systems were not attacked, broken into, or had information stolen from them; the systems were used precisely in the manner they were designed to be used (J. Doe, personal communication, October 3, 2018). While the strategy of computational propaganda may be being used to supplement other forms of state and non-state sponsored cyber-attacks, such as hacking into Hillary Clinton or Emmanuel Macron’s email accounts, or distributed denial of service (DDoS) attacks, the strategy of computational propaganda as an independent behavior fails to constitute criminal behavior.

The question about election interference in the sense of using social media and someone using social, and you keep out the sort of interfering with systems, if you take that away, where is the crime? (A. Seger, personal communication, November 5, 2018)

Hence, where the strategies of information operations may violate latent norms of behavior, i.e. spreading false information to disrupt societies and manipulate political processes, there are no manifest or direct codified laws under the purview of which such behavior falls.

Capacity Building: Sybil Detection and Bot Identification

That is not to say, however, that the Budapest Convention is useless in the efforts to combat cyber information operations. Quite to the contrary, it has made more progress in the international arena than any other body of work, and holds frameworks very much worth carrying forward. Chapter III of the convention, devoted to establishing a functional framework for international cooperation, offers the most established template for promoting amicable and functional cooperation between states regarding cybercrimes and threats. One of the central tenets of the convention’s framework of international cooperation is capacity building, or the

109

technical training and assistance programs designed to bring all members to the convention on an equal technological footing. Such capacity building programs are operated through the

Cybercrime Programme Office (C-PROC) in Bucharest, Romania.

Howard and Kollanyi (2016) found that, while not necessarily reflective of all bot- generated content, there are cases where bots generating computational propaganda migrate across issues. Indeed, they found that two bots that were pro-Palestinian in content had migrated to the pro-leave and pro-stay camps in the British referendum. As Howard (2017) explained further in his inaugural lecture at the University of Oxford’s Internet Institute, a handful of bot accounts had migrated from Israeli-Palestinian issues to the British referendum, and then on to the U.S. presidential election, then to the Italian referendum, then to the German election, and then to the French election. This suggests that, while not indicative of all bot activity, it is possible to track bots as they migrate across issues that span international boundaries. The quantitative research methodology of social network analysis would a prime way to track such bots, but also to potentially identify and mine significantly larger bot networks. Social network analysis, however, poses two potential issues: first, such bot accounts need to be identified to begin the process of tracking and second, countries need the capacity to operate such environmental scanning (Dozier, 1986) efforts.

Where social network analysis makes the process of tracking social media accounts and identifying the network within which they operate, the method does not allow for the identification of social media bots. What does allow for this bot identification is Sybil detection.

The use of Sybils, identified as “spammers, fake users, and compromised normal users” (Wang,

Jia, Zhang, & Zhenqiang, 2018, p. 1) constitutes a formidable tactic in cybercrime and cyber warfare, and is a well-studied area of computer science and computer-based engineering. While

110

methods of Sybil detection are well known among computer scientists and engineers, the method is field specific, with little pertinent knowledge outside the STEM academic and industry communities.

Sybil detection works by the assumption that social media works off of social networks, and as bots do not naturally have social connection in the real world, they would likely have limited to no connections in the cyber world. It is this lack of social connection that can tracked and mapped out in graph form (G. Stringhini, personal communication, October 24, 2018). G.

Stringhini is an Assistant Professor of Electrical and Computer Engineering at Boston

University. With specialized training from computer engineers familiar with existing methodologies for Sybil detection, an entire range of social scientists already familiar with social network analysis can employ such methods.

Beyond network integration, methodologies for analyzing artificial inflation also hold significant merit in the area of Sybil detection. Due to the fact that bots stick out in social network analyses based on their lack of integration, bots now attempt to conflate their connections by ways that are not natural in a socially networked environment. They will follow of friend large sums of people with little reciprocation, or they buy followers and friends who overtime, when realizing they do not actually know the bot account, will terminate the connection. Such artificial and unnatural means for achieving integration are mechanisms for identifying accounts likely to be bots (G. Stringhini, personal communication, October 24,

2018). Tracking anomalous variations in followers in social networks holds significant promise in identifying bot accounts. Much like analyzing patterns of network integration, the analysis of big data sets for abnormal flows in the establishment and/or disestablishment of ties within social networks is a prime activity for the social network analysis approach.

111

There is also one further tactic for Sybil detection that holds merit in the methodological framework of social network analysis, identifying bot networks through the coordination of hashtag usage. Understanding that bot networks are operated by a central user, there tends to synchronized activity across the bot network, i.e. they tend to post at similar times, on similar topics, and using key hashtags. Over time and large enough data sets, trends can be identified to denote which accounts, what content, and what hashtags are likely implicated in a bot network

(G. Stringhini, personal communication, October 24, 2018). Thus, utilizing social network analysis, again over massive sets of big data, holds significant promise for successful attempts at identifying both individual bot accounts and, possibly, entire networks of bot accounts.

Overall, three primary tactics for Sybil detection, i.e. network integration, artificial inflation, and coordinated content, are prime methods for the research methodology of social network analysis. Currently, such tactics reside in the academic and industrial toolbox of computer scientists and engineers. However, given the mass diffusion of the Internet, social scientists are increasingly making their way further and further into computational research methodologies, as evidenced by their adoption of the social network analysis methodology, and the development and use of platforms and software packages such as NodeXL, Jupyter, Python,

Pandas, Networkx, and Gephi. With relative amounts of training, the approach to Sybil detection can be expanded to include the world of social science as well as the STEM sciences.

While the academic and industrial bases in power states in North America, Europe, and parts of Asia may be highly suited to use such social network analysis methods for varying scopes of Sybil detection, it is unlikely countries outside of these power states and tech leaders, what McPhail (2014) refers to as core countries in a world systems theory model, i.e. about the

30 most developed countries in the world, would be capable of keeping pace. Armed with

112

sufficient data mining software, the necessary hardware and processing power to mine and export big data, and a sufficient number of experts to analyze the data and synthesize usable inferences, only these top 30 states at best have the technological resources and know how to approach Sybil detection and social network analysis on a scale needed to confront computational propaganda.

Though bi- and multi-lateral treaties could be established to allow the remaining 165 countries or so to make use of the assets and resources of the top 30, it would make the former dependent on the latter. As diffusion and use of the Internet is likely to only grow for the foreseeable future, including the use of bot networks and spread of misinformation, the more prudent way forward would be the development of such independent capabilities among autonomous states across the globe to retain their information sovereignty. Further, bot detection is akin to looking for a needle in a haystack, and the best means of finding individual bots and related bot networks is to drastically increase the number of eyes looking for said needle, colloquially speaking. Therein, the notion of cyber-based capacity building enumerated in the

Budapest Convention and carried out since through C-PROC offers a viable framework for how to train and develop at an international scale teams comprised of both computer engineers and social scientists that are capable of such efforts.

Mutual Assistance and Spontaneous Information: Bot Tracking Across Borders

Capacity building is not, however, the only useful element of international cooperation established in the Budapest convention. As a multi-lateral treaty interested in legislative and judicial responses to cybercrime, evidence is critical to a court-based effort prosecute cybercriminals. 22 percent of convention refers to avenues and mechanisms for the collection and use of evidence in a court of law. In such light, international cooperation is critical to the

113

collection and eventual transfer of such evidence across international jurisdictions. Should a person located in Brazil hack into the Facebook account of a company located in Romania, for example, it would require extensive cooperation between the Brazilian and Romanian governments, and possibly even the Irish government as Facebook servers for the European theater are housed in Dublin. Such cooperation would surround the issues of evidence collection, analysis, and sharing. In applying such international cooperation in the fight against computational propaganda, specifically, the convention outlines utility in the areas of mutual assistance, and spontaneous information.

Article 25 of the Budapest convention establishes a framework for mutual assistance

This framework for international cooperation around information sharing, enumerated in a variety articles in the convention, offers considerable utility in tracking bot networks and computational propaganda campaigns across borders. Articles 25.1 and 25.3 outline a procedural framework through which one country can request mutual assistance, i.e. information and assistance broadly defined, from a co-signatory of the convention (Council of Europe, 2001).

The articles stress not only the need for functional cooperation between state actors, but also timely cooperation in extent to which cybercrimes may exist in time-sensitive situations.

Article 26 of the Budapest convention adds to Article 25 in establishing a framework for spontaneous information, whereby state actors can volunteer the sharing of information where not initially requested by a partner state (as outlined in Article 25). Article 26 outlines a procedural framework through which one country can independently and proactively provide mutual assistance, i.e. information and assistance broadly defined, to a fellow co-signatory of the convention without a formal request on that state’s part for mutual assistant (Council of Europe,

2001). Where Article 25 establishes a reactive framework for international cooperation, Article

114

26 establishes a proactive framework. Interestingly, the Budapest convention also makes allowances for inter-state cooperation between states where no codified framework for cooperation already exists. Article 27 lays out such a framework, through a 17-step process.

Hence, whether a state of cooperation already exists or not, the convention establishes a viable process through which states and share information and assistance with each other in both proactive and reactive measures.

Research has shown that, while far from a majority, there have been cases where bots have migrated from issue to issue across geopolitical boundaries in real space (Howard, 2017;

Howard & Kollanyi, 2016). As a product of effective capacity building around the areas of Sybil detection and social network analysis, a greater number of state actors would be able to identify and track bot network activity surrounding issues taking place within their own borders, again in real space. With bot accounts under such surveillance, state research units would thus be able to track bot activity, and should it occur, report as to any bot content that migrates issues across international borders and, as enumerated in Article 26 of the Budapest convention, proactively share such information to the relevant state to which the bot migrated.

Where Article 26 holds merit for states with existing bi-lateral treaties for cooperation, a country like Great Britain, had it identified the bots circulating computational propaganda, could have alerted their counterparts in the United States when the issue migration of the bots took place. Equally, if not more pertinent, would be the utility of Article 27 to establish international cooperation between states that have little to nor working relationship with each other in the area of cooperation. For example, should the Israeli state identify bots circulating computational propaganda on the two-state solution with Palestine, and notice that such bots migrate to Saudi

Arabia to stress social issues around the state’s involvement in Yemen, or the purported

115

involvement of Crown Prince Salman in the death of Jamal Khashoggi, the Israelis could alert their counterparts in the Saudi government. While this example is purely theoretical, given the threat of Iran in the region and the growing knowledge of how Iran has utilized computational propaganda (Stubbs & Bing, 2018; Tabatabi, 2018), the utility of Article 27 to establish mutual assistance-based cooperation between states lacking such bi-lateral agreements is paramount for regional, and even trans-continental stability.

Research Question 3

The third area of inquiry in this study (RQ3) seeks to explore how newer frameworks for cybersecurity governance apply to strategies of computational propaganda. Therefore, the analysis of the data seeks to explore newly codified frameworks for confronting computational propaganda, so that policy-based recommendations can be formulated, evaluated, and presented for consideration. Namely, the new framework analyzed is the EU’s General Data Protection

Regulation (GDPR) plan. Passed in 2016, the GDPR took effect in May of 2018, outlining a significant regulatory shift in the business models surrounding the collected personal data of online users both in the EU and those of citizens abroad who are members of EU states. Where the emphasis of RQ2 surrounds direct confrontation of computational propaganda via methods for identifying and tracking bot accounts, the framework of RQ3, i.e. the GDPR, shifts the focus to corporate entities and the collection and usage of personal data that facilitates the micro- targeting element of computational propaganda. Thus, the GDPR offers indirect policy options for confronting the computational spread of disinformation campaigns.

A textual analysis of the convention reveals two primary sections; the first section, which outlines the scope of the treaty in a language meant for the relatively causal reader, consists of 44 percent of the convention by text. The second section, which outlines in detail the functional

116

breakdown and makeup of the convention, consists of 56 percent of the convention. These sections can be explored in further detail in Appendix C.

Privacy by Design

One of the key elements of the GDPR which poses a potentially significant role in confronting computational propaganda is Article 25, i.e. data protection by design. Ultimately, the article establishes the responsibility of institutional controllers, (i.e. data collection actors such as Facebook, Google, etc.) to alter the infrastructural framework of their operation system to allow for greater data protection. Article 25.1 states that institutional controllers are responsible for adopting and implementing sufficient technical infrastructure as would allow for security measures and other safeguards to protect the security of a user’s private information

(European Parliament, 2016). Article 25, therefore, outlines a manner in which institutional controllers can, or should, implemental technical measures in the design of their operating systems to facilitate increased data security and privacy.

Institutional controllers and differential privacy

Article 25 can be found in Chapter 4 of the GDPR, which is devoted to the topic of the institutional controller, i.e. the responsibilities of placed upon data collectors such as Facebook and Google. Article 25 is meant to be framed from the perspective of what can the institutional controller do to increase the security and privacy of the data subject. One such mechanism, as laid out in Article 5.1 of the GDPR and mentioned specifically in Article 25.1 is data minimization, or idea that data collectors should minimize the amount of data they are collecting form data subject, limiting themselves to information that is necessary for the institutional controller to provide its services to the customer, i.e. the data subject (Information

Commissioner’s Office, 2018). Data minimization, however, still involves certain extents of

117

private data to be collected and potentially shared with third parties. What should supplement such a framework is the use of differential privacy.

Pioneered by Dwork, McSherry, Nissim, and Smith (2006), differential privacy consists of a mathematical mechanism in which data aggregators can collect user-provided data, but then provide such data to third parties in manners that retain the statistical properties of the authentic data while removing the individually identifiable information of the data subjects within the data.

The method essentially adds noise in and around the data to cloud the identity of individual users. Interested parties can still make insightful inferences about the population at large, but cannot identify and target any individual data subject (V. Bindschaedler, personal communication, October 11, 2018). V. Bindschaedler is an Assistant Professor of Computer and

Information Science and Engineering at the University of Florida.

Differential privacy, while far from a perfectly developed tactic, is still a viable option for providing usable metric-based data to third parties so that they can study consumer-driven market behavior and infer usable insights, however, at a macro-level only. This thus protects data subjects from being singled out on an individual level, based on their personal data insights, for the purposes of micro-targeting. Indeed, there is evidence to suggest that data aggregators such as Facebook and Google have already made some use of differential privacy in order to report publicly on user statistics (V. Bindschaedler, personal communication, October 11, 2018).

Considering the framework for data minimization laid out in Article 5.1 and enumerated in

Article 25.1 as a mechanism achieving data minimization broadly conceived, marrying such a framework with differential privacy offers a considerably stronger approach to not only increasing the security of information collected from data subjects, but also increasing the privacy of the data subject in the sharing and/or sale of such data to third parties.

118

Data subjects and synthetic data

To the contrary, not all mechanisms for privacy by design must begin and end exclusively with the institutional controller alone. In its overall scope, one of the driving goals of the GDPR is to give data users greater control over their own data and privacy. Therefore, enabling the data user to partake in the data collection process more autonomously also falls under the purview of privacy by design. A tactic through which data users can receive such increased autonomy is by designing and embedding routes for data synthetization within system infrastructure.

Synthetic data is the product of a process of technical obfuscation, that is, an effort to camouflage data characteristics of system users in order to protect their privacy (Bindschaedler

& Shokri, 2016). This method of privacy preservation calls for infrastructural design, i.e. privacy by design, that allows data users to provide a surplus of information to institutional controllers which essentially clouds the true data points of the user, making it harder for the data analyzer to know which data points accurately reflect the user versus which do not. Referencing an example of location tracing with mobile device apps, to look for restaurants in a given city, say within ten-minute walking distance of a given location, would require providing a user’s location to the service provider. Using a system designed for privacy-preserving location traces, the user could provide a multitude of location points to the data provider and then filter out the unwanted results on the mobile device. Such a method effectively minimizes the utility of the data provided to the service provider without minimizing the quality of service provided to the data subject

(Bindschaedler & Shokri, 2016).

Utilizing such a framework in the perspective of Article 25 could aid in the effort to confront computational propaganda by minimizing the effectiveness of its micro-targeting

119

capabilities. Allowing data subjects to autonomously provide a surplus of information on any topic of their choosing, i.e. consumer behavior, political affiliation, demographic characteristics, etc., would aid in camouflaging their true characteristics. To account for such an influx of data variability, computational propaganda efforts would need to be divided down a greater number of options. Then, again using the framework for privacy by design, modifying system infrastructure to allow the data subject to filter their results would aid in effectively filtering out a greater extent of computational propaganda content. Whereas methods for data sanitization remove information from datasets in the possession of institutional controllers, synthetic data provides an abundance of information to achieve much of the same end. Therefore, systems design that allow for data synthesis on the part of the data subject serve to facilitate a process that can directly minimize the effectiveness of micro-targeting efforts, and thus serve as a viable means for countering computational propaganda.

Emergent Themes in the Data

Where the previous sections outline manners in which the data pertain to RQ1, RQ2, and

RQ3, such areas were purposefully addressed. To the contrary, two significant themes manifested themselves in the data, the place of algorithms as a function of micro-targeting and the viral spread of computational propaganda content, and the role existing business models have played in the development and use of algorithms and data as means for micro-targeting. Across all the interviews conducted, algorithms were independently addressed in 11 interviews, where economic business models were brought up in ten of the interviews. Lastly, the theme of political realism was referenced in eight of the interviews.

Algorithms

Computational propaganda is facilitated by and large on its use of micro-targeting to

120

strategically diffuse content across key target audiences. Micro-targeting is facilitated by two key components: first, personal user data that allows for the development of behavioral profiles, and second, algorithms for the development of such behavioral profiles as well as the distribution of appropriate content packages across key target audiences. Where algorithms are not addressed in the frameworks of the Council of Europe’s Budapest convention or the EU’s GDPR, they were equally not proactively addressed in the interviews. However, to understand micro-targeting is to understand the equal part algorithms play in computational propaganda alongside personal data.

That is not to say, however, that algorithms are socially counter-productive. To the contrary, much of the framework of the Internet and the capabilities we have today as a society are based on algorithms. Importantly, algorithms allow for the analysis of data in quantities far exceeding what humans can accommodate. “The sheer amount of information that is posted online, and it's likely to, because of the increasing connectivity, the 5G and everything, this volume is likely to explode. You cannot do it, basically, with people. You need algorithms. That would take us to a new level” (C. Bjola, personal communication, July 30, 2018). Interestingly, algorithms have been useful and highly effective in serving governmental interests in identifying and censor online terrorist content, some a purported rate of almost 90 percent of such terrorist content (C. Bjola, personal communication, July 30, 2018).

Understanding that governmental applications of algorithms have served interests in the areas of national security and foreign policy, it would be only natural to see a continual blending between those designing and adapting algorithms and such end-use target goals as relevant to state interests, for example: surveillance (E. Briant personal communication, August 24, 2018).

Where C. Bjola outlines both the state and non-state use of algorithms for identifying online content relating to ISIS and other terrorist groups, it would be natural to understand that

121

algorithms could equally serve to aid in the identification, tracking, and eventual confrontation of computational propaganda, whether it be from state actors like Russian, Iranian, Burmese sources (Stubbs & Bring, 2018), or non-state groups like ISIS terrorist cells (DiResta, 2018).

Not all applications of algorithms, however, are seen as being constructive to governmental efforts. While digital channels facilitate communication with friendly audiences, it also allows governments to reach and engage stakeholders who are less friendly. Algorithms potentially harm this function of engaging undecided and/or hostile audiences by attempting to provide users with content that more appropriately aligns with their identified behavioral profile.

In essence, algorithms lend to echo chambers and make it harder for governments to reach audiences exist in the purview of other chambers of online discussion (S. Riordan, personal communication, August 28, 2018).

It is plausible to suggest that where computational propaganda thrives on creating echo chambers (Powers & Kounalakis, 2017), such echo chambers actually stymie the work of public diplomacy professionals. Understanding that algorithms not only show stakeholders who they would be inclined to agree with, this also has great potential to perpetuate the cycle of digital misinformation (S. Riordan, personal communication, August 28, 2018). This links back to one of the emergent themes as identified in RQ1, that much of this is perceptual, i.e. based on a person’s unique perspective and world view. As P. Surowiec (personal communication, July 23,

2018) suggests, it always comes back down to who says. Where algorithms have a great capacity to help solve the social problems posed by computational propaganda, those same affordances are also what helps to facilitate computational propaganda.

Economic Business Models

Interestingly, the emergency of algorithms in the interview data intersected discussion of

122

economic business models as facilitating the infrastructural framework that has given rise to computational propaganda. “Basically, these social media companies would not exist unless they could sell algorithms. That’s their business model. And so it’s not in their interest to protect my personal data” (C. White, personal communication, August 16, 2018). C. White is a Professor in the School of Advertising and Public Relations at the University of Tennessee, as well as a

Fellow in the university’s Global Security Program. Indeed, much of the function of micro- targeting was developed as a means to offer better products and customer service in the consumer market arena. For example, corporations like Amazon use algorithms to track purchasing behavior and formulate recommendations based on similar users, all for the effect of streamlining the purchasing process to make it easier and quicker for the consumer. Similarly, streaming services like Netflix use identical procedures to encourage behavior that keeps the consumer on the platform for longer periods of time (V. Bindschaedler, personal communication,

October 11, 2018).

The development and use of algorithms that govern a variety of online functions, such as recommender systems, is critical to the overall business operations of major companies in the areas of consumer products, and increasingly, social networking platforms. Thus, the ability to strategically micro-target did not evolve out of some effort for international political persuasion, but to advance customer service to as to advance purchasing behavior. “The problem is that the people who are facilitating this, Amazon, Google, Facebook, so long as it’s in their business interest to distinguish along different types of people, that will make the facilitation of micro- targeting very, very easy” (B. Valeriano, personal communication, October 11, 2018). B.

Valeriano is the Chair of Armed Politics at the Marine Corps University.

123

The social issue to merging the capabilities of algorithms with consumer-driven interests comes in when capitalist business models weight the value of the market as equal to, or greater than, the personal rights, i.e. privacy and security, of the data subjects. Algorithms are the lifeline behind companies in the technology sector, and to make such proprietary information more public would potential threaten the security and health of the company (J. Pamment, personal communication, September 3, 2018). Where corporate assets that collect, manage, and analyze the personal data of mass consumer populations would seem to necessitate significant levels of transparency, the proprietary nature of algorithms as protected intellectual property plays into the capitalist nature of democratic models wherein corporate interests tend to be weighted equal to, if not greater than, public interest. Pursuing a business-minded agenda that seeks to stress algorithm transparency could help to further stymie the reach of computational propaganda.

Pushing such an agenda would, however, be far from easy. With data insights governing a multi-billion-dollar industry, protecting such data would not only cut into highly coveted sources of revenue, but it could drastically destabilize the business market which, at the current time, has no replacement to the current governing model (J. Doe, personal communication,

October 3, 2018). Perhaps, then, the way to push an agenda of algorithm transparency is not to stress a narrative of consumer-product companies trying to provide better customer service, but rather re-frame strategic stakeholders in this arena as being in a place that sits more at odds with public interest and the issues of data security and privacy. A. Arsenault suggest a strategy where tech leaders like Facebook and Google be rebranded in the eyes of the public away from communication companies interested in facilitating peer-to-peer networking to data companies interested in the aggregation, sale, and exploitation of user data (personal communication,

August 3, 2018).

124

While pushing algorithm transparency is a viable indirect route for attempting to confront computational propaganda, it should be clearly understood that such efforts would be futile without addressing the role the global market economy plays in the process. Tech giants are tantamount in influence to state actors. Such companies now control enough of the tech and financial sectors as to exert extreme economic force on state actors, primarily through lobbying

(J. Doe, personal communication, October 3, 2018). In order to move effectively toward some kind of effort for combatting computational propaganda, the business sector must have a seat at the table. Based on the manner in which algorithms are implicated in the micro-targeting process, but still protected proprietary assets under global intellectual property laws, the issue of algorithm transparency must be pursued from an industry regulatory perspective, ideally based in the intellectual property framework.

Political Partisanship

The final theme that emerged in the interview data was an air of political realism, primarily one that falls down lines of perspective, both in partisanship and in expertise.

Fundamentally, politics is viewed as a power struggle between a minimum of two parties.

Therefore, it naturally asserts that any topic or issue in the political eye is a ground for the extension of political infighting. Where this can be seen in the passing of the 2018 Polish

Holocaust law, and the weaponization of history as an extension of political power struggles, the realm technology is no different. J. Gilbert, an endowed Professor and Department Chair of

Computer and Information Science and Engineering at the University of Florida, and experience professional in information and database security in U.S. elections, suggests that in all cases of technical evaluation and insight, the political notion of “how does this person vote” and “what is this person’s leaning” will always arise, triggering infighting over any attempt to achieve

125

simultaneous network security and true democracy (J. Gilbert, personal communication, October

11, 2018). The manifestation of political competitiveness is what poses one of the greatest barriers to combatting computational propaganda. At the international level particularly, it has plagued any form of international cooperation surrounding regulation of the Internet, and it proves still yet to stymie efforts surrounding computational propaganda.

From J. Gilbert’s perspective, due to the extent of political partisanship permeating the issue of computational propaganda and fake news more colloquially, there is in fact no technical solution that can, on its own, be the solution. As he suggests, policy regulation, achieved as a form of human engagement, is the only way to move forward; autonomous technical infrastructure will never be capable of solving the issues of digital infrastructure in modern democracies (J. Gilbert, personal communication, October 11, 2018). Such insight would seem to suggest that merging viable technical mechanisms with policy-driven cooperation is the ideal way to move forward both national and international strategies for combating computational propaganda.

126

CHAPTER 5 DISCUSSION

Premise of the Study

The twenty-teens will forever be known for many things in the United States, chief among them being the rise of fake news and Russian election meddling. Indeed, what society has deemed as fake news (Kleinman, 2018), the academic community has branded as information disorder (Wardle & Derakhshan, 2017). What’s more, political actors both in the United States and abroad have begun to revisit making strategic use of information disorder to achieve political, social, and military goals, i.e. information intervention. To that end, this study does not uniquely link the Russian state alone to contemporary strategies for digital information intervention, but rather it more broadly acknowledges a centuries-old conflict domain in which political state and non-state actors are engaging each other in struggles of frame competition

(Chong & Druckman, 2007). The unique context of 21st century information intervention is cyberspace, or the digital environment in which these strategic narrative competitions take place.

One of the first instances in which a state has mounted a formidable cyber counter- influence campaign involved the United States’ response to ISIS’ social media recruitment efforts. Orchestrated through the U.S. Cyber Command and executed via Joint Task Force

ARES, Operation GLOWING SYMPHONY was one of the first public governmental attempts to directly tackle and disassemble an information intervention machine on social media platforms

(Lamothe, 2017; Martelle, 2018; Sanger & Schmitt, 2017). Whether carried out by military or political organizations, accusations of similar digital information intervention efforts have been mounted against Iran (Tabatabi, 2018; Stubbs & Bing, 2018), Russia (Committee on Foreign

Relations, 2018; National Intelligence Council, 2017) and Myanmar (Stubbs & Bring, 2018).

Accusations have also been raised against the U.K.’s Institute for Statecraft and a so-called

127

Operation Integrity Initiative program (Dean, 2019; Gayle, 2018; Walker, 2018); however, due to a lack of credible sources verifying the nature of the program, this research neither endorses or rejects the validity of the program and merely highlights the existence of the debate.

Regardless, sufficient evidence exists to conclude that political actors in the 21st century are using digital communication platforms to engage in competitive information framing with the end goal of influencing public opinion in ways that favor the interests of the state and disfavor the interests of that state’s adversaries. Such actions constitute behavior synonymous with political public relations (Strömbäck & Kiousis, 2011). That is not to say, however, that information intervention is one seamless activity. Rather, two unique subsets of the behavior are explored in this research, information politics and information operations.

Politics, broadly conceived, is considered to be behavior in which one entity uses power to impart influence over another (Pfeffer, 1992). Information politics, then, is the means through which information is strategically used as a political tool to convince and persuade a less powerful opponent through methods and tactics that embrace soft power (Nye, 2004) and the market place of ideas (Miller, 2017), i.e. encouraging political discussion and perspectives for rational debate. Such behavior could commonly be identified as public affairs, public diplomacy, and more recently, digital diplomacy (Bjola, 2015).

To the contrary, information operations are means through which information is strategically used as a political or military weapon to compel a less powerful opponent through methods that embrace sharp power tactics (Walker & Ludwig, 2017) and exasperate conditions of information disorder (Wardle & Derakhshan, 2017). This occurs through the use of communication platforms to spread intentionally inaccurate and damaging information, i.e. mal- information (Wardle & Derakhshan, 2017), which stresses emotive and primal instincts of in-

128

grouping versus adversarial out-grouping, i.e. cyber tribalism (Williams & Arreymbi, 2007).

Such behavior could commonly be identified as propaganda, psychological operations, and more recently, computational propaganda.

Where the 20th century was replete with both theoretical and practical debates on distinguishing public diplomacy from propaganda, the continued application of politically-driven digital information disorder leads to similar debates between digital diplomacy and computational propaganda. In an effort to devise a public relations strategy through which U.S. public diplomacy practitioners can defend the integrity of their craft, this study sought to discover functional differences between the strategic approaches of information politics (public and digital diplomacy) and information operations (traditional and computational propaganda).

To that end, considering contributions from the fields of strategic communication and political communication, the behaviors of public affairs, public diplomacy, propaganda, and psychological operations were explored and typologized.

While understanding the functional differences between digital diplomacy and computational propaganda may offer substance for developing public relations strategies for public opinion management, it does little to functionally combat the use and spread of computational propaganda. A secondary goal of this study, therefore, was to explore legal frameworks enumerated in international law on cybercrime and regional law on digital telecommunications with which to develop regulatory frameworks moving forward.

In the United States, 69% of Americans have an unwanted perception of meddling and influence by foreign actors in elections (Doherty, Kiley, & Johnson, 2018). Likewise, 59% of

E.U. citizens feel similarly toward election interference in European elections. In a survey of over 27,000 E.U. citizens, 74% to 81% suggested a strong desire to see changes in the areas of

129

data protection, transparency in online political advertising, cybersecurity, inter-state cooperation on the issue of election meddling, and appropriate sanctions against any violations of election meddling (European Commission, 2018). The dominance of these views in U.S. and E.U. public opinion suggests that democracies across Westphalia likely have similar interests in taking measures to curtail the extent to which computational propaganda and information intervention is being used to subvert their perceptions of free and democratic elections.

Further, a handful of state actors outside of Westphalia are exploring regulatory measures to tackle misinformation in election contexts. Brazil’s Supreme Electoral Court as well as the country’s federal police, their version of the FBI, took preemptive steps to guard against fake news and the dissemination of misinformation in the run up to the country’s 2018 presidential election (Alves, 2018; Greenwald, 2018). South Korea also joined such efforts, proposing over a dozen amendments to South Korean national laws in an effort to crack down not only on those creating misinformation, but also telecommunication services whose networks are being used to disseminate such misinformation (Kajimoto, 2018). These added states suggest an interest in a host of democratic states, both Western and other, that are interested in directly countering the use of misinformation to disrupt and destabilize political institutions and processes.

Despite uni-lateral action being taken at the national level, the Internet is a global phenomenon and the full scope of the World Wide Web 2.0 cannot be regulated and reformed by a single actor alone while still maintaining a free and globally accessible Internet. To achieve this, the only viable option is regulation codified, enacted, and enforced through multi-lateral cooperation. Indeed, there have been a number of multi-lateral calls for international cooperation in this area of Internet governance. In 2004 the U.N. created the Governmental Group of Experts on Developments in the Field of Information and Telecommunications in the Context of

130

International Security (GGE) (Korzak, 2017), the mid-twenty-teens saw a number of calls for a

Digital Geneva Convention (Guay & Rudnick, 2017; Microsoft Policy Papers, 2017), and in

2018 French president Emmanuel Macron launched the “Paris call for trust and security in cyberspace” (Rose, 2018, para. 2), or the Paris Cyber Agreement, as a multi-lateral initiative for cutting edge Internet regulation (Marchal, 2018). Where such efforts have largely proven ineffective to date, part of this ineffectiveness has been centered on the scope of limiting free speech.

Despite democracies exploring limits to free speech at the national level, such as the

Honest Ads Act in the United States (S. 1989, 2017), fake news laws in France (Fiorentino,

2018) hate speech laws in Germany (Shalal & Balmforth, 2018), and the aforementioned proposed legislation in Brazil (Alves, 2018; Greenwald, 2018) and South Korea (Kajimoto,

2018), curtailing free speech is the kryptonite for democratic states and multi-lateral cooperation in the international arena. Any policy positions addressing limitations on free speech which are supported by traditional totalitarian states such as Russia, China, or Iran are immediately viewed as avenues for governmental censorship and are directly counter to the free speech that is critical to democracy. In seeking a multi-stakeholder framework for Internet governance, this study intentionally diverged from models of limiting free speech and rather emphasized avenues for infrastructural regulation. Specifically, the utility of the Council of Europe’s 2001 Budapest

Convention on Cybercrime was explored as an existent and established source for regulatory insight, while the utility of the European Union’s General Data Protection Regulation (GDPR) was explored as a newer and less established source for regulatory insight.

131

Primary Findings

The study made use of a two-part qualitative methodological design. Specifically, this consisted of 24 in-depth interviews with experts in the areas of mass communication, political science, international law, computer science and engineering, and cybersecurity, as well as a socio-legal textual analysis of the Budapest Convention and the GDPR. The following sections outline the findings of the study.

Demarcating Strategies of Information Intervention

As determined through an analysis of the in-depth interview data, methods of information politics, i.e. public diplomacy and public affairs, primarily engage foreign audiences in publicly overt and attributable ways, with strategically framed content that appeals to rational cognitive processing. That is to say, utilizing the soft power paradigm, public diplomacy seeks to insert content into a social environment that is backed by foreign interests, but in a manner that engages in political debate and discussion. Approaching foreign stakeholders, whether friendly or hostile, in an overt manner allows for public accountability and attribution. With greater accountability and attribution comes increased credibility and, ideally, an increased capacity for the relevant content to be found credible and trustworthy by the receiving audience (Knobloch-

Westerwick, Mothens, Johnson, Westerwick, & Donsback, 2015).

The notion of also providing verifiably framed information via rational-based appeals further benefits the cognitive reasoning process. Not only will content be found to be credible with increased source attribution, but the manner of providing sound and logical arguments will tend be associated with greater motivation in the consumption of content, leading to a more processed evaluation of the consumed information and, if in fact influenced by the content, stronger desired effects that last longer (Cyr, Head, Lim, & Stibe, 2018). Such elements of public

132

diplomacy combine to constitute open political discussion that, while maybe not always be in the interest of the receiving stakeholders, does gravitate more toward social and politically constructive ends.

To the contrary, methods of information operations, i.e. propaganda and psychological operations, primarily engage audiences in covert and non-attributable ways, with content that tends to be fabricated or factually unverifiable. That is to say, utilizing the sharp power paradigm, propaganda seeks to insert content into social environments that is backed by a variety of competing domestic and foreign interests, primarily in ways that seek to dilute, exasperate, disrupt, or completely negate participatory political debate and discussion. Approaching target audiences with content that is not overtly attributable in either source, intent, or financial backing puts content receivers in a position of evaluating information in sub-optimal conditions. Should the content then be deemed credible, over time as the lack of source and the substance of the content become increasingly disassociated, i.e. the sleeper-effect (Gaffney, Tomory, & Gold,

2016; Lowery & DeFleur, 1988), the fabricated and potentially malicious information then becomes increasingly credible, contributing to increased information disorder.

What’s more, supplementing the issues of credibility caused by the fabricated and un- attributable nature of the content are the effects of its emotive appeal strategy. Propaganda tends to embrace message strategies that reject rationality and rather endorse informational themes that are inflammatory and controversial. These elements of propaganda combine to constitute political discussion that is counterproductive if not nonexistent at all. While such foreign information is rarely, if ever, in the interest of the receiving stakeholders, it firmly gravitates toward information disorder that is socially and politically destabilizing.

133

Understanding the functional characteristics to public diplomacy and propaganda allows for transferable insight into the functional differences between their antecedents, digital diplomacy and computational propaganda. That is to say, digital diplomacy utilizes the soft power paradigm through engaging foreign demographics in strategic political speech that is available to the mass public, overt in intent, attributable in source and financial backing, verifiable to some extent, and rational in nature. Digital diplomacy often favors political models that endorse free speech and an open and inclusive Internet, i.e. democracies. To the contrary, computational propaganda utilizes the sharp power paradigm through engaging key target demographics with strategic information packages that may or may not be available to a mass public (via micro-targeting), covert in intent, does not offer cues of attribution, cannot be verified via reliable external sources, and stresses emotion-laden topics and controversial perspectives.

Computational propaganda thus is frequently a tool for combatting adversaries in political models that endorse free speech and an open and free Internet, i.e. democracies.

Existing Frameworks for Cybersecurity Governance

As determined through a mixture of qualitative methods, specifically in-depth interviews and socio-legal textual analysis, the Council of Europe’s Budapest Convention on Cybercrime was found to have a number of tangible assets in developing a multi-lateral regulatory framework to combat computational propaganda. Overall, the scope of the convention would not directly address the issues of computational propaganda as it currently exists for a number of reasons. First, the Budapest Convention, inspired by the Western thinking of cybersecurity in

2001, views the primary perpetrator of cybercrimes as non-state actors. This is expressed in the thinking that states should, and could, work together to share information, tracker cyber criminals across borders, and eventually bring such criminals before state judicial systems. This

134

does not account for the current state of geopolitical cyber espionage and aggression wherein states themselves engage in cybercrimes or use proxies to achieve similar ends.

Second the convention views cybercrime in a traditional Western cybersecurity perspective, meaning that threats to cyber infrastructure including breaking into secure systems, the theft of information, intellectual property, or other monetary property secured within the system, and affecting the operation of cyber systems through interference or non-kinetic destruction. To that end, where the hacking of Hillary Clinton and Emmanuel Macron’s email servers constitute cybercrimes, the simultaneous computational propaganda efforts during the

U.S. and French elections would not be considered outright illegal behavior, and therefore not under the purview of the Budapest Convention.

There are, however, elements to the convention which would prove useful in future frameworks to combat computational propaganda, namely capacity building, mutual assistance, and spontaneous information. Sybil detection is a technical process for identifying digital accounts (email, social media, etc.) that spread spam or misinformation on a substantial scale, or are compromised accounts of genuine users (Wang et al., 2018). Where the method Sybil detection is diffusing across leaders in tech, it is far from the purview of global application. A framework for capacity building would therefore serve as a multi-lateral cooperation program where experts in Sybil detection could train detection units in other countries. Expanding global assets and resources for detecting, tracking, and monitoring bot accounts poses significant benefit as research has shown that, in certain instances, bot accounts migrate across issues that span physical geopolitical boundaries (Howard, 2017; Howard & Kollanyi, 2016).

In such a way, the frameworks for mutual assistance (Article 25) and spontaneous information (Article 26) add to a framework for multi-lateral capacity building in the sharing of

135

information regarding bots that migrate issues trans-nationally. As Howard and Kollanyi (2016) report, they tracked a pro-Palestinian bot account as it migrated from Israeli/Palestinian issues to the Brexit debate. Had Israeli detection personnel identified the account and been monitoring its behavior, they would have noticed its issue migration and could have alerted their British counterparts directly, or provided the information via a mediator such as the United States.

Regardless, in an international theater filled with complex bi- and multi-lateral state relationships, the frameworks of capacity building, mutual assistance, and spontaneous information could prove vital to establishing bi- or ideally multi-lateral networks for detecting and tracking bot activity and sharing relevant information across a network of cooperating states.

Newer Frameworks for Cybersecurity Governance

As determined through a mixture of qualitative methods, i.e. in-depth interviews and socio-legal textual analysis, the European Union’s GDPR offers a number of tangible assets in developing a multi-lateral regulatory framework to combat computational propaganda, specifically its micro-targeting aspect. As a piece of regional legislation that came into force in

May of 2018, the full scope and influence of the GDPR has yet to be fully determined. However, one of the driving ideals embedded in the legislation holds particular merit for countering computational propaganda. In spite of a myriad of service issues that arose during the initial implementation of the GDPR, the notion of Privacy by Design (Article 25) is a piece of the law which establishes a precedent under which technology companies and service providers must be willing to modify or redesign the infrastructural layout of their software to provide enumerated rights of authority and control of users.

The first application of privacy by design falls under the purview of the institutional controller, i.e. the technology company that provides the software and/or service. The act of data

136

minimization (Article 5.1) necessitates that institutional controllers amend their system programming and information collection procedures to reduce the amount of identifiable information collected on data subjects, i.e. the customer or consumer. A method for satisfying this is differential privacy, a technical method through which data can be collected on consumers, however, there is a dissociation made between the inferences determined through data analysis and the identify of individual data subject. This model for data minimization would work well under the business models that constitute current data practices. This is because data aggregators could continue to collect user data and still provide such information to third parties, however, data provided to such third parties under the protection of differential privacy would still hold the statistically inferential properties for large-scale market trends and flows without providing individually identifiable information. Thus, personalized behavioral profiles could not be developed which would aid in minimizing the effectiveness of computational propaganda through minimizing the effectiveness of micro-targeting.

The second application of privacy by design falls under the purview of the data subject, i.e. the technology companies’ consumers and customers. Continuing to seek an avenue for data minimization (Article 5.1), the method of data synthetization offers great potential to combat the micro-targeting element of computational propaganda. Specifically, where institutional controllers would modify the infrastructure of their operating systems to allow for the data subject to engage in data synthesis, i.e. privacy be design, users would have the capability to intentionally provide inaccurate information in the data aggregation process. This would entail data subjects providing a plethora of inquiries and/or data points to a data aggregator which would necessitate a response addressing each query. The data subject, then, would have the technical ability on their device to filter out the unwanted query responses so as not to receive a

137

reduced level of service from the technology company. More broadly, this method would involve data subjects themselves intentionally manipulating the data collected on them, which would then reduce the accuracy of statistical inferences made during the creation and maintenance of a data subject’s psychological behavioral profile.

Overall, the GDPR enumerates two unique paths to developing regulatory frameworks that would curtail the ability of computational propaganda to micro-target strategic content to a myriad of key demographics. Differential privacy, on the part of the institutional controller, would retain much of the process that currently exists while removing individual-level statistical inferences from the data sets exported to third parties. Data synthesis, on the part of the data subject, would provide less accurate information in the data collection process to intentionally misinform a subject’s behavioral profile, thereby ideally reducing the psychological effectiveness of micro-targeting.

Policy Recommendations

While the primary findings of this study highlight just a few insights out of an extensive data set with considerable depth, such findings alone do not provide clear and succinct ways to move forward in the growing fight against computational propaganda and disinformation.

Rather, such insights must be reorganized into a succinct progression of proactive steps.

Therefore, the following section offers policy recommendations that can be carried out, primarily in from the perspective of the United States government, in uni-, bi-, or multi-lateral frameworks to the combat of computational propaganda.

Multi-Stakeholder Driven Cooperation

As knowledge of strategic digital information manipulation efforts is still comparably new, i.e. less than five years old, the full nature of computational propaganda and those

138

implicated in such actions is still very much being debated. Where most efforts to date have been offered from organizational perspectives, i.e. government, private sector, academia, etc., few if any of these efforts have made meaningful progress in establishing a multi-stakeholder front with proactively invested parties across a range of industry-specific actors. In the light of multi-lateral cooperation, a multi-stakeholder task force is vital to any serious effort to confront and combat computational propaganda, not unlike what can be found in the European Dialogue on Internet

Governance (EuroDig). EuroDig is a multi-stakeholder, pan-continental platform for dialogue and debate surrounding public policy regulation of the Internet. The organization is supported by a host of governmental and non-governmental entities and holds the Internet Governance Forum, regular conferences through which a significantly larger and more diverse range of corporate, non-governmental, and scholarly participants can join the member institutions in furthering insight and debate around European Internet governance. Taking such a model, this research promoted multi-stakeholder cooperation that includes national governments, international organizations, non-governmental organizations and non-profits, the technology economic sector, academia, and think tanks.

Governments

As the primary targeted victims of computational propaganda to date, state actors like the

United States, United Kingdom, France, and Germany have invested interests in tackling foreign digital information intervention within the confines of their own information sovereignty.

Naturally, both domestic and foreign state resources come to the forefront of combatting information security issues; such efforts involve resources and assets in the United States’ law enforcement, intelligence, and defense sectors. However, as computational propaganda has notably been executed in trans-national contexts, this naturally implicate organizations that

139

manage a state’s position and relationships abroad, such as the Department of State. Further, as information intervention is a process that is only facilitated by the communication of content both in and across state borders, federal entities responsible for communication regulations also have an invested interest in being included in the conversation, such as the Federal

Communications Commission.

Lastly, one of the more interesting entities to be included in this list are governmental entities responsible for electoral oversight. The primary instances that have driven public notoriety of computational propaganda has been its involvement with efforts to interfere in democratic voting processes. In the United States, anywhere from 60% - 70% of Americans ranging in age from 30 – 65 and above feel that algorithmically delivered content on social media pertaining to political and electoral advertising is unacceptable (Smith, 2018). It therefore stands to reason that voting regulatory actors would be interested in having a seat at the table, this would naturally implicate the Federal Election Commission. Additionally, as the Article 1,

Section 5 of the U.S. Constitution places electoral oversight in the purview of individual state legislatures, individual states will likely also be necessary.

Indeed, Damrosch (1989) was ahead of her time in envisioning regulatory means to combat unconventional foreign intervention tactics. In contemplating legal avenues to confront non-forcible foreign influence in internal domestic voting practices, Damrosch viewed campaign funding and economic sanctions as indirect ways in which state actors influence political outcomes in foreign countries. Further, as A. Seger (personal communication, November 5,

2018) points out, computational propaganda does not outright violate laws governing voting processes, but through what we have seen with information disorder driven by computational propaganda, the basic principles of free democratic elections are indeed being violated.

140

The diffusion of television and political advertisements in the United States in the late

20th century prompted the adoption of the candidate endorsement to verify the authenticity of televised content (J. Gilbert, personal communication, October 11, 2018). This is most evident in political advertisements that include the candidate statement: I am and I approve this message. Likewise, a similar elections-based regulatory framework could hold utility in reducing some level of information disorder in political elections. Therefore, actors responsible for voting and elections enforcement would have a unique perspective to contribute on methods for addressing computational propaganda.

International organizations

While a variety of governmental stakeholders within the United States play an important role in both political and regulatory perspectives, they alone are unlikely to stymie the growing uses of computational propaganda, even in instances of inter-agency cooperation. The Internet is a unique phenomenon that exists nowhere physically, yet exists everywhere on the globe virtually. This necessitates that most, if not all, U.S. action geared toward unilateral governance of a free and open Internet will be viewed as violations of state sovereignty. International organizations that exist both above and around a state-centric international system are equally as vital in facilitating and operating multi-lateral cooperation between states. For example, since its founding, the U.N. has been a place that has encouraged international cooperation and facilitated a venue where state actors, including the United States and its representatives, could engage in dialogue at various stages of inter-state partnership.

Not only are such international organization actors as the U.N., the E.U., the African

Union, the Union of South American Nations, and the Asia Cooperation Dialogue important in the facilitation of multi-lateral cooperation surrounding Internet governance, many of them are

141

already making efforts that hold potential merit for addressing computational propaganda. The

U.N. has pioneered an attempt looking into multi-state cooperation around Internet governance and security, the GGE. While to date this effort has been lackluster at best, the U.N. is not the only international organization engaging in proactive behavior. The E.U. is significantly ahead of the United States in regards to privacy legislation, data security, and governmental involvement in Internet governance, and has recently moved even further forward in novel ways with the adoption of the GDPR, a regional legal mechanism that holds substantial force on a global scale.

Further, the North Atlantic Treaty Organization, for example, has made headway through a variety of joint research projects and other efforts facilitated by the NATO Strategic

Communications Center of Excellence.

Political and military international organizations, however, are not the only avenues for international cooperation. Computational propaganda has been facilitated by a global business landscape and entrepreneurial perspective that places individual and political security second to the accumulation of fiscal wealth in communication and data driven industries. A possible way forward, therefore, is to revisit the economic, political, legal, and humanitarian ideologies that govern business models pertaining to data collection and exploitation. Pursing cooperation in this area would likely be ineffective without international organizations like the World Trade

Organization or the G20. Further, as algorithms were identified as an equal part of the micro- targeting behavior alongside data collection, other frameworks for intellectual property protection and enforcement could play a role, therefore necessitating involvement of international organizations such as the World Intellectual Property Organization.

Technology economic sector

Where both governmental- and international organization-based efforts to address

142

computational propaganda hold some merit, it cannot be understated the extent to which the

Internet is an arena completely dominated by the private sector. In order to maintain a free and open Internet, the majority of states and international organizations have chosen a path over the past 20 years that favors civil administration of the Internet, thus making the private technology companies the great powers of cyberspace and the gatekeepers to reforming the networks. Such stakeholders include social networking companies like Facebook, Inc. and Twitter, Inc., data aggregators like Google, Yahoo!, and Microsoft (via Bing), and other technology companies and corporations who operate largely in cyberspace based on data-driven insights and advertising, such as Amazon and Apple.

Again, efforts in this arena have already been made to promote a safer and more secure cyberspace. Primarily, Microsoft has led the charge in pushing a Digital Geneva Convention aimed at establishing either international law or international norms that outline and seek to guarantee protection of humanitarian interests in cyberspace (Microsoft Policy Papers, 2017).

Such efforts are designed to constrain, if not openly interfere with, the ability of state and non- state actors to weaponize the Internet as a means to damage social, political, and economic interests via kinetic and non-kinetic cybercrimes and attacks. Despite such calls, minimal progress has been made toward making such a convention a reality.

NGOs & non-profits

Where governments, international organizations, and the technology economic sector have been making efforts to get the ball rolling on global cooperation, smaller entities have been working between the more heavy-weight stakeholders and have covered substantial ground over the past two decades. Specifically, non-governmental organizations and non-profits are not as tied to national interests and are not subject to as much bureaucracy. This makes them

143

significantly more agile and capable of making smaller scale progress in both domestic and international contexts. In the framework of multi-stakeholder cooperation, NGOs and non-profit organizations could make both useful and influential mediators between the more powerful stakeholders, i.e. governmental agencies, international organizations, and technology companies.

A few of these organizations at the forefront of international Internet governance and security include the Global Commission on the Stability of Cyberspace, the EastWest Institute, the Global Network Initiative, and the Internet Society. Many of these NGOs and non-profits are already engaged in multi-stakeholder cooperation. For example, the Global Commission on the

Stability of Cyberspace already has functional relationships with the Dutch, French, and Estonian foreign ministries, the Republic of Singapore, Microsoft, and other NGOs/non-profits

GLOBSEC, the East/West Institute, The Hague Center for Strategic Studies, and the Internet

Institute.

Think tanks and academia

Lastly, the United States cannot move forward without substantial research and insight into what computational propaganda is, how it is being executed, what threats it entails, and how

Internet governance can address it. As a socio-political issue in a new cyber-driven era, novel ways of studying and thinking are necessary, and such ends can be best achieved by think tanks and academia. Again, not unlike previous mentioned categories of stakeholders, there are a number of think tanks, academic institutions, and researchers who recognize the growing danger of computational propaganda and are making what efforts they can to research this new threat.

Such think thanks include the Atlantic Council, New America, the Carnegie Endowment for

International Peace, Data & Society, and Likewise. Such academic institutions include the

University of Oxford’s Internet Institute, ’s Berkman Klein Center and Belfer

144

Center, Yale’s Information Society Project, Stanford’s Center for International Security and

Cooperation, Freeman Spogli Institute, Global Digital Policy Incubator, and Center for

Deliberative Democracy, and the University of Southern California’s Center for Public

Diplomacy, to name only a small few.

Multi-lateral Driven Data Regulation

Not only is it important to bring all of the right stakeholders to the table, but it is equally as vital that the United States take proactive action. The primary means of doing this could be either the importation of E.U. privacy and data security legislation to the United States or supporting the exportation of the E.U.’s GDPR ideology to either an international context or a variety of supra-regional contexts, particularly the notion of privacy by design. There is currently a social awakening taking place in the United States where the public is being coming increasingly aware of the insight that can be gained from big data analysis, and also the implications for potential influence and/or manipulation in a variety of contexts, ranging from cognitive, attitudinal, and even behavioral tendencies (Smith, 2018). While far from always malicious, it is micro-targeting that facilities such behavior. Tackling big data collection and exportation to third parties, therefore, are the ideal methods for curtailing computational propaganda.

Using the privacy by design philosophy, the frameworks of differential privacy on the part of data collectors and user-driven data synthesis should be enumerated in any kind of sub- national, national, regional, or international reform moving forward. Alone, each measure offers some reduction of the applicability of private data to be mined and used for computational behavior modification. Used in conjunction with each other, these methods stand a significantly

145

stronger chance of making meaningful progress in countering the exploitation of civil populations for social, political, and namely economic ends.

Regional regulation

In terms of scope, where such data regulation could be implemented on a national scale, with a number of leading data collectors and their third party partners being multi-national actors, national legislation alone is unlikely to have a significant and lasting effect outside of consumer markets that the global data sector finds attractive. Hence, utilizing an approach like the GDPR to achieve regional legislation that bundles a number of smaller consumer markets into a significantly larger one changes the game for data companies. Instead of facing potential consequences or banishment in a small number of countries totaling a few million consumers or less, regional legislation like the GDPR would offer consequences in a continent-sized region with hundreds of millions of consumers. Based on such thinking, a model implicating the United

States could involve primarily Canada, and possibly Mexico, in formulating a North American regulatory agreement that would stand between the technology companies and a market of potentially 480 million consumers. As a profit-driven industry, data companies then have an invested interest in staying in the market and thus abiding by data protection regulations, such as privacy by design, i.e. differential privacy and data synthesis.

International regulation

As the Internet is a global phenomenon, the more states bundle up to constitute larger and larger consumer markets, the more they compel data companies into compliance. International data legislation, hence, would be the most ideal scenario. This would only be affective as long as the threat of a substantial enforcement mechanism existed. International law, however, is not precisely known for its enforceability; thus, such international data regulation would need to

146

proceed in a legal venue where signatories to some sort of international agreement would be bound into enforcement, such as a U.N. Security Council resolution. With the United States being a permanent member, along with two other victims of computational propaganda efforts

(U.K., and France), and sitting opposite two other permanent members believed to be implicated in the use of computational propaganda (Russia and China), a binding Security Council resolution would be an intense geo-political fight that rested on the ability of either side to mount support among the non-permanent members of the council.

An internationally binding resolution would be harder to achieve but would be more universal and forceful in terms of compelling compliance; to the contrary, regional legislation would be easier to achieve but would have less affect and offer other challenges. A multitude of regionally unique regulatory systems would also place the burden of compliance on multi- national corporations to abide by certain rules in each region that they operate. Still, from the liberal and collectivist perspective, international data regulation is the most ideal way forward, and given the growing number of known computational propaganda efforts outside of Europe and the United States, namely Mexico (Glowacki et al., 2018), Brazil (Arnaudo, 2017), and

Japan (Schäfer, Evert, & Heinrich, 2017), it would stand that the three permanent members of the Security Council in favor of regulation would have an advantage in garnering support from

Latin American and Caribbean, Eastern European, and Asia-Pacific non-permanent council members.

Increased Algorithm Oversight

Where data regulation targets one piece of the micro-targeting function, there must be a reciprocal policy stance that targets the other piece, algorithms. In current business models, algorithms constitute the pumping heart to many data and technology companies. Without their

147

driving algorithms, companies like Google, Facebook, and Twitter would likely be uncompetitive in a global business market, if existent at all. Therefore, to protect their most valuable assets, such companies opt out of more public forms of intellectual property protection, including copyrighting or patenting their algorithms, in favor of trade secret protection with organizations like the U.S. Patent and Trademark Office or WIPO.

Broadly speaking, any confidential business information which provides an enterprise a competitive edge may be considered a trade secret … The unauthorized use of such information by persons other than the holder is regarded as an unfair practice and a violation of the trade secret. (WIPO, 2019)

In capitalist markets, such companies do have genuine invested interests to pursue private protection that is not necessarily nefarious. As algorithms, in particular, are deemed proprietary information by technology companies, they are kept as secret as possible, and trade secrets are the legal means, both in the United States and internationally, to ensure the protection of such exclusive information.

However, in a society where data companies underpin the very fabric of society through the control of user-driven telecommunication platforms, impacting social, political, and legal process, there needs to be increased pressure for public accountability. And indeed this is happening; states are now debating and looking to implementing regulatory frameworks over such companies. The number of times alone that Facebook CEO Mark Zuckerberg and Google

CEO Sundar Pichai have been called to testify before legislative bodies around the world reflects this growing sense for public accountability. As mentioned in the need for a multi-stakeholder driven cooperation, however, the technology economic sector is a critical component to addressing the issues of data manipulation and computational propaganda. Approaching them in aggressive and even hostile methods will not be successful in the long run, and communication platforms like Google and Facebook will not be going away anytime soon. Hence, to ensure

148

long-term cooperation, data and technology companies must be approached in an inclusive manner, primarily one that seeks to respect, if not protect, their competitive edge in a global business market.

Multi-lateral transparency and evaluation

Therefore, a policy stance must be taken to reform the business market as it pertains to algorithm protection, at least in the purview of data and technology companies. Where there is no need for such proprietary information to be made fully public, thereby reducing any effectives of the algorithm, there is a need to expand the oversight of algorithms under the purview of trade secret protection. Interestingly, noting the prominence of governmental foreign affairs departments, forms of multi-lateral oversight can be achieved through sub-departments centered around economic interests, business practices, and the protection of intellectual property. For example, the U.S. Department of State operates a Bureau of Economic and Business Affairs in the aim of expanding U.S. finical interests abroad. A sub-office within this bureau is the Office for Intellectual Property Enforcement. While not responsible for awarding intellectual property protection, the office does address the uses of, and abuses against, intellectual property protections. Retooling such an office for an added role of evaluation the potential abuses of algorithms under trade secret protection could serve as an added level of transparency, evaluation, and most importantly, accountability while still maintaining the general secrecy of the proprietary intellectual property.

International transparency and evaluation

While options exist on a bi- and multi-lateral level for state-based algorithm oversight, there are also possible avenues in the international arena. The most likely international actors that could facilitate such a process would be the World Trade Organization (WTO) or the World

149

Intellectual Property Organization (WIPO). Namely, the creation of a double-blind review panel of data and technology experts could be added as a level of oversight for practical application, i.e. a group of experts to review potential implications and perform an impact analysis for the use of such algorithms. Such a group of experts could mirror the EuroDig model, as previously discussed.

The prospect of an international entity reigning oversight over a unique sector of business and trade, such as algorithms used for data collection and analysis, could be viewed as an infringement on some states’ rights. Despite such feelings, the United States pushing an agenda in the international system could lend to a possible route for achieving multi-lateral uniformity through a singular process. Similar to the Paris Agreement on Climate Change, options for addressing economic and intellectual regulation exist that get the majority of states on board in one succinct process that applies to all.

Key to this, however, is getting the primary players that drive the technology industry.

This means getting countries like the United States, United Kingdom, Germany, Japan, South

Korea, China, Russia, and Israel on board. Where these countries are already participating members of such international organizations, they already see the benefits to taking part in such a system. Therefore, to promote an oversight process for algorithm governance, i.e. increased coding transparency and impact evolution and assessment, such changes must be framed in ways that appeal to the uni-lateral interests of the key players.

Computational Propaganda Detection Network

The primary method for detecting computational propaganda exists in social monitoring and social listening (Freberg, 2019). This is precisely where the public relations functions behind public diplomacy arises as an adversary to computational propaganda. Public relations

150

constitutes the behavior of one actor attempting to influence and mold opinion with a public audience, and to achieve such goals in a digital information environment, that actor must be monitoring online content and listening to online conversations to determine trends. As a primarily corporation-dominated industry, the public relations field already recognizes the critical value of social listening and has invested copious sums of resources and fiscal assets into setting up a global network of social listening centers. Most prominent is the public relations firm Golin’s Bridge network. Each of the public relations firm’s Bridge offices comprise a social media monitoring and listening command center for client interests in specific regions. With offices in North America, South America, Europe, Asia, and Australia, Golin mans a global detection network that can pinpoint any social media content that falls in the interests of one of its clients (Boland, 2017; Scott, 2012).

This is precisely the kind of global social listening structure that the United States must engage in to detect and track computational propaganda. Interestingly, the E.U. may be ahead of the United States in building such a structure. In anticipation of both traditional and social cyber- attacks during the 2019 E.U. elections, the organization has taken preemptive measures to establish an early

In fact, this notion of global cyber monitoring is already well in practice in security circles. For example, in 2010 the U.S. Navy reactivated it’s Tenth Fleet as a force provider for the newly created U.S. Fleet Cyber Command. This new command constantly monitors the information network security of global U.S. naval operations. That is, alongside similar structures across U.S. military branches and under the direction of U.S. Cyber Command and

U.S. Strategic Command, the tenth fleet engages in in-depth monitoring of cyber activity from a myriad of points around the globe. In line with the Western ideology of cybersecurity, however,

151

these military monitoring and deterrence centers address traditional cybercrimes including network hacking, network tampering, information theft. They do not address more social forms of information intervention.

Likewise, the U.S. Department of State has made efforts at advancing national and international cybersecurity, particularly in the realm of information intervention; however, the product of such efforts has been significantly dwarfed is size and funding by the military-driven approach of Cyber Command and the Tenth Fleet. In 2016, then-Secretary of State John Kerry reorganized the Center for Strategic Counterterrorism Communications into the Global

Engagement Center (GEC). With the digital decline of terrorist groups like ISIS, the State

Department shifted the efforts of GEC to “lead, synchronize, and coordinate efforts of the

Federal Government to recognize, understand, expose, and counter foreign state and non-state propaganda and disinformation efforts” (Loomis, Powers, & Rahimi, 2018, p. 131). With the very clear goal of addressing state-sponsored propaganda and computational propaganda, it would thus make sense to suggest this would be an established organizational structure through which a U.S. team of social listening practitioners and researchers could attempt to identify, track, and counter computational propaganda efforts.

Taking such a model in to account, if computational propaganda is viewed as a legitimate threat, then establishing a social listening structure is critical for deterrence and counter- influence efforts. Using methods such as Sybil detection to supplement other methodologies like social network analysis and network visualization, a global computational propaganda detection network can be established. Whether such a U.S. network would be enveloped into existing global monitoring structures like U.S. Fleet Cyber Command, adopted by the GEC and deployed around the globe using the U.S. embassy and consulate networks abroad, or contracted out to

152

private public relations firms like Golin, Edelman, Ogilvy, Weber Shandwick, or Fleishman

Hillard, for example, such a network will be vital in any efforts moving forward to combat computational propaganda.

The private sector, primarily technology corporations and public relations firms, has far surpassed any public governmental attempt at social monitoring and listening on a global scale.

Therefore, any organized effort to identify, track, and counter information disorder as spread via computational propaganda must be inspired by what has already been achieved in private industry. To see farther into the world of computational propaganda, future efforts must stand on the shoulders of social listening giants like Golan’s Bridge and Facebook’s election war room.

Functional Cooperation Programs

While a myriad of stakeholders are interested in the topics of computational propaganda and Internet governance, to date most cooperation seen between these stakeholders is reflected in discussion. While research, conversation, and debate on such issues is critical, such action lacks proactive behavior to address the threats of computational propaganda. Therefore, multi- stakeholder cooperation must be advanced toward a framework of behavior that takes a more aggressive stance. This can be easily achieved through establishing a multi-stakeholder framework for capacity building, mutual assistance, and spontaneous information.

Where core states, like the United States, maintain certain capacities for Sybil detection and large-scale social network analyses, namely through their military, domestic private sector, and academic institutions, the assets and resources of most other countries around the world, semi-periphery and periphery states, do not facilitate similar capabilities. Establishing a capacity building program where U.S. experts can teach and train assets in other allied countries in the methodologies of Sybil detection and social network analyses, will begin the process of arming

153

an increasing number of friendly stakeholders with the means to identify and potentially counter computational propaganda.

Implementing frameworks for mutual assistance and spontaneous information, as laid out in the Budapest Convention on Cybercrime, would allow for increased cooperation between the stakeholders committed to capacity building programs. Where there may be some concern that strengthening states’ defensive capabilities around disinformation may lend to growing their offensive capabilities in this area, such concerns are warranted and merit attention and debate.

The presence, however, of a security dilemma perspective only paralyzes cooperation around this issue. Core states may find it in their interests to retain exclusive knowledge of Sybil detection and social network analysis, but a single state actor alone cannot monitor all of the Internet’s content alone. Only through cooperation and a substantial division of labor can the best effort be made to identify, tracking, and counter the spread of malicious misinformation.

Interestingly, the E.U. may be ahead of the United States in building such a structure. In anticipation of disinformation campaigns during the May 2019 E.U. elections, the supra-national organization has taken preemptive measures to establish, “a new warning system” (de Carbonnel,

Macdonald, & Vey, 2019, para 4). This system would facilitate an avenue where participating

E.U. states could share information on threats perceived to be disinformation campaigns, primarily those enacted by foreign state actors. Such a information security-sharing system enumerates precisely the kind of mutual assistant and spontaneous information cooperation discussed in this research.

While the United States may, or more likely may not, be privy to this threat intelligence network, there already exists a number of U.S. programs that could potentially facilitate, or contribute to, such a framework. For example, as laid out in the U.S. Department of State’s

154

annual report on public diplomacy efforts (Loomis et al., 2018), one such program is the

European Digital Diplomacy Exchange (EDDE), a collaborative network program connecting

U.S. digital diplomats with their counterparts across Europe. EDDE is, “an intergovernmental network of digital diplomacy practitioners from partner governments committed to increasing the ability to execute effective digital engagement and communications. Participants work collaboratively and exchange best practices” (Loomis et al., 2018, p. 173). Therefore, adopting such an established inter-governmental programming for the purposes of capacity building, mutual assistance, and spontaneous would be both simple to do and achieve a new level of multi- lateral cooperation in the fight against computational propaganda.

Further, while the GEC represents a large-scale shift in the State Department’s methods for tackling information operations, the EUR/PD Strategic Communications Unit (Stratcomm) presents a smaller-scale perspective, similar to that of EDDE. Addressing Russian-sponsored disinformation specifically:

StratComm coordinated the work of the Russian Information Group, a group cochaired by EUR/PD and the U.S. European Command (EUCOM) that implements a whole-of- government response to disinformation, aligning interagency interests and resources against priority lines of effort … StratComm also assisted in refuting false Russian narratives, and building resilience among foreign audiences to recognize disinformation. (Loomis et al., 2018, p. 172-173)

Thus, what is seen in public documents is a myriad of efforts across governmental agencies designed to tackle the use and effects of computational propaganda. Where such efforts may collaborate with multi-lateral contacts and assets, as in the case of EDDE, the scope of the cooperation is not effectively designed for the core purpose of establishing a computational propaganda detection network through the use of capacity building, mutual assistance, and spontaneous information, i.e. building a multi-lateral social listening and information sharing structure.

155

Another existing multi-lateral framework that the United States is already a functioning member of which could serve as a venue, if not template, for information-sharing on disinformation campaigns is the UKUSA Agreement. The agreement is an Anglophonic intelligence alliance addressing the collection and sharing of signals, defense, human, and geospatial intelligence. This threat intelligence cooperative, more often referred to as the Five

Eyes (FVEY), includes the United States, United Kingdom, Canada, Australia, and New Zealand

(NSA, 2019). What’s more, the geopolitical makeup of the Five Eyes has been evolving to tackling cotemporary threats, namely with increased information sharing with Germany and

Japan as a means to address the rise of Chinese influence efforts (Barkin, 2018). The existence of such a program is important because it shows that states recognize not only a need for such multi-lateral cooperation, but that such multi-lateral cooperation is possible between friendly allied states. Many of the key organizations, actors, and assets already exist, there simply needs to be a shift in policy to reflect a cooperative stance toward computational propaganda information sharing.

Public Opinion Management

Lastly, not only do the previous policy recommendations hold promise in outlining a structure for combating computational propaganda, but to approach and achieve such policy it must be presented to the U.S. public in such ways that garner substantial support. One particular means for doing just this lies in one of the arguments made earlier in this research, namely that methods for content regulation may be passable in national governing systems, however, such methods are highly ineffective in an inter-state driven cyber environment. Second, laws surrounding content regulation are unlikely to ever see much progress on the international stage due to the freedom of expression divide between democratic and authoritarian leaders in

156

technology and international influence. Hence, not only must infrastructural regulation be framed in favor of such an international light, the approach must hold particular merit in a social environment representative of the late twenty-teens, where U.S. public sentiment for oversight and regulation is largely falling on the technology sector, i.e. Facebook, Twitter, Google, etc.

Moving forward with multi-lateral cooperation around telecommunication regulation would prove beneficial for two of the primary stakeholders outlined in the framework of multi- stakeholder cooperation, the U.S. federal government and the technology economic sector based in Silicon Valley. With such negative sentiment in the United States facing the abuses information security (Smith, 2018) and computational propaganda (Doherty, Kiley, & Johnson,

2018), advocating governmental policy positions that are tough on the technology sector will look favorably among constituents angered over the issues of data collection, micro-targeting, and political manipulation. Such a framework also works well for the technology economic sector who faces a massive public relations crisis (Smith, 2018). Where U.S. public sentiment is highly critical of the administrative oversight of companies like Facebook and Google, public policies that stress governmental oversight provide a key crisis communication opportunity for members of the technology sector to openly and publicly work with governments to eradicate the issue, thereby ideally contributing to the reputation management and restoration efforts already underway from the sector’s public relation practitioners and consultants.

Additionally, such public policy positions hold significant merit for the revitalization of public diplomacy, digital diplomacy, and its practitioners. Where the U.S. State Department has been implicated in the use of micro-targeting to push state-sponsored content on social media, public interest was present primarily among academic and news media institutions. While such a first instance did not openly evoke a public crisis for the State Department, such behavior is

157

unlikely to stop, and affixed beside the growing abuses of computational propaganda, a future clash in the realm of public opinion is inevitable. Therefore, it is critical for U.S. diplomatic personnel and institutions to publicly position themselves as the solution to the problem of computational propaganda rather than eventually being associated with similar behavior.

As G. Golan (personal communication, August 23, 2018) notes, public diplomacy is often viewed in the public eye as soft power programs designed for feel good effects. The public image of language programs, cultural exchanges, embassy reading rooms, study abroad opportunities, and social media-driven question and answer sessions can seem frilly and superfluous, particularly in the eyes of executive and legislative appropriation committees in the face of a growing number of national security concerns. To manage both U.S. governmental and public opinion, U.S. foreign and civil service officers responsible for public diplomacy and digital diplomacy must change the tone of their mission and realign themselves to fit more centrally in current national security discussions.

Where public diplomacy was the chosen tool the United States turned to in order to combat Soviet propaganda of the 20th century, the same approach and mentality must be taken to position digital diplomacy and its practitioners as the chosen tool to combat computational propaganda. While it is true that some of these steps have been taken, including the State

Department’s retooling of the GEC, the creation of EUR/PD Stratcomm and EDDE, the planning of a Persian-language digital network, the strengthening of the VOA Russian service, and increased digital media support in Tajikistan, such measures are largely unknown to the civil public.

While U.S. residents are generally aware that the federal government is working to deter, impede, and counter state-sponsored disinformation, few would be able to name the agencies

158

involved in such work. While the aforementioned programs within the U.S. State Department are of public record, little can be found on them. Indeed, this research only learned of them via a

U.S. Advisory Commission on Public Diplomacy report that was 215 pages in length (Loomis et al., 2017). Further, even when the programs are mentioned, less than a paragraph content was provided detailing their work.

Gone are the days when the U.S. State Department was associated with commanding and daunting leaders like John Marshal and Henry Kissinger. In a time of tightening budgets, increasing governmental shutdowns, and reputation issues such as the 2012 Benghazi attack, the

2016 Clinton email controversy, and the Rex Tillerson firing, the U.S. State Department needs to revitalize its image, and publicly position its public diplomacy resources and assets to wage 21st century information campaigns. Constructing a strategic narrative that harkens back to the

Department’s role in the Cold War is precisely a method of achieving a reputation restoration.

Limitations and Suggestions for Future Research

This study is a unique piece of research, not for any form of novel methodological or theoretical contribution, but more so in the unorthodoxy of its interdisciplinary scope and its unconventional presentation as a piece of doctoral research. While it does attempt to advance unique perspectives on the current landscape of cyber-driven information intervention and solutions through which cooperative action can use information politics to address the growing abuses information operations, it did present with limitations. The below section outline a mere handful of such limitations and offers insight for future scholars approaching similar areas of research.

First, flaws in the conceptual scope of this research and its policy recommendations must be addressed. The geopolitical nature of the international system is one situated in anarchy and

159

governed by realist thinking and perspectives. That is to say, all state actors seek competitive advantages over both their opponents and their allies, and they make use of any and all means available to achieve a sense of security. In the end, offense makes for the best defense in international politics and security. To the contrary, however, this research promoted perspectives and policies that embrace neoliberal thinking, namely the need for states to cooperate and work together to tackle problems that are beyond the scope of any one actor.

The Internet is a globally accessible information communication technology (ICT), and as long as democracies like the United States push for one universally free and open Internet, they inadvertently mandate a parallel need for cooperation surrounding the governance of such a network. Three options exist to the United States, the first is forcing uni-lateral Internet governance as the primary national power over the technology economic sector, i.e. Silicon

Valley. Such action is likely to result in increased animosity among the great powers and push state actors towards a China model where states operate smaller, more national or regional versions of the Internet. Second, the United States could opt into a China model itself and encourage the continued stratification and segmentation of the World Wide Web 2.0 into smaller networks. Or third, the United States can acknowledge the fact that, based on the ideals of democracy and free expression, the Internet should remain public and freely accessible thus mandating a renewed willingness to find ways to cooperate with other state actors.

The key to the potential success of using Internet governance to tackle computational propaganda and other forms of bot-based disinformation is that the notion of cooperation does not necessitate complete cooperation with all actors in the international system. The more countries that can be brought on board in terms of capacity building, mutual assistance, and spontaneous information, the better; to that end, such cooperation does not mandate some 200

160

odd countries partaking in this process, as illustrated in FVEYs. The United States can be strategic about with whom it cooperates and how much intelligence it shares with those actors.

Whether it be close allies like the United Kingdom and Israel, a majority of Westphalian democracies, or a significant part of the liberal international order, there exists a number of avenues open to the United States.

In spite of the liberal, idealistic, and possibly utopian conceptual perspective that might limit the applicability of this research and its recommendations in a realist international political order, there are signs that it does in fact represent some realistic interests. The presence of a multitude of calls for international cooperation surrounding Internet governance (U.N. GGE,

Digital Geneva Convention, and Paris Cyber Agreement), the existence of FVEY as a threat intelligence alliance where information is shared among allies, the restructuring of the U.S.

Navy’s 10th Fleet and the U.S. State Department’s GEC, and the creation of an information warning system in the E.U. prior to the May 2019 elections, there is clearly interest and desire from a multitude of actors to see progress made in this area. What this research aims to contribute is a list of recommendations that should be considered, debated, and possibly adopted into in future efforts to craft public policy on Internet governance.

Second, the methodology of this study employed a qualitative approach guided by an inductive theoretical framework, i.e. grounded theory. As outlined in Chapter Three, this presents a research perspective that is naturally subjective to the background and worldview of the researcher. That is, irrespective of the researcher’s effort to remain objective and impartial through the use of the constant comparative method, the collection, analysis, and interpretation of data was ultimately biased by the engaging researcher. A suggestion for future research in this area would be to find ways to employ a more mixed methods approach to better orient possible

161

insights with the largely quantitative nature of mass communication and public diplomacy research.

Third, for the portion of the methodology devoted to in-depth expert interviews, a total of

24 interviews were conducted. Where this does suffice the methodological rigor of a qualitative approach, i.e. namely reaching saturation above 20 interviews, the interdisciplinary nature of the study’s approach limited the depth of expertise in specific areas. In all, referring to formal backgrounds and education, a total of four interviewees were in law, three were in international relations and diplomacy, three in cybersecurity, two in computer science and computer engineering, and twelve in mass communication. It is suggested that future research attempt to better balance the perspective of the in-depth interview data by expanding the breadth of expert insight in the fields of law, international relations and diplomacy, cybersecurity, and computer science.

Further, of the 24 experts interviewed, 22 were members of the academic community with advanced expertise in their areas and, at best, some first-hand experience in the field. Only two of the interviewees held primary employment outside of academia at the time of the interview. This imbalance between practice and the academy was not for a lack of effort in recruiting relevant experts in the field, rather from the inability of the researcher to tap social and professional networks with contacts in the appropriate areas, as well as the inability of the researcher to secure interest on a cold-calling basis (via email).

In particular, the topic material of the research in the social context of the time period made for an uninviting community amongst practitioners. In reaching out to an unnamed practitioner with the U.S. Federal Communication Commission, the study’s researcher received a response indicating that any discussion of cybersecurity matters relating to national security and

162

election security would have to be cleared through an extensive network of supervisors, lawyers, consultants, and governmental ethics committees. After three months, a brief reply of denied was received without explanation or advice on how to move forward. Therefore, it is suggested that future research in this area, and on any topics dealing with national security, cybersecurity, and material generally requiring a security clearance, make supplementary efforts to better mine social and professional networks in order to better reach and secure expert practitioners for the interview process.

In an overview of other insights drawn from the study and its pertinent data, it is suggested that the field of public diplomacy and its respective scholars move to consider the relation between public diplomacy and PSYOPs. As society increasingly sees a relationship between information politics and information warfare, particularly one being played out in cyberspace, the relationship between public diplomacy and PSYOPs has never been more manifest. Interestingly, where pubic diplomacy scholarship often studies the topic area in relation to propaganda and public affairs, it rarely, if ever, considers the relationship with PSYOPs (or

MISO as in the U.S. context). Indeed, outside of Pamment (2013), no public diplomacy scholarship was located that included the terms psychological operations or PSYOP in the text.

To the contrary, however, a host of literature fixed in combat and military perspectives addresses the two in relationship to each other (Goldstein, 1996; Farwell, 2012; Freeman, 2006;

Taylor, 2007; Wallin, 2015). In exploring the sub-disciplines of strategic political communication, Farwell (2012) addresses propaganda, public affairs, public diplomacy, and

PSYOPs, viewing them as similar typologies under a single categorization. Goldstein, a former colonel in the U.S. Air Force, goes so far as to equate public diplomacy as overt political warfare, and adds that,

163

Psychological operations … do not have a separate compartment. They may at times be a part of public affairs, civic action, troop information, civil affairs, public diplomacy, humanitarian aid, or political action … everyone can and should aid in the psychological enhancement of all programs to gain and maintain the allegiance and support of the target audience: the population itself. (1996, p. 330)

This suggests that the perspective on the relationship between the two is not reciprocal across the respective fields of study. Further, where the growing notion of information warfare characterizes how scholars and practitioners in security circles view public diplomacy, this would suggest a significant gap in the way public diplomacy scholars view the concept, making such an area a prime topic for study in the growing age of state-sponsored disinformation and digital information campaigns.

Another insight gained through the progression of this research is the marked difference in Russian and Western views on cybersecurity. Namely, for years Russia has addressed information security where Western state actors like the United States have addressed concerns for cybersecurity, detailing the technical nature of online threats to online infrastructure.

However, the Russian perspective conceives of information security both in a technical perspective, i.e. technical security and fortifications, as well as a social one, namely the use of user-generated content for strategic goals. Indeed, as the first sovereign state to ever establish a modern propaganda machine for the purposes of international public opinion management, via the Communist International (Comintern for short), Soviet Russia and now the Russian

Federation are longstanding actors in content-driven information manipulation.

Understanding that Western security scholars and practitioners view cybersecurity as the extent to which online adversaries pose threats to cyber infrastructure and stored information, i.e. the use of viruses, worms, and Trojan horses to achieve behaviors such as wiretapping, port scanning, keystroke logging, screen scraping, denial of service attacks, spoofing, ARP poisoning,

164

Smurf attacks, buffer overflow, and format string exploits, it can be recognized that such concerns pose no relevance to the use of platform content to achieve social repercussions. That is, the goal of disinformation campaigns is not to affect cyber infrastructure, but rather use to deliver content aimed at psychological manipulation of human populations. Thus the areas of computational social science and social psychology are natural areas for collaboration in this new perspective of social cybersecurity. Dr. Sauvik Das has begun some work on the social implications of cybersecurity, though his research is geared to encouraging good security behavior on the part of platform users and data subject. This research rather urges the need for a conceptual shift in encouraging scholars and practitioners in the cybersecurity community

In his inaugural address at the University of Oxford’s Internet Institute, Dr. Philip

Howard suggested the need for social scientists and computer scientists to engage in in-depth collaboration to address the growing field computational social science and, specifically, the issue of computational propaganda. This research extends on that call to suggest using such an interdisciplinary approach to pioneer the field and study of social cybersecurity, where the intended effects of cyber-attacks are not infrastructural manipulation, but psychological manipulation through the transmission of information via cyber infrastructure. The existent model of cybersecurity that dominates Westphalia is incapable of conceiving, much less addressing, this newer form of psychological information cyberattack. It thus falls on interdisciplinary-driven scholars to advance the concept of social cybersecurity among a host of scholarly academies, industry practitioners, and the public.

Lastly, the premise of all research is founded on a host of basic assumptions. Some, ontologically speaking, can be investigated through an epistemology and found to be either valid assumptions or inaccurate. One such assumption that is a core foundation to the intent of this

165

research is the belief that computational propaganda has in some form or fashion influenced political processes in democratic states. This would suggest a violation of the most basic democratic principles that govern free and fair elections. However, as B. Valeriano (personal communication, October 11, 2018) expresses, society still lacks the most basic forms of scientific evidence empirically suggesting that computational propaganda did in fact have an effect. Political and social institutions across the globe have been shaken by the notion that

Russians used bots and social media platforms to spread malicious disinformation designed to circumvent political autonomy and sovereignty, and while evidence exists to suggest that such attempts were carried out, there is no tangible research to document the presence and size of effect this had on voting intention and behavior. Given the growing research on such security concerns, it is suggested that epistemologically sound methodologies must be employed to empirically support or refute the notion that computational propaganda is linked to some kind of media effect, i.e. some form of cognitive, attitudinal, or even behavioral change related to democratic elections and voting processes.

Conclusion

The 1990s and early 2000s saw the mass diffusion and adoption of digital ICTs such as the Internet and social media platforms. It is the twenty-teens, however, that has reached enough saturation of these communication media to begin to recognize the social and political consequences they are having on society. From Facebook depression, to cyber aggression, to dopamine-driven technology addiction, we as a collective society are just now beginning to truly recognize, acknowledge, and react to the negative impacts such technologies have had.

The U.S. presidential election and British referendum of 2016 will forever be the formal introduction of computational propaganda and misinformation to the world stage. The time

166

frame may, in fact, be far from the beginning of such cyber activities, but 2016 constitutes the first time public opinion was made aware of such efforts in practice. While this problem of information manipulation and information disorder is far from new, 2016 ushered in a new installment of a centuries old information-persuasion paradigm, one driven by the facilities and capabilities of digital ICTs and their infrastructure. The implications of such capabilities are beyond current comprehension, much like the thought of data mining and micro-targeting were far beyond the comprehension of the general public in 2006.

With such a novel approach to the traditional practice of state-sponsored disinformation, all levels of society are struggling to figure out how to cope with, and defend against, computational propaganda. With the failure of self-regulation in the technology sector, state actors like the United States are now seeking more direct paths for public accountability and, ultimately, government oversight. This has supplemented uni-lateral state legislation aimed at combatting what the public ambiguously refers to as fake news, with notable efforts in the

United States, United Kingdom, and France, and even early codification in Germany. With the unique international nature of the World Wide Web, however, uni-lateral behavior is destined to be insufficient as long as states continue to pursue a free and open Internet for all. Hence, bi- and primarily multi-lateral cooperation is necessary. Where this naturally juxtaposes the realist and anarchic nature of the international order, a collectivist approach is the only foreseeable way to more society forward.

Where some multi-lateral efforts have been made, such as the U.N.’s GGE, the Paris

Cyber Agreement, and calls for a Digital Geneva Convention, no practical progress has been made to date. The impetus of this research is thus to present a framework for forward-moving progress in the United States based on other multi-lateral frameworks already in effect. Still yet,

167

with a conceptual framework ready for application, a relevant outlet was necessary for widespread application. This suggested outlet is the realm of public diplomacy, and its recent computational manifestation, digital diplomacy.

Where public diplomacy was the antithesis to propaganda in the 20th century, it is digital diplomacy that is rife to be the antithesis to computational propaganda. The natural world operates in harmony and balance, and it is only appropriate to confront deconstructive information-persuasion efforts with a reciprocal force for constructive information-persuasion.

Indeed, U.S. public diplomacy efforts of the 21st century have faced a crisis of the soul in sorts, namely an inability to produce desired results as public and grandiose as those achieved in the

Cold War. With an inability to produce results in a manner meeting the expectations of U.S. lawmakers and those responsible for appropriations, computational propaganda offers both a modern and public battlefield in the area of social cybersecurity where digital diplomacy and public diplomacy practitioners can once again present themselves to the United States public as the solution to pressing national security threats.

168

APPENDIX A INTERVIEW PROTOCOL

Introduction: Hello [Insert Name],

Thank you for your participation in today’s interview. I am interested in better understanding the relationship between public diplomacy, public affairs, propaganda, and psychological operations in the framework of international law. This requires having a deep understanding of each concept and how it is carried out in practice. Due to your expertise in [Insert One: Public Diplomacy, Propaganda, Psychological Operations, International Relations, International Law], I am interested in your thoughts on this issue.

During this interview, please feel free to share any and all thoughts, ideas, and opinions you have. I am looking to gather insight from a variety of experts across multiple fields, so even if your ideas may be unconventional or unpopular in certain circles, please feel more than comfortable sharing them. Do you have any questions or concerns so far?

(The moderator will pause to determine if the interviewee responds in any way that would suggest he/she has a question or concern. Once these are addressed the moderator will continue.)

Our conversation today will be recorded; the recording and your responses will only be used for my current research and will not be shared with anyone else. If at any time you feel uncomfortable, or wish to end the interview, you are free to do so. For the record, please note that your coded identifier will be Interviewee [Insert: 1, 2, 3, 4, etc]. While you will receive full confidentiality in the scope of this interview and the data collected from it, you have the option to have your name, a pseudonym of your choosing, or an assigned pseudonym associated with your thoughts and opinions in the final manuscript of this study. Please clearly state which you would prefer.

(The moderator will pause to allow the interviewee to clearly express which option he/she would prefer. Once this has been determined, the moderator will continue.)

Thank you. We will now move to an array of questions surrounding the topic of state-sponsored persuasive broadcasting and international law.

1. Are you familiar with the term, Public Diplomacy?

a. If Yes: How would you most accurately define Public Diplomacy?

b. If No: I am going to provide you with the following definition: The Edward R. Murrow Center for Public Diplomacy defines Public Diplomacy as, “the influence of public attitudes on the formation and execution of foreign policies. It encompasses dimensions of international relations beyond traditional diplomacy; the cultivation by governments of public opinion in other countries; the

169

interaction of private groups and interests in one country with another; the reporting of foreign affairs and its impact on policy; communication between those whose job is communication, as diplomats and foreign correspondents; and the process of intercultural communications.” Further, Mediated Public Diplomacy is defined in the scope of this study as the extent to which such activities are carried out through telecommunication broadcast technologies such as radio, television, and the Internet.

i. Is there anything you would like to add or disagree with in the definition provided?

2. Are you familiar with the term, Public Affairs?

a. If Yes: How would you most accurately define Public Affairs?

b. If No: I am going to provide you with the following definition: The U.S. Public Affairs Council defines Public Affairs as, “the management function responsible for interpreting the corporation's noncommercial environment and managing the company's response to those factors” involving “the key tasks of intelligence gathering and analysis, internal communication, and external action programs directed at government, communities, and the general public.”

i. Is there anything you would like to add or disagree with in the definition provided?

3. Are you familiar with the term, Propaganda?

a. If Yes: How would you most accurately define Propaganda?

b. If No: I am going to provide you with the following definition: Whitton defines Propaganda as, “the communication of acts, fiction, argument, and suggestion often with the purposeful suppression of inconsistent material, with the hope and intention of implanting in the minds of the “target” audience certain prejudices, beliefs, or convictions aimed at persuading the latter to take some action serving the interest of the communicator.”

i. Is there anything you would like to add or disagree with in the definition provided?

4. Are you familiar with the term, Psychological Operations?

a. If Yes: How would you most accurately define: Psychological Operations?

170

b. If No: I am going to provide you with the following definition. The U.S. Department of Defense defines Psychological Operations as: “planned operations to convey selected information and indicators to foreign audiences to influence their emotions, motives, objective reasoning, and ultimately the behavior of foreign governments, organizations, groups, and individuals. The purpose of psychological operations is to induce or reinforce foreign attitudes and behaviors favorable to the originator’s objectives.”

i. Is there anything you would like to add or disagree with in the definition provided?

5. Would you please explore any similarities and/or differences you see between mediated public diplomacy, public affairs, propaganda, and psychological operations?

6. Based on your current understanding of International Law, i.e. international norms for state behavior, do you believe any of these four concepts, public diplomacy, public affairs, propaganda, and psychological operations, violate international law?

a. If Yes: Would you please elaborate?

b. If No: Would you please elaborate?

7. Based on your current knowledge, would you describe the United States’ Voice of America as public diplomacy, public affairs, propaganda, or psychological operations?

a. If necessary: Would you please elaborate further?

8. Based on their use of state-sponsored broadcasting systems to influence foreign audiences, do you believe the United States has engaged in foreign intervention?

a. If Yes: Would you please elaborate?

b. If No: Would you please elaborate?

9. Based on your current knowledge, would you describe China’s CCTV as public diplomacy, public affairs, propaganda, or psychological operations?

a. If necessary: Would you please elaborate further?

10. Based on their use of state-sponsored broadcasting systems to influence foreign audiences, do you believe China has engaged in foreign intervention?

a. If Yes: Would you please elaborate?

171

b. If No: Would you please elaborate?

11. Based on your current knowledge, would you describe Russia’s RT as public diplomacy, public affairs, propaganda, or psychological operations?

a. If necessary: Would you please elaborate further?

12. Based on their use of state-sponsored broadcasting systems to influence foreign audiences, do you believe Russia has engaged in foreign intervention?

a. If Yes: Would you please elaborate?

b. If No: Would you please elaborate?

13. In International Law, the concept of Self-Determination states that every country has a right to govern its own people free of external influence. Further, the concept of Non- Intervention states that governments have a responsibility not to interfere in the internal workings of other countries. Do you believe any of these four concepts [Public Diplomacy, Public Affairs, Propaganda, Psychological Operations] violate Self- Determination and/or Non-intervention?

a. If Yes: Would you please elaborate?

b. If No: Would you please elaborate?

14. The International Court of Justice established a precedent in Nicaragua v. USA (1986) and DRC v. Uganda (2005) for “Indirect Foreign Intervention.” This precedent suggests that countries can engage in foreign intervention without the direct use of military forces or resources. To the best of your knowledge, would you consider public diplomacy a form of indirect foreign intervention?

a. If Yes: Would you please elaborate?

b. If No: Would you please elaborate?

15. The United States Congress is currently reviewing proposed revisions to the Foreign Agents Registration Act (FARA) that would force Confucius Institutes in the country to register as foreign agents, based on their financial backing by the Chinese Communist Party (Foreign Intervention Transparency Act). Not unlike how U.S. national laws compel social media endorsements to clearly identify their posts as advertisements, based on their invested interests in the desired outcomes of the sponsor, do you feel that public diplomacy content mediated through the Internet should clearly identify its financial support system?

172

a. If Yes: Would you please elaborate?

b. If No: Would you please elaborate?

16. There are many who believe public diplomacy in the 21st century is engaged in by governments, non-governmental organizations (NGOs), private corporations, and even private persons. In the extent to which such efforts are mediated by a broadcast technology (radio, television, Internet), do you believe a state can be held responsible for the actions of corporations or private persons?

a. If Yes: Would you please elaborate?

b. If No: Would you please elaborate?

17. The United States Intelligence Community has concluded that 13 Russian private citizens with the Internet Research Agency in St. Petersburg purchased advertisements on Facebook during the 2016 presidential election aimed at influencing the results of the election. Would you consider such behavior to be public diplomacy, public affairs, propaganda, or psychological operations?

a. If Yes: Would you please elaborate?

b. If No: Would you please elaborate?

18. Do you believe the Russian state shares any culpability or responsibility for the actions of its citizens?

a. If Yes: Would you please elaborate?

b. If No: Would you please elaborate? 19. In 2008 the U.S. Department of State, headed by former Secretary of State Hillary Clinton, made the digitization of foreign policy and public engagement a priority. Since that time, the majority of the global diplomatic core has adopted digital strategies for public diplomacy. Do you feel that existing international law regulating radio and television broadcasting is capable of sufficiently regulating public diplomacy initiatives carried out on the Internet?

a. If Yes: Would you please elaborate?

b. If No: Would you please elaborate?

20. Based on how Facebook advertisements were used in an attempt to influence the 2016 U.S. presidential election, and the following data privacy issues with Cambridge

173

Analytica, do you feel social media accounts operated by national governments can be used in such ways that violate international law?

a. If Yes: Would you please elaborate?

b. If No: Would you please elaborate?

21. Considering that the Facebook advertisements were algorithmically targeted to key demographics, based on user data, do you believe the EU’s General Data Protection Plan (GDPR) would serve as a viable option to help mitigate the targeting of such advertisements in the future?

a. If Yes: Would you please elaborate?

b. If No: Would you please elaborate?

22. While much of diplomatic behavior is regulated by the 1961 Vienna Convention on Diplomatic Relations, this international treaty was adopted in a pre-digital age. The 2001 Budapest Convention on Cybercrime is the landmark international treaty regulating state cooperation around crimes on the Internet. To the best of your knowledge, do you consider the cybercrime convention a potential framework for regulating state-sponsored persuasive broadcasting on the Internet?

a. If Yes: Would you please elaborate?

b. If No: Would you please elaborate?

23. Now that you have answered 22 questions, you have some idea of the direction of my research. To the best of your knowledge, are there any areas of international, regional, or national legislation or case law you would recommend I investigate further?

a. If Yes: Would you please elaborate?

24. Based on the answers you have provided today, are there any additional thoughts, opinions, or recommendations you would like to offer on my current research?

25. Again, based on the answers you have provided today, are there any key experts, academic or professional, you would recommend that could benefit my current socio- legal research?

174

Conclusion: Thank you for your cooperation today. Before we conclude, do you have any questions or concerns?

(The moderator will pause to determine if the interviewee responds in any way that would suggest he/she has a question or concern. Once these are addressed the moderator will continue.)

Thank you again for your participation in this interview. Your insights and opinions are valuable to my research and will be extremely helpful in moving scholarship forward on the relationship between state-sponsored persuasive broadcasting and international law. Should you have any follow up comments or queries, please feel more than welcome to reach out to myself at [email protected] or this study’s co-investigator, Dr. David Ostroff at [email protected].

175

APPENDIX B QUANTITATIVE OVERVIEW OF IN-DEPTH INTERVIEWS

Appendix B. Quantitative overview of the findings of the in-depth interviews. Public Diplomacy Public Affairs Propaganda Psychological Operations Political Institutions Political Institutions Military Institutions (n = 11); (n = 6); Intelligence Institution (n = 10); Who Engages Non-state actors Military Institutions (n = 1) Political Institutions (n = 6) (n = 3) (n = 2)

Covert Overt (n = 3); Covert What Manner Not Addressed (n = 5) Overt (n = 2) (n = 2)

Domestic Audiences Foreign Audiences Foreign Audiences Foreign Audiences (n = 9); (n = 4); (n = 4); What Audience (n = 11) Mixed Audiences Mixed Audiences Mixed Audience (n = 2) (n = 4) (n = 1) Mixed Mixed Framed Truths (n = 6); (n = 2); Framed Truths (n = 2); Falsehoods Falsehoods Method (n = 4) Mixed (n = 4); (n = 1); (n = 1) Framed Truths Frames Truths (n = 1) (n = 1) Two-Way One-Way Flow Model Not Addressed Not Addressed (n = 5) (n = 5) Inform Inform Persuade (n = 2) Persuade (n = 3); (n = 8) Persuade Intent (n = 8) Persuade Disruption (n = 3) (n = 3) (n – 7) Disruption (n = 3) Domestic Policy (n = 3); Foreign Policy Foreign Policy (n = 5) (n = 2); End Served Not Addressed Not Addressed Public Policy Economic Policy (n = 1) (n = 2); National Security (n = 1)

176

APPENDIX C QUANTITATIVE OVERVIEW OF THE GDPR BY THEMATIC BREAKDOWN

Appendix C. Quantitative overview of the GDPR by thematic breakdown. Overall Salience Overall Percent Content Description via Word Count of Section II Theme (N = 31,547) (% of N) Emphasis on the responsibilities of the Institutional parent company who engages in data 7,410 23% Controller collection, i.e. Facebook, Google, etc. Emphasis is placed on a language International within the GDPR intended to facilitate 5,269 17% Cooperation collaboration between EU states.

Emphasis is placed on the rights of the Data Subject user who provides their private 4,294 14% information to institutional controllers. Emphasis on the responsibilities of the parent company when involving Third Party 3,025 10% auxiliary entities in data collection, analysis, or other usage. Emphasis on the bodies and mechanism to be created and/or used Internal Structure 2,950 9% to facilitate the effectiveness to the GDPR’s goals.

Emphasis on the universal beliefs that Principles necessitate a regulatory framework 2,589 8% such as the GDPR. Emphasis on the repercussions Consequences of institutional controllers will face for 2,124 7% Violation breaking any rules of the GDPR.

Sets a framework for understanding Logistics of the the GDPR document, such as 1,859 6% Document universal definitions. Sets a framework for how GDPR Transparency processes will operate and be held 1,150 4% accountable. Sets a framework for how GDPR Provisions of the processes will interact with other 537 1% Convention existing EU laws and regulations in the area of data collection.

Sets a framework for how GDPR can Implementation of will be adopted and implemented 340 1% the Convention across the EU.

177

LIST OF REFERENCES

Adams, M. J., & Reiss, M. (2018, March 4). International law and cyberspace: Evolving views. Lawfare. Retrieved on July 17, 2018, from https://www.lawfareblog.com/international- law-and-cyberspace-evolving-views.

Adams, T. (2014, November 30). Simon Anholt interview: ‘There is only one global superpower: public opinion’. The Guardian. Retrieved on June 20, 2018, from https://www.theguardian.com/politics/2014/nov/30/simon-anholt-good-country-party- global-superpower-public-opinion.

Akçapar, S. K., & Bayraktar Aksel, D. (2017). Public diplomacy through diaspora engagement: The case of Turkey. Perceptions, 22(4), 135-160.

Alakwe, K. W. (2017). Positivism and knowledge inquiry: From scientific method to media and communication research. Specialty Journal of Humanities and Cultural Science, 2(3), pp. 38-46.

Alberstein, M. (2017). Pragmatism and law: From philosophy to dispute resolution. New York, NY: Routledge.

Alves, L. (2018, June 29). Brazil prepping to fight fake news during October’s elections. The Rio Times. Retrieved on March 19, 2019, from https://riotimesonline.com/brazil-news/rio- politics/brazil-preparing-to-fight-fake-news-during-octobers-elections/.

Aoi, C. (2017). Japanese strategic communication: Its significance as a political tool. Defense Strategic Communication, 2(3), 71-101.

Arnaudo, D. (2017). Computational propaganda in Brazil: Social bots during elections. Computational Propaganda Research Project. Retrieved on January 14, 2019, from https://blogs.oii.ox.ac.uk/politicalbots/wp-content/uploads/sites/89/2017/06/Comprop- Brazil-1.pdf.

Armstrong, M. (2017a, January 19). The past, present, and future of the war for public opinion. War on the Rocks. Retrieved on June 20, 2018, from https://warontherocks.com/2017/ 01/the-past-present-and-future-of-the-war-for-public-opinion/.

Armstrong, M. (2017b, March 20). A strategic perspective on “information warfare” & “counter- propaganda”. Emerging Threats & Capabilities Subcommittee of the House Armed Services Committee. Retrieved on June 19, 2018, from https://warontherocks.com/2017/ 01/the-past-present-and-future-of-the-war-for-public-opinion/.

Auchard, E., & Ingram, D. (2018, March 20). Cambridge Analytica CEO claims influence on U.S. election, Facebook questioned. Reuters. Retrieved on June 18, 2018, from https://www.reuters.com/article/us-facebook-cambridge-analytica/cambridge-analytica- ceo-claims-influence-on-u-s-election-facebook-questioned-idUSKBN1GW1SG.

178

Banakar, A. (2000). Reflections on the methodological issues of the sociology of law. Journal of Law and Society, 27(2), pp. 273-395.

Banasik, M. (2017). Russia’s hybrid war in theory and practice. Journal on Baltic Security, 2(1), 157-182. doi: 10.1515/jobs-2016-0035.

Barkin, N. (2018, October 12). Exclusive: Five Eyes intelligence alliance builds coalition to counter China. Reuters World News. Retrieved on March 23, 2019, from https://www.reuters.com/article/us-china-fiveeyes/exclusive-five-eyes-intelligence- alliance-builds-coalition-to-counter-china-idUSKCN1MM0GH?fbclid=IwAR0_ 2YdNPKyh1l2mfEAahw8GxfIZ-nCAqVE8LRxnqKx8xvXpZwtwjB35S8w.

Bazov, G. (2014, July 10). Eyewitness account of atrocities by Ukrainian Nazi Banderovtsy in Slavyansk. Slavyandgrad. Retrieved on June 19, 2018, from https://slavyangrad.org/2014/07/10/atrocities-in-slavyansk/.

Bejesky, R. (2012). Public diplomacy or propaganda? Targeted messages and tardy correction to unverified reporting. Capital University Law Review, 40(1), 697-1052.

Belgrave, L. L., & Seide, K. (2018). Grounded theory methodology: Principles and practices. In P. Liamputtong (Ed.), Handbook of research methods in health social sciences, (pp. 1- 18). Singapore: Springer.

Bernays, E., & Miller, M. C. (2005). Propaganda. Brooklyn, NY: IG Publishing.

Bischof, A., & Jurgens, Z. (2015). Voices of freedom – western interference? 60 years of Radio Free Europe. Göttingen, Germany: Vandenhoeck & Ruprecht.

Bjola, C. (2015). Introduction: Making sense of digital diplomacy. In C. Bjola & M. Holmes (Eds.), Digital diplomacy: Theory and practice (pp. 1-13). New York, NY: Routledge.

Bjola, C., & Jiang, L. (2015). Social media and public diplomacy: A comparative analysis of the digital diplomatic strategies of the EU, US, and Japan in China. In C. Bjola & M. Holmes (Eds.), Digital diplomacy: Theory and practice (pp. 71-88). New York, NY: Routledge.

Blandy, S. (2014). Socio-legal approaches to property law research. Property Law Review, 3(3), pp. 166-175.

Boeije, H. (2002). A purposeful approach to the constant comparative method in the analysis of qualitative interviews. Quality and Quantity, 36(4), pp. 391-409.

Bogner, A., Littig, B., & Menz, W. (2009). Introduction: Expert interviews – an introduction to a new methodological debate. In A. Bogner, B. Littig, & W. Menz (Eds.), Interviewing Experts, (pp. 1-16). New York, NY: Palgrave Macmillan.

179

Böhm, A. (2004). Theoretical coding: Text analysis in grounded theory. In U. Flick, E. Kardorff, & I. Steinke (Eds.), A companion to qualitative research (pp. 270-275). London, England: Sage.

Boland, G. (2017, December 13). How Golin’s Bridge is connecting brands to the new future of PR. NewsWhip. Retrieved on January 15, 2019, from https://www.newswhip.com /2017/12/golin-bridge/.

Borah, P. (2011). Conceptual issues in framing theory: A systematic examination of a decade’s literature. Journal of Communication, 61(2), 246-263.

Bowen, C. (2006). Grounded theory and sensitizing concepts. International Journal of Qualitative Methods, 5(3), pp. 1-9.

Bradshaw, S., & Howard, P. N. (2018). Challenging truth and trust: A global inventory of organized social media manipulation. Computational Propaganda Research Project. Retrieved on August 22, 2018, from http://comprop.oii.ox.ac.uk/wpcontent/uploads/sites /93/2018/07/ct2018.pdf

Brattberg, E., & Maurer, T. (2018, May 23). Russian election interference: Europe’s counter to fake news and cyber attacks. Carnegie Endowment for International Peace. Retrieved on June 18, 2018, from https://carnegieendowment.org/2018/05/23/russian-election- interference-europe-s-counter-to-fake-news-and-cyber-attacks-pub-76435.

Bruce, M., Peltu, M., & Dutton, W. H. (1999). Society on the line: Information politics in the digital age. New York, NY: Oxford University Press.

Bryman, A. (2012). Social research methods (4th Ed.). Oxford, United Kingdom: Oxford University Press.

Byrman, A. (2012). Social research methods (4th Ed.). Oxford, England: Oxford University Press.

Cairns, H. (1941). The theory of legal science. Chapel Hill, NC: University of North Carolina Press.

Carr, E. H. (2001). The twenty years’ crisis, 1919-1939: An introduction to the study of international relations. New York, NY: Perennial.

Cerar, M. (2009). The relationship between law and politics. Annual Survey of International & Comparative Law, 15(1), 23.

Chaffee, S. H., & Roser, C. (1986). Involvement and the consistence of knowledge, attitudes, and behaviors. Communication Research, 13(3), p. 373-399.

180

Chappell, B. (2018, May 22). ‘Are you telling the truth?’ European parliament questions mark Zuckerberg. National Public Radio. Retrieved on June 18, 2018, from https://www.npr.org/sections/thetwo-way/2018/05/22/613338380/watch-mark- zuckerberg-speaks-to-european-union-parliament.

Charmaz, K. (2006). Constructing grounded theory: A practical guide through qualitative analysis. London, England: Sage.

Chekinov, S. G., & Bogdanov, S. A. (2010). Asymmetric actions in support of the military security of Russia. Military Thought, 16(3), 19-20.

Chomsky, N. (2016, May 9). Who rules the world? American is no longer the obvious answer. The Guardian. Retrieved on June 20, 2018, from https://www.theguardian.com/us- news/2016/may/09/noam-chomsky-who-rules-the-world-us-foregin-policy.

Cone, S. (1998). Presuming a right to deceive: Radio Free Europe, Radio Liberty, the CIA, and the news media. Journalism History, 24(4), 148-149.

Chong, D., & Druckman, J. (2007). A theory of framing and opinion formation in competitive elite environments. Journal of Communication, 57(1), pp. 99-118.

Committee on Foreign Relations. (2018). Putin’s asymmetric assault on democracy in Russia and Europe: Implications for U.S. national security. United States Senate. Retrieved on June 18, 2018, from https://www.foreign.senate.gov/imo/media/doc/FinalRR.pdf.

Clark, D. D., Sollins, K. R., Wroclawski, J., & Braden, R. (2002). Tussle in cyberspace: Defining tomorrow’s Internet. ACM SIGCOMM Computer Communication Review, 32(4), pp. 347-356.

Clawson, R. A., & Oxley, Z. M. (2017). Public opinion: Democratic ideals, democratic practice. Thousand Oaks, CA: Safe.

Critchlow, J. (2006). Public diplomacy during the Cold War: The record and its implications. Journal of Cold Ear Studies, 6(1), 75-89. Doi: 10.1162/152039704772741597.

Cull, N. J. (2006). “Public diplomacy” before Gullion: The evolution of a phrase. USC Center for Public Diplomacy Blog. Retrieved on July 6, 2018, from https://uscpublicdiplomacy. org/blog/public-diplomacy-gullion-evolution-phrase.

Cull, N. J. (2008). The Cold War and the United States Information Agency. Cambridge, UK: University of Cambridge Press.

Cull, N. (2009b). Public diplomacy before Gullion: The evolution of a phrase. In N. Snow & P. M. Taylor (Eds.), Routledge handbook of public diplomacy (pp.19–23). New York, NY: Routledge.

181

Chynoweth, P. (2008). Legal research. In A. Knight & L. Ruddock (Eds.), Advanced research methods in the built environment (pp. 28-38). Chichester, UK: Wiley-Blackwell.

Dale, H., & Lord, C. (2007). Public diplomacy and the Cold War: Lessons learned. The Heritage Foundation. Retrieved on July 2, 2018, from https://www.heritage.org/defense/ report/public-diplomacy-and-the-cold-war-lessons-learned.

Dalkey, N., & Helmer, O. (1963). An experimental application of the Delphi method to the use of experts. Management Science, 9(3), pp. 458-467.

Dawson, T. B. (1992). Legal research in a social science setting: The problem of method. Dalhousie Law Journal, 14(3), pp. 445-472.

Davenport, T. H., Eccles, R. G., & Prusak, L. (1992). Information politics. Sloan Management Review. Retrieved on July 18, 2018, from http://www.sims.monash.edu.au/subjects /ims5042/stuff/readings/Davenport_Eccles_Prusak.pdf. de Carbonnel, A., Macdonald, A., & Vey, J. B. (2019, March 20). EU leaders warn of cyber, fakes threats to May elections. Reuters World News. Retrieved on March 20, 2019, from https://www.reuters.com/article/us-eu-disinformation/eu-leaders-to-warn-of-cyber-fake- news-threat-to-may-elections-idUSKCN1R11QM?fbclid=IwAR2I6zbaqH8RfThuno KO8Bicudwvjbslrqe9qM3AGR8LXD9wJrPftIYUdTs. de Orellana, P. (2017). ‘You can count on us’: When Malian diplomacy stratcommed Uncle Sam. Defense Strategic Communication, 2(3), 103-137.

Dear, I., & Foot, M. R. D. (2001). Axis Sally. Oxford Companion to World War II. Retrieved on June 24, 2018, from https://www.worldcat.org/title/oxford-companion-to-world-war- ii/oclc/47356289.

Demers, D. (2000). Communication theory in the 21st century: Differentiation and convergence. Mass Communication & Society, 3(1), pp. 1-2. doi: 10.1207/ S15327825MCS0301_01.

Dodd, M. D., & Collins, S. J. (2017). Public relations message strategies and public diplomacy 2.0: An empirical analysis using Central-Eastern European and western embassy Twitter accounts. Public Relations Review, 43(2), 417-125.

Doherty, M. (2000). Nazi wireless propaganda: Lord Haw-Haw and British public opinion. Edinburgh, Scotland: Edinburgh University Press.

Doherty, C., Kiley, J., & Johnson, B. (2018). Elections in America: Concerns over security, divisions over expanding access to voting. Pew Research Center. Retrieved on March 19, 2019 from http://www.people-press.org/2018/10/29/elections-in-america-concerns-over- security-divisions-over-expanding-access-to-voting/.

182

Dua, S., & Du, X, (2016). Data mining and machine learning in cybersecurity. Boca Raton, FL: Auerback Publications.

Dunbar, D., Proeve, M., & Roberts, R. (2018). Problematic Internet usage self-control dilemmas: The opposite effects of commitment and progress framing cues on perceived value of Internet, academic, and social behaviors. Computers in Human Behavior, 82(X), p. 16- 33. doi: 10.1016/j.chb.2017.12.039

Dwork, C., McSherry, F., Nissim, K., & Smith, A. (2006). Calibrating noise to sensitivity in private data analysis. In Theory of Cryptography Conference, 265-284. Spring, Berlin, Heidelberg.

Echevarria, A. J. II (2007). Clausewitz and contemporary war. Oxford, England: Oxford University Press.

Edward R. Murrow Center of Public Diplomacy. (2018). Definitions of public diplomacy. The Fletcher School. Retrieved on July 6, 2018, from https://web.archive.org/web/ 20100617004930/http://fletcher.tufts.edu/murrow/pd/definitions.html.

Edwards, B. (2004). Edward R. Murrow broadcast from London (September 21, 1940). Library of Congress. Retrieved on June 20, 2018, from https://www.loc.gov/programs/static /national-recording-preservation-board/documents/murrow.pdf.

Eichorn, J. (2018, April 25). Life under #GDPR and what it means for cybersecurity. InfoSecurity Group. Retrieved on July 17, 2018, from https://www.infosecurity- magazine.com/opinions/life-gdpr-cybersecurity/.

Entman, R. M. (1993). Framing: Toward clarification of a fractured paradigm. Journal of Communication, 43(4), p. 51-58. doi: 10.1111/j.1460-2466.1993.tb01304.x

Euromaidan. (2015, August 25). Monument to Russian media fakes unveiled in Ukraine. Euromaidan Press. Retrieved on June 19, 2018, from http://euromaidanpress.com /2015/08/25/monument-to-russian-media-fakes-unveiled-in-ukraine/.

European Commission. (2018, November 26). European Commission survey shows citizens worry about interference head of European elections. European Commission Press Release Database. Retrieved on March 19, 2019, from http://europa.eu/rapid/press- release_IP-18-6522_en.htm.

European Parliament. (2016). Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC. Official Journal of the European Union. Retrieved on December 10, 2018, from https://publications.europa.eu/en/publication-detail/-/publication/3e485e15- 11bd-11e6-ba9a-01aa75ed71a1/language-en.

183

Fan, Y. (2010). Branding the nation: Towards a better understanding. Place Branding and Public Diplomacy, 6(2), 97-103.

Farndale, N. (2005). Haw-Haw: The Tragedy of William and Margaret Joyce. London, England: Macmillan.

Farwell, J. P. (2012). Persuasion and power: The art of strategic communication. Washington, D.C., Georgetown University Press.

Farwell, J. P. (2017). A closer look at Yemen. Defense Strategic Communications, 2(2), 191- 202.

Fernandez, W. (2004). The grounded theory method and case study data in IS research: Issues and designs. D. Hart & S. Gregor (Eds.), Information systems foundations: Constructing and criticizing, (pp. 43-59). Canberra, Australia: ANU Press.

Fidler, D. P. (2013, February 7). Internet governance and international law: The controversy concerning revision of the International Telecommunication Regulations. American Society of International Law. Retrieved on July 16, 2018, from https://www.asil.org/ insights/volume/17/issue/6/internet-governance-and-international-law-controversy- concerning-revision.

Fidler, D. P. (2016). Cyberspace and human rights. In N. Tsagourias & R. Buchan (Eds.), Research handbook on international law and cyberspace, (pp. 94-117). Cheltenham, England: Edward Elgar.

Fitzpatrick, K. R. (2007). Advancing the new public diplomacy: A public relations perspective. The Hague Journal of Diplomacy, 2(3), 187-211.

Fletcher, A. J., & Marchildon, G. P. (2014). Using the Delphi method for qualitative, participatory action research in health leadership. International Journal of Qualitative Methods, 13(1), pp. 1-18.

Freberg, K. (2019). Social media for strategic communication. Los Angeles, CA: Sage.

Freedman, L. (2006). The transformation of strategic affairs. New York, NY: Routledge.

Freeman, B. R. (2006). The role of public diplomacy, public affairs, and psychological operations in strategic information operations (Master’s thesis). Retrieved from Calhoun: The Naval Postgraduate School Institutional Archive.

Fridman, O. (2017). The Russian perspective on information warfare: Conceptual roots and polticisation in Russian academic, political, and public discourse. Defense Strategic Communications, 2(2), 61-86.

184

Gaffney, A M., Tomory, J. J., & Gold, G. J. (2016). The endorsement of commentator opinion: A case of manufactured consent. Psychology of Popular Media Culture, 5(2), 194-202.

Galeotti, M. (2018, March 5). I’m sorry for creating the ‘Gerasimov Doctrine’. Foreign Policy. Retrieved on July 3, 2018, from https://foreignpolicy.com/2018/03/05/im-sorry-for- creating-the-gerasimov-doctrine/.

Gardner, J. A. (1961). The sociological jurisprudence of Roscoe Pound. Villanova Law Review, 7(1), p. 1-26.

GDPR Key Changes. (2018). GDPR FAQs. The European Union. Retrieved on January 10, 2019, from https://eugdpr.org/the-regulation/.

Gerasimov, V. (2013, February 26). Ценность науки в предвидении. Военно-промышленный курьер. Retrieved on July 3, 2018, from https://www.vpk-news.ru/articles/14632.

Gilboa, E. (2008). Searching for a theory of public diplomacy. The Annals of the American Academy of Political and Social Science, 616(1), 55-77.

Gilboa, E. (2015). Public diplomacy. In G. Mazzoleni (Ed.) The International Encyclopedia of Political Communication (1st Ed.). John Wiley & Sons. doi: 10.1002/9781118541555.wbiepc232

Glaser, B. G. (1965). The constant comparative method for qualitative analysis. Social Problems, 12(4), pp. 436-445.

Glaser, B. G., & Strauss, A. L. (1967). The discovery of grounded theory: Strategies for qualitative research. New Brunswick: Aldine Transaction.

Glassman, J. K. (2010, March 10). How to win the war of ideas. Foreign Policy. Retrieved on June 19, 2018, from http://foreignpolicy.com/2010/03/10/how-to-win-the-war-of-ideas/.

Glowacki, M., Narayanan, V., Maynard, S., Hirsch, G., Kollanyi, B., Neudert, L., Howard, P., Lederer, T., & Barash, V. (2018). News and political information consumption in Mexico: Mappign the 2018 Mexican presidential election on Twitter and Facebook. Computational Propaganda Research Project. Retrieved on January 14, 2019, from http://comprop.oii.ox.ac.uk/wp-content/uploads/sites/93/2018/06/Mexico2018.pdf.

Glynn, C. J., Herbst, S., Lindeman, M., O’Keefe, G. J., & Shapiro, R. Y. (2015). Public opinion (3rd Ed.). New York, NY: Taylor & Francis.

Golan, G. J., & Yang, S. U. (2015). Introduction: The integrated public diplomacy perspective. In G.J. Golan, S.U. Yang, & D. Kinsey (Eds.), International public relations and public diplomacy: Communication and engagement (pp. 1-14). New York: Peter Lang.

185

Goldstein, F. L. (1996). Psychological operations: Principles and case studies. Montgomery, AL: Air University Press.

Greenwald, G. (2018, January 10). First France, now Brazil unveils plan to empower the government to censor the Internet in the name of stopping “fake news.” The Intercept. Retrieved on March 19, 2019, from https://theintercept.com/2018/01/10/first-france-now- brazil-unveils-plans-to-empower-the-government-to-censure-the-internet-in-the-name-of- stopping-fake-news/.

Guay, J., & Rudnick, L. (2017, June 25). What the Digital Geneva Convention means for the future of humanitarian action. United Nations High Council on Refugees. Retrieved on January 8, 2019, from https://www.unhcr.org/innovation/digital-geneva-convention- mean-future-humanitarian-action/.

Hall, K. H., & Kenski, K. (2017). Political communication: Then, now and beyond. In K. Kenski & K. H. Jamieson (Eds.) The Oxford Handbook of Political Communication. doi: 10.1093/oxfordhb/9780199793471.001.0001.

Hallahan, K. (2004). Communication management. In R. L. Heath (Ed.), Encyclopedia of public relations. Thousand Oaks, CA: Sage.

Hallahan, K., Holtzhausen, D., van Ruler, B., Verčič, D., & Sriramesh, K. (2007). Defining strategic communication. International Journal of Strategic Communication, 1(1), p. 3- 35. doi: 10.1080/15531180701285244

Hallams, E. (2010). Digital diplomacy: The Internet, the battle for ideas and US foreign policy. CEU Political Science Journal, 5(4), pp. 538-574.

Halliday, S., & Schmidt, P. (2009). Beyond methods: Law and society in action. In S. Halliday & P. Schmidt (Eds.), Conducting law and society research: Reflections on methods and practices (pp. 1-13). Cambridge, UK: Cambridge University Press.

Handy, F. J. (2016). European Union in the age of misleading communications: Insights on disinformation and propaganda. Romanian Journal of Journalism & Communication, 11(4), 36-44.

Hanson, E. C. (2008). The information revolution and world politics. Lanham: Rowman & Littlefield.

Harrison, G. K. (2013). Environmental determinism. Salem Press Encyclopedia.

Hartig, F. (2011). Confucius Institutes and the rise of China. Journal of Chinese Political Science, 17(1), pp. 53-76.

Hartig, F. (2016a). Chinese public diplomacy: The rise of the Confucius Institute. New York, NY: Routledge.

186

Hartley, D. S. (2015). An ontology for unconventional conflict. Cham, Switzerland: Springer. doi: 10.1007/978-3-319-75337-9 Hawkins, D. (2018, May 25). The cybersecurity 202: Why a privacy law like GDPR would be a tough sell in the U.S. The Washington Post. Retrieved on July 17, 2018, from https://www.washingtonpost.com/news/powerpost/paloma/the-cybersecurity- 202/2018/05/25/the-cybersecurity-202-why-a-privacy-law-like-gdpr-would-be-a-tough- sell-in-the-u-s/5b07038b1b326b492dd07e83/?utm_term=.21c23ffc96c1.

Hayden, C. (2009). Applied public diplomacy: A marketing communications exchange program in Saudi Arabia. American Behavioral Scientist, 53(4), pp. 533-548. doi: 10.1177/0002764209347629

Hendrix, M. (2016). Google’s ever-growing impact on the global economy. U.S. Chamber of Commerce Foundation. Retrieved on June 20, 2018, from https://www.uschamber foundation.org/blog/post/googles-ever-growing-impact-global-economy.

Henisz, W. J. (2017). Corporate diplomacy: Building reputations and relationships with external stakeholders. London, England: Routledge.

Hillson, R. (2009). The DIME/PMESII model suite requirements project. U.S. Naval Research Laboratory Review. Retrieved on July 18, 2018, from https://www.nrl.navy.mil/ content_images/09_Simulation_Hillson.pdf.

Ho, E. L. E., & McConnell, F. (2017). Conceptualizing ‘diaspora diplomacy’: Territory and population betwixt the domestic and foreign. Progress in Human Geography, doi: 10.1177/0309132517740217.

Holbrooke, R. (2001, October 28). Get the message out. The Washington Post. Retrieved on July 7, 2018, from http://www.washingtonpost.com/wpdyn/content/article/2010/12/13/ AR2010121305410.html.

Holmes, O. W. (2005). The common law. Clark, NJ: Lawbook Exchange.

Holt, R. T. (1958). Radio Free Europe. Minneapolis: MN: Press.

Hopkins, A. E. (2015). Government public relations: Public diplomacy or propaganda? Inquiries Journal, 7(3), Retrieved on August 10, 2018, from http://www.inquiriesjournal.com/ articles/1012/government-public-relations-public-diplomacy-or-propaganda.

Howard, M., & Parent, P. (Eds,). (1976). Carl von Clausewitz: On war. Princeton, NJ: Princeton University Press.

Howard, N. (2012). Diplomatic, information, military, and economic power (DIME): An effects modeling system. National Conference on Computing and Communication Systems. Retrieved on July 18, 2018, from https://ieeexplore.ieee.org/stamp/stamp. jsp?tp=&arnumber=6413032.

187

Howard, P. N. (2017, June 15). Is social media killing democracy? Computational propaganda, algorithms, automation and public life. Oxford Internet Institute. Retrieved on December 3, 2018, from https://www.youtube.com/watch?v=J1kXdA61AQY.

Howard, P. N., & Kollyani, B. (2016, June 21). Bots, #StrongerIn, and #Brexit: Computational propaganda during the UK-EU referendum. Social Science Research Network. Do: 10.2139/ssrn.2798311.

Ingram, D. (2018, March 19). Factbox: Who is Cambridge Analytica and what did it do? Reuters. Retrieved on June 18, 2018, from https://www.reuters.com/article/us-facebook- cambridge-analytica-factbox/factbox-who-is-cambridge-analytica-and-what-did-it-do- idUSKBN1GW07F.

Ingram, D., & Henderson, P. (2018, March 16). Trump consultants harvested data from 50 million Facebook users: report. Reuters. Retrieved on June 18, 2018, from https://www.reuters.com/article/us-facebook-cambridge-analytica/trump-consultants- harvested-data-from-50-million-facebook-users-reports-idUSKCN1GT02Y.

International Covenant on Civil and Political Rights. (1966). United Nations Human Rights Office of the High Commissioner. Retrieved on July 17, 2018, from https://www.ohchr.org/en/professionalinterest/pages/CCPR.aspx.

Isaacson, W. (2010, September 28). America's voice must be credible and must be heard. Celebrating 60 Years of RFE. Retrieved on June 23, 2018, from http://docs.rferl.org/en- US/2010/09/29/100928%20rferl-isaacson.pdf

Iwabuchi, K. (2015). Pop-culture diplomacy in Japan: Soft power, nation branding and the question of ‘international cultural exchange’. International Journal of Cultural Policy, 21(4), pp. 419-432. doi: 10.1080/10286632.2015.1042469.

Jackson, S. J., & Dawson, M. C. (2017). The IOC-State-Corporate Nexus: Corporate diplomacy and the Olympic coup d’état. South African Journal for Research in Sport, Physical Education and Recreation, 39(1), pp. 101-111.

Johnson, J. B. (2017, December). A new challenge for public diplomacy. Public Diplomacy Council. Retrieved on February 26, 2018, from http://www.publicdiplomacycouncil.org/2017/12/26/new-challenge-public- diplomacy/?utm_content=bufferb9a61&utm_medium=social&utm_source=twitter.com& utm_campaign=buffer

Joint Publication 3-13 (2012). Information operations. Chairman of the Joint Chiefs of Staff. Retrieved on October 22, 2018, from http://www.jcs.mil/Portals/36/Documents/Doctrine/ pubs/jp3_13.pdf

188

Joint Publication 5-0. (2017). Joint planning. Chairman of the Joint Chiefs of Staff. Retrieved on July 18, 2018, from http://www.jcs.mil/Portals/36/Documents/Doctrine/pubs/ jp5_0_20171606.pdf.

Jordan, T. (2015). Information politics: Liberation and exploitation in the digital age. Chicago, IL: Press.

Jowett, G. S., & O’Donnell, V. J. (2012). Propaganda and persuasion (5th Ed.). Thousand Oaks, CA: Sage.

Kajimoto, M. (2018, March 14). In east and southeast Asia, misinformation is a visible and growing concern. Poynter. Retrieved on March 19, 2019, from https://www.poynter.org/fact-checking/2018/in-east-and-southeast-asia-misinformation- is-a-visible-and-growing-concern/.

Kampf, R., Manor, I., & Segav, E. (2015). Digital diplomacy 2.0? A cross-national comparison of public engagement in Facebook and Twitter. The Hague Journal of Diplomacy, 10(4), pp. 331-362, doi: 10.1163/1871191X-12341318.

Kaufman, E. (2002). A broadcasting strategy to win media wars. The Washington Quarterly, 25(2), 115–127.

Kearney, M. G. (2007). The prohibition of propaganda for war in international law. Oxford, England: University of Oxford Press.

Kenny, M. (2008). Germany calling: A personal biography of William Joyce Lord Haw-Haw. Dublin, Ireland: New Island Books.

Keohane, R. O. (1988). International institutions: Two optics. International Studies Quarterly, 32(4), pp. 379-396.

Khan, H. (2016, May 11). Is public opinion a bigger superpower than the US? The Express Tribune. Retrieved on June 20, 2018, from https://tribune.com.pk/story/1100498/is- public-opinion-a-bigger-superpower-than-the-us/.

Knotten, V., Hansen, G. K., Svalestuen, F., & Laedre, O. (2017). Learning across disciplines: Use of the constant comparative method. In M. Buser, G. Lindhal, & C. Raisanen (Eds.), Proceedings of the 9th Nordic Conference on Construction Economics and Organization, (pp. 273-284). Lyngby, Denmark: Polyteknisk, Boghandel & Forlag

Kochar, S. (2018). Corporate Diplomacy as an Engagement Strategy of the Nonmarket Business Environment. In K. A. Johnson & M. Taylor (Eds.), The Handbook of Communication Engagement, (pp. 347-356). Cambridge, England: Wiley.

189

Kochhar, S., & Molleda, J.C. (2015). The evolving links between international public relations and corporate diplomacy. In G.J. Golan, S.U. Yang, & D. Kinsey (Eds.), International public relations and public diplomacy: Communication and engagement (pp. 51-71). New York: Peter Lang.

Kolb, S. M. (2012). Grounded theory and the constant comparative method: Valid research strategies for educators. Journal of Emerging Trends in Educational Research and Policy Studies, 3(1), pp. 83-86.

Korzak, E. (2017, September 5). The outcome of the 2016/2017 UN GGE on information security: The end of an era? EastWest Institute. Retrieved on January 10, 2019, from https://www.eastwest.ngo/idea/outcome-20162017-un-gge-information-security-end-era.

Kottasová, I. (2018, May 1). Why Mark Zuckerberg may want to avoid the UK. CNN Tech. Retrieved on June 18, 2018, from http://money.cnn.com/2018/05/01/technology/ facebook-zuckerberg-uk-parliament/index.html. Kozloski, R. (2009). The information domain as an element of national power. Strategic Insights, 8(1).

Krause, P., & van Evera, S. (2009). Public diplomacy: Ideas for the war on ideas. Middle East. Policy Archives, 16(3).

Kraut, R. R., & Burke, M. M. (2015). Internet use and psychological well-being: Effects of activity and audience. Communication of the ACM, 58(12), p. 94-100. doi: 10.1145/2739043

Krieg, A. (2014, December 31). Opinion: Is America’s reign as a superpower ending? CNN News. Retrieved on June 20, 2018, from https://www.cnn.com/2014/12/12/opinion/krieg- america-superpower/index.html.

Kurki, M. (2008). Causation in international relations. Cambridge, England: Cambridge University Press.

La Porte, T. (2012). The impact of ‘intermestic’ non-state actors on the conceptual framework of public diplomacy. The Hague Journal of Diplomacy, 7(4), pp. 441-458. Doi: 10.1163/1871191X-12341241

Land, M. (2013). Toward an international law of the Internet. Harvard International Law Journal, 54(2), 393-458.

Lee, S. T., & Lin, J. (2017). An integrated approach to public diplomacy and public relations: A five-year analysis of the information subsidies of the United States, China, and Singapore. International Journal of Strategic Communication, 11(1), 1-17.

190

Lehmann, H. (2010). Grounded theory and information systems: Are we missing the point? In R. H. Sprague (Ed.), Proceedings of the 43rd Hawaii International Conference on System Sciences. Los Alamitos, CA: IEEE Computer Society Press.

L’Etang, J. (2009). Public relations and diplomacy in a globalized world: An issue of public communication. American Behavioral Scientist, 53(4), 607-626.

Lippman, W. (1957). Public opinion. New York, NY: MacMillan.

Loomis, A., Powers, S., & Rahimi, J. (2018). 2018 Comprehensive annual report on public diplomacy and international broadcasting: Focus on the fiscal year 2017 budget data. U.S. Advisory Commission on Public Diplomacy. Retrieved on January 15, 2019, from https://www.state.gov/documents/organization/287682.pdf?fbclid=IwAR3_dU1InS47fep NCl_nQFY_Oo_-91s_cq-BmV8d80_QLiU5QD6zXXmBC7I.

Lowery, S. A., & DeFleur, M. L. (1988). Milestones in mass communication research: Media effects. White Plains, NY: Longman.

Lucas, R. (2010). Axis Sally: The American voice of Nazi Germany. Philadelphia, PA: Casemate.

National Intelligence Council. (2017). Assessing Russian Activities and Intentions in Recent US Elections”: The Analytic Process and Cyber Incident Attribution. United States Director of National Intelligence. Retrieved on June 18, 2018, from https://www.dni.gov/files/documents/ICA_2017_01.pdf.

Neuendorf, K. A. (2002). The content analysis guidebook. Thousand Oaks, CA: Sage.

Nye, J. S. (1990). Soft power. Foreign Policy, 80(3), 153-171. doi: 10.2307/1148580.

Nye, J. S. (2004). Soft power: The means to success in world politics. New York, NY: Perseus Book Group.

Nye, J. S. (2009). Get smart: Combining hard and soft power. Foreign Affairs, 88(4).

NSA. (2019). UKUSA agreement release 1940-1956. National Security Agency. Retrieved on March 23, 2019, from https://www.nsa.gov/news-features/declassified-documents/ukusa/.

Macnamara, J. (2015). Book review: International public relations and public diplomacy: Communication and Engagement. Journalism and Mass Communication Quarterly, 92(4), p. 1005-1007.

Manheim, J. (1994). Strategic public relations and American foreign policy: The evolution of influence. New York, NY: Oxford University Press.

Manor, I. (2016). Are we there yet: have MFAs realized the potential of digital diplomacy? Diplomacy and Foreign Policy 1(2), pp. 1-110.

191

Marchal, N. (2018, November 20). Unpacking France’s “Mission Civilisatrice” to tame disinformation on Facebook. Council on Foreign Relations. Retrieved on January 8, 2019, from https://www.cfr.org/blog/unpacking-frances-mission-civilisatrice-tame- disinformation-facebook.

Martelle, M. (2018, August 13). Joint Task Force ARES and Operation GLOWING SYMPHONY: Cyber Command’s Internet war against ISIL. National Security Archive. Retrieved on January 7, 2019, from https://nsarchive.gwu.edu/briefing-book/cyber- vault/2018-08-13/joint-task-force-ares-operation-glowing-symphony-cyber-commands- internet-war-against-isil.

Martinez, L. A. (2014). Biological determinism. Salem Press Encyclopedia of Health.

Mather, L. (2011). Law and society. In R. E. Goodin (Ed.) The Oxford Handbook of Political Science. doi: 10.1093/oxfordhb/9780199604456.013.0015

Mccrudden, C. (2006). Legal research and the social sciences. Law Quarterly review, 122(1), pp. 632-650.

McDonnell, J. P. (2009). National strategic planning: Linking DIMEFIL/PMESII to a theory of victory. Joint Forces Staff College. Retrieved on July 18, 2018, from http://www.dtic.mil/ dtic/tr/fulltext/u2/a530210.pdf

McKew, M. K. (2017, January). Putin’s real long game. Politico. Retrieved on July 3, 2018, from https://www.politico.eu/article/putin-trump-sanctions-news-hacking-analysis/.

McLuhan, M. (1964). Understanding media: The extensions of man. Cambridge, MA: MIT Press.

McQuail, D. (2010). McQuail’s mass communication theory (6th Ed.). Los Angeles, CA: Sage.

Mearsheimer, J. J. (2014). The tragedy of great power politics (2nd Ed.). New York, NY: Norton & Company

Mehnaz, G. (2015). Public diplomacy or propaganda: A case study of VOA Deewa Pashto Radio Service for the tribal region of Pakistan and Afghanistan (Doctoral dissertation). Retrieved from DigiNole: Florida State University Digital Repository.

Meuser, M. and Nagel, U. (2005) “Expert Inneninterviews – vielfach erprobt, wenig bedacht” in A. Bogner, B. Littig, & W. Menz, (Eds.), Das Experteninterview – Theorie, Methode, Anwendung, (2nd Ed), (pp. 71-93). Wiesbaden: Verlag für Sozialwissenschaften.

Microsoft Policy Papers. (2017). A Digital Geneva Convention to protect cyberspace. Microsoft. Retrieved on January 8, 2019, from https://www.microsoft.com/en- us/cybersecurity/content-hub/a-digital-geneva-convention-to-protect-cyberspace

192

Miller, B. (2017, December 4). There’s no need to compel speech. The marketing place of ideas is working. Forbes. Retrieved on January 7, 2019, from https://www.forbes.com/sites/ briankmiller/2017/12/04/theres-no-need-to-compel-speech-the-marketplace-of-ideas-is- working/#67fef5a74e68.

Mills, J., Bonner, A., & Francis, K. (2006). The development of constructivist grounded theory. International Journal of Qualitative Methods, 5(1), pp. 1-10.

Ministry of Defence. (2012). Joint doctrine note 1/12. Strategic communication: The defence contribution. Retrieved on June 25, 2018, from https://assets.publishing.service.gov.uk/ government/uploads/system/uploads/attachment_data/file/33710/20120126jdn112_Strate gic_CommsU.pdf.

Miskimmon, A., O’Loughlin, B., & Roselle, R. (2013). Strategic narratives: Communication power and the new world order. New York, NY: Routledge.

Misyuk, I. (2013). Propaganda and public diplomacy: The problem of differentiation. Paper presented at the Humanities & Social Sciences conference in Lviv, Ukraine. Retrieved on August 10, 2018, from http://ena.lp.edu.ua/bitstream/ntb/24199/4/29-76-77.pdf

Mogensen, K. (2017). From public diplomacy to corporate public diplomacy. Public Relations Review, 43(3), 605-614.

Molleda, J.C. (2011). Global political public relations, public diplomacy, and corporate foreign policy. In S. Kiousis, & J. Strömbäck (Eds.), Political public relations: Principles and applications (pp. 274-292). New York, NY: Routledge.

Morse, J. M. (2009). Tussles, tensions, and resolutions. In J. M. Morse, P. Stern, J. Corbin, B. Bowers, & A. Clarke (Eds.), Developing grounded theory: The second generation, (pp. 13-22). Walnut Creek, CA: Left Coast.

Murti, B., & Zaharna, R. S. (2014). India’s diaspora diplomacy: Operationalizing collaborative public diplomacy for social media. Exchange: The Journal of Public Diplomacy, 5(1), pp. 3-29.

Okadar, G. (2017, July 31). How frequency of exposure can maximize the resonance of your digital campaigns. Nielsen. Retrieved on June 25, 2018, from http://www.nielsen.com/ nz/en/insights/news/2017/how-frequency-of-exposure-can-maximise-the-resonance-of- your-digital-campaigns.html

Opalek, K. (1984). Integration between legal research and social science. In A. Peczenik, L. Lindahl, & B. V. Roermund (Eds.), Theory of legal science (pp. 531-550). Dordrecht, Netherlands: Reidel.

193

Ordeix-Rigo, E., & Duarte, J. (2009). From public diplomacy to corporate diplomacy: Increasing corporation’s legitimacy and influence. American Behavioral Scientist, 53(4), pp. 549- 564.

Osgood, K. (2017). Propaganda and public diplomacy. In C. J. Pach (Ed.), A companion to Dwight D. Eisenhower, (pp. 370-393). Malden, MA: Wiley & Sons.

Özdora, E., & Molleda, J.C. (2014). Immigrant integration through public relations and public diplomacy: An analysis of the Turkish diaspora in the capital of the European Union. Turkish Studies 15(2), 220-241.

Pamment, J. (2013a). New public diplomacy in the 21st century: A comparative study of policy and practice. London, England: Routledge.

Pamment, J. (2014). Articulating influence: Toward a research agenda for interpreting the evaluation of soft power, public diplomacy and nation brands. Public Relations Review, 40(1), pp. 50-59. doi: 10.1016/j.pubrev.2013.11.019.

Pamment, J., Olofsson, A., & Hjorth-Jenssen, R. (2017). The response of Swedish and Norwegian public diplomacy and national branding actors to the refugee crisis. Journal of Communication Management, 21(4), pp. 326-641. doi: 10.1108/JCOM-03-2017- 0040.

Penney, J. W. (2012). Communications disruption and censorship under international law: History lessons. 2nd USENIX Workshop on Free and Open Communications on the Internet. Retrieved on July 17, 2018, from https://www.usenix.org/conference/foci12 /workshop-program/presentation/penney.

Pfeffer, J. (1992). Managing with power: Politics and influence in organizations. Boston, MA: Harvard Business School Press.

Potter, E. H. (2009). Branding Canada: Projecting Canada’s soft power through public diplomacy. London, England: McGill-Queen’s University Press.

Potter, E. H. (2018). The evolving complementarity of nation-branding and public diplomacy: Projecting the Canada brand through “weibo diplomacy” in China. Canadian Foreign Policy Journal, 24(1). doi: 10.1080/11926422.2018.1469523.

Potter, W. J. (2012). Media effects. Thousand Oaks, CA: Sage.

Potter, W. J., Cooper, R., & Dupagne, M. (1993). The three paradigms of mass media research in mainstream communication journals. Communication Theory, 3(4), pp. 317-355.

194

Powers, S., & Samuel-Azran, T. (2015). Conceptualizing international broadcasting as information intervention. In G. J. Golan, S. Yang, & D. F. Kinsey (Eds). International public relations and public diplomacy: Communication and engagement (pp. 245-166). New York, NY: Peter Lang.

Prados, J. (2009). Safe for democracy: The secret wars of the CIA. Lantham, MD: Ivan R. Dee.

Pratkanis, A., & Aronson, E. (2001). Age of propaganda: The everyday use and abuse of persuasion (Revised Ed.). New York, NY: Hold Paperback.

Pruce, J. R., & Budabin, A. C. (2016). Beyond naming and shaming: New modalities of information politics in human rights. Journal of Human Rights, 15(3), pp. 408-425. doi: 10.1080/14754835.2016.1153412

Puddington, A. (2003). Broadcasting freedom: The Cold War triumph of Radio Free Europe and Radio Liberty. Lexington, KY: University Press of Kentucky.

Rasmussen, R. K., & Kerkelsen, H. (2012). The new PR of states: How nation branding practices affect the security function of public diplomacy. Public Relations Review, 38(5), pp. 810- 818. doi: 10.1016/j.pubrev.2012.06.007.

Rawnsley, G. D. (1996). Radio diplomacy and propaganda: The BBC and VOA in international politics, 1956-64. New York: Palgrave Macmillan.

Reilly, R. R. (2007). Winning the war of ideas. Claremont Institute. Retrieved on June 19, 2018, from http://www.claremont.org/crb/article/winning-the-war-of-ideas/.

Richard, L. (2010). With a sweet kiss from Sally: Fantasy and reality collided when Allied investigators hunted down the seductive Nazi broadcaster known to GIs as Axis Sally. World War II, 24(5), p. 48.

Richards, D. (1996). Elite interviewing: Approaches and pitfalls. Politics, 16(3), pp. 199-204.

Richter, S. (2007, March 3). The POP (public opinion poll) superpower. The Globalist. Retrieved on June 20, 2018, from https://www.theglobalist.com/pop-public-opinion-poll- superpower/.

Rivlin, A. M., & Litan, R. E. (2001). The economy and the Internet: What lies ahead? Brookings Institute. Retrieved on June 20, 2018, from https://www.brookings.edu/research/the- economy-and-the-internet-what-lies-ahead/.

Romm, T. (2018, April 11). Facebook’s Zuckerberg just survived 10 hours of questioning by Congress. The Washington Post. Retrieved on June 18, 2018, from https://www.washingtonpost.com/news/the-switch/wp/2018/04/11/zuckerberg-facebook- hearing-congress-house-testimony/?noredirect=on&utm_term=.5a0232d98eda.

195

Roose, K. (2018, July 19). U.S.-funded broadcaster directed ads to Americans. The New York Times. Retrieved on August 20, 2018, from https://www.nytimes.com/2018/07/19/ technology/facebook-ads-propaganda.html.

Saari, S. (2014). Russia’s post-orange revolution strategies to increase its influence in former Soviet republics: Public diplomacy po russkii. Europe-Asia Studies, 66(1), pp. 50-66. doi: 10.1080/09668136.2013.864109.

Sasse, G. (2017, May 5). Revisiting the 2014 annexation of Crimea. Carnegie Europe. Retrieved on June 19, 2018, from https://carnegieeurope.eu/2017/03/15/revisiting-2014-annexation- of-crimea-pub-68423.

Satloff, R. (2004). The battle of ideas in the War on Terror: Essays on U.S. public diplomacy in the Middle East. Washington D.C.: Washington Institute for Near East Policy.

Savelsberg, J. L., & Cleveland, L. L. (2017). Law and society. Oxford Bibliographies. Retrieved on June 18, 2018, from http://www.oxfordbibliographies.com/view/document/obo- 9780199756384/obo-9780199756384-0113.xml.

Schäfer, F., Evert, S., & Heinrich, P. (2017). Japan’s 2014 General Election: Political bots, right- wing Internet activism, and Prime Minister Shinzo Abe’s hidden nationalist agenda. Big Data, 5(4), 294-309. doi: 10.1089/big.2017.0049

Schiff, D. N. (1976). Socio-legal theory: Social structure and law. The Modern Law Review, 39(3), pp. 287-310.

Schindler, C. (2016). Proactively preserving the inward quiet: Public diplomacy and NATO. Defence Strategic Communication, 1(1), p. 133-146.

Shoemaker, P. J., Tankard, J. W., & Larosa, D. L. (2003). How to build social science theories. London, England: Sage.

Scott, D. M. (2012, April 12). GolinHarris shows how an agency does real-time communications right. David Meerman Scott. Retrieved on January 15, 2019, from https://www.davidmeermanscott.com/blog/2012/04/golinharris-shows-how-an-agency- does-real-time-communications-right.html.

Scott, J. (1990). A matter of record. Cambridge, England: Polity.

Seger, A. (2016, December, 7). The Budapest Convention on Cybercrime: A framework for capacity building. Global Forum on Cyber Expertise. Retrieved on July 17, 2018, from https://www.thegfce.com/news/news/2016/12/07/budapest-convention-on-cybercrime.

Seidel, S., & Urquhart, C. (2013). On emergence and forcing in information systems grounded theory studies: The case of Strauss and Corbin. Journal of Information Technology, 28(3), pp. 237-260.

196

Seppälä, T. (2012). Globalizing resistant against War: Theories of resistant and the new anti- war movement. New York, NY: Routledge.

Shalini, S. (2016, March 3). Budapest Convention on cybercrime: An overview. Center for Communication Governance. Retrieved on July 17, 2018, from https://ccgnludelhi. wordpress.com/2016/03/03/budapest-convention-on-cybercrime-an-overview/.

Sharma, S. (2010). Reviewing NGOs’ media strategies: Possibilities for NGO-media collaboration. International NGO Journal, 5(4), pp. 84-87.

Shields, M. (2016, February 18). An oral history of the first presidential campaign websites in 1996. The Wall Street Journal. Retrieved on June 20, 2018, from https://www.wsj.com/ articles/an-oral-history-of-the-first-presidential-campaign-websites-in-1996-1455831487.

Shotter, J. (2018, January 8). Czechs fear Russian fake news in presidential election. Financial Times. Retrieved on June 18, 2018, from https://www.ft.com/content/c2b36cf0-e715- 11e7-8b99-0191e45377ec.

Signitzer, B. H., & Coombs, T. (1992). Public relations and public diplomacy: Conceptual convergences. Public Relations Review, 18(2), 137-147.

Signitzer, B. H., & Wasmer, C. (2006). Public diplomacy: A specific government public relations function. In I. C. H. Botan & V. Hazelton (Eds.), Public relations theory II (pp. 435-464). Mahwah, NJ: Lawrence Erlbaum Associates.

Šlapkauskas, V. (2010). The significance of the sociological approach to law for the development of jurisprudence. Societal Studies, 4(8), pp. 167-181.

Slaughter, A.S. (1993). International law and international relations theory: A dual agenda. The American Journal of International Law, 87(2), 205-239.

Snow, N., & Taylor, P. M. (2006). The revival of the propaganda state: US propaganda at home and abroad since 9/11. International Communication Gazette, 68(1), 389-407. doi: 10.1177/1748048506068718.

Smith, A. (2018). Public attitudes toward computer algorithms. Pew Research Center. Retrieved on March 21, 2019, from https://www.pewinternet.org/2018/11/16/public-attitudes- toward-computer-algorithms/.

Smith, J. (2009). The rise of global public opinion. In G. Modelski and R. A. Denemark (Eds.) World System History: Encyclopedia of Life Support Systems. New York, NY: EOLSS Publications.

197

Soesanto, S., & D’Incau, F. (2017, August 15). The UN GGE is dead: Time to fall forward. European Council on Foreign Relations. Retrieved on July 17, 2018, from https://www.ecfr.eu/article/commentary_time_to_fall_forward_on_cyber_governance#_ft n1.

Strauss, A., & Corbin, J. (1990). Basics of qualitative research: Grounded theory procedures and techniques. London, England: Sage.

Stetzenmüller, C. (2017, June 28). The impact of Russian interference on Germany’s 2017 election. Brookings Institute. Retrieved on June 18, 2018, from https://www.brookings.edu/testimonies/the-impact-of-russian-interference-on-germanys- 2017-elections/.

Strömbäck, J., & Kiousis, S. (2011). Political public relations: Defining and mapping an emergent field. In J. Strömbäck and S. Kiousis (Eds.) Political public relations: Principles and applications (pp. 1-32). New York, NY: Routledge.

Szondi, G. (2010). From image management to relationship building: A public relations approach to nation branding. Place Branding and Public Diplomacy, 6(4), pp. 333-343.

Szostek, J. (2017). The power and limits of Russia’s strategic narrative in Ukraine: The role of linkage. Perspectives on Politics, 15(2), 379-395. doi: 10.1017/S153759271700007X.

Tam, L., Kim, J., & Kim, J. N. (2018). The origins of distant voicing: Examining relational dimensions in public diplomacy and their effects on megaphoning. Public Relations Review, 44(3), 407-418.

Tariq, S., & Woodman, J. (2013). Using mixed methods in health research. Journal of the Royal Society of Medicine Short Reports, 4(6). doi: 10.1177/2F2042533313479197.

Taylor, P. M. (2007). ‘Munitions of the mind’: A brief history of military psychological operations. Place Branding and Public Diplomacy, 3(3), 196-204.

Thomas, T. (2018). Russia’s forms and methods of military operations: The implementers of concepts. Military Review, 96(3), p. 30-37.

Thunø, M. (2017). China’s new global position: Changing policies toward the Chinese diaspora in the twenty-first century. In B. Wong & C. B. Tan (Eds.), China’ rise and the Chinese Overseas, (pp. 184-208). New York, NY: Routledge.

Trager, R., Ross, S. D., & Reynolds, A. (2018). The law of journalism and mass communication (6th Ed.). Washington, D.C.: C. Q. Press.

Treaty Office. (2018). Chart of signatures and ratifications of Treaty 185: Convention on cybercrime. Council of Europe. Retrieved on July 17, 2018, from https://www.coe.int/en/web/conventions/full-list/-/conventions/treaty/185/signatures.

198

Trilokekar, R. D. (2010). International education as soft power? The contributions and challenges of Canadian foreign policy to the internationalization of higher education. Higher Education, 59(2), pp. 131-147.

Tyler, P. E. (2003). Threats and responses: News analysis; a new power in the streets. The New York Times. Retrieved on June 20, 2018, from https://www.nytimes.com/2003/02/17/ world/threats-and-responses-news-analysis-a-new-power-in-the-streets.html.

Uttaro, R. A. (1982). The voices of America in international radio propaganda. Law and Contemporary Problems, 45(4), p. 103-112.

Walker, C. (2016). The authoritarian threat: The hijacking of “soft power”. Journal of Democracy, 27(1), p. 49-63.

Walker, C., & Ludwig, J. (2017). From ‘soft power’ to ‘sharp power’: Rising authoritarian influence in the democratic world. In J. P. Cardenal, J. Kucharcyzk, G. Meseznikov, & G. Plaschova. Sharp power: Rising Authoritarian Influence. International Forum for Democratic Studies. Retrieved on March 6, 2018, from https://www.ned.org/wp- content/uploads/2017/12/Sharp-Power-Rising-Authoritarian-Influence-Full-Report.pdf.

Walker, P. (2018, December 10). Foreign Office investigates reports that state-funded body targeted Corbyn. The Guardian. Retrieved on January 7, 2019, from https://www.theguardian.com/politics/2018/dec/10/foreign-office-investigates-report- state-funded-body-targeted-corbyn.

Wallin, M. (2015). Military public diplomacy: How the military influences foreign audiences. American Security Project. Retrieved on January 22, 2019, from https://www.americansecurityproject.org/wp-content/uploads/2015/02/Ref-0185- Military-Public-Diplomacy.pdf.

Walzer, M. (2015). Just and unjust wars: A moral argument with historical illustrations (5th ed.). New York, NY: Basic Books.

Wang, J., & Chang, T. K. (2004). Strategic public diplomacy and local press: How a high profile “head-of-state” visit was covered in America’s heartland. Public Relations Review, 30(1), 11-24.

Wardle, C., & Derakshshan, H. (2017). Information disorder: Toward an interdisciplinary framework for research and policy making. Council of Europe Report DGI(2017)09. Retrieved on July 5, 2018, from https://rm.coe.int/information-disorder-toward-an- interdisciplinary-framework-for-researc/168076277c.

Weaver, D. H. (1980). Audience need for orientation and media effects. Communication Research, 7(3), p. 361-376.

199

Webley, L. (2010). Qualitative approaches to empirical legal research. In P. Cane & H. Kritzer (Eds.), Oxford handbook of empirical legal research (pp. 926-950). Oxford, England: Oxford University Press.

Weiner, T. (2008). Legacy of ashes: The history of the CIA. New York, NY: Anchor.

Weiss, T. G., Seyle, D. C., & Coolidge, K. (2013). The rise of non-state actors in global governance: Opportunities and limitations. Broomfield, CO: One Earth Future.

Whittington, K.E. (2012). Law and politics: Critical concepts in political science. Princeton University. Retrieved on June 20, 2018, from http://www.princeton.edu/~kewhitt/ lawandpolitics.

Whittington, K. E., Kelemen, R. D., & Caldeira, G. A. (2011). Overview of law and politics: The study of law and politics. In R. E. Goodin (Ed.) The Oxford Handbook of Political Science. doi: 10.1093/oxfordhb/9780199604456.013.0015

Whorf, B. L. (1956). Language, thought, and reality: Selected writings of Benjamin Lee Whorf. Cambridge, MA: MIT Press.

Wiesche, M., Jurisch, M. C., Yetton, P. W., & Krcmar, H. (2017). Grounded theory methodology in information systems research. Management Information Systems Quarterly, 41(3), pp. 685-701.

WIPO. (2019). What is a trade secret? World Intellectual Property Organization. Retrieved on March 22, 2019, from https://www.wipo.int/sme/en/ip_business/trade_secrets/ trade_secrets.htm.

Vanc, A. M., & Fitzparick, K. R. (2016). Scope and status of public diplomacy research by public relations scholars, 1990-2014. Public Relations Review, 42(3), 432-440.

Vanian, J. (2017, February 7). Denmark to appoint a ‘Silicon Valley ambassador’ as if tech was its own country. Fortune. Retrieved on June 20, 2018, from http://fortune.com/ 2017/02/06/denmark-ambassador-apple-google/.

Verdon, T. (2016). The return of khilafah: The constitutive narratives of DAESH. Defence Strategic Communications, 1(1), p. 76-98.

Yang, A., Taylor, M., & Yang, A. (2014). Public diplomacy in a networked society: The Chinese government-NGO coalition network on acquired immune deficiency syndrome prevention. International Communication Gazette, 76(7), pp. 575-593. doi: 10.1177/1748048514538929.

Yun, S. H. (2006). Toward public relations theory-based study of public diplomacy: Testing the applicability of the excellence study. Journal of Public Relations Research, 18(4), 287- 312. doi: 10.1207/s1532754xjprr1804_1.

200

Yun, S. H., & Toth, E. (2009). Future sociological public diplomacy and the role of public relations: Evolutions of public diplomacy. American Behavioral Scientist, 53(4), 493- 503.

Yun, S. E., & Vibber, K. (2012). The strategic values and communicative actions of Chinese students for sociological Korean public diplomacy. International Journal of Strategic Communication, 6(1), pp. 77-92, doi: 10.1080/1553118X.2011.634864.

Zaharna, R. S. (2010). Battles to bridges: U.S. strategic communication and public diplomacy after 9/11. New York: Palgrave Macmillan.

Zaharna, R. S., & Uysal, N. (2016). Going for the jugular in public diplomacy: How adversarial publics using social media are challenging state legitimacy. Public Relations Review, 42(1), pp. 109-119. doi: 10.1016/j.pubrev.2015.07.006.

Ziglio, E. (1995). The Delphi method and its contribution to decision-making. In M. Adler & E. Ziglio (Eds.), Gazing into the oracle: The Delphi method and its application to social policy and public health, (pp. 3-33). Bristol, PA: Jessica Kingsley Publishers.

201

BIOGRAPHICAL SKETCH

Phillip C. Arceneaux was born and raised in Lafayette, Louisiana and attended Saint

Thomas More Catholic High School. His parents are Bryan Neil Arceneaux and Debbie Michelle

Hanrahan. He graduated cum laude from Louisiana State University with a Bachelor of Arts in communication studies and summa cum laude from the University of Louisiana, Lafayette with a

Master of Science in communication. Following the completion of this dissertation, Phillip C.

Arceneaux graduated summa cum laude from the College of Journalism and Communication at the University of Florida with a Ph.D. in mass communication. During his studies, Phillip worked with the University of Oregon, the United States Naval Academy, Department of State, and Central Intelligence Agency.

202