<<

Manufacturing Dissent: The Subtle Ways International Shapes Our Politics

by Aleksandr Fisher

B.A. in History and Political Science, May 2014, Temple University

A Dissertation submitted to

The Faculty of The Columbian College of Arts and Sciences of The George Washington University in partial fulfillment of the requirements for the degree of Doctor of Philosophy

May 17, 2020

Dissertation directed by Henry Hale Professor of Political Science The Columbian College of Arts and Sciences of The George Washington University cer- tifies that Aleksandr Fisher has passed the Final Examination for the degree of Doctor of Philosophy as of December 19, 2019. This is the final and approved form of the disserta- tion.

Manufacturing Dissent: The Subtle Ways International Propaganda Shapes Our Politics

by Aleksandr Fisher

Dissertation Research Committee: Henry Hale, Professor of Political Science, Dissertation Director Danny Hayes, Associate Professor of Political Science, Committee Member Michael Miller, Assistant Professor of Political Science, Committee Member

ii Abstract of Dissertation

Manufacturing Dissent: The Subtle Ways International Propaganda Shapes Our Politics

What impact does state-sponsored propaganda have on political behavior? How imper- vious is propaganda to strategies designed to combat it? I demonstrate that international propaganda has limited direct influence on mass public opinion, but the perception that propaganda is effective on ”other” can have independent effects on democratic norms. I evaluate the conditions under which exposure to propaganda makes people more cynical and conspiratorial, exploring undertheorized opinion outcomes in empirical propaganda research. I show that many of our counter-propaganda initiatives, both defensive and of- fensive, fail to have their intended effect and may actually backfire.

iii Contents

1 Abstract of Dissertation iii

2 Introduction1 2.1 Goal of Dissertation...... 3 2.2 Overview of Empirical Chapters...... 5 2.3 Contribution...... 9

3 Propaganda in International Politics 13 3.1 Why Study International Propaganda?...... 20 3.2 Is There a Demand for Foreign Perspectives?...... 23 3.3 Is Exposure to International Propaganda More Frequent Than We Think?. 25

4 Is International Propaganda Effective? 32 4.1 Reconceptualizing Effective Propaganda...... 34 4.2 Undermining Rivals...... 36 4.3 Promoting Conspiracies and Cynicism...... 37 4.4 The Power of Perceptions...... 39 4.5 Countering Propaganda...... 41 4.6 Defensive Responses...... 41 4.7 Offensive Responses...... 44 4.8 Conclusion...... 46

5 Demonizing the Enemy 48 5.1 Introduction...... 48 5.2 Research Design...... 49 5.3 Results...... 52 5.4 Conclusion...... 61

6 The Conspiratorial and the Cynical 63 6.1 Introduction...... 63 6.2 Research Design...... 64 6.3 Results...... 67 6.4 Conclusion...... 81

7 Propaganda’s Presumed Influence 82 7.1 Introduction...... 82 7.2 Research Design...... 84 7.3 Results...... 85 7.4 Conclusion...... 95

iv 8 How to Criticize an Autocrat 97 8.1 Introduction...... 97 8.2 Research Design...... 101 8.3 Results...... 106 8.4 Conclusion...... 119

9 Conclusion and Implications 123 9.1 Limitations...... 125 9.2 Future Research...... 128 9.3 Final Thoughts...... 133

10 Appendix A: Demonizing The Enemy 169 10.1 Measures...... 169 10.2 Survey Sample Characteristics...... 172 10.3 Balance Test...... 173 10.4 Robustness Checks...... 174 10.5 Bayesian Additive Regression Trees (BART)...... 181

11 Appendix B: The Cynical and the Conspiratorial 183 11.1 Placebo Posts...... 183 11.2 Survey Sample Characteristics...... 186 11.3 Balance Across Treatments...... 188 11.4 Robustness with Manipulation Checks...... 190 11.5 Interaction Effects...... 192

12 Appendix C: Propaganda’s Presumed Influence 196 12.1 Survey Questions (Dependent Variables)...... 196 12.2 Balance Test...... 198 12.3 Robustness Checks...... 198 12.4 Sensitivity Analysis for Mediation Analysis...... 200 12.5 Online Media Regulation...... 201 12.6 Bayesian Additive Regression Trees (BART)...... 202

13 Appendix D: How To Criticize an Autocrat 203 13.1 Summary Statistics...... 203 13.2 Balance Table...... 204 13.3 Robustness Checks...... 206 13.4 Characteristics of Different Citizens...... 213 13.5 Bayesian Additive Regression Trees (BART)...... 214

v List of Figures

1 Participants were asked their levels of favorability toward: Ukraine, Ukrainian foreign policy, Ukrainian President Petro Poroshenko, , Russia’s for- eign policy, and Russian President Vladimir Putin. OLS with no controls. Sample includes individuals who passed reading checks. Figure plots the marginal effects of the treatments on each dependent variable...... 53 2 I obtain the pairwise differences of the mean of favorability toward Ukraine across the levels of the treatments and adjust the p-values and confidence intervals for multiple comparisons using Tukey’s Honest Significant Dif- ference test (HSD)...... 54 3 I obtain the pairwise differences of the mean of favorability toward Russia across the levels of the treatments and adjust the p-values and confidence intervals for multiple comparisons using Tukey’s Honest Significant Dif- ference test (HSD)...... 55 4 Participants were asked their levels of support for expanding sanctions on Russia and arming the Ukrainian government. OLS with no controls. Sample includes individuals who passed reading checks. Figure plots the marginal effects of the treatments on each dependent variable...... 56 5 Participants were asked their levels of favorability toward: Ukraine and Russia. OLS with no controls. Sample includes individuals who passed reading checks. Figure plots the marginal effects of the treatments on each dependent variable...... 57

vi 6 Participants were asked their levels of favorability toward: Ukraine and Russia. OLS with no controls. Sample includes individuals who passed reading checks. Figure plots the marginal effects of the treatments on each dependent variable...... 59 7 Participants were asked their levels of favorability toward Ukraine. OLS with no controls. Sample includes individuals who passed reading checks. Figure plots the marginal effects of the treatments by attitudes on Russia.. 60 8 Russian Propaganda Treatments...... 65 9 The figure shows the percentage of respondents who agreed with each state- ment. Logistic regression. Dashed lines represent 95% confidence intervals. 68 10 Participants were asked to assess the accuracy of the three conspiracies. Logistic regression with controls. Sample includes individuals who passed reading checks. Figure plots the increase in probability of believing each statement. Dashed lines represent 95% confidence intervals...... 70 11 Participants were asked to assess the accuracy of the three conspiracies. Logistic regression with controls. Sample includes individuals who passed reading checks. Figure plots the increase in probability of believing each statement. Dashed lines represent 95% confidence intervals...... 71 12 Participants were asked to assess the accuracy of the three conspiracies. Logistic regression with controls. Sample includes individuals who passed reading checks. Figure plots the increase in probability of believing each statement. Dashed lines represent 95% confidence intervals...... 72 13 Treatment effects on cynicism. OLS with robust standard errors and 95% confidence intervals. N=935...... 75 14 Treatment article. Only the source and the intention group are shown the logo...... 78

vii 15 Treatment effects on cynicism. OLS with robust standard errors and 95% confidence intervals. N=1,000...... 79 16 Influence of political cynicism on attitudes toward strong leader. OLS re- gression with full controls and robust standard errors. Dashed lines repre- sent 95% confidence intervals. N=489. Histogram at the bottom shows the percentage of participants with different levels of cynicism...... 80 17 Effect of treatments on propaganda’s presumed effect on others. Controls included. Horizontal lines represent 95% confidence intervals for estimates. 86 18 Effect of treatments on propaganda’s presumed effect on others. All con- trols included. Results disaggregated by partisans (including leaners). Hor- izontal lines represent 95% confidence intervals for estimates...... 88 19 Dependent variable is belief that the outcome of the 2016 election is ille- gitimate. Independent variable is belief about propaganda’s influence on others’ voting behavior. All controls included. Horizontal lines represent 95% confidence intervals for estimates...... 89 20 Dependent variable is belief that the outcome of the 2016 election is ille- gitimate. Independent variable is belief about propaganda’s influence on others’ voting behavior. All controls included. Results disaggregated by partisans (including leaners). Horizontal lines represent 95% confidence intervals for estimates...... 90 21 Dependent variable is support for banning Russian-funded networks. Inde- pendent variable is belief about propaganda’s influence on others’ voting behavior. All controls included. Horizontal lines represent 95% confidence intervals for estimates...... 91

viii 22 Dependent variable is support for banning Russian-funded networks. Inde- pendent variable is belief about propaganda’s influence on others’ voting behavior. All controls included. Results disaggregated by partisans (in- cluding leaners). Horizontal lines represent 95% confidence intervals for estimates...... 92 23 Causal mediation plot. Treatment is the Inoculation (compared to control condition), Mediator is presumed propaganda effects on others. Outcome is belief that 2016 election outcome was illegitimate. Horizontal lines rep- resent 95% confidence intervals for estimates...... 93 24 Causal mediation plot. Treatment is the Inoculation (compared to control condition), Mediator is presumed propaganda effects on others. Outcome is belief that Russian-funded networks should be banned. Horizontal lines represent 95% confidence intervals for estimates...... 94 25 Treatment Posts: Individuals in the control groups saw all the articles in rows two and three. Meanwhile, individuals in the criticism groups saw four (random) placebo posts in addition to the criticism of Vladimir Putin. Individuals in the balanced group saw three (random) placebo posts, the criticism of Vladimir Putin, and the criticism of U.S. corruption...... 104 26 Effect of treatments on support for Vladimir Putin (0-1 scale). Left plot illustrates the effect of domestic criticisms relative to the control. Right plot illustrates the effect of foreign criticisms relative to the control. Horizontal lines represent 90% confidence intervals for estimates...... 107 27 Effect of treatments on support for Vladimir Putin (0-1 scale). Left plot illustrates the effect of domestic criticisms relative to the control. Right plot illustrates the effect of foreign criticisms relative to the control. Horizontal lines represent 95% confidence intervals for estimates...... 108

ix 28 Effect of treatments on support for Vladimir Putin (0-1 scale). Compar- ing domestic vs. foreign one-sided/balanced treatments. Horizontal lines represent 95% confidence intervals for estimates...... 109 29 Effect of treatments on trust toward Strela’s (1-3 point scale). Left plot illustrates the effect of domestic criticisms relative to the control. Right plot illustrates the effect of foreign criticisms relative to the control. Horizontal lines represent 95% confidence intervals for estimates...... 110 30 Causal mediation plot. Treatment is the foreign balanced criticism (com- pared to foreign one-sided criticism), Mediator is perceptions of source bias. Outcome is support for Vladimir Putin. Horizontal lines represent 95% confidence intervals for estimates...... 111 31 Causal mediation plot. Treatment is the foreign balanced criticism (com- pared to foreign one-sided criticism), Mediator is perception that corrup- tion has increased during Vladimir Putin’s tenure as president. Outcome is support for Vladimir Putin. Horizontal lines represent 95% confidence intervals for estimates...... 112 32 Effect of treatments on support for Putin (0-1 scale). Left plot illustrates the effect of domestic criticisms relative to the control. Right plot illustrates the effect of foreign criticisms relative to the control. Horizontal lines represent 95% confidence intervals for estimates...... 113 33 Effect of treatments on support for Putin (0-1 scale). Left plot illustrates the effect of domestic criticisms relative to the control. Right plot illustrates the effect of foreign criticisms relative to the control. Horizontal lines represent 95% confidence intervals for estimates...... 114

x 34 Effect of treatments on support for Putin (0-1 scale). Left plot illustrates the effect of domestic criticisms relative to the control. Right plot illustrates the effect of foreign criticisms relative to the control. Horizontal lines represent 95% confidence intervals for estimates...... 115 35 Effect of treatments on support for Putin (0-1 scale). Left plot illustrates the effect of domestic criticisms relative to the control. Right plot illustrates the effect of foreign criticisms relative to the control. Horizontal lines represent 95% confidence intervals for estimates...... 116 36 Plots the effect of the foreign treatments (in respect to the control) on sup- port for Vladimir Putin by education. Horizontal lines represent 95% con- fidence intervals for estimates...... 117 37 Effect of treatments on support for Alexei Navalny (0-1 scale). Left plot illustrates the effect of domestic criticisms relative to the control. Right plot illustrates the effect of foreign criticisms relative to the control. Horizontal lines represent 95% confidence intervals for estimates...... 118 38 Effect of treatments on view that Russia is democratic (0-1 scale). Left plot illustrates the effect of domestic criticisms relative to the control. Right plot illustrates the effect of foreign criticisms relative to the control. Horizontal lines represent 95% confidence intervals for estimates...... 119 39 Bart Estimated Treatment Effects...... 181 40 Bart Estimated Treatment Effects...... 182 41 Placebo Posts...... 184 42 Placebo Posts...... 185 43 Treatment Effects on Political Cynicism Scale...... 191 44 Treatment Effects on Conspiracy Scale...... 191 45 Age Interaction - Cynicism...... 192

xi 46 Ideology Interaction - Cynicism...... 192 47 Education Interaction - Cynicism...... 193 48 Age Interaction - Cynicism...... 194 49 Ideology Interaction - Cynicism...... 194 50 Education Interaction - Cynicism...... 195 51 Balance Test for Sample...... 198 52 Robustness Checks...... 199 53 Sensitivity Analysis for Presumed Propaganda Effect on Others on Illegit- imacy...... 200 54 Sensitivity Analysis for Presumed Propaganda Effect on Others on Support for Censorship...... 200 55 Dependent variable is support for regulating online media. Independent variable is belief about propaganda’s influence on others’ voting behavior. All controls included. Results disaggregated by partisans (including lean- ers). Horizontal lines represent 95% confidence intervals for estimates.... 201 56 Bart Estimated Treatment Effects...... 202 57 Multinomial logistic regression. Horizontal lines represent 95% confidence intervals for estimates...... 213 58 Bart estimated treatment effects...... 214 59 Bart estimated treatment effects...... 214

xii List of Tables

1 Research Design...... 51 2 Study Design...... 66 3 Study Design...... 76 4 Research Design...... 105 5 Summary Table...... 172 6 Balance Test...... 173 7 Regression Table for Figure 1...... 175 8 Regression Table for Figure 2...... 176 9 Full Sample Robustness Check...... 177 10 Full Sample Robustness Check...... 178 11 Treatment Effects with Controls...... 179 12 Treatment Effects with Controls...... 180 13 Study 2: Summary Table...... 186 14 Study 1: Summary Table...... 187 15 Balance Across Treatments...... 188 16 Balance Across Treatments...... 189 17 Summary Table...... 203 18 Balance Test...... 205 19 Domestic Criticism Effect on Support for Putin...... 207 20 Foreign Criticism Effect on Support for Putin...... 208 21 Domestic Criticism Effect on Support for Putin...... 209 22 Foreign Criticism Effect on Support for Putin...... 210 23 Domestic Criticism Effect on Support for Putin...... 211 24 Foreign Criticism Effect on Support for Putin...... 212

xiii Introduction

“In May 2016, two groups of protestors faced each other in downtown Hous- ton, Texas. One side was drawn there by a Facebook group called “Heart of Texas” to oppose the purported “Islamification of Texas.” The other side was recruited by a Facebook group called “United Muslims of America” and was there to rally for “saving Islamic knowledge.” The dueling protests in Houston led to confrontation and verbal attacks between the sides. What neither the protestors nor the authorities understood at the time was that both Facebook groups that spurred the protests were established and operated not by Housto- nians, but by individuals posing as Americans from thousands of miles away. For relatively little cost, the (IRA), the now infa- mous troll farm in St. Petersburg, Russia, manipulated the most widely used social media platform to pit Americans in the ’ fourth-largest city against one another. The goal may have been to incite violence between these

opposing groups of protestors”(Fly, Rosenberger and Salvo 2018, 7).

The 2016 U.S. presidential election exposed how foreign entities can use inexpensive social media campaigns to spread and influence democratic politics in other countries. New information and communication technologies not only help autocrats main- tain control over their own citizens (Gunitsky 2015), but they also provide them the tools to undermine the integrity of electoral processes and confidence in democratic institutions abroad (Fly, Rosenberger and Salvo 2018, 9). Since the election of Donald Trump, the U.S. intelligence community and independent researchers have shown that Russia tried to promote disinformation on a host of socially and politically divisive issues in order to sow discord and increase political polarization in Western democracies (Allcott and Gentzkow 2017; Badawy, Ferrara and Lerman 2018; Crilley 2017). Some even claim that foreign

1 strategic information campaigns shifted the outcome of the 2016 U.S. presidential election, raising doubts about the legitimacy of the sitting president. Clinton Watts, a researcher on Russian strategic operations, argues that “without the Russian influence effort...Trump would not have even been within striking distance of Clin- ton on Election Day” (Boot 2018). Former Director of National Intelligence James Clapper noted that fewer than 80,000 votes in Michigan, Pennsylvania and Wisconsin decided the contest. According to him, there is “no doubt that more votes than that were influenced by the massive effort by the Russians” (Hunt 2018). Others disagree with this perspective, arguing that “given the volume and tenor of mainstream and right-wing domestic coverage of Clinton, it seems unlikely that Russian propaganda made much of a difference” (Benkler 2018). Regardless of whether one believes that Russian propaganda helped shape the outcome of the 2016 election, international propaganda has become central to debates on election security, democratic decision making, and U.S. foreign policy (Landon-Murray, Mujkic and Nussbaum 2019; Levin 2016; Tenove et al. 2019; Tomz and Weeks 2019; Way and Casey 2019). According to Kathleen Hall James, author of the book, Cyberwar: How Russian Hackers and Trolls Helped Elect a President – What We Don’t, Can’t, and Do

Know: “It’s almost irrelevant in some ways whether the interference changed the election or not. What matters is the actions, the assault on the presidential election process” (Jamieson 2018). Schultz echoes this sentiment, stating that: “Not only did Russian interference in the 2016 election show that a foreign state could play on internal political divisions and mistrust in order to sow general confusion in the democratic process and to bolster its preferred candidate, but it did so in a way that has to date left the country divided over what happened and how to respond to an attack on its sovereignty” (Schultz 2017, 10). While scholars may continually debate the true influence of foreign interference on electoral outcomes, ordinary citizens have largely made up their minds on the issue.

2 In the United States, over 51% of individuals think that Russia interfered in the 2016 election. Notably, this split is heavily determined by one’s partisanship with only 15% of Republicans thinking Russia interfered compared to 84% of Democrats and 53% of Inde- pendents.1 Moreover, 62% of Americans were concerned that Russia tried to interfere in the 2018 elections. Ironically, the mere perception that international propaganda is effec- tive can shape public opinion and foster polarization - regardless of whether it actually had any direct influence (Fisher 2019b; Tomz and Weeks 2019). Moreover, because interna- tional propaganda has received considerable media attention, millions of dollars are being invested at home and abroad into security measures to counter international disinformation campaigns (Farwell 2018). Yet, despite all of this attention to foreign disinformation cam- paigns we do not have clear evidence on the influence of international propaganda on mass public opinion, nor do we know the impact our countermeasures are having in mitigating the effects of foreign disinformation.

Goal of Dissertation

Much of the existing research on international propaganda focuses on mapping net- works of influence, cataloguing the spread of particular pieces of disinformation, and theo- rizing about the normative implications of new information and communication technolo- gies on democratic governance. This work has been invaluable in demonstrating how so- phisticated propaganda apparatuses are designed, funded, and maintained (Herpen 2015; Helmus et al. 2018; Shambaugh 2007). It has given us a novel understanding of how disinformation spreads across social networks (Badawy, Ferrara and Lerman 2018; Ben- kler, Faris and Roberts 2018; Farkas and Bastos 2018; Allcott, Gentzkow and Yu 2019; Llewellyn et al. 2018; Zannettou et al. 2019). It has also forced us to consider how propa- gandists structure their narratives to appeal to diverse audiences (Arif, Stewart and Starbird

1See: http://pollingreport.com/russia.htm.

3 2018; Etudo, Yoon and Yaraghi 2019; Flock 2018; Keating and Kaczmarska 2018; Nelson 2019). However, what is lacking is an in-depth analysis on the actual influence of these strategic information operations on public opinion (Allcott and Gentzkow 2017; Chapman and Gerber 2019; Gerber and Zavisca 2016; Lazer et al. 2018; Lanoszka 2019; Clayton et al. 2019; Peisakhin and Rozenas 2018; Pennycook and Rand 2018). Additionally, what we need is a fundamental re-evalaution of “effective propaganda” since state and non-state actors have a myriad of goals they are trying to achieve through international propaganda campaigns. My dissertation focuses on the influence of international propaganda as well as the ef- ficacy of our counter-propaganda initiatives. This project is motivated by the following research questions: (i) What impact does state-sponsored propaganda have on political be- havior; and (ii) How impervious are propaganda efforts to strategies designed to combat it? I examine these questions in the context of Russian propaganda and U.S. counter- propaganda due to the attention this case has received in the media, policy circles, and academia. While policy officials have dedicated time and resources countering interna- tional propaganda, they have not provided much evidence about the impact of foreign me- dia on public opinion or offered new insights into which citizens are most susceptible to foreign messages (Lanoszka 2019; Levin 2016). More troubling, we do not know the effect of our own counter-messaging initiatives (Bjola 2018; Helmus et al. 2018). In the chapters below, I demonstrate that international propaganda has limited direct influence on mass public opinion - shifting attitudes on low-saliency foreign policy issues and increasing belief in conspiracy theories among individuals pre-disposed to accept these theories in the first place. However, the mere perception that propaganda is effective can have independent effects on democratic governance. By creating the impression that for- eign actors have an extensive influence on political outcomes, states can undermine the legitimacy of democratic systems. Notably, our countermeasures against international pro-

4 paganda often fail to have their intended effect, meaning we should be more cognizant about the efficacy of strategic information campaigns. While we often talk about propaganda using metaphors from epidemiology – claiming disinformation infects public opinion and inoculation messages immunize against harm- ful messages – these analogies can obfuscate propaganda’s true influence on our politics (Cunningham 2002; Ellul 1965; Martin 1971). As technological advances make it easier for hostile actors to manipulate public opinion, we should be careful not to overreact and use the threat of international propaganda to subvert democratic processes. As history has shown, political actors often rely on foreign threats to curtail citizens’ rights, aggrandize power, and promote their own political agenda (Bermeo 2003).

Overview of Empirical Chapters

In this dissertation, I test the influence of Russian propaganda on American audiences, analyze the effectiveness of warning messages in countering foreign propaganda, and eval- uate the efficacy of U.S. international broadcasting on Russian audiences using a series of survey experiments – three in the United States and one in Russia (N ∼ 4,000). While existing studies rely on quasi-experimental designs to evaluate how the presence of inter- national messages changes political behavior, they cannot reliably demonstrate the effect of direct exposure (Adena et al. 2015; Crabtree, Darmofal and Kern 2015; Crabtree, Kern and Plaff 2018; Kern and Hainmueller 2009; Peisakhin and Rozenas 2018; Yanagizawa-Drott 2014). Consequently, those that study international disinformation argue that using survey experiments is most appropriate research design for testing the influence of propaganda (Lanoszka 2019, 16). This is the first project, to my knowledge, that directly measures the influence of international propaganda and tests whether inoculation strategies mitigate international propaganda’s effectiveness. I expand our conceptualization of “effective propaganda” examining when international

5 propaganda has soft power, sharp power, and/or third-person power effects (Davison 1983; Nye 2004; Walker 2018). Specifically, I assess when foreign messages: (i) bolster public support for the communicating country and undermine the legitimacy of target govern- ments; (ii) increase belief in conspiracy theories and heighten political cynicism; and (iii) change people’s perceptions about the effectiveness of propaganda on other citizens. By analyzing a more diverse set of political outcomes, I offer a more nuanced analysis of for- eign propaganda’s influence on democratic politics. I break up my empirical analysis into four sections. First, I test how exposure to Russian propaganda can improve Russia’s image and den- igrate the Kremlin’s rivals. In my first experiment, I assess whether Russian propaganda on Russo-Ukrainian conflict can shift American public opinion about Ukraine. I subject subgroups of Americans to an article from Russia Today (RT), a Russian international television network, criticizing the Ukrainian government. I vary whether audiences are aware of the message source, and/or the intentions, of the Russian-funded network. I show that exposure to information about Ukrainian human rights violations lowers Americans’ evaluations of Ukraine irrespective of source awareness – indicating that making people more aware of foreign propaganda does not attenuate its influence. However, exposure to messages from RT does not increase support for pro-Russian policies, highlighting the difference between sharp and soft power. In sum, this chapter demonstrates that foreign actors can undermine the legitimacy of target governments, especially when citizens have undeveloped opinions about foreign policy issues. Second, I assess how exposure to conspiracy theories from foreign state-sponsored net- works shapes Americans’ belief in those conspiracies. I find evidence that international propaganda can increase conspiratorial beliefs, but these effects are concentrated among strong conservatives (when the conspiracy aligns with a pro-conservative narrative) and young people with low levels of political awareness (when the conspiracy addresses U.S.

6 foreign policy). Troublingly, I find evidence of backlash effects to inoculation messages that warn people about the threat of foreign disinformation (Nyhan and Reifler 2010). I also assess whether the Kremlin’s use of strategic and populist news frames promotes greater distrust in democracy. I show Americans an article criticizing the U.S. electoral system from Russia Today (RT). I vary whether individuals know the criticism comes from Russia and whether they received a warning message about Russian propaganda prior to reading the article. Contrary to popular opinion, I find that Russian propaganda does not increase political cynicism – regardless of whether people are aware of the message source or whether they were exposed to an inoculation message. The findings have implications for research on propaganda effects and inoculation strategies, contributing to a growing literature on how to combat and disinformation (Clayton et al. 2019; Lazer et al. 2018; Pennycook and Rand 2018). My third empirical chapter considers whether the true influence of international propa- ganda campaigns is in people’s perceptions of its’ effectiveness (Little 2017). This study is the first to take an experimental approach to analyze the “influence of presumed influence” in the context of international propaganda (Cohen et al. 1988; Golan and Lim 2016; Tal-Or et al. 2010). I test whether domestic politicization of foreign propaganda causes individu- als to overestimate the impact of mass communication on public opinion – leading people to support more anti-democratic policies in order to protect ‘vulnerable’ voters (Davison 1983). I examine whether people tend to overestimate the effect that external messages have on the public’s attitudes, and whether this perception of propaganda’s effectiveness on others increases their support for media regulation and heightens the perception that the political system is illegitimate. I provide evidence that those who believe that propaganda influences others are more likely to think the outcome of the 2016 presidential election was illegitimate and more likely to support censorship – even when accounting for partisanship and political aware-

7 ness. While exposure to propaganda tends to have few direct effects, the perception that propaganda shapes other people’s behaviors can exacerbate political polarization and dis- trust in democratic processes. By assessing the downstream consequences of disinforma- tion as well as our own counter-messaging efforts, this study emphasizes the importance of citizens’ beliefs about others (Ahler and Sood 2018; Hollyer, Rosendorff and Vreeland 2015; Kuran 1991; Levendusky and Malhotra 2015). People’s perceptions of reality can often exert more influence on attitudes and behaviors than reality itself. Finally, I assess the efficacy of U.S. offensive counter-messaging in non-democratic regimes. The growing threat of international propaganda has caused some to advocate fund- ing independent regional broadcasting networks in order to challenge autocratic regimes (Farwell 2018; Jones 2018). Yet, it is unclear whether criticism from foreign countries causes people to update their beliefs in ways that undermine autocratic leaders. To test the influence of U.S. offensive countermeasures in Russia, I rely on a novel experimen- tal design. I subject subgroups of Russian citizens to social media posts from a fictional news agency on the topic of corruption. I vary whether the source of the information is domestic or foreign and whether participants receive one-sided or balanced criticism. I find that foreign criticism can lower evaluations of the Russian president, but only if that criticism is balanced (i.e. also presents criticism of the United States). Paradoxically, the most ‘persuadable’ populations are the least likely to challenge the regime. I argue that foreign messages are most effective in changing views when aimed at specific groups of persuadables who are nominally apolitical. This chapter has direct implications for in- ternational democratization, counter-propaganda, and information politics in authoritarian regimes (Chen and Yang 2018; Huang and Yeh 2017; Marinov 2018; Robertson 2017). More broadly, my dissertation interrogates the complex interaction between domestic and international media, while emphasizing the indirect influence of propaganda. By an- alyzing the micro-level effects of international propaganda, the influence of propaganda’s

8 presumed influence, and the limitations of foreign digital media on democratization, I of- fer a more nuanced account of the effect of propaganda on public opinion. While many have written about the harmful effects of propaganda, our own counter-messaging initia- tives may exert the most influence over public opinion. The perception that propaganda is effective in shaping public opinion has a much greater impact than propaganda itself, forc- ing us to reconsider the mechanisms by which international propaganda influences politics. Our initiatives to combat propaganda – both defensive and offensive – may have limited influence.

Contribution

This project sheds light on several important topics in political science. First, while im- pact of propaganda aimed at domestic audiences is a thoroughly studied topic, the impact of media emanating from foreign countries is a less developed area of research (Adena et al. 2015; Bernays 1928; Ellul 1965; Lasswell 1927; Jowett and O’Donnell 2014; Yanagizawa- Drott 2014). This topic only becomes more important as globalization has integrated com- munication networks and made foreign disinformation a more palpable threat in the 21st century (Chapman and Gerber 2019; Crabtree, Darmofal and Kern 2015; DellaVigna et al. 2014; D’Hooghe 2014; Ettinger 1946; Gagliardone 2013; Herpen 2015; Kern and Hain- mueller 2009; Youmans and Powers 2012). Computational propaganda, which focuses on how actors use social media and big data to influence public opinion, provides novel insights into the new tools propagandists can use to reach mass audiences (Bolsover and Howard 2017, 273). We need to rethink how pervasive international propaganda is be- coming in our information spaces, and re-conceptualize how we measure the influence of foreign messages. Second, I shed light on some under-theorized and overlooked outcomes in propaganda studies. The preoccupation with measurable effects has caused scholars to focus on de-

9 pendent variables like favorability towards actors or approval of policies (Lanoszka 2019), while forgetting about earlier scholarship on propaganda, which emphasizes the corrosive impact of disinformation and biased media on human cognition (Cunningham 2002). Since persuasion-based processes are complex and multifaceted, more attention should be placed on assessing unintended and subtle impacts of political communication (Holbert, Garret and Gleason 2010, 17). As noted by others, “the proliferation of via social media exacerbates this undermining of the media’s traditional role by reducing public con- fidence in the veracity of media reporting” (Baum and Potter 2019, 751). Assessing how propaganda changes people’s trust in traditional democratic institutions is a necessary area of research (Pomerantsev 2014a). Third, I examine the effectiveness of current inoculation and counter-propaganda strate- gies (Fund 2017; Hall 2017). Exposing foreign propagandists, educated citizens to be more intelligent consumers of political information, and helping promote transparency in non- democratic states are necessary pursuits (Farwell 2018; Fly, Rosenberger and Salvo 2018). However, we should be vigilant about unintended and unwanted consequences of domestic politicization of international propaganda and it ability to exacerbate political polarization, discredit domestic opposition movements, and worsen existing deficit in media trust (Bjola 2018;M alksoo¨ 2018). As argued by Josh Machleder, Vice President for Europe, Eura- sia and Asia Programs at Internews, “fighting propaganda with counter-propaganda only breeds despair, cynicism, and confusion among the target populations” (Machleder 2015). While this is not meant to discourage efforts to promote a freer media environment in less competitive regimes nor to promote moral equivalence between the propagandistic actions of democratic and autocratic states, I believe that by assessing the impact of counter- propaganda we gain some insights into the unintended influence of mass communication (Constine 2018; Guess and Coppock 2018; Nestler and Egloff 2010; Szostek 2015; Wood and Porter 2018). This is crucial since revelations about foreign interference has raised

10 questions about the role technology firms and governments have in protecting civilians from disinformation and what citizens can do to exert greater control over their political information (Susskind 2018). Finally, this project complements existing research on the ways autocratic countries harness the power of new information and communication technologies to promote the sta- bility of their regime (Chen 2018; Gunitsky 2015; Huang 2015b; King, Pan and Roberts 2013, 2017; Qin, Stromberg¨ and Wu 2017; Shambaugh 2013). While most of this work fo- cuses on the institutional logic behind propaganda by offering more nuanced understanding of censorship in non-democratic regimes and analyzing networks of influence (Behrouzian et al. 2016; Chen and Yang 2018; Roberts 2018), they rarely measure the impact of specific propagandistic messages which can help provide a more holistic understanding of inter- national propaganda both at home and abroad (Adena et al. 2015; Huang and Yeh 2017; Robertson 2017; Truex 2016; Yanagizawa-Drott 2014). Overall, I hope this project can speak to researchers working in political communication, autocratic information politics, and international democratization while contributing to our understanding of international propaganda’s effect on democratic governance and national security. The rest of the dissertation proceeds as follows. In Chapter 2, I explain why studying international propaganda is necessary. Chapter 3 presents a new theory on how propaganda can shape public opinion and outlines existing defensive and offensive responses to foreign propaganda. In Chapter 4, I analyze when Russian propaganda undermines the legitimate authority of a target government. In Chapter 5, I test if Russian propaganda can increase belief in conspiracies and bolster political cynicism. Both Chapters 4 and 5 also evaluate whether making people more aware of the message source mitigates propaganda’s effec- tiveness. Chapter 6 examines whether making individuals aware of the influence of foreign propaganda exacerbates third-person effects and alters views toward electoral legitimacy and censorship. Chapter 7 switches the focus from Russian state-sponsored propaganda

11 to assess whether Western-sponsored Russian-language news agencies may inadvertently increase support for Putin’s government. Chapter 8 summarizes the findings, discusses the normative implications of the studies, and discusses avenues for future research.

12 Propaganda in International Politics

“Over the course of my career, I’ve seen a number of challenges to our democ- racy. The Russian government’s effort to interfere in our election is among the most serious. As I said on May 29, this deserves the attention of every American.”

– Former Special Counsel Robert Mueller, testifying before the House Judi- ciary Committee about Russian interference in the 2016 election, July 24, 2019

Before discussing the threat of international propaganda and disinformation campaigns, it is useful to define these normatively loaded terms. What are the main differences between propaganda, public diplomacy, misinformation, disinformation, and fake news? Scholars that study state-driven efforts to communicate directly with international audiences often use the term public diplomacy, referring to “efforts by the government of one nation to influence public or elite opinion in a second nation for the purpose of turning the foreign policy of the target nation to its advantage” (Manheim 1994, 4). Public diplomacy can be primarily informational, where the motivation is to deliver a message to foreign audiences (Melissen 2005). Public diplomacy can also be relational, where states attempt to create networks with foreign audiences to increase the chance of effectively propagating future information (Zaharna, Arsenault and Fisher 2014). In practice, public diplomacy encompasses everything from student exchanges, foreign broadcasting, and formal lobbying with civic organization, making it a catch-all phrase without a clear theoretical agenda (Fan 2008). Media diplomacy is a more concrete term, focusing on the use of media to target foreign audiences (Gilboa 2008, 58). Yet, when non-democratic countries try to communicate with foreign audiences these initiatives are labeled propaganda (Barghoorn 1964; Leighton 1991; Nagorski 1971). Propaganda, as de- fined by Jowett and O’Donnell, is “the deliberate, systematic attempt to shape perceptions,

13 manipulate cognitions, and direct behavior to achieve a response that furthers the desired intent of the propagandist” (Jowett and O’Donnell 2014, 7). How does foreign propaganda differ from public diplomacy? First, one can consider the process of communication. Public diplomacy, according to some, is an interactive, two-way process where the communicating state not only sends messages but also listens to their audience. Propaganda, on the other hand, transmits in- formation directly to the target, with a unimodal relationship between communicator and recipient (Melissen 2005). Some assume that non-democratic states are irresponsive to public opinion and do not have to listen to their citizens. Due to the lack of a two-way process of communication, any messages from non-democratic states can be considered propaganda (Gilboa 2008, 58). Yet, autocratic regimes do respond to changes in public opinion and they care about the opinions of domestic and international audiences (Bell and Quek 2018). This focus on “one-way” vs “two-way” communicative processes is also problematic because it rarely relies on empirical evidence to categorize the mechanisms by which states communicate with foreign audiences. When Western democracies attempt to influence mass audiences abroad, their actions are seen as proactive initiatives that educate foreign audiences (Gilboa 2008). On the other hand, countries such as Russia, , and Saudi Arabia are portrayed as maliciously trying to manipulate public opinion (Rawnsley 2015). However, many accuse U.S. international broadcasting of being equally manipulative and culturally oppressive (Doaui 2014, 139). Citizens in the Middle East often find U.S. attempts at public diplomacy to be carefully disguised propaganda campaigns, despite U.S. efforts to frame their projects as two-way interactive processes (Brooks 2015; Rugh 2006). Moreover, the opening of historical archives in Russia has revealed few differences in the tactics used by the and the United States during the with the Soviet Union actually emulating many U.S. public diplomacy strategies (Rupprecht 2015).

14 Propagandists need to engage their audience to be successful. The archaic model of channeling blatant pro-regime information is ineffective (Martin 1971). Most modern pro- paganda campaigns account for the interests of their target audiences, making propaganda an interactive process between communicators and receivers. In fact, many are concerned that propagandists are listening too much to their audiences and are able to craft specifically tailored messages that maximize persuasion (Endres and Kelly 2018). If the difference be- tween propaganda and public diplomacy does not in the process of communication, how do we differentiate the two concepts? Edward Bernays, a pioneer in propaganda studies, contends that while propaganda may carry unpleasant connotations for many. He argues that whether propaganda is good or bad depends upon the merit of the cause urged, and the correctness of the information published (Bernays 1928, 20). This has led subsequent scholars to examine the source, intent, and bias of information to create typologies of mass propaganda. Stanley Cunningham uses source transparency to distinguish types of political infor- mation. He discusses differences between white propaganda, which does not conceal its origin; propaganda, which conceals or camouflages its sources; and grey propaganda which uses sources of questionable origin (Cunningham 2002, 66-75). For instance, so- cial media bots that pose as real people to spread fake news seem to clearly fall within the purview of propaganda, since the communicator is disguising her/his identity to maximize the message’s persuasiveness (Arif, Stewart and Starbird 2018; Farkas and Bastos 2018). Cloaked social media posts, where users pose as their rivals and write inflammatory posts, should also be considered propaganda under this framework (Farkas, Schou and Neumayer 2018). Martin and Shapiro(2019) contend that foreign influence campaigns often produce content that looks like it has been “produced organically in the target state” (4). However, states employ a combination of these strategies to shape mass attitudes. For example, RAND describes how the Kremlin relies on a wide variety of tools with different

15 levels of explicit connections to Russia to push their strategic narratives (Helmus et al. 2018, 11-13). Networks like Russia Today (RT) and promote fairly normal news coverage (albeit with a clear pro-Russia bias) while remaining in the open. Other tools, like Russian bots posing as pro-Trump voters, more clearly fall under the category of (Sanovich, Stukal and Tucker 2018). While black, white, and grey propaganda may be useful concepts because they allow scholars to categorize political information, obsessive classification may lead to few new theoretical insights about the influence of propaganda (Cunningham 2002, 42). This can degenerate into different political factions labeling media on the other side ‘fake news’ without offering clear definitions. Empirical work often focuses on identifying objectively false information, relying on independent fact checkers to categorize certain news items as definitively false (Figueira and Oliveira 2017). By identifying narratives that mention false information or use doc- tored images, researchers try to classify certain news agencies as more or less propagan- distic (Resnick, Ovadya and Gilchrist 2018). Focusing on explicitly false information and speculations about the propagandists’ intentions allows scholars to form new typologies like: (1) mis-information, which refers to the sharing of false information that does not intend to do harm; (2) dis-information, which refers to the sharing of false information to cause harm; and (3) mal-information, which refers to true information that is intentionally shared to cause harm (Wardle and Derakhshan 2017, 5). Does information have to be false to be classified as propaganda? Early work on pro- paganda explicitly made falsehoods central to their definitions of propaganda (Irion 1950). Subsequent work argues that sophisticated propaganda utilize a mixture of facts and false- hoods, “it exploits expatiations and confusion; it overloads audiences with information; it relies upon murkier epistemic moves such as suggestion, innuendo, implication, and trun- cated modes of reasoning” (Cunningham 2002, 98). According to Ellul, “in propaganda, truth pays off” (Ellul 1965, 53).

16 While flat-out inaccurate information may be easy to spot, only focusing on demonstra- bly false information misses the nuances of propagandistic messages. Through a mixture of facts and falsehoods, even clearly biased networks can shape political attitudes (Little 2017; Stanley 2015; Truex 2016). It is the “regular coverage” of events, rather than bla- tant disinformation, that allows networks to gain credibility and makes it difficult to simply label their networks as pure disinformation outlets (Carter and Carter 2018). Networks funded by non-democratic states obviously exhibit a strong bias, but they often resemble hyper-biased partisan networks (Peisakhin and Rozenas 2018). Some measure the extent of bias in a network to differentiate “genuine” broadcasting and propaganda (Budak, Goel and Rao 2016; Martin and Yurukoglu 2017; Ribeiro et al. 2018). Using automated text analysis to detect fake news has had mixed results (Graves 2018; Hamborg, Donnay and Gipp 2018). In practice, assessing the extent of bias in an information source is complicated since bias can take many forms. For instance, “a media outlet can be selective in what issues it covers (issue bias), what aspects of the issues it includes or excludes (facts bias), how the facts are presented (framing bias), and how it is commented (ideological stand bias)” (Prat and Stromberg¨ 2013, 33). This has led to a reconceptualization of how propaganda works in both democratic and non-democratic states. For instance, research on Russian coverage of economic issues finds that the Kremlin does not censor unfavorable economic news but simply shifts blame to external actors while taking responsibility for any favorable outcomes (Rozenas and Stukal 2018). This ability to shift blame while reporting factual information has important consequences for democratic accountability (Bisgaard 2019). Newspapers in autocratic states actually provide rather neutral coverage of events the majority of the time so they can keep viewers interested and credulous for when the government needs to ramp-up pro-regime messages. Autocrats can concede policy failures in order to boost the credibility of pro-regime propaganda (Carter

17 and Carter 2018). This makes it more difficult to label a news network as a propaganda outlet without drawing accusations of bias from the other side. This becomes even more difficult when one considers satire, which often exaggerates the truth or relies on false stories to draw at- tention to political causes (Shao and Liu 2018). The debate over terms is contentious since political polarization allows people to classify any information that they find disagreeable as fake news, fostering a vicious cycle which engenders further political divisions and dis- trust in the veracity of competing narratives (Born 2017; Faris et al. 2017; Marwick and Lewis 2017). For some, “to distinguish exactly between propaganda and information is impossible” (Ellul 1965, 112). Does this mean that there are no differences between networks like Voice of America (VOA), the British Broadcasting Corporation (BBC), and Russia Today (RT)? Dan Robin- son, VOA’s former chief White House correspondent, insists that such a separation, between journalism and government influence, was sometimes more theory than reality.2 Margarita Simonyan, editor-in-chief of RT, argues that Russia is simply doing what the United States and Europe have doing in other countries for years. She asserts that “those who are the first to accuse RT of shoddy journalism and propagating conspiracy theories are practicing exactly what they supposedly preach against.”3 Russia’s foreign ministry has called their approach innovative diplomacy, or a “tool of Russian foreign policy to exert influence on public opinion through the use of ICT (informa- tion communication technologies)” (Surowiec 2017, 23). According to the Kremlin, U.S. foreign media also relies on false information (Tamkin 2017). Journalists working for RT contend that their coverage of race-relations in the United States and deficits in American

2https://www.aljazeera.com/programmes/listeningpost/2018/06/journalism-propaganda- state-sponsored-media-180602110533574.html 3https://www.rt.com/op-edge/205667-uk-rt-conspiracy-theories/

18 democracy is not illegitimate just because the network occasionally promotes conspiracy theories. In fact, “the central premise of a lot of Russian propaganda, is that all news is biased and partisan, and the search for objective truth is a fool’s errand” (Boduszynski and Breeden 2017). Yet, many agree that RT and Sputnik are fundamentally different than VOA or the BBC. Russian networks frequently promote conspiracy theories whose purpose is to undermine trust in traditional democratic institutions, spread rumors about rival countries, and equate problems in Western democracies with the democratic deficits in Russia (Yablokov 2015). According to Andrew Feinberg, an American journalist who formerly worked for Sputnik: “the untold is untold because it’s not true, and the ‘alternative perspective’ is a way to push a hostile government’s agenda by tearing down the reputation of other nations” (Feignberg 2017). The close editorial connection between RT and the Kremlin means that the basic purpose of this network is to wage (Nelson 2019, 137). The extent of false coverage that appears in Russian networks is categorically different than other international broadcasters. In Ukraine, blatantly false stories about Ukrainian soldiers crucified children, U.S. involvement in the shooting down of flight MH17, and the infiltration of Nazis in the new Ukrainian movement represents coverage that is absent from more reputable international broadcasters (Lichtenstein et al. 2018; Roman, Wanta and Buniak 2017). While nobody can deny that even the best intentioned journalists are ca- pable of unintentionally starting a false rumor, most would agree that using doctored photos during conflicts in Ukraine and Syria or propagating false narratives about mass shootings in the United States are qualitatively different phenomenon (Kirchick 2017). East Stratcom Task Force’s “Disinformation Review” and the Kyiv Mohyla Journalism School “Stop- Fake” have identified hundreds of false stories in Russian media targeting international audiences. Even if one considers news outlets like RT and Sputnik as genuine news outlets, the spread of disinformation through the use of bots and cloaked social media accounts is

19 a far cry from traditional journalism in search of objective truth. This has caused headaches for governments and large technology firms who are tasked with identifying and removing foreign disinformation. Trying to objectively classify some news networks as propaganda outlets is likely to be unsuccessful. For instance, automated fact-checking has substantial limitations and “will require human supervision for the fore- seeable future”, meaning there will always be accusations of human bias. (Graves 2018, 1). That does not mean we should accept that all international broadcasters are morally equivalent. Rather, we need to recognize that propaganda is not simply false information and consider the full spectrum of propaganda strategies (Stanley 2015). Moreover, while debates over terminology and categorization are important since how we categorize infor- mation often influences our views on its acceptability in society and laws surrounding its dissemination, we also need to spend more time dissecting modern propaganda campaigns and assessing their influence on our societies. What is different about this era of interna- tional propaganda? How does it compare to past?

Why Study International Propaganda?

States have always relied on propaganda to promote their own interests and influence foreign publics. In Ancient Greece, the Athenian general Themistocles strategically placed propaganda engraved stones for the opposing Ionian forces to see before battle (Jowett and O’Donnell 2014, 51). Genghis Khan relied on agents to spread rumors about the size of the Mongol armies in enemy territory to inspire fear in his opponents (Linebarger 1954, 7-15). Napoleon was adept at promoting national policy by planting stories in foreign newspapers (Thomson 1999, 223). In WWII, allied forces sent mixed messages to Nazi military offi- cials and civilians to gain military advantages (Roetter 1974). Famously, Japanese soldiers dropped propaganda leaflets to American interracial service units in Iwo Jima to promote defection (Davison 1983, 5).

20 States recognize that maintaining lines of communication and speaking to whole popu- lations, rather than just elites, can facilitate the accomplishment of one’s political goals. With new information and communication technologies, state and non-state actors can present their side of a foreign conflict to international audiences (Sheafer and Gabay 2009, 273). For instance, during the Cold War, U.S. government officials were optimistic that by providing more information to citizens in the Soviet Union, these audiences would hold their government more accountable and push for democratic reforms (Prat and Stromberg¨ 2013, 47). According to one comprehensive study of Western broadcasting to the countries behind the Iron Curtain, Western radio had between 25-50 million Soviet listeners between from 1978-1990, and Radio Liberty reported a weekly listenership of over 15% (35 mil- lion people) after jamming ended in 1988 (Danielson 2004, 17). While these figures may look impressive, scholars still debate about the actual influence of these initiatives (Manaev 1991; Puddington 2000; Urban 1997). In the West, the United States feared that Soviet spread communist ideas in America and Western Europe. Famously, the USSR helped fan conspiracy theories about the origins of the AIDS epidemic (Qiu 2017). The Kremlin forged documents linking the West German regime to the Nazis to stir anti-German sentiments in Central European countries. The Soviets also sought to create divisions between the U.S. and Egyptian Pres- ident Anwar Sadat, mobilize Western European public opinion about the neutron bomb, and promote anti-US narratives in third-world countries using front organizations like the World Peace Council (Kux 1985, 20-22). Now that the Cold War is over, the United States and its European allies worry more about the growth of China (Shambaugh 2013), the spread of radical Islam (Gerstel 2017), and disinformation emanating from non-democratic states (Lucas and Pomerantsev 2016). Part of the newfound anxiety over foreign propaganda stems from the increasing prevalence and reach of international networks due to advances in information technologies - partic-

21 ularly the internet and social media. A cross-national survey by PwC UK found that over 66% of those surveyed uses cross-border media, with the faster growing providers coming from non-Western countries (Tomlinson 2016). Especially in the last decade, there has been growing attention to the investments of non-democratic states into government-funded news networks. These include Russia’s Russia Today (RT) and Sputnik, China’s CCTV, and Qatar’s Al-Jazeera (Lynch 2006; Seib 2008; Xie and Boyd-Barrett 2015). In 2015, the Russian government-funded network RT was available to over 700 million people in more than 100 countries. According to an IPSOS survey in 38 countries, over 70 million individuals watch RT every week and 35 million watch the Russian network daily. RT also has a large YouTube presence, being the first networks to reach one billion views (Nelson, Orttung and Livshen 2015). In November 2014, the Kremlin launched Sputnik International, a subsidiary of the state-owned Rossiya Segodnya, which is available in over 35 languages and is aimed at “disenfranchised” inter- national audiences (Nimmo 2016). At its peak, Al-Jazeera could reportedly reach over 40 million citizens in the Arab world and a similar number of American households (Xie and Boyd-Barrett 2015, 74). China’s CCTV has 10 channels broadcasting to global audience in English, French, Span- ish, Arabic, Japanese, and Russian. According to CCTV, the channel is available to over 65 million viewers in 140 countries. In 2018, China announced its plan to merge its overseas broadcasting networks into one giant media outlet called Voice of China (Yip 2018). Not even counting government-sponsored news outlets, advances in computational propaganda allow foreign actors to more easily interfere in the politics of other states by reaching a large number of citizens outside their country’s borders using social media (Bolsover and Howard 2017, 273). There is a concern that cross-border media from non-democratic states can help fuel dissent within foreign countries, galvanize extremist groups, and contribute to political po-

22 larization (Youmans and Powers 2012). Some scholars see these initiatives as part of a larger trend of autocratic diffusion (Ambrosio 2010; Tolstrup 2015; Weyland 2017). By reaching foreign audiences, non-democratic states can can increase their normative influ- ence and promote their foreign policy agenda (Warren 2014). Advances in communication technology facilitate the creation of mass media infrastructures which dramatically lower the production cost of normative influence (Warren 2014, 113). In fact, “what we see un- folding right before our eyes is nothing less than Moore’s Law applied to the distribution of misinformation: an exponential growth of available technology coupled with a rapid collapse of costs” (Filloux 2017). Yet, while it is true that foreign countries are able to produce more content at a faster pace due to people’s increased reliance on the internet for political information, is anyone actually listening?

Is There a Demand for Foreign Perspectives?

Individuals use media to satisfy a wide variety of needs ranging from obtaining ac- curate information (Tetlock 2002), avoiding cognitive dissonance (Festinger 1957; Gerber and Green 1999), and signaling social identities (Wardle and Derakhshan 2017). In demo- cratic countries, where direct censorship is rarely an issue, it is unclear why citizens would ever listen to or read foreign news due to the proliferation of competing domestic sources. Occasionally, citizens may seek out online foreign news networks to “appraise attitude- consistent news stories” (Best, Chmielewski and Krueger 2005, 65). Yet in most case, direct exposure to foreign media outlets is rare since individuals are more likely to listen to domestic rather than foreign media (Page, Shapiro and Dempsey 1987). While there is some evidence that foreign elite perspectives are more common than they were in the past (Althaus et al. 1996; Hayes and Guardino 2011; Murray 2014), there may be little space for foreign actors to shape public discourse due to their lack of credibility and familiar- ity (Bennett 1996; Entman 2004). Because of this, some assert that external propaganda

23 against a democracy is generally ineffective (Ellul 1965, 296). Does the rise of online social networks and citizens’ ability to opt out of all political discussions merely suggest an age of minimal effects – especially for foreign propaganda (Holbert, Garret and Gleason 2010)? Some argue that when foreign media does reach its target audiences, it tends to be consumed by individuals that are pre-disposed to accept these narratives – whose numbers generally make up a small proportion of the population (Nelson and Taneja 2018). Once accounting for this selective exposure, persuasion may be limited and foreign messages may simply reinforce pre-existing beliefs (Arceneaux and Johnson 2013; Lazarsfeld, Berelson and Gaudet 1944; Lord, Ross and Lepper 1979; Rug- giero 2000). In the United States, those who study non-democratic foreign news networks argue that concern “is overstated insofar as the security of its domestic media markets is concerned” (Xie and Boyd-Barrett 2015, 79). Fletcher et al.(2018), who examine the influence of fake news in France, find that the most popular fake news outlet, Sant+ Maga- zine, reached only 3% of citizens with outlets like RT and Sputnik reaching half of that (3). Writing about modern Russian foreign propaganda, policy analyst Leon Hadar asserts that:

You don’t have to be a marketing genius to figure out that in the age of the 24/7 media environment, foreign networks face prohibitive competition from American cable news networks like CNN, MSNBC, , social media, not to mention Netflix and yes, those online porno sites. Thus the chances that a foreign news organization would be able to attract large American audiences,

and have any serious impact on their political views, remain very low (Hadar 2017)

Yet, there is reason to believe that current research underestimates the reach of state- sponsored propaganda by focusing on direct exposure and overlooking indirect or inad- vertent exposure to foreign media. When we reconsider how international propaganda is

24 reaching audiences, it is possible to arrive to a fundamentally different conclusion about its influence on society.

Is Exposure to International Propaganda More Frequent Than We Think?

Some studies on propaganda focus on counting the number of viewers of state-owned television network or “likes”, “shares”, and “re-tweets” on social media posts (Badawy, Ferrara and Lerman 2018; Kim et al. 2018; Metzger and Siegel 2019; Orttung and Nelson 2019). In doing so, they assess the how citizens are directly exposed to propaganda. In a study that measures the prevalence of selective exposure to misinformation in the United States using individual-level web traffic histories, the authors find that exposure to fake news is concentrated among “10% of people with the most conservative online information diets” (Guess, Nyhan and Reifler 2018, 1). Others confirm that that direct exposure to disinformation is often concentrated amongst a small group of individuals (Badawy, Ferrara and Lerman 2018; Nelson and Taneja 2018; Zannettou et al. 2019). However, citizens may not be actively selecting into foreign sources or even be aware they are consuming information from foreign countries (Hegelich and Janetzko 2016; Mejias and Vokuev 2017; Shao et al. 2017). Most studies cannot capture incidental exposure found on social media sites (Guess, Nyhan and Reifler 2018, 3), or exposure that is a consequence of the habits of family members who select into these networks (Szostek 2018a, 70). Indi- viduals are exposed to international propaganda while using social media for non-political purposes, or when their preferred news organizations pick up stories spread by foreign net- works (Budak, Agrawal and Abbadi 2011; Tucker et al. 2018). This means that current research may be understating the mechanisms by which foreign messages are reaching and influencing audiences (Youmans and Powers 2012). Analyzing the diffusion of fake and true news on , Vosoughi, Roy and Aral discover that fake news “spreads further, faster, deeper, and more broadly than the truth

25 in all categories of information, and the effects were more pronounced for false political news” (Vosoughi, Roy and Aral 2018, 1156). Misinformation spreads because people have difficulty judging the accuracy of information, and consumers derive utility from biased news that confirms their political priors (Allcott and Gentzkow 2017, 212). While many had hoped that the internet’s ability to provide quick and easy access to information would improve transparency and accountability, it is becoming clear that “excessive data do not enlighten the reader or the listener; they drown him” (Ellul 1965, 87). This “deluge of information, while it may prompt some awareness, does not really instruct, but simply overwhelms, ultimately disempowering the public” (Cunningham 2002, 105). While some foreign propaganda may appear absurd or unsophisticated (Madrigal 2018), others show that behavioral targeting and identity resolution technologies associated with modern-day propaganda tools are as sophisticated as it gets – with the ability to reach highly specialized populations (Albright 2017b). Propagandists rely on both traditional and social media to “create the appearance of multiple voices and points of view, masking a coordi- nated approach” (Nimmo 2018). Since individuals are more prone to accept information when it is repeated frequently and by multiple sources, propagandists create the illusion that a news story is coming from multiple channels (Centola and Macy 2007). Through this mechanism, propagandists can fake the “multiple source” heuristic that individuals use to the judge the quality of information (Qiu et al. 2017, 2). Computational propaganda scholars have highlighted ways that propagandists can ex- ploit social media and search algorithms to promote their political narratives without hav- ing audiences directly select into foreign networks. In 2016, reported that Russian propaganda reached over 126 million American citizens through Facebook alone, in addition to 131,000 messages on Twitter and over 1,000 videos on YouTube. The Internet Research Agency (IRA), a Russian company linked to the Kremlin, “had posted roughly 80,000 pieces of divisive content that was shown to about 29 million people be-

26 tween January 2015 and August 2017. Those posts were then liked, shared and followed by others, spreading the messages to tens of millions more people” (Isaac and Wakabayshi 2017). PropOrNot, an anonymous organization that tracks disinformation, identified over 200 websites that promoted Russian propaganda during the 2016 U.S presidential elec- tion that reached over 15 million Americans more than 213 million times on Facebook (Spangher et al. 2018, 2). Facebook disclosed a list of 129 events promoted by Russian- sponsored accounts that drew the attention of over 340,000 people, indicating that foreign propaganda is not only interested in changing political attitudes but also shaping political behavior (Timberg 2016). A report on Russian disinformation released in December 2018 found that on Facebook alone “IRA posts were shared by users just under 31 million times, liked almost 39 million times, reacted to with emojis almost 5.4 million times, and engaged sufficient users to gen- erate almost 3.5 million comments” (Howard et al. 2018, 6). Analysis of Twitter accounts linked to the IRA found that: “one common thread in the U.S.-focused news accounts is that they often posted regular local coverage and links to legitimate sources in order to build trustworthiness among followers” (Hanlon and Bennett 2018). Russian propagandists be- lieved local news outlets to be more credible among American audiences (Roescher et al. 2018). The report found that “the accounts, some of which gathered nearly 20,000 fol- lowers, didn’t purposely spread false news and instead shared credible local news stories without any particular slant” (Mikelionis 2018). Analysis from both government agen- cies and private organizations show that the intensity of Russian sponsored disinformation campaign was highest during the final stretch of the 2016 presidential campaign and con- centrated in swing states (Kim et al. 2018). Notably, this does not account for people who were indirectly exposed to Russian pro- paganda via platforms like Reddit, Tumblr, Instagram, and Pinterest (Orr 2018). Insta- gram was a major distributor and re-distributor of Russian propaganda relying on strategic

27 meme engagement to reach minority segments of the population (Albright 2017a). Reports from Oxford’s University’s Computational Propaganda Project found Russian propagan- dists used Instagram to target U.S. veterans and military personal. These specialized social media accounts helped spread misinformation about national security issues and attracted more followers than real veterans organizations like Vietnam Veterans of America (Gal- lacher et al. 2017). Google has also been tied to Russian propaganda campaigns. After an international team of investigators blamed Russia for the downing of Malaysian Flight 17 (MH17), sto- ries from RT and Sputnik tried to defuse responsibility by spreading conspiracy theories about the Ukrainian government, many of which appeared on Google’s front page (Hanlon 2018a). Ordinary individuals were also instrumental in spreading Russian propaganda to their fellow citizens by interacting with bots that spread disinformation (Yevgeniy, Hart- mann and Adler-Nissen 2018). Overall, people’s growing reliance on social media for political news, in conjunction with countries’ active efforts to boost the popularity of particular posts with bots, makes the likelihood of exposure to foreign propaganda more likely that previously thought (Newman et al. 2016). This is important since individuals tend to forget the source they used when they discover news while browsing social media (Kalogeropoulos, Fletcher and Nielsen 2018). As argued by Tucker et al.(2017):

Attention-hacking techniques that authoritarian regimes have used, such as clickbait and manipulated search results, benefit immensely from rapid diffu- sion. This process may gain strength from users’ accidental (as opposed to selective) exposure to content shared via social media. Such content, even if it is out of line with users’ beliefs, will in at least some cases rouse their curiosity when otherwise they might never have looked into the topic (54).

28 In many cases, people do not know they are being exposed to information from state- funded networks. Recent investigative journalists have found that the Kremlin is adept at creating new online media companies to promote their preferred narratives to foreign au- diences. According to Hanlon and Morley(2019), “Russia’s state-controlled propaganda network is targeting young, often left-leaning Western consumers with slickly-produced misinformation packaged as meme-able satire and no-nonsense takes on history, environ- mental issues, and sensitive global politics.” Notably, there is evidence that Russian trolls are still active and getting ready to influence future elections (Im et al. 2019, 1). In additional to social media, traditional news outlets occasionally pick up narratives from biased networks, thereby increasing their reach beyond those who simply select into consuming foreign media. Kohei Watanabe finds that Russian narratives about the Ukraine conflict were published on popular online news sites such as Yahoo News and Huffington Post (Watanabe 2017). The guardian uncovered that Russian propaganda was cited over 80 times by British press agencies in 2017 (Hern, Duncan and Creamer 2018). In Europe, far-right and far-left parties often promote pro-Russian narratives in their media outlets (Hegedus˝ 2016). For example, Hungary’s Fidesz party often quoted RT and Sputnik prior to the April 2018 parliamentary election (Bayer and Plucinska 2018). In Slovakia, for- mer Slovak Prime Minister Jan Carnogursky promoted pro-Russian narratives and accused those who take anti-Russian stance to be “Russophones” (Rohac 2015). Tricking main- stream media organization into amplifying their messages is a major goal of propagandists since this allows their strategic narratives to reach a broader audience (Wardle and Der- akhshan 2017, 3). Jonathon Albright, a data journalist who works on “micro-propaganda” found that fringe propaganda sites rely on mainstream networks to propagate their narra- tives given their centrality in information ecosystems (Albright 2017a). In the United States, Matt Drudge’s link-aggregation site Drudge Report, which re- ceived over 1.4 billion views a month during the 2016 U.S. presidential campaign, had

29 multiple links to RT and Sputnik (Bump 2017). Russians also posted comments on popu- lar U.S. news sites to polarize public opinion and promote misinformation (Anspach and Carlson 2018; Seddon 2018). The conspiracy site InfoWars, which not only has a sizable American audience but also the admiration of President Trump, posted over 1,000 articles from RT from 2014-2017 (Lytvynenko 2017). News outlets like Buzzfeed, the Washing- ton Post, Vox, and the Miami Herald included tweets made by Russian trolls in their news coverage, inadvertently amplifying the Kremlin’s reach (Hanlon 2018b, 5). One notable example were tweets from the Russia-controlled account under the name Jenna Abrams which posted racist and controversial tweets that were seen by hundreds of thousands of individuals and were covered in the mainstream media (Hsu 2018). It is not surprising that most of the research has centered on the Kremlin’s strategic in- formation campaigns given Russia’s visible attempt to manipulate Western elections. How- ever, Russia is not the only actor responsible for this growth of international propaganda. According to cybersecurity experts, “multiple foreign actors have demonstrated an ability and willingness to leverage these kinds of influence operations in pursuit of their geopoliti- cal goals” (Timberg and Romm 2019). Citizen Lab at the University of Toronto discovered that the the Iranian government published over 136 fake stories concerning Saudi Arabia, the United States, and Israel that were made to look like genuine news articles from out- lets like the Guardian, Bloomberg, Al Jazeera, the Independent, the Atlantic, and Politico (Silverman 2019). This network of inauthentic websites and social media accounts would practice a strategy of ephemeral disinformation. Put simply, when content “achieves social media traction, it is deleted and the links are redirected to the domain being impersonated. This technique creates an appearance of legitimacy, while obscuring the origin of the false narrative” (Lim et al. 2019). In sum, foreign actors use mainstream media sources to disseminate their propaganda and manipulate local elites into spreading disinformation. Many speculate that these highly

30 targeted and strategically timed international propaganda campaigns are subverting demo- cratic processes (Jamieson 2018; Hunt 2018). Yet, we lack empirical evidence on inter- national propaganda’s effect on individual political behavior. Does computational propa- ganda and targeted disinformation mark a new era of propaganda, or are we over-inflating the threat of foreign propaganda (Lanoszka 2019)? When actually confronted with foreign propaganda, do audiences change their attitudes?

31 Is International Propaganda Effective?

“Propaganda manifests itself not simply in the content of particular beliefs of skewed attitudes but more radically in the impairment of the mind...the public no longer cares to distinguish between reality and television-induced pseudo- reality” – (Cunningham 2002, 108).

Despite a concerted effort to counter international propaganda, we lack evidence on when foreign messages influence mass attitudes (Allport and Simpson 1946; Barghoorn 1964; Lorimor and Dunn 1968; Martin 1971; Nagorski 1971; White 1952). This is trou- bling since the main reason most scholars study international propaganda is because they assume that foreign states are manipulating information in countries in ways that sway im- portant political outcomes, harm the foreign policy interests of the state, and undermine the health of democratic politics. Yet, evidence of the influence of international propaganda on mass attitudes or political behavior is lacking (Badawy, Ferrara and Lerman 2018; Gerber and Zavisca 2016; Helmus et al. 2018; Herpen 2015; Peisakhin and Rozenas 2018). A growing body of research shows that foreign actors can significantly shape pub- lic opinion (DellaVigna et al. 2014; Dragojlovic 2015; Grieco et al. 2011; Guardino and Hayes 2017; Murray 2014). Yet, not all individuals are equally receptive to foreign cues. Citizens’ political priors, levels of political awareness, and personality all shape the in- fluence of transnational persuasion (Zaller 1992). For instance, Grieco et al.(2011) find that endorsement from the UN Security Council and NATO allies’ endorsements can in- crease support for the use of force among American with lower levels of support for the US president. Dragojlovic(2015) shows that Democrats are more receptive to French and British sources than an American cue. Finally, Guardino and Hayes(2017) demonstrate that international actors can be more persuasive than domestic cues for Democrats. Others argue that foreign cues are ineffective or even counter-productive (Crabtree, Dar-

32 mofal and Kern 2015; Kern and Hainmueller 2009). Jacques Ellul noted that international communication has little influence on foreign public opinion (Ellul 1965, 296). L. John Martin argues that “if our persuasive communication ends up with a new positive effect we must attribute it to luck, not science. The propagandist cannot control the direction or the intensity or impact of his message, if, indeed he reaches his target at all” (Martin 1971, 70). Noting an in-group bias to political persuasion (DeMarzo, Vayanos and Zwiebel 2003), some argue that foreign messages will be discounted by audiences or provoke backlash effects (Ashmore et al. 1979, 132; Page, Shapiro and Dempsey 1987, 32). For example, exploiting signal variation in Western media in Eastern , Kern and Hainmueller (2009) find that the availability of Western media helped increase satisfaction with the Communist regime. Relying on an online survey experiment in China, Huang and Yeh (2017) show that “reading relatively positive foreign media content about foreign countries can improve rather than worsen the domestic evaluations of citizens who self-select such content” – demonstrating that foreign content can have unforeseen effects based on how information aligns with individuals’ prior beliefs (1). These aren’t new insights. Decades ago, the CIA estimated that the Soviets spent over $4 billion per year on strategic information operations meant to undermine U.S. credi- bility and promote Soviet foreign policy interests. Despite Soviet efforts to mobilize the peace movement within the United States in the 1980s, and the proliferation of thousands of documented , the FBI uncovered little evidence that American policy makers were swayed by these efforts (O’Brien 1989, 40). Generally, ”Russian propaganda rarely had a substantial impact inside the United States because credible media outlets usually weeded out fake stories and false headlines, preempting them from reaching large audi- ences” (Deeks, McCubbin and Poplin 2017). Studies of Western media in the Soviet Union came to similar conclusions. Pro-Western networks were generally not viewed as decisive

33 in ending . Former KGB officer, Oleg Kalugin argues that the “average So- viet listener was not affected by Western broadcasts (Danielson 2004, 35). Put simply, a great deal of existing research suggest that the threat of international propaganda may be exaggerated and the influence of foreign messages may be overstated (Little 2018; Nyhan 2018).

Reconceptualizing Effective Propaganda

Why would countries continue to invest resources into these programs if they were ineffective? The potential for backlash effects from other governments – who feel that strategic information operations are an act of aggression – should force states to cease propaganda operations if they had no influence whatsoever (Tomz and Weeks 2019). Others argue that the true effect of international disinformation campaigns are domestic. Lanoszka (2019) argues that “domestic society – not an enemy society abroad – may be the true target audience of the disinforming state. Broadcasting preferred ideologies and counter- narratives abroad through various media agencies or state organs can be a sign that the state is engaging in resistance against others that it sees as dominating the international system” (14). Others focus on the subtle influence of propaganda on mass public opinion (Pratkanis and Aronson 2001). While it is difficult to definitively demonstrate the impact of any particular message, international propaganda campaigns may have hidden effects on politics (Davison 1983; Ellul 1965). Vladislav Bittman, a former intelligence officer specializing in disinformation for the Czech Intelligence Service, says that while the effects of Soviet propaganda may not be statistically measurable, it had “contributed to anti-American psychosis around the world” (Bittman 1985, 55). Joseph Gordon, an expert on Soviet psychological operations, claimed that while, “it is difficult to measure the effects of clandestine broadcasting in any of the target regimes, there appears to at least be some official concern about potential influence”

34 (Gordon 1988, 12). Anatoliy M. Golitsyn, a KGB defector and expert on disinformation tactics, suggested that during the Cold War, Soviet propaganda may have caused “Western adversaries to contribute unwittingly to the achievement of communist objectives” (Golit- syn 1990, 5). Reviews of the U.S. intelligence department show that: “while it is difficult to assess the effectiveness of active measures, U.S. intelligence agencies concluded that many campaigns were effective in influencing target audiences” (Jones 2018, 4). All of these statements allude to the subtle effects of international propaganda. I argue that existing work on international propaganda often has a limited perspective on what constitutes successful influence. This is because there is confusion over the proper dependent variables in studies on foreign broadcasting and public diplomacy. According to Gregory(2016), “definitions such as ’winning hearts and minds’ or ‘a government’s en- gagement with people’ are problematic” while “low levels of abstraction can oversimplify” (7). Traditionally, most scholars focus on how soft power – the ability to attract and persuade – complements countries’ hard power capabilities during periods of conflict (Nye 2004). They study how public diplomacy or media diplomacy campaigns can boost favorable at- titudes toward the communicating country, and how those soft power resources translate into tangible military advantages (Goldsmith and Horiuchi 2009; Schatz and Levine 2010). Even here, there are debates about whether attitudes toward the actual country, the coun- try’s foreign policy, or the country’s leader is what matters most (Balmas 2018; Goldsmith and Horiuchi 2012). This soft power approach centers on whether messages from a com- municating country have direct effects on the communicating country, its leaders, and its foreign policy agenda. From this perspective it appears that Russia’s propaganda campaign is a failure. Avgeri- nos(2009) argues that “the Kremlin’s inability to execute effective media campaigns further agitates anti-Russian prejudices in the West and hinders the country’s efforts to improve its

35 international reputation” (115). Public opinion polls consistently show drops in favorability toward Russia following 2014 – when the Kremlin decided to annex Crimea from Ukraine (Letterman 2018). There is some indication that certain groups, particularly supporters of far-right parties in Europe have grown more pro-Putin over time (Taylor 2017). Yet, it is not clear whether this translates to more support for Russian foreign policy objectives and how stable this boost amongst populist parties really is (Fisher 2020). Notably, percep- tions of Russia’s importance in international politics have grown over time – demonstrat- ing that simply looking at favorable attitudes toward a country’s leader or its foreign policy may insufficiently measure its international influence (Letterman 2018). While propaganda campaigns may not boost favorable attitudes toward the disseminator of propaganda, it can shift public opinion in other subtle ways.

Undermining Rivals

Rather than trying to improve an autocrat’s reputation, international propaganda may be more effective in fostering the perception of equivalence between democratic and autocratic states – highlighting rival states’ shortcomings to demonstrate that nominally democratic states look as dysfunctional, if not more so, than less competitive regimes (Rawnsley 2015, 275). Countries constantly try to contest their rivals’ framing of political events to shift public opinion against their adversaries (Ayalon, Popovich and Yarchi 2016; Sheafer and Gabay 2009). Especially during periods of conflict, states try to portray adversaries as barbaric or destructive (Alexander, Levin and Henry 2005, . 31). These images, in turn, shape policy preferences (Herrmann and Fischerkeller 1995), and help create permissive conditions for a country’s interventionist initiatives (Dutta-Bergman 2006, . 112-113). Russia (RT), China (CGTN), Qatar (Al Jazeera), and Saudi Arabia (Al Arabiya) in- creasingly rely on satellite television news and social media platforms to challenge rival states’ broadcasting capabilities (Golan, Manor and Arceneaux 2019, p. 4). The do not just

36 try to improve their own image, but they increasingly try to sow doubt and undermine the legitimacy of target governments (Walker 2018, p. 19). This can be an effective strategy since negative coverage of opponents can decrease affect for the target of the message (Lau, Sigelman and Rovner 2007, 1182). Prior research, based on observational data, contends that negative coverage of a foreign country can drastically influence public opinion about that nation (Wanta, Golan and Lee 2004; Zhang and Meadows 2012). Consequently, it may be inappropriate to evaluate countries’ international influence by only looking at favorable public opinion (Avgerinos 2009). While the focus on sympathetic media coverage as a prerequisite for achieving foreign policy goals has dominated public diplomacy studies, it is equally important to examine how states can use propaganda to vilify their rivals (Sheafer and Shenhav 2009, 275). Moreover, if citizens recognize that a conflict exists between foreign nations, denigrating the rival nation’s reputation may be an effective method to improve one’s own image (Brewer 2006). Especially in the realm of foreign policy, where individuals often do not have developed policy preferences, exposure to even small amounts of foreign propaganda can shift political attitudes (Entman 2004; Holsti 2009).

Promoting Conspiracies and Cynicism

Some argue that an “overreliance on the soft-power paradigm has bred analytical com- placency regarding the growth of authoritarian influence” (Walker 2018, 18). Put dif- ferently, international propaganda is not simply about increasing favorability toward the communicating country or even undermining support for rival countries. In earlier stud- ies, Ellul argues that propaganda is not solely used to create favorable impressions of a person, policy, or state, but that it can also: “destroy the group, break it up – for exam- ple, by stimulating contradictions between feelings of justice and of loyalty, by destroying confidence in accustomed sources of information, by modifying standards of judgement,

37 by exaggerating crisis and conflict, or by setting groups against each other” (Ellul 1965, 190). Consequently, instead of soft power, we should focus on sharp power, a term used to describe actions taken by government to “pierce, penetrate, or perforate the political and information environments of targeted countries” (Nye 2018). Moving beyond a framework where propaganda is used to manufacture consent (Her- man and Chomsky 1988), we should also pay attention to how propagandists sow confu- sion, apathy, and cynicism in individuals (Huang 2018, 1034; Shao and Liu 2018, 15). These outcomes, which are frequently cited as central to propaganda, are often neglected by empirical political science research (Ellul 1965). In other words, propaganda may not improve the communicator’s reputation but it may be able to increase belief in conspiracies and exacerbate political cynicism. Given the centrality of populist frames and conspiracy theories in Russian media, ex- posure to Russian state-sponsored propaganda can promote a more conspiratorial view of the world (Yablokov 2015). Oliver and Wood(2014) argue that “most people will only express conspiracist beliefs after they encounter a conspiratorial narrative” (955). Russian networks like RT and Sputnik amplify extremist views, portraying politics as inherently corrupt with elite conspiracies around every corner (Pomerantsev 2015). The simultaneous adoption of conspiracies from both “left- and right-wing critics of the US gives RT lee- way to adapt its narratives in relation to different audiences, thereby expanding its global influence” (Yablokov 2015, 6) Close observers of Russian foreign propaganda also contend that the Kremlin tries to increase political cynicism in order to decrease support for liberal democracy. Soviet- born British journalist Peter Pomerantsev best explains this perspective: “The underlying goal of the Kremlin’s propaganda is to engender cynicism in the population. Cynicism is useful to the state: When people stop trusting any institutions or having any firmly held values, they can easily accept a conspiratorial vision of the world” (Pomerantsev 2015, 42).

38 While (Pomerantsev 2015) is writing about Russian domestic propaganda, the Kremlin is exporting its strategic information campaigns to influence Western audiences. Specifically, Howard et al.(2018) argue that Russian propaganda during the 2016 presidential campaign was aimed at fostering cynicism (32). Existing research presents mixed findings on when media can make citizens more cyn- ical, and when cynicism influences democratic outcomes like voting and political partici- pation (Boukes and Boomgaarden 2015; Cappella and Jamieson 1997; Dancey 2012; Elen- baas and Vresse 2008; Erber and Lau 1990; Hanson et al. 2010; Jackson 2011; Koch 2003; Rijkhoff 2015; Shao and Liu 2018; Shehata 2014; Valentino, Beckmann and Buhr 2001; Vresse and Semetko 2002; Vresse 2005). Scholars debating about influence of foreign pro- paganda on western countries often do not articulate the relationship between international propaganda and cynicism (Benkler 2018; Jamieson 2018). Additionally, while they demon- strate the prevalence of conspiracy theories in international propaganda networks, they do not analyze their influence on public opinion (Yablokov 2015). Given media attention to international propaganda campaigns, it is critical to analyze whether foreign messages can boost beliefs in conspiracy theories and increase political cynicism .

The Power of Perceptions

So far, I have discussed how propaganda directly effects public opinion – either by cre- ating less favorable attitudes toward adversaries, promoting conspiracy theories, or increas- ing political cynicism. Yet, propaganda does not have to be believed, or even consumed, in order to have social effects. Specifically, the simple presence of propaganda can have a third-person effect. In a seminal piece, W. Philips Davison suggests that: (i) “people will tend to overestimate the influence that mass communications have on the attitudes and behavior of others,” and (ii) “the impact that they expect this communication to have on others may lead them to take some action” (Davison 1983, 3). International propaganda

39 may not persuade individuals, but it can cause people to think that “others” are persuaded – leading to changes in their attitudes and behaviors (Gunther 1991; Gunther and Storey 2003; Sun, Shen and Pan 2008; Wei, Chia and Lo 2011). For instance, some argue that while American broadcasting had minimal effects on So- viet public opinion, it ironically had the largest impact on Soviet elites who realized that their system needed major change, triggering reforms that led to the dissolution of the So- viet Union (Danielson 2004, 35). Propagandists may be less concerning about changing people minds or their behavior, and more concerned with creating the illusion that propa- ganda is effective on other people. Through this indirect mechanism, international propa- ganda can affect democratic legitimacy (Levin 2016; Tomz and Weeks 2019). While not all foreign propaganda is aimed at swaying electoral outcomes, strategic information cam- paigns conducted during an electoral cycle can cast doubt over the final outcome. If people think that their fellow citizens were influenced by foreign propaganda, they may consider the election as rigged and therefore illegitimate – particularly if their preferred candidate did not win (Rojas 2010). If propagandists can create the perception that foreign influence played a large role in an electoral outcome, they can cast doubts about the legitimacy of the democratic process. Additionally, the perception that others are overly receptive to harmful foreign propa- ganda can lead to greater support for elite control over mass media to protect vulnerable groups (Rucinski and Salmon 1990, 345). Prior work finds that the presumed influence of foreign messages is associated with greater support for restrictive action (Golan and Lim 2016). If exposure to propaganda can indirectly create pro-censorship attitudes – via in- creasing the perceived influence of propaganda on others – this highlights a mechanism by which propaganda can shift political outcomes without directly influencing a large group of people. The potential for international disinformation to shape our politics, has led a flurry of government and civil-society organizations to dedicate resources into combatting

40 the threat of international disinformation.

Countering Propaganda

In January 2017, a report by the Office of the Director of National Intelligence (ODNI) and the Department for Homeland Security (DHS) stated that the Kremlin attempted to undermine the U.S. election through a series of active measures. A year later, the United States Congress released a report titled “Putin’s Asymmetric Assault on Democracy in Rus- sia and Europe: Implications for U.S. National Security” – which detailing Russian attempt to sway domestic politics in Western counties. Government officials and private organiza- tions are scrambling to counter the reach of digital propaganda meant to manipulate democ- racies processes. These proposal include producing counter-narratives, educating citizens about misinformation, and altering transparency standards for social media (Hellman and Wagnsson 2017). Others propose changing social media algorithms to screen out disin- formation, creating new fact-checking organizations to promote thoughtful deliberation of information, revising the legal and market infrastructures that allow foreign governments to reach audiences on social media networks, and even promoting out-right censorship (Kim et al. 2018; Marwick and Lewis 2017; Faris et al. 2017). I group existing initiatives into two categories: defensive responses and offensive responses.

Defensive Responses

Governments are primarily concerned with stopping foreign propaganda in their own countries (Bjola 2018). Experts consider that warning people about propaganda prior to exposure can decrease citizens’ receptivity to disinformation (Paul and Matthews 2016, p. 9). They are dedicating extensive resources into exposing and shaming propagandistic networks in the hope that alerting citizens to the threat of foreign propaganda attenuates its influence (Kiesler and Kiesler 1964; McGuire and Papageorgis 1962). Educating citizens

41 to be more aware of the origin of their political information is a major point of emphasis (McGeehan 2018, 56). For instance, in Ukraine, from 2015-2016, IREX (International Research and Exchanges Board), an international development non-governmental organi- zation, implemented a media literacy training program called Learn to Discern (L2D). This program trained Ukrainian citizens to critically assess news media messages and identify misinformation. Researchers found that L2D helped participants spot disinformation and bolstered their media literacy (Murrock et al. 2018, 53-54). In the United States, some argue that “national regulators certainly can do more to in- form media audiences about the ownership of external media – for example, the fact that RT is owned by TV-Novosti and hence the Russian state” (Surowiec 2017, 26). In March 2018, Reps. Seth Moulton (D-Mass.) and Elise Stefanik (R-N.Y.) introduced a biparti- san bill doing just that. The Countering Foreign Propaganda Act, which would “guarantee press freedom while also guaranteeing that Americans are made aware of exactly who is funding the TV they’re watching. It would do this by empowering the Federal Communi- cations Commission to let Americans know when they are watching foreign propaganda” (Freeman 2018). In Europe, the European Parliament voted on a resolution to brand the Kremlin-funded foreign broadcasting networks Russia Today (RT) and Sputnik as “hostile propaganda.” Labeling a network as a propaganda outlet is mean to dissuade consumption and provide the government a legal basis for censoring foreign-sponsored media. Since individuals often forget the source of the information when exposed to politi- cal information on social media (Kalogeropoulos, Fletcher and Nielsen 2018), experts are concerned individuals may be less motivated to discount propagandistic messages (Prior 2013, 108). Consequently, governments and tech firms are pushing for information trans- parency, especially when it comes to foreign broadcasting networks on social media (Allen- Ebrahimian, Groll and Gramer 2016). One clear example of this is YouTube adding state- funding disclaimers over a dozen Russian channels to alert viewers to the videos’ source

42 (Dave and Bing 2019). Awareness campaigns rely on the assumption that information from foreign states will not be trusted (Avgerinos 2009, 126; Hayes and Guardino 2011, 832). Existing work has shown that “when participants were forewarned of the intent of the speakers persuasion was inhibited” (Pornpitakpan 2004, 246). Revealing the source can signal the communicating actors’ intentions and make audiences less persuadable (Friestad and Wright 1994, 18). If this is true, countering foreign cues should be as simple as informing individuals that the information they are receiving comes from a hostile foreign actor. Yet, despite this push for making citizens more aware of the information they are consuming, there is little evidence that these initiatives make any difference (Farwell 2018). In fact, inoculation programs may also have unintended consequences. Priming individuals to think about propaganda can elicit feelings of distrust toward their own governments (Clayton et al. 2019). Alerting citizens to the threat of foreign propaganda can make people more likely to perceive politics as a game of elites trying to manipulate the people’s will, generating more cynical attitudes (Cappella and Jamieson 1997). When Facebook tried to flag fake stories, “users who wanted to believe the false stories had their fevers ignited and they actually shared the more” (Constine 2018). Research on media coverage of terrorism has noted that excessive warnings about an inter- national threat can lead to greater cynicism and a boomerang effect (Whitaker 2012, 70). People also may react unfavorably to actors’ attempts to educate them about propaganda because they view such efforts as scaremongering (Roese and Sande 1993). By responding to foreign propaganda, government officials may simply be drawing more attention to these foreign networks and inadvertently creating the impression that all media is propagandistic – feeding into existing distrust toward democratic institutions (Fisher 2019b). Consequently, it is clear that we need more research on when defensive actions against foreign propaganda are effective and when they draw unnecessary attention

43 to foreign networks.

Offensive Responses

Some have suggested that defensive proposals do not address the problem at its root. Using external media to reach foreign audiences is not a new phenomenon. After World War II, the U.S. congress created Radio Marti and TV Marti to target Fidel Casto’s Cuba (Hall 2017, 8). Voice of America and Radio Free Europe broadcast behind the Iron Curtain to challenge Communist control (Uttaro 1982). More recently, the U.S. launched Alhurra to provide audiences in the Middle East and North Africa a US-friendly perspective on news and current events (Doaui 2014). All these initiatives were meant to promote more democratic outcomes in foreign countries by breaking through autocratic states’ monopo- on political information (Risso 2013). When local journalists are unable to hold the government accountable for its actions, international actors promote alternative narratives in non-democratic states via traditional and digital media (Besley and Prat 2006, 720). Those who champion the internet as a liberation technology argue that accessibility to competing information allows citizens to evaluate their government through an alternative lens (mirror-holding), while giving them a glimpse into how citizens live in more democratic countries (window-opening) (Bailard 2012, 333). Some argue new communication technologies make it easier to spread infor- mation to foreign audiences, which can lead to more democratic outcomes (Tucker et al. 2017, 48-50). By weaponizing social media against non-democratic states, Western gov- ernments believe they can counteract autocrats’ control over political information, decrease satisfaction with the government’s performance, and increase demand for democracy (Aus- deran 2014; Huang 2015a; Kuran 1991). Policymakers argue that the “United States needs to adopt a proactive, offensive cam- paign to coerce Russia to curb its information warfare efforts, punish Moscow when further

44 incidents occur, and exploit Russian weaknesses and vulnerabilities” (Jones 2018, 1). For- mer Chairman of the House Foreign Relations Committee, Ed Royce, questions why the United States does “not go on the offense to release information exposing corruption at the Kremlin” (Keeley 2018). Specifically, some think that “a 24/7 direct broadcast satellite news service could expose corruption, nepotism, and incompetence that Russians already suspect” (Farwell 2018, 45). The U.S. has already launched Current Time, an online Russian-language network whose purpose is to “serve as a reality check” for Russian audi- ences (CBS 2017). The European Union has dedicated tens of millions of Euros to create “free media” in post-Soviet countries (Szostek 2015). Yet, not everyone is optimistic about the effectiveness of these initiatives. There are reports that, “anti-Russia messaging could backfire in the region, given local skepticism of Western propaganda” (Helmus et al. 2018, xii). Gerber echoes this sentiment, arguing that “fear of the menace that foreign powers pose to Russia is the aspect of Russian nationalism that is consistently linked to support for the Putin regime, its policies, and messages” (Ger- ber 2014, 128). External communication can help entrench support for non-democratic incumbents (Kern and Hainmueller 2009). Put simply, there is the possibility that outside guidance may be perceived as illegit- imate and backfire against the communicating country (Allport and Simpson 1946; Ellul 1965). Historically, researchers have shown that foreign broadcasting is generally ineffec- tive (Allport and Simpson 1946; Danielson 2004). There are several reasons to believe that individuals in non-competitive regimes will not be receptive to politically sensitive infor- mation (Hutchings 2018). Contrary to some accounts, Western media is no longer regarded as ‘forbidden fruit’ in non-democratic states (Nagorski 1971, 138). Citizens in modern au- tocracies often have access to a substantial amount of alternative perspectives (Geddes and Zaller 1989; Robertson 2017; Szostek 2017), meaning that individuals who support non- democratic regimes are not simply naive, brainwashed, or fail to consider alternative per-

45 spectives (Mikkonen 2010, 780). Access to information damaging to the regime does not mean that all citizens will access this information, or that they grow more anti-government after exposure (Chen and Yang 2018). Rather than assuming that individuals in autocratic states crave foreign perspective or critical coverage of the regime, one should consider the factors that make individual more or less receptive to external propaganda, when international criticism is viewed as legitimate, and the conditions under which foreign messages lead to unexpected outcomes (Drago- jlovic 2015; Robertson 2017). Criticism directed toward a country needs take into account a range of factors to be received in an open-minded fashion (Hornsey, Trembath and Gun- thorpe 2004, 500).

Conclusion

In sum, I argue that there are three different way that international propaganda can in- fluence mass public opinion. The soft power effect of propaganda refer to how messages improve views of the communicating county and its leaders (Nye 2004). The sharp power perspective examines when propaganda undermines rival states, bolsters political cynicism, and increases belief in conspiracy theories (Walker 2018). Finally, the third-person power effect examines how people’s perceptions of propaganda’s effectiveness shapes their polit- ical attitudes and behaviors, regardless of propaganda’s “direct” influence (Gunther 1991). By broadening our conceptualization of “effective propaganda” we can obtain a more holis- tic understanding of the influence of propaganda on our society and begin to think about how/whether we should counter this threat. In the remaining chapters, I use a series of survey experiments to assess the influence of foreign propaganda, evaluate the effect of inoculation messages, and analyze the im- pact of offensive counter-propaganda programs in the United States. In Chapter 4, I test how exposure to Russian propaganda can improve Russia’s image and denigrate the Krem-

46 lin’s rivals. In Chapter 5, I assess whether the Kremlin’s use of strategic and populist news frames increases belief in conspiracy theories and promotes political cynicism. In Chapter 6, I examine whether people tend to overestimate the effect of external messages, and whether the perception of propaganda’s effectiveness on others increases their support for media regulation and heightens the perception that the political system is illegitimate. In Chapter 7, I assess the efficacy of U.S. offensive counter-messaging in Russia. These studies analyze a broader number of outcome measures, allowing me to assess subtle way international propaganda may be affecting our democracy. Additionally, I evaluate the ef- fectiveness of a several counter-propaganda initiatives, providing empirical evidence about when policymakers attempts to combat propaganda are effective and when they backfire.

47 Demonizing the Enemy

“Even when enemy propaganda fails to nudge us in the wrong direction, our

ignorance nonetheless leads us away from the proper course.” - J.R. Nyquist

Introduction

This chapter assesses when states can influence public opinion about an adversary. In other words, can ”State A” use propaganda in ”State B” to ruin the reputation of ”State C”? By generating hostility (or at the very least apathy) toward a rival country, the propaganda- sending state has more latitude when conducting its international relations. This is why propaganda-sending states spend time and resources spreading negative information about their rivals. Additionally, does warning people about the aims of the propaganda-sending country make these negative media campaigns ineffective? One might expect that gov- ernments and civil society organizations can simply invest more resources into ensuring people are aware of these propaganda campaigns. In theory, by forewarning one’s popula- tion about foreign propaganda campaigns, governments can inoculate their citizenry from propaganda’s influence. I analyze the influence of Russian international broadcasting on American audiences about the topic of Ukraine. I subject subgroups of Americans to an article from Russia Today (RT) that criticizes the Ukrainian government. I vary whether audiences are aware of the message source, and/or the intentions, of the Russian-funded network in order to assess how bringing attention to a foreign network’s nationality influences it ability to persuade audiences. This is similar to the strategy adopted by technology firms like YouTube that have begun to add disclaimers to videos from state-sponsored media outlets (Dave and Bing 2019). I test whether making individuals aware that they are receiving information from a foreign government attenuates a message’s influence – testing the effectiveness of

48 one current inoculation strategy against propaganda (Banas and Rains 2010; Kiesler and Kiesler 1964; McGuire and Papageorgis 1962). While foreign actors spend considerable resources trying to denigrate target states’ reputations and legitimacy, we do not know what effect these initiatives have on public opinion. In order to counter foreign disinformation campaigns, we first need to know how people react when they are actually exposed to these types of messages and what influence our warning messages have on their receptivity.

Research Design

The news article that I use for my treatment is representative of a specific type of propaganda particularly relevant to public diplomacy and propaganda studies. State and non-state actors often try to portray their opponents as targeting innocent civilians due to long-standing norms against harming non-combatants (Honig and Reichard 2018, 298). Previous work finds that one prominent frame the Kremlin used during the Ukraine con- flict was portraying themselves as “defending humanitarian interests and respecting state sovereignty” (Golan, Manor and Arceneaux 2019). Consequently, I use a real RT story about Ukrainian human right violations because it is: (i) a critical narrative in the Kremlin’s foreign propaganda campaign; and (ii) a good representation of Russian ‘’ (Hutchings and Szostek 2015; Yablokov 2015; Zannettou et al. 2019). Whataboutism is a propaganda technique – frequently employed during the Cold War – “which occurs when officials implicated in wrongdoing whip out a counter-example of a similar abuse from the accusing country, with the goal of undermining the legitimacy of the criticism itself” (Khazan 2013). The vignette that I used in the experiment is included below:

Since February 2014, Russia and Ukraine have been engaged in a conflict in Eastern Ukraine.

According to Russia Today (RT), a Russian government-sponsored foreign

49 media network whose purpose is to spread a pro-Russian message to West-

ern audiences,4 an investigation into the conflict in Ukraine has indicated that the pro-Ukrainian forces committed crimes against humanity in Eastern Ukraine.

Last year, a prominent foreign government released a major report on human rights violations, abuse of law, torture, inhuman treatment and other crimes perpetrated by the Ukrainian military against the civilian population in eastern Ukraine. The ministry has twice updated it since its initial release and says the facts described in this document have been confirmed by international rights groups, such as Human Rights Watch.

I recruited participants through TurkPrime, a popular crowd-sourcing website for en- listing participants to perform particular tasks on Amazon’s Mechanical Turk (MTurk). 885 participants successfully completed the survey in October 2016. The sample was 55% female and had a mean age of 38. The sample is more liberal than the population with 45% supporting the Democratic Party, 22% the Republican Party, and 33% identifying as Independents. The sample is also more educated with over 55% having at least a college degree (Litman, Robinson and Abberbock 2017). Additional information about the sample can be found in Online Appendix A. Although MTurk samples are not representative of the general population, “MTurk re- spondents do not appear to differ fundamentally from population-based respondents in un- measurable ways” (Levay, Freese and Druckman 2016, 1). Therefore, there is little reason to believe that the effect of the treatments would vary in a representative sample (Mullinix, Leeper and Druckman 2015). Moreover, while an online sample over-represents internet

4Italicized text is presented to the source group. Bold and italicized text is presented to the intentions group.

50 users, this is precisely the sample that is more likely to be exposed to foreign propaganda online. To ensure that participants were attentive and actually received the treatment, I included reading checks and removed participants who failed these tasks (Oppenheimer, Meyvis and Davidenko 2009).5 I randomly assigned participants to: (1) a Control group, where individuals only com- pleted the post-treatment survey; (2) an Information group, where individuals read a short news post on Ukrainian human rights violations from Russia Today (RT) without reveal- ing the source of the information; (3) a Source group, where individuals read the same news post and were told the message source; or (4) an Intentions group where individuals read the same news post and were told more information about the intentions of the net- work (see Table 1). I test when exposure to negative information about an adversary can effectively lower foreign public opinion toward that adversary and how awareness of the message source mediates message persuasiveness.

Table 1: Research Design

Treatment Description Control Completed post-treatment survey Information News post on Ukrainian human rights violations Source Source cue + News post on Ukrainian human rights violations Intentions Intention cue + News post on Ukrainian human rights violations

I examine the effects of foreign propaganda on several types of outcome measures. First, I assess how exposure to negative information – with and without the message source – influences favorability toward Ukraine and Russia, as well as attitudes toward the coun- tries’ respective foreign policies and leaders. While most studies on international influence measure favorability toward the country, Goldsmith and Horiuchi(2012) argue that it is cit- izens’ foreign policy views that are “crucial for the country attempting to use soft power to

5Approximately 7% percent of participants (N=58) failed the reading check.

51 favorably affect policy outcomes” (556). Conversely, Balmas(2018) asserts that views on a country’s leader are the most important factor in determining attitudes toward the country and its citizens. Consequently, it is important to assess how foreign messages shape atti- tudes across these different levels of analysis. These dependent variables range from “very unfavorable” (1) to “very favorable” (5). I also assess whether citizens draw pro-Russian policy conclusion following exposure to Russian propaganda, assessing their attitudes toward expanding sanctions on Russia and arming the Ukrainian government. These dependent variables range from “strongly op- pose” (1) to “strongly support” (5). I am interested in whether exposure to the Russian propaganda can lower positive affect toward Ukraine among U.S. audiences, and/or im- prove attitudes toward Russia. I also test whether knowing that the information comes from Russia mitigates these effects.

Results

I use ordinary least squares with robust standard errors to estimate the effects of the treatments. First, I find that all the treatments significantly lower attitudes toward Ukraine and Ukraine’s foreign policy (see Figure1 ). Exposure to a news post about Ukrainian human rights violations decreases favorable views of Ukraine and the country’s foreign policy by approximately 10 percent (a 0.5 point drop on a 5 point scale). Only individuals in the intentions group have less favorable attitudes toward Petro Poroshenko. While there are minor differences between outcome measures, the results are generally the same regardless of whether one asks about favorability toward the country, the country’s foreign policy, or its leader.6 Overall, I show that countries can use foreign broadcasting to lower favorable

6Many people have no opinion on countries’ foreign policies and leaders. While only 3% of respondents did not express any opinion on Ukraine, over 17% had no opinion of Ukraine’s foreign policy, and 30% said ‘don’t know’ when asked about Ukrainian Presi-

52 affect toward rival countries and these results are fairly consistent across different outcomes measures.

Influence of Russian Propaganda on Views toward Ukraine and Russia

Ukraine Ukraine’s Foreign Policy Poroshenko

Intention Intention Intention

Source Source Source

Information Information Information −0.5 0 0.5 −0.5 0 0.5 −0.5 0 0.5

Russia Russia’s Foreign Policy Putin

Intention Intention Intention

Source Source Source

Information Information Information −0.5 0 0.5 −0.5 0 0.5 −0.5 0 0.5

Treatment effect

Figure 1: Participants were asked their levels of favorability toward: Ukraine, Ukrainian foreign policy, Ukrainian President Petro Poroshenko, Russia, Russia’s foreign policy, and Russian President Vladimir Putin. OLS with no controls. Sample includes individuals who passed reading checks. Figure plots the marginal effects of the treatments on each dependent variable.

Notably, I find no significant difference in the magnitude of the effect among the treat- ment groups, indicating that revealing the message source and providing information about the foreign network does not moderate its influence. To test differences between the treat- ment groups, I obtain the pairwise differences of the means of the dependent variables across the levels of the treatments and adjust the p-values and confidence intervals for mul- tiple comparisons using Tukey’s Honest Significant Difference test (see Figure2 ).

dent Petro Poroshenko.

53 Figure 2: I obtain the pairwise differences of the mean of favorability toward Ukraine across the levels of the treatments and adjust the p-values and confidence intervals for multiple comparisons using Tukey’s Honest Significant Difference test (HSD).

People express less favorable attitudes toward Ukraine even when they know the in- formation comes from a Russian-funded news network. While warning people about the intentions of the communication is meant to mitigate persuasion (Kiesler and Kiesler 1964; McGuire and Papageorgis 1962; Petty and Cacioppo 1977), I show that self-serving sources can still shift political attitudes (Hass 1981; Hovland and Mandell 1952). Contrary to pop- ular belief, alerting people to a source’s persuasive intentions does not automatically mean that they will become more skeptical or hostile toward the communicator (Friestad and Wright 1994; Issac and Grayson 2017). These findings raise question about the influence of policies that aim to reduce foreign sources’ anonymity. While policy makers might assume that warning individuals about foreign propaganda source will negate its effectiveness, many people may ignore infor- mation about the source’s origin. This is not to say that more strongly worded warning

54 messages may not elicit a stronger effect (Banas and Rains 2010; Murrock et al. 2018). However, initiatives from technology companies like YouTube that warn individuals that they are watching state-sponsored propaganda may not have much effect in reducing the influence of foreign propaganda networks.

Figure 3: I obtain the pairwise differences of the mean of favorability toward Russia across the levels of the treatments and adjust the p-values and confidence intervals for multiple comparisons using Tukey’s Honest Significant Difference test (HSD).

Next, I evaluate whether propaganda against an adversary has positive spillover effects on the communicating country. Put differently, can Russia improve its reputation by putting down a rival state? I show that none of the treatments have a significant effect on attitudes toward Russia, Russian foreign policy, or Vladimir Putin (see Figure1 and Figure3 ). Put simply, people not party to the conflict do not appear to have in mind a kind of zero-sum game – meaning that criticizing one group in the conflict does not increase the support for the other group (Brewer 1999; Fisher 2019a). While exposure to Russian propaganda can decrease favorable attitudes toward Russia’s rivals, it does not seem to improve views of

55 Russia itself. Russia and Ukraine may be engaged in an international conflict, but individ- uals are not drawing strong connections between Russia and Ukraine, most likely due to the low issue salience of Russia-Ukraine relations in America. I also show that exposure to a the treatments does not cause individuals to support increasing sanctions against Russia. However, individuals in the intentions group do tend to be less supportive of arming the Ukrainian government (see Figure4 ). This highlights how negative affect toward a foreign country can have security implications (Goldsmith and Horiuchi 2012).

Influence of Russian Propaganda on Policy Attitudes

Sanctions Arms

Intention Intention

Source Source

Information Information −0.5 0 0.5 −0.5 0 0.5

Treatment effect

Figure 4: Participants were asked their levels of support for expanding sanctions on Russia and arming the Ukrainian government. OLS with no controls. Sample includes individuals who passed reading checks. Figure plots the marginal effects of the treatments on each dependent variable.

I consider whether people are reacting in heterogeneous ways to Russian propaganda since prior work has emphasized how political priors moderate the impact of foreign cues

56 (Dragojlovic 2015). Motivated skepticism would suggest that foreign media can increase political polarization since individuals can use the same information to come to drastically different conclusions based on their political priors (Peisakhin and Rozenas 2018). The influence of foreign propaganda may be contingent on how new information aligns with individuals’ political ideology and their political sophistication (Taber and Lodge 2006). I first test for significant interaction effects based on partisanship (see Figure5 ). I find that Democrats and Independents are more likely to adopt negative views of Ukraine after exposure to propaganda. Democrats – in the inoculation group only – also experience a backlash effect toward Russia, highlighting how external communication can backfire (Peisakhin and Rozenas 2018).

Influence of Russian Propaganda on Views toward Ukraine and Russia Political Party Interaction

Ukraine Russia

Independent Independent

Republican Republican

Information Information Source Source Democrat Democrat Intention Intention

−1 −.5 0 .5 1 −1 −.5 0 .5 1

Treatment effect

Figure 5: Participants were asked their levels of favorability toward: Ukraine and Russia. OLS with no controls. Sample includes individuals who passed reading checks. Figure plots the marginal effects of the treatments on each dependent variable.

It is possible that knowledge that the Ukrainian government is violating human rights

57 can lessen support for Ukraine in more liberal voters since liberals tend be more sensitive to the use of force in foreign policy than conservatives (Kertzer and Brutger 2016, 235). Prior work finds that Democrats tend to see foreign voices as more credible than Republicans and are therefore more receptive to information from foreign sources (Guardino and Hayes 2017, p. 3). Scholars emphasize that Republicans’ greater levels of ethnocentrism and distrust of foreign nations makes them less receptive to foreign cues (Brewer et al. 2004; Dragojlovic 2015; Kinder and Kam 2009). Individuals with strong national identities are more likely to discount information from abroad - a group that more closely represents Republicans in the United States (Herrmann 2017). Yet, I would expect partisanship to have varying moderating effects based on the topic of the foreign media message. As recent reports on Russian propaganda made clear, the Kremlin was adept at targeting both the far-left and the far-right with their strategic information campaigns in the United States (Howard et al. 2018).

58 Influence of Russian Propaganda on Views toward Ukraine and Russia By Prior Level of Knowledge

Ukraine Russia

A lot Information A lot Information Source Source Intention Intention

A moderate amount A moderate amount

A little A little

Nothing at all Nothing at all How much do you know about the conflict in Ukraine? −1 −.5 0 .5 −1 −.5 0 .5

Treatment effect

Figure 6: Participants were asked their levels of favorability toward: Ukraine and Russia. OLS with no controls. Sample includes individuals who passed reading checks. Figure plots the marginal effects of the treatments on each dependent variable.

I also assess heterogenous treatment effects based on people’s self reported knowledge of Ukraine-Russia relations (see Figure6 ). They were asked how much, if anything, they had read or heard about tensions between Russia and Ukraine over territory in eastern Ukraine. Responses ranged from “Nothing at all” (1) to “A great deal” (5). Less than 20% of individuals said they knew a lot or a great deal about the conflict, indicating the issue’s low saliency. While there are methodological issues with self-reported measures of awareness, it still allows me to estimate how propaganda shapes individuals with different levels of self-professed knowledge (Guess 2015). I find some evidence that providing information about the intentions of RT mitigates the effect of propaganda among individuals with greater levels of political awareness. What this means in practice is that making the source of foreign information apparent is an im-

59 portant counter-propaganda strategy, but it may only be effective on a limited audience with sufficient prior knowledge about the topic. Individuals with low levels of prior knowledge about Ukraine are actually more likely to adopt less favorable attitudes toward Ukraine when presented the article with more information about the Russian network.

Influence of Russian Propaganda on Views toward Ukraine

Favorable

Neutral Attitudes toward Russia

Information Source Not Favorable Intention

−2 −1.5 −1 −.5 0 .5 Treatment effect

Figure 7: Participants were asked their levels of favorability toward Ukraine. OLS with no controls. Sample includes individuals who passed reading checks. Figure plots the marginal effects of the treatments by attitudes on Russia.

Finally, I evaluate whether Russian government-sponsored media has different effects based on prior attitudes toward Russia (see Figure7 ). The figure shows that inocula- tion messages can mitigate the effect of the negative coverage of Ukraine among individ- uals with pre-existing negative attitudes toward Russia. Individuals with prior unfavorable views of Russia who are warned about the intentions of the Russian government do not become more likely to adopt unfavorable views of Ukraine. However, giving more infor- mation about the Russian network actually backfires on individuals with favorable views of

60 Russia – causing them to express more unfavorable attitudes toward the Ukrainian govern- ment. As it turns out, only 1/5 of Americans tend to hold favorable attitudes toward Russia, meaning these backfire effects should be rare (Letterman 2018). One issue with this anal- ysis is that attitudes toward Russia were measured post-treatment , meaning that the effect of the treatments on attitudes toward Ukraine can be slightly biased (Mongomery, Nyhan and Torres 2018). However, it still provides preliminary evidence on how priors toward an autocratic state moderate the effect of foreign media emanating from that state.7

Conclusion

This chapter contributes to an exciting literature on the conditions under which foreign cues change public opinion and when awareness of the messenger’s nationality alter a net- work’s persuasiveness (Chapman and Gerber 2019; Dragojlovic 2015; Guardino and Hayes 2017; Lorimor and Dunn 1968; Murray 2014). Since an increasing part of hybrid warfare involves denigrating international support for one’s enemies, this is a critical area of re- search (Renz 2016; Schatz and Levine 2010). Citizens’ attitudes toward foreign countries have the potential to constrain elites’ policy options (Foyle 1997; Gilboa 2008). In order to assess the full influence of countries’ international broadcasting networks we should not only focus on traditional soft power variables like favorability toward the communicating country and its leaders, but we should also examine whether messages can undermine views of adversarial states. Propagandists can achieve their foreign policy objectives by lowering favorable public opinion towards one’s enemies and creating the

7I acknowledge that looking for heterogenous treatment effects covariate-by-covariate can lead to false discoveries (Gelman and Loken 2014). Therefore, I use Bayesian Additive Regression Trees (BART) – a sum-of-trees model that detects treatment effect heterogene- ity and predicts the conditional mean of the outcome variable while minimizing overfitting to account for this issue in Online Appendix A. (Guess and Coppock 2018, p. 13).

61 conditions for more permissive interventionist policies that can undermine international security and democratic processes (Paul 2005, 59). I show that foreign messages that emphasize a state’s transgressions can effectively lower favorability toward the adversary, although they do not improve the communicat- ing country’s image. Most notably, revealing the message source has little influence on audiences’ receptivity to propaganda. Contrary to popular belief, drawing attention to pro- paganda networks has a limited effect on propaganda’s effectiveness (Farwell 2018, 42). Only individuals with sufficient prior knowledge about foreign policy issues or negative views toward the communicating country are receptive to inoculation messages. These findings complement research on the limitations of source cues and raise questions about the best techniques for countering propaganda (Clayton et al. 2018, 2019; Metzger, Flana- gin and Medders 2010). The findings in this chapter are notable for two reasons. First, if a foreign actor’s goals are simply to dissuade international action in favor of their enemy – rather than promoting policies in favor of their country – negative coverage of the enemy may achieve a sufficient level of influence. Second, if making individuals more aware of foreign interference has lit- tle influence, we should re-evaluate how to counter propaganda. While policy experts care a lot about exposing foreign propaganda, it is not apparent that citizens are paying much attention to the message source (Kalogeropoulos, Fletcher and Nielsen 2018). Highlight- ing the manipulative attempt of foreign propaganda may be a more effective inoculation strategy (Banas and Rains 2010). Moreover, propaganda may have more subtle effects than shifting attitudes toward adversarial states.

62 The Conspiratorial and the Cynical

“When people stopped trusting any institutions or having any values, they could easily be spun into a conspiratorial vision of the world. Thus the para-

dox: the gullible cynic.” – Pomerantsev(2014 b).

Introduction

In the previous chapter, I demonstrated that exposure to Russian propaganda on the topic of Ukrainian human rights violations lower people’s favorable perceptions of Ukraine, but does not shift attitudes toward Russia. Foreign propaganda targeting a rival state may not improve the communicating country’s image, but it can ruin the rival’s reputation. I also find that warning people about propaganda does not mitigate Russia’s ability to denigrate their rivals in the eyes of global publics. Initiatives that draw people’s attention to the source of political information may not be effective in countering sophisticated disinformation campaigns. However, this previous study focuses on a very particular type of propaganda - a news post that demonizes a rival country. This type of study has several limitations. First, since it has been discovered that Russian government relied on a variety of dif- ferent narratives to target vulnerable populations, we need more research on whether pro- paganda on different topics is more effective. For example, while there has been a lot of attention to the Kremlin’s desire to spread disinformation and conspiracy theories, there is no evidence that they were effective in doing do (Yablokov 2015). Second, foreign propaganda’s main effect could be simply confusing audiences and breeding political cynicism (Pomerantsev 2015). In other words, individuals may not be- come more likely to support pro-Russian positions, but they may come away with a general distrust of political institutions following exposure to propagandistic messages. This cyni- cism can influence their engagement with democratic politics (Citrin and Stoker 2018).

63 Finally, it is possible that more strongly-worded warning messages might better prime people to become more skeptical toward information coming from foreign broadcasters. For instance, using terms like ’propaganda’ and ’disinformation’ may alert individuals to the fact that foreign entities are trying to manipulate them. These stronger warnings may more effectively inoculate individuals from believing foreign propaganda. My next two experiments address these limitations by assessing whether foreign disin- formation increases belief in conspiracy theories and breed political cynicism. They also test how more strongly worded inoculations shape receptivity to disinformation. In doing so, they expand on the findings in the first study by analyzing whether exposure to foreign propaganda has more subtle effects on public opinion.

Research Design

I once again recruit participants through TurkPrime (Litman, Robinson and Abberbock 2017). 993 participants successfully completed the survey. The sample was 56% female, 76% white, and had a mean age of 38. 56% of the respondents identifying or leaning toward the Democratic Party and over 51% having a bachelor degree or higher. To ensure that participants who did not pay close attention to the survey did not bias the results, I excluded those who complete the survey in less than 3 minutes, leaving 940 respondents (Chen 2018, 7). After answering some demographic questions, participants were randomly assigned to one of three groups. The Control group was exposed to eight non-political social media posts, with a mix of human interest stories and advertisements. The Propaganda group was exposed to four non-political social media posts and four articles from the Kremlin state- funded networks Russia Today (RT) and Sputnik. All are real stories taken from Russia Today (RT) and Sputnik.

64 Figure 8: Russian Propaganda Treatments

(a) MH17 Probe (b) Syria

(c) Putin Meddled in Election (d) Seth Rich Conspiracy

65 The Inoculation group was given a warning about the intentions of Russian propagan- dists before participants were exposed to the same set of articles as the propaganda group (see Figure8 ). The treatments were made to look like Facebook posts in order to increase the external validity of the study (Clayton et al. 2019).8

Inoculation: There has been increasing concern over false news and misin- formation online. Many experts claim that Russian government-funded inter- national news networks, Russia Today (RT) and Sputnik are simply tools of a sophisticated Russian propaganda machine, created by the Kremlin to push its foreign policy, defend its aggression in Ukraine and undermine confidence in democracy, NATO and the world as we have known it.

I explicitly test whether exposure to international propaganda has an effect on belief in specific conspiracy theories. I include Russian disinformation on: (i) the Seth Rich con- spiracy – the narrative that Hillary Clinton was behind the murder of Democratic National Committee employee Seth Rich; (ii) the Syria conspiracy – the narrative that Assad’s gas attack was a false flag; and (iii) the Ukraine Conspiracy – the narrative that Ukraine was responsible for the downing of flight MH17. I also include one story directly on the topic of Putin’s meddling in U.S. elections.

Table 2: Study Design

Treatment Description Control 8 non-political placebo posts Propaganda 4 non-political placebo posts + 4 Russian propaganda posts Inoculation Inoculation + 4 non-political placebo posts + 4 Russian propaganda posts

I then ask participants to evaluate the accuracy of several statements on a 1 to 4 scale with 1 indicating “not accurate at all” and 4 indicating “very accurate”. I condense this

8The placebo posts are available in Online Appendix B.

66 measure into a binary measure indicating belief in the if participants thought the claim was somewhat or very accurate. The exact text of the questions is as follow:

1. To the best of your knowledge, how accurate is the claim that Democratic National Committee (DNC) employee Seth Rich was assassinated after having given Wik- iLeaks thousands of DNC emails?

2. To the best of your knowledge, how accurate is the claim that the chemical weapons attacks against Syrian civilians was a “false flag” operation designed to trigger Amer- ican involvement in the country’s civil war?

3. To the best of your knowledge, how accurate is the claim that Ukraine was behind the downing of flight MH17?

Results

I use logistic regression since the outcome is binary to estimate the influence of the propaganda and the inoculation on the perceived accuracy of these three statement. I find that exposure to propaganda has little effect on belief in these theories (see Figure9 ). See- ing Russian propaganda on Syria, Ukraine, and Seth Rich does not increase people’s belief in these conspiracies. In fact, only participants in the inoculation group update their atti- tudes. Specifically, individuals who were warned about Russian propaganda prior to being exposed to Russian disinformation were more likely to believe that Seth Rich was assas- sinated after having given WikiLeaks thousands of DNC emails. This is very concerning since it demonstrates that warning people about propaganda can make some citizens more likely to believe these claims.

67 Seth Rich Conspiracy Syria Conspiracy Ukraine Conspiracy

32 32 35 Control

33 34 40 Propaganda

40 31 35 Inoculation

25 30 35 40 45 25 30 35 40 45 25 30 35 40 45 Percent

Figure 9: The figure shows the percentage of respondents who agreed with each statement. Logistic regression. Dashed lines represent 95% confidence intervals.

I unpack these results further to identify who is most receptive to propaganda and why inoculation messages backfire. Receptivity to misinformation and belief in conspir- acy theories has been associated with personality characteristics such as “ receptiv- ity” (Hart and Graether 2018), and socio-political factors such lower trust in government (Imhoff and Lamberty 2018). Unsurprisingly, people tend to believe misinformation that confirms their political priors (Kahne and Bowyer 2017; Schaffner and Roche 2016). Con- sequently, we might expect that when propaganda reaffirms people’s pre-existing views, they will be more likely to consider this information credible (Taber and Lodge 2006). However, while susceptibility to disinformation is present across the ideological spectrum (Oliver and Wood 2014, 964), prior work shows that conservatives tended to be more likely to consume and share disinformation in the United States (Grinberg et al. 2019).

68 Although there is a strong sense that partisan motivated reasoning is the driving force behind the persuasiveness of misinformation, partisan motivated reasoning is not fully pre- dictive of what stories people believe. Clayton et al.(2018) finds that individuals “do not blindly judge the content of articles based on news source, regardless of their own partisan- ship and ideology” (1). Others show that people’s susceptibility to misinformation is often driven “more by lazy thinking than it is by partisan bias” (Pennycook and Rand 2017). Consequently, vulnerability to disinformation may be conditional on individuals’ levels of political sophistication rather than their political priors. Put simply, individuals who are ex- posed to propaganda may be more likely to hold false perceptions if they have low political sophistication. It is important to examine what types of audiences are more receptive to disinformation. If political priors are a strong predictor, counter-propaganda campaigns may only exacer- bate political polarization (Hart and Nisbet 2011; Lewandowsky et al. 2012; Nyhan and Reifler 2010). Strong political identities are often immune to counter-propaganda (Pereira and Bavel 2018). However, if low levels of political awareness predict higher levels of receptivity, then it would make sense to make audiences more aware of the effect of Rus- sian propaganda in their society to counter its influence (Ivanov et al. 2017; Pennycook and Rand 2017). Conversely, if other less malleable traits like one’s personality, cognitive re- flection, or analytical thinking predict receptivity, than the influence of counter-messaging may be limited (Arceneaux and Wielen 2017; Bronstein et al. 2019; Keersmaecker and Roets 2017). Given the attention to age, political ideology, and political awareness in stud- ies of disinformation, I test whether these factors moderate whether people are persuaded by foreign conspiracy theories.

69 r h osqec fpltclideology. political of consequence the are disinformation (see to policy receptive more foreign be U.S. to on tend which people - younger conspiracy – Rich News Seth Fox the on believe prominent to was likely more be to tend people older In while policy. sum, foreign concerning theories being conspiracy about after adults especially older - inoculating in flag effective is false propaganda Russian a about to is people warning Syria However, tend in propaganda. people Russian attack about Young gas warning the Ukraine. the believe and to Syria likely on more be conspiracies find the I However, for pattern significant). opposite statistically that not are results these (although conspiracy Rich represent lines Dashed checks. statement. reading each intervals. believing passed confidence of who 95% probability individuals in increase includes the Sample plots Figure controls. with regression gistic 10: Figure

n oeidcto htodrpol r oelkl ob eetv oteSeth the to receptive be to likely more are people older that indication some find I Treatment effect

−.6 −.4 −.2 0 .2 .4 20 Seth RichConspiracy atcpnswr se oass h cuayo h he osiais Lo- conspiracies. three the of accuracy the assess to asked were Participants Inoculation Propaganda 30 40 50 60 10 Figure 70 Influence ofRussianPropaganda on BeliefinConspiracyTheories

−.6 −.4 −.2 0 .2 .4 20 .Ti esteqeto fwehrteefindings these whether of question the begs This ). Syria Conspiracy Inoculation Propaganda 30 70 Age 40 50 60 70

−.6 −.4 −.2 0 .2 .4 20 Ukraine Conspiracy Inoculation Propaganda 30 40 50 60 70 t h mato atcekn hl nlz h feto eea nclto messages inoculation general of effect the analyze I while fact-checking of impact the ate and (Guess them about warned 2018). when Porter theories and Wood conspiracy 2018; Coppock in believe to likely individuals some more that evidence are clear find I exaggerated, is effects backfire over of concern murder that the propaganda in Russian involved about warned was were they Clinton after Hillary Rich Seth that think to Notably, likely 2019). most (Darby are Rich” conservatives Seth of death the covered obsessively particular, in Hannity 11 Figure represent lines Dashed checks. statement. reading each intervals. believing passed confidence of who 95% probability individuals in increase includes the Sample plots Figure controls. with regression gistic 11: Figure

n oal ifrnebtenpeiu eerhadti td sta tesevalu- others that is study this and research previous between difference notable One (see conspiracy Rich Seth the believe to likely more are conservatives expected, As Treatment effect

Very liberal −.2 0 .2 .4 .Ti sbcue frasldmnhi al 07 o es n otSean host and News, Fox 2017, early in month solid a “for because, is This ). SlightlyLiberal liberal atcpnswr se oass h cuayo h he osiais Lo- conspiracies. three the of accuracy the assess to asked were Participants Seth RichConspiracy Inoculation Propaganda

Slightly conservative Moderate

Conservative Very conservative Influence ofRussianPropaganda on BeliefinConspiracyTheories

Very liberal −.2 0 .2 .4

SlightlyLiberal liberal Inoculation Propaganda Political Ideology Syria Conspiracy

71 Slightly conservative Moderate

Conservative Very conservative hl eerhrsac finds research research While .

Very liberal −.2 0 .2 .4

SlightlyLiberal liberal Inoculation Propaganda Ukraine Conspiracy

Slightly conservative Moderate

Conservative Very conservative (Banas and Rains 2010). Additionally, my study present the information in the form of social media posts – more accurately mimicking how citizens are exposed to foreign disin- formation. Finally, it is important not to overstate these backlash effects. While I do find that inoculations can backfire on politically sensitive conspiracies, there is no backlash to inoculation on the Ukraine or Syria conspiracies.

Influence of Russian Propaganda on Belief in Conspiracy Theories

Seth Rich Conspiracy Syria Conspiracy Ukraine Conspiracy

.6 Propaganda .6 Propaganda .6 Propaganda Inoculation Inoculation Inoculation .4 .4 .4 .2 .2 .2 Treatment effect 0 0 0 .2 .2 .2 − − −

0 .2 .4 .6 .8 1 0 .2 .4 .6 .8 1 0 .2 .4 .6 .8 1

Political Awareness

Figure 12: Participants were asked to assess the accuracy of the three conspiracies. Lo- gistic regression with controls. Sample includes individuals who passed reading checks. Figure plots the increase in probability of believing each statement. Dashed lines represent 95% confidence intervals.

Finally, I assess how individuals’ political awareness shapes their receptivity to con- spiracy theories (see Figure 12). Individuals are asked five question to assess how closely they follow and understand politics. I show that exposure to propaganda makes people with low political awareness more receptive to the Ukraine conspiracy and the Syria conspiracy (although the latter is just short of statistically significance). While political ideology is

72 an important predictors of belief in conspiracy theories (Schaffner and Roche 2016), po- litical awareness also strongly predicts whether people can be taken in by disinformation (Pennycook and Rand 2017). Overall, I find evidence that both propaganda and inoculations have heterogenous ef- fects based on a person’s age, political ideology and political awareness. Notably older conservatives are more receptive to the Seth Rich conspiracy when warned about Rus- sian propaganda. However, young people with low political awareness tend to be more receptive to disinformation about U.S. foreign policy. It is evident that there is no single ’vulnerable’ group to Russian propaganda (Grinberg et al. 2019). Rather, foreign actors can reach both young and old adults as well as liberals and conservatives (Howard et al. 2018). Troublingly, inoculation messages are just as likely to backfire as they are to successfully mitigate the influence of disinformation. My next step is to analyze the influence of propaganda on political cynicism. Political cynicism is a concept that has taken varied meaning across the political science literature. While some emphasize a disconnect from politics, others claim cynicism lies in distrust in politicians, low confidence in government, or a general belief that institutions are “hyp- ocritical, soulless, or otherwise devoid of the beliefs that once animated them” (Mazella 2007, 6). Trust in government to do the right thing most of the time is a cornerstone for democratic legitimacy, making political cynicism an important topic of research, with implications for political participation, support for extremist parties, and civic engagement. Despite all the claims that Russian propaganda has sowed discord, undermined demo- cratic norms, and created more cynical citizens, there is neither strong empirical evidence nor a clear theoretical link between exposure to Russian propaganda and political cynicism (Howard et al. 2018). Why would exposure to Russian propaganda cause individuals to distrust their government? What is it about the narratives present in Russian propaganda targeted at American audiences that would increase cynicism?

73 Scholars have noted that some political coverage may inadvertently be disillusioning in- dividuals about politics and demobilizing citizens (Robinson 1976). The spiral of cynicism hypothesis states that when the media reports largely strategically about politics, it erodes civic engagement, increases political cynicism, and depresses political participation (Cap- pella and Jamieson 1997; Vresse 2004; Jackson 2011). Individuals exposed to messages that emphasize the hypocritical nature of politics may not change their attitudes on spe- cific issues or countries, but they can become more cynical (Pomerantsev 2015). Content analysis reveals that Russian state-funded media networks emphasize faults in democratic systems to intensify existing rifts between social and political groups (Farkas and Bastos, 2018; Yablokov, 2015). Their coverage of American politics does not just focus on racial tensions and social divisions, but it also emphasizes the undemocratic aspects of America’s two-party system. When individuals are confronted with conspiracy theories or informa- tion about democratic deficiencies in their political system, they can become more cynical. Instead of speculating about this relationship, I attempt to test it empirically. I use principal component factor analysis to extract a single dimension based on agree- ment to six statements that gauge general political cynicism rather than trust in specific politicians or institutions (Jackson 2011, 78). Statements include: (1) “Elections give vot- ers a real choice among candidates with different positions” (2) “Politicians genuinely try to keep their campaign promises” (3) “Politicians lose touch quickly with the public after they are elected”; (4) “I’m disgusted with politics”; (5) “The government is immoral”; and

(6) “I am cynical about government” (Cronbach α = 0.81). Responses range from “strongly disagree” (1) to “strongly agree” (7). Questions where higher scores indicate lower political cynicism are reverse coded. Contrary to my expectations, Russian propaganda has no effect on political cynicism (see Figure 13). Individuals do not tend to become less trusting of their political sys- tem when exposed to conspiracy theories from Russian-state sponsored media networks.

74 Despite claims that exposure to conspiracy theories is making American more confused and distrusting, there is no evidence that Russian propaganda is making Americans more cynical (Yablokov 2015).

Russian Propaganda and Inoculation Effect on Cynicism

Inoculation

Propaganda −.02 −.01 0 .01 .02 .03 Treatment effect

Figure 13: Treatment effects on cynicism. OLS with robust standard errors and 95% confidence intervals. N=935.

However, the findings may be sensitive to the topic of the propaganda in the first ex- periment. It is possible that propaganda that directly disparages American democracy may be more effective in increasing cynicism. I rely on another survey experiment to assess whether an article criticizing the U.S. electoral system bolsters distrust in American insti- tutions. I also manipulate whether individuals know the criticism comes from Russia and whether they received a warning message about Russian propaganda prior to reading the article. In doing so, I am able to test whether Russian propaganda that explicitly draws at- tention to deficits in American democracy can increase political cynicism and how warning

75 messages shape propaganda effects. 1000 participants completed this survey. The final sample was 52% female and had a mean age of 38. 54% support the Democratic Party, 33% the Republican Party, and 12% identifying as Independents with no preference. The sample is also more educated with over 54% having at least a college degree. After completing some standard demographic questions, I randomly assigned participants to: (1) a Control group, where individuals simply completed the post-treatment survey; (2) an Information group, where individuals read an article with criticism of the two-party system from Russia Today (RT) without information on the source; (3) a Source group, where individuals are explicitly told the message source; or (4) an Intention group where individuals were warned about the threat of RT before reading the same article with source information.9

Table 3: Study Design

Treatment Description Control Completed post-treatment survey Information Article with Russian criticism Source Source cue + article with Russian criticism Intention Inoculation + article with Russian criticism

Participants in the treatments groups are presented with the following text: “Below is a short excerpt from a media interview with Annie Machon, a former intelligence officer for MI5, the security service in the . The topic is the threat Russia poses to American democracy.” The source and inoculation groups are told: The interview is brought to you by the Russian media network, Russia Today (RT) (see Figure 14). The

9The warning message is the same as the one in the previous chapter. The main difference between the intention group in this study and the inoculation group is the wording of the warning message. The inoculation group more explicitly refers to misinformation and propaganda to alert people to the threat of foreign disinformation campaigns.

76 intentions group is also presented with the following message:

According to several news sources, Russia Today (RT) and Sputnik are simply tools of a sophisticated Russian propaganda machine, created by the Kremlin to push its foreign policy, defend its aggression in Ukraine and undermine confidence in democracy, NATO and the world as we have known it.

The intentions vignette explains how RT is a tool of the Russian propaganda machine, priming individuals to think about the threat of Russian interference. The article is in the form of an interview between an RT journalist and a British intelligence officer on the subject of Russia’s relationship to the United States. The Kremlin often relies on Western hosts and guests to boost the credibility of their coverage among American audiences, mak- ing this a common format for Russian propaganda targeted at Western audiences (Xie and Boyd-Barrett 2015). The article centers on a prominent theme of Russian propaganda: the unrepresentative and corrupt nature of America’s two party system. It is a good example of the Kremlin’s “whataboutisms” that are intended to change the conversation and misdirect the audience.

77 Figure 14: Treatment article. Only the source and the intention group are shown the logo.

Despite what some scholars claim, there is little evidence to suggest that Russian pro- paganda is creating a more cynical citizenry (Flock 2018; Conor and Friedersdorf 2018). These results are robust to samples where individuals had to pass reading checks as well as models with pre-treatment controls (see Figure 15). It is possible that repeated exposure to propaganda might cause people to become disillusioned with politics and skeptical of the media’s ability to report events in an unbiased manner (Paul and Matthews 2016, 4). However, I show that individuals do not become more cynical even when directly exposed to propaganda on a topic directly addressing shortcoming in American democracy. This should make one question when foreign propaganda can shift political attitudes in ways hypothesized by scholars studying international disinformation (Lucas and Nimmo 2015). Additionally, since foreign propaganda makes up a tiny fraction of audiences media diets, heavy repeated exposure to foreign propaganda is unlikely in most contexts (Nelson and

78 Taneja 2018).

Russian Propaganda Effect on Political Cynicism

Intention

Source

Information −.04 −.02 0 .02 .04 .06 Treatment effect

Figure 15: Treatment effects on cynicism. OLS with robust standard errors and 95% confidence intervals. N=1,000.

As a final step, I evaluate the relationship between political cynicism and anti-democratic viewpoints. While there is a lot of attention to the negative influence of cynicism on demo- cratic politics, this relationship may be exaggerated or simply nonexistent. Many surveys may be capturing citizens who are “verbalizing a causal and ritual negativism rather than an enduring sense of estrangement that influences their beliefs and actions” (Eisinger 1999, 46). In fact, some go as far as arguing that “cynicism is perhaps little more than an indi- cation of an interested and critical citizenry” (Vresse 2005, 294). This raises questions concerning our measures of political cynicism, and whether cynics are simply critical and engaged citizens who are skeptical about centralized power (Eisinger 2000). To evaluate the relationship between cynicism and support for democratic norms, I as-

79 sess the relationship between holding cynical views and believing strong-man rule is good for the country. Responses range from 0 to 10 with higher values indicating support for strong-man rule. I also control for gender, age, race, education, ideology, and partisanship, which can be related to cynicism and support for strong leaders. I show that contrary to popular belief, cynics tend to be less likely to hold anti-democratic viewpoints. In fact, cynical individuals tend to be more likely to defend democratic norms (see Figure 16).

Support Having a Strong Leader Who Does Not Have to Bother with Parliament and Elections

100% 4.5 10)

− 80%

4

60%

3.5 40%

3 20% Support Strong Man Rule (1

2.5 0% 0 .2 .4 .6 .8 1 Cynicism (0−1 Scale)

Figure 16: Influence of political cynicism on attitudes toward strong leader. OLS regres- sion with full controls and robust standard errors. Dashed lines represent 95% confidence intervals. N=489. Histogram at the bottom shows the percentage of participants with dif- ferent levels of cynicism.

I contend that it is necessary to distinguish between healthy skepticism and distrust in government and a cynical/disengaged citizenry (Eisinger 1999, 48). Given the attention to the rise of cynicism in democratic politics, we need to revisit the what political cynicism means and its consequences for political participation, voting for extremist parties, and support for democratic norms (Citrin and Stoker 2018). While claiming that foreign pro- paganda is making Americans cynical is an effective way to sell newspapers (or get clicks

80 on social media) we need to better theorize why foreign propaganda would shift citizens’ deep-rooted attitudes about their political system.

Conclusion

In this chapter, I demonstrate that Russian propaganda and corresponding inoculation messages have differential effects on belief in conspiracy theories based on the topic of the disinformation as well as the citizens’ age, political priors, levels of political awareness. Older conservatives are more likely to believe pro-conservative conspiracy theories when warned about the dangers of Russian propaganda. However, younger people with low levels of political awareness seem to be more susceptible to disinformation on U.S. foreign policy. Overall, while foreign actors can reach different audiences by using targeted narratives, the most persuadable groups tend to be the most politically polarized and the least politically aware. Second, despite a lack of strong empirical evidence that international propaganda shapes public opinion, a prevalent narrative is that the Kremlin’s propaganda is fostering cynicism (Jamieson 2018; Howard et al. 2018). I interrogate the logic behind this claim and show that even when directly exposed to a critical message of the U.S. two-party system, indi- viduals do not become more cynical. Notably, I question whether political cynicism, as it is currently measured, should be regarded as a negative influence on democracy. I demon- strate that more cynical individuals tend to be less willing to embrace nondemocratic modes of government. I contend that we need to re-consider the role of cynicism in democratic politics and better theorize the conditions under which cynical individuals challenge au- tocratic practices and when cynicism serves as a conservative societal mechanism (Brock 2018, 292). Simply put, while we often hear the claim that propaganda causes individuals to become politically cynical, the jury is still out about when cynicism is detrimental to democracy (Eisinger 2000; Vresse 2005).

81 Propaganda’s Presumed Influence

“Goldstein was delivering his usual venomous attack upon the doctrines of the Party – an attack so exaggerated and perverse that a child should have been able to see through it, and yet just plausible enough to fill one with an alarmed feeling that other people, less level-headed than one-self, might be taken in by

it” –(Orwell 1949, 12).

Introduction

The previous chapters analyze whether exposure to propaganda has direct effects on public opinion. I show messages can influence attitudes toward rival states but do not improve views toward the communicating country. I also find that viewing disinformation can increase some citizens’ beliefs in particular conspiracy theories but does not heighten political cynicism. Yet, we may still be ignoring propaganda’s primary influence on public opinion. Specifically, propaganda’s main effect may be in shaping perceptions about the influ- ence of disinformation on others. By funding international propaganda campaigns states can make themselves appear more influential than they actually are, thereby fostering po- larization and increasing distrust in democratic processes (Tomz and Weeks 2019). While people think that they are are immune to foreign propaganda, they might judge their fellow citizens to be more credulous. The concern is that when individuals think that their fellow citizens are easily duped by foreign disinformation, they can become more skeptical of democratic outcomes and begin to support a more restrictive media environment. Theories unpacking the mechanism behind the “influence of presumed influence” gen- erally fall within a cognitive or a motivational camp (Sun, Pan and Shen 2008, 282). Ac- cording to the motivational account, individuals’ desire to enhance their ego causes them to

82 believe that others are more receptive to undesirable social influences (Gunther and Mundy 1993; Perloff 1999). This leads them to overestimate the influence of harmful media mes- sages on group they perceive as weaker or inferior in some way (McLeod, Detenber and Jr. 2001). Especially, in an era of political polarization, people may be more likely to presume that members of the opposite party are more likely to select into undesirable media outlets (Eveland et al. 1999;M uller¨ 2013). According to the cognitive account, most people believe they are too smart to be in- fluenced by foreign propaganda, but assume that others are less resistant to threatening persuasive appeals (Scherr and Muller¨ 2017). When evaluating media content they think others are more gullible or less discerning (Gunther 1991). Alternatively, individuals be- lieve that others are more likely to be influenced by foreign messages if they think that others are more likely to be exposed to international propaganda (Reid and Hogg 2005). In much empirical research on third-person effects researchers focus on the source, method, sampling, respondent, country, desirability, medium, and message content to explain the overall effect size between estimated media effects on self and on others (Paul, Salwen and Dupagne 2000). Yet, we don’t know how exposure to propaganda (or warning messages) shape perceptions of propaganda’s influence in our society. When people are exposed to instances of foreign propaganda do they tend to think it has a larger or smaller influence on others? This is an an important, but overlooked, indirect effect of international propa- ganda. Exposure to disinformation may not shift most individuals political attitudes, but it is possible that exposure to foreign propaganda can cause people to believe propaganda is pervasive and therefore influential (Eveland et al. 1999). Therefore, I rely on a survey experiment to test how exposure to Russian propaganda, and/or inoculation messages, shape individuals’ beliefs about the influence of foreign pro- paganda. Second, I use observational data to assess whether people’s perceptions that “others” are influenced by propaganda shapes views on electoral legitimacy and restrictive

83 media policies. Third, I rely on causal mediation analysis to confirm that presumed propa- ganda effects on others is a mediating variable that leads to attitudinal responses regarding the legitimacy of elections and the suppression of media messages (Imai, Keele and Ya- mamoto 2010). Together, I aim to emphasize the indirect mechanisms by which foreign propaganda can shape public opinion, even if few people are actually persuaded by specific messages.

Research Design

To evaluate the presumed influence of propaganda, I rely on the same survey as the pre- vious chapter that randomly exposed participants to disinformation on Seth Rich, Syria, and Ukraine. As a reminder, the Control group was exposed to eight non-political social media posts, with a mix of human interest stories and advertisements. The Propaganda group was exposed to four non-political social media posts and four articles from the Kremlin state-funded networks Russia Today (RT) and Sputnik. The Inoculation group was given a warning about the intentions of Russian propagandists before participants were exposed to the same set of articles as the propaganda group (see Figure8 ). After being assigned to one of the three groups, participants are asked questions about the influence of propaganda. Specifically, they were asked: How effective do you think Russia propaganda was at influencing other citizens’ voting during the 2016 U.S. presi-

dential election? Responses ranges from “not at all effective” (1) to “very effective” (4). Respondents were also asked whether they agree with the following two statements: (i) The presence of Russian propaganda during the campaign makes the results of the 2016

Presidential election illegitimate; and (ii) Any Russian-backed television station should be banned. Responses range from ”strongly disagree” (1) to ”strongly agree” (7). I analyze whether seeing propaganda (or being warned about propaganda) causes people to overesti- mate the influence of propaganda on other people. Additionally, I assess whether the belief

84 that others were influenced by propaganda is associated with the belief that democratic outcomes are less legitimate and censorship is necessary.

Results

In line with previous research, I find that individuals tend to believe that propaganda had a greater effect on others’ attitudes and behaviors than their own (Perloff 1993). While only 14% think that Russian propaganda had some or a great influence on their voting be- havior in the 2016 election, over 58% of respondents think it had some or a great influence on others’ voting behavior. What influence does exposure to propaganda, and/or an in- oculation message, have on this perception? I analyze the effect of the treatments on the perception that propaganda influenced other people’s voting in the 2016 presidential elec- tion using ordinary least squared (OLS) regression with robust standard errors. Because the balance tests indicate slight imbalance in the sample in regard to political awareness and partisanship, I include a series of controls in my models. Specifically, I control for gender, age, race, education, partisan identity, political sophistication, political interest, and trust in media, which have all been linked to overestimating the influence of media messages on others (Cohen and Davis 1991; Eveland et al. 1999; Gunther 1991; Perloff 1993; Rucinski and Salmon 1990; Tiedge et al. 1991). The results are nearly identical across alternative operationalizations and without controls (See Appendix C).

85 How effective do you think Russia propaganda was at influencing other citizens’ voting during the 2016 U.S. presidential election?

Inoculation

Propaganda −0.2 −0.1 −0.0 0.1 0.2 Treatment effect

Figure 17: Effect of treatments on propaganda’s presumed effect on others. Controls included. Horizontal lines represent 95% confidence intervals for estimates.

I find that exposure to propaganda has a negative effect on individuals’ perceptions of propaganda’s effectiveness on others (β = −0.13, p = 0.04). The inoculation with the propaganda also decreases the belief that other citizens changed their votes as a result of

Russian propaganda by approximately the same magnitude (β = −0.14, p = 0.04). Put dif- ferently, exposure to sample foreign propaganda posts or a warning message about propa- ganda reduced the belief that other people changed their voting behavior by approximately 6% – a relatively small but still notable effect (Figure 17).10 While some may predict that exposure to propaganda, and/or a warning message about propaganda, may prime individ- uals think about the pervasive influence of foreign interference and overestimate its effect

10I also discover that neither exposure to the propaganda posts nor the inoculation had an effect on the perception that propaganda affected their own behavior.

86 on public opinion, I find that exposure to these messages lowers propaganda’s presumed influence on others. In my survey, over 77% of Democrats think Russian propaganda affected voting, while only 28% of Republicans think the same - closely mirroring the findings in a March 2018 YouGov survey drawing from a sample more representative of the population as a whole. Notably, I find that Democrats and Independents become less likely to consider propaganda effective when exposed to the treatments. Democrats are approximately 8% less likely to think that propaganda influenced other citizens’ voting behavior in both the propaganda and the inoculation treatment groups. Independents – in the inoculation group only - tend to be much less likely to believe propaganda changed others’ voting behavior – approximately a 16% decrease. Republicans, on the other hand, are not influenced by either the propaganda posts or the inoculation message (Figure 18).11 What explains some of these unexpected findings? I suggest that if exposure to pro- paganda, and/or an inoculation message, causes people to believe that propaganda is less threatening than they previously believed, they may update their views on propaganda’s influence on others (Smith 1970). Simply put, people may be expecting threatening and sophisticated propaganda posts – particularly after a warning message. But after being exposed to what appear seemingly tame social media posts, they may not consider that propaganda is that threatening or effective. Messages that decrease the perceived threat of propaganda can lessen propaganda’s presumed influence on others (Sun, Pan and Shen 2008, 282). Showing people that foreign propaganda is not as sophisticated or threatening as they previously believed can decrease its perceiving influence over politics.

11The lack of an effect on Republicans may simply be due to a floor effect since Republi- cans are already less likely to believe propaganda influenced others.

87 How effective do you think Russia propaganda was at influencing other citizens’ voting during the 2016 U.S. presidential election?

.2

0

−.2

−.4 Treatment effect

−.6

Propaganda Inoculation −.8 Democrat Republican Independent

Figure 18: Effect of treatments on propaganda’s presumed effect on others. All controls included. Results disaggregated by partisans (including leaners). Horizontal lines represent 95% confidence intervals for estimates.

Next, I rely on non-experimental data to test how the perception that “others” are re- ceptive to propaganda shapes people’s perception of electoral legitimacy and their attitudes toward censorship. If it does not, it is less consequential whether people tend to overes- timate the effect of propaganda on others. Specifically, I regress people’s views on the legitimacy of the 2016 presidential election and support for censorship on people’s per- ceptions of Russian propaganda’s influence on other citizens’ voting behavior in the 2016 election. First, I find that the perception that propaganda shapes other voters’ behavior has a large substantive effect on the view that the outcome of the election is illegitimate. Individuals who believe that propaganda influences other citizens are 120% more likely to think the outcome of the election was illegitimate - even when controlling for partisanship and political sophistication (Figure 19).

88 The presence of Russian propaganda during the campaign makes the results of the 2016 presidential election illegitimate

Strongly agree

Neither agree nor disagree

Strongly disagree Not at all effective A little effective Pretty effective Very effective Propaganda Effect on Others

Figure 19: Dependent variable is belief that the outcome of the 2016 election is illegit- imate. Independent variable is belief about propaganda’s influence on others’ voting be- havior. All controls included. Horizontal lines represent 95% confidence intervals for estimates.

Some might wonder whether this relationship is simply a consequence of partisan mo- tivated reasoning. Democrats tend to believe that Trump is illegitimate, may be justifying this belief by saying that propaganda shaped the voting behaviors of other citizens to ex- plain the loss of their preferred candidate. To account for this, I assess how the relationship between believing propaganda influences others and believing the outcome of the election is illegitimate varies across partisans. While I do find that the influence of propaganda’s presumed influence on perceptions of electoral legitimacy is moderated by partisanship, it has a statistically significant relationship on all voters’ views about the election’s legitimacy (Figure 20). Even Republicans who believe that propaganda shaped other people’s behav- ior are more likely to consider the outcome of the election as less legitimate – although the

89 substantive effect is smaller.

The presence of Russian propaganda during the campaign makes the results of the 2016 presidential election illegitimate

Strongly agree Democrat Republican Independent

Neither agree nor disagree

Strongly disagree Not at all effective A little effective Pretty effective Very effective Propaganda Influences Others

Figure 20: Dependent variable is belief that the outcome of the 2016 election is ille- gitimate. Independent variable is belief about propaganda’s influence on others’ voting behavior. All controls included. Results disaggregated by partisans (including leaners). Horizontal lines represent 95% confidence intervals for estimates.

Second, I discover that the belief that others are affected by propaganda increases sup- port for censorship. Individuals who think propaganda influenced others’ voting behavior are approximately 34% more likely to support banning Russian networks (Figure 21). I do however find that the significant effects are driven by Democratic and Independent vot- ers. Republicans who believe that Russian propaganda shaped others’ voting behaviors are not more likely to support censorship (Figure 22). The findings add intrigue about the conditions under which presumed propaganda effects on others increases support for more restrictive media policies (Sun, Pan and Shen 2008).

90 Any Russian−backed television station should be banned

Strongly agree

Neither agree nor disagree

Strongly disagree Not at all effective A little effective Pretty effective Very effective Propaganda Effect on Others

Figure 21: Dependent variable is support for banning Russian-funded networks. Indepen- dent variable is belief about propaganda’s influence on others’ voting behavior. All controls included. Horizontal lines represent 95% confidence intervals for estimates.

Some research finds that individuals who think fake news has a larger influence on people’s attitudes are less likely to support media regulation – raising questions about when presumed propaganda effects translate into greater support for censorship (Jang and Kim 2018, 299). These findings are important since support for banning foreign networks can be a potentially slippery slope leading to greater elite control over the media. Elites can use the excuse of ”foreign propaganda” to ban content they find disagreeable. While these concerns may still be minor in the United States, which has strong norms against censorship, we should be cognizant about the tradeoffs between protecting citizens from foreign manipulation and upholding freedom of speech (West 2017). In countries like Singapore and Russia, elites have already used concern about ’fake news’ and propaganda as an excuse to curtail media freedom and violate citizens’ privacy (Daskal 2019).

91 Any Russian−backed television station should be banned

Strongly agree Democrat Republican Independent

Neither agree nor disagree

Strongly disagree Not at all effective A little effective Pretty effective Very effective Propaganda Influences Others

Figure 22: Dependent variable is support for banning Russian-funded networks. Indepen- dent variable is belief about propaganda’s influence on others’ voting behavior. All controls included. Results disaggregated by partisans (including leaners). Horizontal lines represent 95% confidence intervals for estimates.

Finally, I address some concerns regarding reverse causality. I argue that the perception that others are vulnerable to propaganda leads to the belief that elections are illegitimate and increases support for censorship. Yet, it is possible that individuals with these pre- existing attitudes justify their beliefs by claiming that others are susceptible to propaganda (Tsfati 2007). Moreover, it is possible that any relationship between the perceived effect of propaganda and support for censorship and views on an election’s legitimacy is due to some omitted factor (Tal-Or et al. 2010, 805). Previous work has tried to exogenously induce greater or lower perceptions of propaganda effectiveness by varying types of information presented to participants before they answer survey questions to overcome this concern about reverse causality (Cohen et al. 1988).

92 My design follows a similar logic – randomly exposing participants to Russian propa- ganda or inoculation messages that are mean to prime them to think about foreign disin- formation. This allows me to estimate the average causal mediation effect (ACME), or the indirect effect of the treatments on beliefs about the election’s legitimacy and support for censorship that is mediated through presumed effects on others. I use Tingley et al.(2014)’s mediation package in R to estimate ACMEs. The results of this analysis are depicted in Figure 23 and Figure 24.

ACME

ADE

Total Effect

−0.6 −0.4 −0.2 0.0

Figure 23: Causal mediation plot. Treatment is the Inoculation (compared to control con- dition), Mediator is presumed propaganda effects on others. Outcome is belief that 2016 election outcome was illegitimate. Horizontal lines represent 95% confidence intervals for estimates.

93 The ACME, or indirect effect of presumed propaganda effects on the belief that the outcome of the 2016 election is illegitimate, is negative and significant (p = .02). Ap- proximately 52% of the estimated average decrease in people’s belief that the election was illegitimate arrives as a result of presumed effect on others rather than ‘directly’ from the treatment. This means that when individuals tend to believe that propaganda is less effec- tive on others, they are less likely to consider the outcome of the election illegitimate.

ACME

ADE

Total Effect

−0.4 −0.2 0.0 0.2

Figure 24: Causal mediation plot. Treatment is the Inoculation (compared to control condi- tion), Mediator is presumed propaganda effects on others. Outcome is belief that Russian- funded networks should be banned. Horizontal lines represent 95% confidence intervals for estimates.

94 The ACME of presumed propaganda effects on censorship is also negative and signifi- cant (p = .02). Around 25% of the estimated average decrease in support for censorship in the inoculation group comes as a result of presumed effects on others. However, while the direct and total effects are negative, the confidence intervals overlap with 0. While early work claimed that failing to find a direct effect of X on Y means that mediation analy- sis is unnecessary (Baron and Kenny 1986), more recent work suggests otherwise (Shrout and Bolger 2002). The lack of direct treatment effects on one of the outcome variables may indicate either the presence of heterogeneous mediators across subgroups or multiple opposing mediators (Mackinnon and Fairchild 2009, 17). Since the potential for confounding mediators is always a possibility in experimental work that does not randomly manipulate the magnitude of mediating variables, future work should use parallel encouragement designs in order to determine to what extent attitudinal and behavioral responses regarding the prevention of the media message are a consequence of presumed media effects on others (Pirlott and Mackinnon 2016). This would supplement theories that link perceived media effects and restrictive behaviors (Feng and Guo 2012; Scherr and Muller¨ 2017).

Conclusion

This chapter evaluates the influence of presumed influence in the context of interna- tional propaganda. I analyze how direct exposure to propaganda, and/or warning messages about propaganda, shape people’s perceptions of propaganda’s impact on other citizens, and whether their perceptions influence views on electoral legitimacy and attitudes toward censorship. I present two major findings. First, exposure to propaganda and warning mes- sages reduces the perception that others are affected by propaganda. I argue that actually seeing propaganda may help reassure audiences that international propaganda is not ma- nipulating public opinion in ways that invalidate election outcomes or requires suppression

95 of free speech. By providing citizens a glimpse into the Russian propaganda apparatus, they may find that it is not as sophisticated, threatening, or influential as they believed. Second, the perception that others are affected by propaganda increases the view that elections are illegitimate and increases support for censorship (especially among Democrats and Independents). Perceptions of propaganda effects on others is highly consequential as people’s beliefs (regardless of the underlying reality) have independent effects on their po- litical attitudes and behaviors (Ahler and Sood 2018; Hollyer, Rosendorff and Vreeland 2015; Kuran 1991). These findings shed light on indirect mechanisms by which interna- tional propaganda can shape public opinion. It will be essential to continue to assess how propaganda and our counter-messaging efforts not only shape belief in misinformation or specific narratives, but also how exposure to this type of content affects people’s perceptions about the effectiveness of propaganda in our society. In the United States, social media sites have begun to remove sites linked to the Russian government and flag websites that promote misinformation (Helmus et al. 2018, 80). Governments have begun to pass legislation policing disinformation and foreign propaganda (Hellman and Wagnsson 2017; Khaldarova and Pantti 2016). Some groups find these actions to be politically biased, viewing efforts to police disinformation as an excuse to silence particular political viewpoints. Some even argue that policies that limit foreign voices, “play into Putin’s hand, and only serve to support RT’s repeated mantra that the ‘West’ is hypocritical when it comes to freedom of speech” (Crilley 2018). One major concern is that legislation to protect individuals from “harmful content” can lead to broader censorship (Lundgren et al. 2018), raising concerns about the balance between security and democratic liberties (Malksoo¨ 2018, 379). While international propaganda may be reaching a small and selective group, perceptions about the effectiveness of propaganda in our society can have much great consequences.

96 How to Criticize an Autocrat

“One of the strategic mistakes that BBG [Broadcasting Board of Governors] as an institution has made, especially since the Ukrainian conflict, has been an extremely critical tone towards everything in Russia. Critical and melodra- matic. This is one of the biggest problems with American foreign broadcasts, and not only in Russia. The people who work primarily out of America over- estimate the harshness of what happens in the country or countries where they

broadcast.” -(Schafer and Gatov 2017).

Introduction

So far, I have focused on the influence of foreign propaganda in the United States while evaluating the efficacy of defensive counter-propaganda strategies. The direct influence of Russian disinformation is minor – increasing unfavorable attitudes toward rivals states and spreading belief in conspiracy theories in individuals with low political awareness and strong political attitudes. I also show that believing that other people are susceptible to pro- paganda can decrease democratic legitimacy and increase support for censorship. Finally, I demonstrate that some of our defensive counter-propaganda strategies are either ineffective or even counter-productive. Yet, as mentioned previously, some believe that the United States and other Western democracies should go on the offensive and fund alternative media in autocratic states to undermine support for non-democratic regimes (Farwell 2018). While most citizens will not be receptive to foreign information, some citizens may be receptive (White 1952, 540). Most notably, “opponents of the regime are more likely to be motivated to acquire infor- mation that is damaging to incumbent authoritarians, and are also more likely to believe information coming from critical or foreign sources” (Robertson 2017, 2). While existing

97 studies consider how individual levels factors moderate transnational persuasion, we have less research on which communicator strategies are most effective in changing views. If foreign sources are not perceived as credible, what can communicators do to over- come foreign audiences’ cognitive defenses (Allport and Simpson 1946; Friestad and Wright 1994)? I argue that individuals’ perceptions about the “criticalness” of a source can have independent effects on their view of the source’s credibility and their receptivity to mes- sages critical of their government (Pornpitakpan 2004, 253). I also assert that by balancing one’s coverage, networks can minimize the likelihood of backlash toward the communi- cator. Szostek argues that “repetitive and one-sided attributions of blame risk alienating the unconverted among the general population” (Szostek 2018b, 129). One-sided cov- erage can turn off readers, especially when people hold pre-existing negative evaluations of the communicator (Cappella and Jamieson 1996; Roese and Sande 1993). It re-affirms their perception that foreign networks are biased toward their country, leading to a lower likelihood of persuasion (Valline, Ross and Lepper 1985). One-sided messages may not just fail to promote pro-democratic outcomes, but there is the possibility they backfire and retrench support for non-democratic regimes (Martin 1971, 68). Some evidence suggests that attempts to provide voters with information in weak democracies is ineffective or even counter-productive (Chong et al. 2014). More often than not, foreign criticism comes off as non-credible and therefore unpersuasive (Hovland and Weiss 1951; Adelman and Dasgupta 2019). Social identify theory posits that when out-group members threaten one’s social group, people reject or counter-argue the critical information, or derogate the communicator (Kunda 1990; Lord, Ross and Lepper 1979; Taber and Lodge 2006). A number of studies demonstrate that information that threatens one’s group can backfire (Nyhan and Reifler 2010; Redlawsk 2002). Outside messages in particular can “cause nationalistic backlash, since outside pronouncements may come across as inappropriate meddling in a country’s domestic affairs (Marinov 2018, 6).

98 For instance, one study conducted in Ukraine after the 2004 presidential election found that Ukrainians perceived American and Russian interventions as improper (Shulman and Bloom 2012). A study in Lebanon found evidence that educated and politically sophis- ticated voters had negative perceptions of foreign involvement in the electoral process if the outsiders supported a particular side (Marinov 2013, 1299). Pressure from a geopo- litical rival, in particular, is likely to be perceived as threatening to the nation’s standing (Gruffydd-Jones 2019, 588). People tend to defend their salient social group from criti- cism, adopting attitudes that make their group look better (and the outgroup look worse). In particular, “people who identify strongly with a group display defensive reactions to social identity threat” (Branscombe et al. 1999, 37). Since motivated reasoning is often driven by the desire to signal loyalty to relevant social groups, it can be prudent to avoid excessive criticism which can activate ideologically motivated cognition (Kahan 2013, 407). Balanced blame attribution, on the other hand, where the communicator admits one’s own faults to signal unbiased coverage, can improve a foreign broadcasters’ credibility and make its arguments more persuasive (Allen 1991). The father of propaganda studies, Jaques Ellul argued that “if an enemy can demonstrate that he has told the truth, a sudden turn in his favor will result” (Ellul 1965, 66). During World War I, the German-language U.S. funded newspaper Frontpost was able to gain credibility by honestly reporting on the Allies losses during the Battle of the Bulge (Daugherty and Janowitz 1958, 559). The BBC gained a reputation for honesty when it reported that the British (RAF) suffered more losses during an air raid than reported by German media outlets. These admissions of failure can increase source credibility and persuasiveness (White 1952, 542). Prior research has suggested that unexpected information (like negative coverage of one’s self) can help signal unbiasedness (Chiang and Knight 2011, 817). When audiences come in with the expectation that all media from the other side is heavily biased, presenting some positive coverage of the enemy, and/or negative coverage of oneself, can improve

99 the credibility (and the persuasiveness) of the source (Lumsdaine and Janis 1953; O’Keefe 1999). For instance, Smith(1970) found that Americans exposed to Radio Moscow adopted more pro-Russian views because “conditions in our own society had led the audience to hold unrealistic negative images which, upon actual exposure, were clearly refuted for many of the listeners” (550). In theory, by highlighting one’s own faults, it is possible to reach and persuade a broader audience. The messenger’s nationality also plays a major role in persuasion (Hovland and Weiss 1951; Dragojlovic 2013; Pornpitakpan 2004). Audience’s evaluations of source credibility are highly impacted by whether the communicator is regarded as a member of the ingroup or outgroup (Mackie and Queller 2000). Research in social psychology consistently shows that “people are more persuaded by criticism of their group when it comes from fellow ingroup members rather than outgroup members” (Adelman and Dasgupta 2019, 740). Individuals who support the regime might deem any information coming from foreign news outlets as not credible, and as a result, unpersuasive. In fact, any message – regardless of it content – that is attributed to an actor perceived as hostile to one’s interests may fail to impact attitudes and may actually backfire (Ashmore et al. 1979, 132; Page, Shapiro and Dempsey 1987, 32). Some even claim, American propaganda aimed at external actors is often unsuccessful because “Americans have a ‘superiority complex’, which expresses itself in such publications and displeases foreigners” (Ellul 1965, 70). The greatest concern is that “criticism from outsiders might inhibit change as groups go into a state of denial about problems they would otherwise acknowledge” (Hornsey, Trembath and Gunthorpe 2004, 500). Before we invest too many resources into funding “free media” for autocratic countries, we should consider how to frame content for skeptical audiences to overcome psycholog- ical barriers to persuasion. In doing so, we must consider how the source and content of foreign messages moderate the persuasive impact of transnational criticism. In my final

100 study, I investigate how actors can better frame their coverage to convince skeptical foreign audiences.

Research Design

I focus on U.S. foreign broadcasting in Russia since the Kremlin’s visible interfer- ence in foreign elections has caused the U.S. to dedicate more resources into countering strategic influence operations (Farwell 2018). Russia is a good case to study the impact of transnational persuasion because it has substantial population that accesses foreign net- works. Szostek argues that “digital generation Russian-speakers don’t lack access to infor- mation – in fact they are inundated with options. The problem is knowing which informa- tion to trust” (Szostek 2015). Ellen Mickiewicz also find that young Russian students are likely to read a host of diverse sources – including those from the West (Michiewicz 2014). Finally, unlike respondents in other autocratic regimes, Russians are more likely to answer sensitive questions about regime support (Frye et al. 2017). Consequently, it is critical to assess how U.S. broadcasting influences Russian public opinion and evaluate what types of content makes persuasion more or less likely. To test the impact of foreign criticism in non-democratic states, I rely on an online sur- vey experiment. While a number of studies assess the correlation between use of foreign media and political support for the autocratic regimes, these estimates are vulnerable to reverse causality (Gainous, Wagner and Ziegler 2018; Xiang and Hmielowski 2017). A survey experiment that randomly exposes some individuals to different foreign/domestic messages better tests the micro-level effects of external propaganda on public opinion and political behavior (Huang and Yeh 2017; Robertson 2017). Specifically, I assess how crit- icism of Russia sourced to a fictional media outlet shapes political attitudes in Russia and analyze how foreign broadcasters can balance their coverage in order to bolster their cred- ibility and maximize message persuasiveness.

101 The survey was conducted by the survey firm Ipsos Research in February 2019. 956 participants successfully completed the survey. Invitations were stratified on age and gen- der based on internet penetration data. The sample is 52% female and had a mean age of 42. The survey tended to oversample urban and more educated individuals, but this group is also the most likely to encounter foreign news sources and they play a critical role in protest politics in autocratic regimes, making them the ideal target population to study online foreign criticism (Wallace 2013). Most importantly, while the sample may not be perfectly representative of the broader population, there is little reason to believe that the treatment effects would vary in a more representative sample (Barabas and Jerit 2010). At the onset of the study, respondents are given a brief survey gauging their priors to- ward the regime, media consumption behavior, and some standard demographic questions. I identify regime supporters by their reported vote in the March 2018 Presidential election. Regime supporters are defined as those who voted for one of the regime-sponsored candi- dates – Vladimir Putin or Vladimir Zhirinovsky (58%).12 Regime opponents are those who voted for Sergey Baburin, Pavel Grudinin, Ksenia Sobchak, Maxim Suraykin, Boris Titov, Grigory Yavlinsky, or those who said they voted for an “other” candidate (19%). I am also left with non-voters – people who either refused to answer, did not vote, were not eligible to vote, or spoiled their ballots (23%). Since there are questions about who are legitimate opposition candidates, I use different operationalization of regime support as well and find nearly identical results (March 2009). After completing the first part of the survey, participants are told to evaluate a new domestic or foreign news outlet. I use a fictional news media outlet since individuals may have pre-determined attitudes toward existing sites that may bias the results in unexpected ways. The experiment used a 2x3 between-subjects design. The first factor of variation is whether the source is described as a domestic or foreign news outlet.

12Results are identical when coding Zhirinovksy as an opposition candidate as well.

102 We are interested in your opinion on the newly created information resource on the Internet. Strela is a new Russian (American) news agency offering news, analytics, reports. Tell me please, do you find the following articles interest- ing?

Individuals are then randomly placed into: (1) a Control Group where they are exposed to five non-political placebo articles; (2) a One-sided Criticism Group, where they are asked to look at four placebo articles and one article about Putin’s corruption; or a Balanced Criticism Group, where they are asked to look at three placebo articles, an article about corruption in Putin’s regime, and an article about corruption in the United States. Balance checks indicate the randomization was done successfully (See Appendix D).

103 Figure 25: Treatment Posts: Individuals in the control groups saw all the articles in rows two and three. Meanwhile, individuals in the criticism groups saw four (random) placebo posts in addition to the criticism of Vladimir Putin. Individuals in the balanced group saw three (random) placebo posts, the criticism of Vladimir Putin, and the criticism of U.S. corruption.

104 Putin Corruption Headline: Vladimir Putin’s “inner circle” controls about $24,000,000,000 Text: Vladimir Putin has always denied that he has any serious wealth. Never- theless, one of his relatives, with a salary of $8,500 a year, somehow managed to accumulate assets of $573 million.

U.S. Corruption Headline: In America, there is a problem with corruption Text: Corruption in the United States is much broader than just Trump’s story. Trump is almost certainly just the most corrupt president in American history.

By making the articles look like social media posts and having a mixture of political and non-political content, this design better mimics how individuals see foreign news online, thereby increasing the external validity of the study. I focus on the topic of corruption since it is a salient political issue (Filipov 2017). Corruption, many believe, is a point of weakness for the Kremlin that can be used to challenge autocracy in Russia (Farwell 2018, 45).

Table 4: Research Design

Foreign Domestic Control Five placebo news posts Five placebo news posts Four placebo news posts + Four placebo news posts + One-Sided One post on Putin’s corruption One post on Putin’s corruption Three placebo news posts + Three placebo news posts + Balanced One post on Putin’s corruption + One post on Putin’s corruption + One post on American corruption One post on American corruption

After exposure to the treatments, I ask individuals their views on Vladimir Putin’s han- dling of corruption, the economy, and their general attitudes toward the president. I use principal component factor analysis to create a single measure of support for Vladimir

105 Putin (µ = .51, σ=0.26). The scale is normalized to range from 0-1. I also assess whether exposure to information about Putin’s corruption leads to: (i) greater support for Alexei Navalny, a prominent anti-corruption advocate in Russia; and (ii) less faith that Russia is a democracy (µ=0.51, σ=0.26). Support for Navlany is measured using a 4-point scale, which I condense into a binary variable indicating whether the individual would consider voting for Navalny (µ=0.24, σ=0.43). By randomly exposing Russians to different types domestic and foreign criticism of Vladimir Putin, I am able to assess which types of mes- sages are most likely to be effective and identify which individuals are most persuadable.

Results

I first plot the influence of domestic and foreign criticism on support for Vladimir Putin in the full sample (see Figure 26). Criticism of a foreign country is more effective if the communicating country also presents criticism of one’s own country. I find that foreign balanced criticism lowers evaluations of Putin by approximately 7% compared to the con- trol (p=0.08). Neither of the domestic criticisms nor the one-sided foreign criticism shift attitudes toward Putin.

106 Influence of Domestic and Foreign Criticism on Support for Putin

Domestic Criticism vs. Control Foreign Criticism vs. Control

Balanced Balanced

One−sided One−sided −.1 −.05 0 .05 .1 −.1 −.05 0 .05 .1

Treatment effect

Figure 26: Effect of treatments on support for Vladimir Putin (0-1 scale). Left plot illus- trates the effect of domestic criticisms relative to the control. Right plot illustrates the effect of foreign criticisms relative to the control. Horizontal lines represent 90% confidence in- tervals for estimates.

Since regime voters, opposition voters, and non-voters are likely to react differently to foreign information (Robertson 2017), I plot the effects of the treatments based on in- dividuals’ self-reported vote choice in the 2018 election (see Figure 27). Disaggregating the effects of the treatments by regime support shows that while most citizens are not re- ceptive to criticism, some citizens can be persuaded. Compared to the control, I find that among non-voters, balanced foreign criticism decreases evaluations of Vladimir Putin by 16% (p=0.01).

107 Influence of Domestic and Foreign Criticism on Support for Putin

Domestic Criticism vs. Control Foreign Criticism vs. Control

Regime Voter One−Sided Regime Voter One−Sided Balanced Balanced

Opposition Voter Opposition Voter

Non−Voter Non−Voter

−.3 −.2 −.1 0 .1 .2 −.3 −.2 −.1 0 .1 .2

Treatment effect

Figure 27: Effect of treatments on support for Vladimir Putin (0-1 scale). Left plot illus- trates the effect of domestic criticisms relative to the control. Right plot illustrates the effect of foreign criticisms relative to the control. Horizontal lines represent 95% confidence in- tervals for estimates.

Foreign balanced criticism also lowers support for Putin compared to foreign one-sided criticism for non-voters by 17% (p=0.01). Pointing out Putin’s corruption only decreases support for the president when the source also highlights corruption in the United States. Balanced foreign criticism can lower support for autocratic rulers, but only among non- voters. This is consistent with other research that finds that people who disapprove with the regime are more likely to update their beliefs in response to the new information from outside groups (Bush and Prather 2017, 933). While some fear that foreign criticism may be viewed as illegitimate, there is no evi- dence that domestic criticism is more effective than foreign criticism (Helmus et al. 2018, xii). In fact, domestic criticism is consistently less effective than foreign criticism (see Fig-

108 ure 28). Compared to domestic balanced criticism, I find that foreign balanced criticism significantly decreases support for Putin among non-voters (from 0.47 to 0.30, p = 0.001). However, when I compare the effects of domestic one-sided criticism vs. foreign one-sided criticism, I find no difference – indicating that the source’s nationality and its content have independent moderating effects on persuasiveness (Pornpitakpan 2004). Regime support- ers are not reacting adversely to foreign criticism of their president, meaning that backlash effects may be rarer than some expect (Guess and Coppock 2018).

Influence of Domestic vs. Foreign Criticism on Support for Putin

Domestic One−Sided vs. Foreign One−Sided Domestic Balanced vs. Foreign Balanced

Regime Voter Regime Voter

Opposition Voter Opposition Voter

Non−Voter Non−Voter −.3 −.2 −.1 0 .1 .2 −.3 −.2 −.1 0 .1 .2

Treatment effect

Figure 28: Effect of treatments on support for Vladimir Putin (0-1 scale). Comparing domestic vs. foreign one-sided/balanced treatments. Horizontal lines represent 95% confi- dence intervals for estimates.

Why is foreign balanced criticism the most effective in shifting attitudes toward auto- cratic leaders? I predicted that balanced criticism would help signal source unbiasedness and make criticism of the Russian president more persuasive (Allen 1991). I test for this mechanism by assessing to what extent balanced criticism boosts participants’ perceptions

109 of the source’s credibility. I find that balanced criticism does not increase trust in the source among non-voters (see Figure 29). What this means is that non-voters become more anti- Putin following exposure to foreign balanced criticism even though they do not find the foreign outlet to be more credible.

Influence of Domestic and Foreign Criticism on Trust in Strela

Domestic Criticism vs. Control Foreign Criticism vs. Control

Regime Voter One−Sided Regime Voter One−Sided Balanced Balanced

Opposition Voter Opposition Voter

Non−Voter Non−Voter

−.4 −.2 0 .2 .4 −.4 −.2 0 .2 .4

Treatment effect

Figure 29: Effect of treatments on trust toward Strela’s (1-3 point scale). Left plot illus- trates the effect of domestic criticisms relative to the control. Right plot illustrates the effect of foreign criticisms relative to the control. Horizontal lines represent 95% confidence in- tervals for estimates.

In short, non-credible sources can still shift political attitudes (Hass 1981). People can judge external media network to be untrustworthy, but they are still persuaded by their ar- guments (Smith 1970, 540). In her study of over three-thousand Chinese citizens, Truex (2016) finds that “perceptions of bias seem unrelated to a respondent’s self-reported polit- ical attitudes. Both regime supporters and discontents acknowledge the pro-regime biases in official media, in sharp contrast to the standard hostile media framework” (24). More

110 generally, I demonstrate that while propagandists can try and bolster their reputation, the success of these initiatives is often uncertain (Bush and Prather 2018; Martin 1971). I consider whether the decrease in support for Putin comes about as a results of priming individuals to think about corruption in the country. Presented with a story about corruption might cause individuals to evaluate Putin by his handling of corruption in the country, an issue that he is particularly vulnerable to (Filipov 2017). I discover that the treatments do not increase the perceptions that corruption has worsened in Russia under Putin, making it unlikely that the treatments are lowering evaluations of Putin by increasing the saliency of corruption issues.

Figure 30: Causal mediation plot. Treatment is the foreign balanced criticism (compared to foreign one-sided criticism), Mediator is perceptions of source bias. Outcome is support for Vladimir Putin. Horizontal lines represent 95% confidence intervals for estimates.

Causal mediation plots affirm that neither perceptions of source bias nor corruption

111 are mechanisms driving the decrease in support for Putin (see Figure 30 and Figure 31). To further elucidate the mechanism behind the experimental findings, I assess several other conditioning variables. All interaction variables were measured pre-treatment to avoid bias- ing the heterogenous treatment effects (Mongomery, Nyhan and Torres 2018). In particular, I analyze how pre-existing attitudes toward the United States, trust in foreign media, media viewing habits, sensitivity to criticism, and education moderate the effect of domestic and foreign criticism.

Figure 31: Causal mediation plot. Treatment is the foreign balanced criticism (compared to foreign one-sided criticism), Mediator is perception that corruption has increased during Vladimir Putin’s tenure as president. Outcome is support for Vladimir Putin. Horizontal lines represent 95% confidence intervals for estimates.

First, attitudes toward the United States do not influence how criticism from the U.S. shapes attitudes toward Putin (see Figure 32). There is some evidence that individuals who hold unfavorable attitudes toward the U.S. are growing more pro-Putin when exposed

112 to domestic criticism but the results are not statistically significant. Contrary to research on foreign cues in the United States, people’s priors toward the communicating country are not moderating the impact of transnational communication (Dragojlovic 2013).

Influence of Domestic and Foreign Criticism on Support for Putin By Priors toward United States

Domestic Criticism vs. Control Foreign Criticism vs. Control

.2 .2 One−Sided One−Sided Balanced Balanced

.1

.1

0

0

−.1 Treatment effect on Support for Putin

−.1 −.2 Unfavorable U.S. Neutral U.S. Favorable U.S. Unfavorable U.S. Neutral U.S. Favorable U.S.

Attitudes toward U.S.

Figure 32: Effect of treatments on support for Putin (0-1 scale). Left plot illustrates the effect of domestic criticisms relative to the control. Right plot illustrates the effect of for- eign criticisms relative to the control. Horizontal lines represent 95% confidence intervals for estimates.

I also find no heterogenous treatment effects based on individuals’ levels of trust toward foreign media. Trust in foreign media is measured on a four point scale ranging from “do not trust at all ” (1) to ”fully trust” (4). Individuals who say they trust foreign media are not more likely to become more anti-Putin following exposure to either one-sided nor balanced criticism of the Russia president (see Figure 33).

113 Influence of Domestic and Foreign Criticism on Support for Putin By Trust in Foreign Media

Domestic Criticism vs. Control Foreign Criticism vs. Control

.2 One−Sided One−Sided .1 Balanced Balanced

.1 0 Treatment effect 0 −.1

−.1 −.2 Don’t trust Trust Don’t trust Trust

Trust Foreign Media

Figure 33: Effect of treatments on support for Putin (0-1 scale). Left plot illustrates the effect of domestic criticisms relative to the control. Right plot illustrates the effect of for- eign criticisms relative to the control. Horizontal lines represent 95% confidence intervals for estimates.

Paradoxically, I find that individuals who watch more foreign news media tend to be more supportive of Putin if they are exposed to criticism of the Russian president (see Fig- ure 34).13 In other words, individuals who are most likely to come across foreign messages exhibit backlash effects when confronted with media that criticizes their president. These findings are robust to the inclusion of a host of control variables that may be related to support for Putin and greater foreign media consumption.

13Foreign Media Use is the number of foreign media outlets individuals selects divided

by the total number of media networks (µ = .06, σ=0.13). The foreign media options include: EuroNews, BBC, Current Time, Meduza, Radio Liberty, and “other foreign media” outlets.

114 Influence of Domestic and Foreign Criticism on Support for Putin by Foreign Media Consumption

Domestic Criticism vs. Control Foreign Criticism vs. Control

.3 .3 One−Sided One−Sided Balanced Balanced

.2 .2

.1 .1 Treatment effect

0 0

−.1 −.1

None A lot None A lot

Consumption of Foreign Media (Proportion)

Figure 34: Effect of treatments on support for Putin (0-1 scale). Left plot illustrates the effect of domestic criticisms relative to the control. Right plot illustrates the effect of for- eign criticisms relative to the control. Horizontal lines represent 95% confidence intervals for estimates.

Aside from pro-government views and political awareness, an individuals ’personality can predict support for authoritarianism in Russia (Greene and Robertson 2017). Specif- ically, individuals’ sensitivity to criticism, or how upset they get when their country is criticized, can influence how they process criticism. To measure sensitivity to regime crit- icism I ask individuals how upset they feel when Russia is insulted. The exact framing of the question was as follows: “When Russia is insulted in front of you, I get very upset.” Respondents’ answers were coded from 1, meaning “strongly disagree”, to 4, for “strongly agree” (µ=2.97, σ=1.02).

115 Influence of Domestic and Foreign Criticism on Support for Putin By Sensitivity to Criticism

Domestic Criticism vs. Control Foreign Criticism vs. Control

.2 .2 One−Sided One−Sided Balanced Balanced

.1 .1

0 0

−.1 −.1 Treatment effect

−.2 −.2

Somewhat agree Somewhat agree Do not agree at all I completely agree Do not agree at all I completely agree Somewhat disagree Somewhat disagree

Get upset when Russia is insulted

Figure 35: Effect of treatments on support for Putin (0-1 scale). Left plot illustrates the effect of domestic criticisms relative to the control. Right plot illustrates the effect of for- eign criticisms relative to the control. Horizontal lines represent 95% confidence intervals for estimates.

I find that holding other factors constant, individuals who claim not to get upset when Russia is criticized are more receptive to balanced foreign criticism (see Figure 35). Im- portantly, I argue that sensitivity to criticism is not merely a proxy for pre-existing attitudes toward the regime or hostility toward foreign nations. Individuals can vote for opposition candidates but still feel upset when their country is criticized. These findings highlight the need to better understand the role of emotions and personality in explaining regime support and receptivity to outside information. Building off previous research on foreign cues, I also examine how foreign coverage influences individuals based on their levels of education (Dragojlovic 2015). Martin(1971) asserts that: “presenting only one side of an issue is more effective than presenting two

116 sides when the audiences is not well educated, already convinced and unlikely to hear the other side;” but “that presenting both sides of a case is more effective with whose who are well educated or initially opposed” (67). As noted by others, more educated citizens are likely internalize the political norms of their regime, while less educated citizens are more receptive to persuasion (Geddes and Zaller 1989, 333). I find that foreign balanced criticism persuades individuals with low levels of education, mirroring media effects in democratic regimes (Zaller 1992). Paradoxically, it appears that the persuadable are the least knowledgable (Robertson 2017, 18). These citizens are unlikely to challenge the regime since they tend to also be politically disinterested (see Figure 36).

Influence of Domestic and Foreign Criticism on Support for Putin By Education

Domestic Criticism vs. Control Foreign Criticism vs. Control

.4 .4 One−Sided One−Sided Balanced Balanced

.2 .2

0 0 Treatment effect −.2 −.2

−.4 −.4

Low Medium High Low Medium High

Education

Figure 36: Plots the effect of the foreign treatments (in respect to the control) on support for Vladimir Putin by education. Horizontal lines represent 95% confidence intervals for estimates.

Finally, I show that while non-voters can grow more anti-Putin after exposure to bal- anced foreign criticism, they do not become more likely to support opposition candidates

117 or believe Russia is more undemocratic (see Figure 37 and Figure 38). This is important because it demonstrates the limited influence of international broadcasting. Some apolitical citizens can be persuaded to adopt more critical viewpoints of their leader, but there is little evidence to suggest that this has broader effects that would lead to protest or democratiza- tion.

Influence of Domestic and Foreign Criticism on Support for Navalny

Domestic Criticism vs. Control Foreign Criticism vs. Control

Regime Voter One−Sided Regime Voter One−Sided Balanced Balanced

Opposition Voter Opposition Voter

Non−Voter Non−Voter

−.4 −.2 0 .2 .4 −.2 0 .2 .4

Treatment effect

Figure 37: Effect of treatments on support for Alexei Navalny (0-1 scale). Left plot il- lustrates the effect of domestic criticisms relative to the control. Right plot illustrates the effect of foreign criticisms relative to the control. Horizontal lines represent 95% confi- dence intervals for estimates.

118 Influence of Domestic and Foreign Criticism on View that Russia is Democratic

Domestic Criticism vs. Control Foreign Criticism vs. Control

Regime Voter One−Sided Regime Voter One−Sided Balanced Balanced

Opposition Voter Opposition Voter

Non−Voter Non−Voter

−.2 −.1 0 .1 .2 −.2 −.1 0 .1 .2

Treatment effect

Figure 38: Effect of treatments on view that Russia is democratic (0-1 scale). Left plot illustrates the effect of domestic criticisms relative to the control. Right plot illustrates the effect of foreign criticisms relative to the control. Horizontal lines represent 95% confi- dence intervals for estimates.

Conclusion

Some advocate for the development of independent regional news agencies to counter propaganda in areas where media choice is limited (Lucas and Pomerantsev 2016, 3). Yet, we know little about what influence foreign criticism has on citizen in non-democratic states, what types of citizens may be most receptive to critical messages, and how com- binations of different foreign messages shape public opinion. According to Smith(1970), “since most people are taught to expect political propaganda from their nation’s adver- saries, this fact has obvious implications for developing effective political communications to the people of other nations” (551). I analyze what political actors can do to overcome motivated reasoning and minimize backlash toward foreign messages. I assess: (i) whether broadcasters can better balance their news coverage in order to signal unbiasedness and increase the influence of their crit-

119 icism, and (ii) how the messenger’s nationality impacts receptivity to foreign messages. Unlike other studies, I analyze the conditions under which foreign cues are persuasive in a non-democratic setting (Dragojlovic 2015; Grieco et al. 2011). I am able to identify the causal effect of foreign criticism and estimate heterogeneous treatment effects (Lorimor and Dunn 1968; Huang and Yeh 2017). I consider what governments and civil society organization can do to better persuade audiences that may be less receptive to foreign crit- icism (Allport and Simpson 1946; Chmel, Savin and Carpini 2018). I find that foreign criticism can convince some citizens, but these individuals are the least likely to challenge the regime in the first place. There is no evidence that greater trust in the source is the reason non-voters adopted more anti-regime views when exposed to balanced foreign criticism. Rather, it is politically apathetic individuals who possess lower levels of education that are the most likely to respond to balanced foreign coverage. Individuals’ sensitivity to national criticism and levels of prior foreign media exposure also heavily moderate the influence of outside information. While most people cannot be swayed by foreign information, I demonstrate that how criticism is packaged matters for audiences’ receptivity, with balanced criticism being more effective than one-sided criticism. One-sided criticism can elicit backlash effects, inadvertently increasing support for non-democratic leaders (Kern and Hainmueller 2009; Peisakhin and Rozenas 2018). The results indicate that it is unlikely that we will discover uniform strategies that will be successful across all types of foreign audiences (Bush and Prather 2017; Chong and Druckman 2012). Foreign broadcasting may shape the public opinion of some citizens under very specific conditions, but it is unlikely to be a driving force for democratization. I argue that we should move beyond stating that citizens in autocratic regime differ in their receptivity toward foreign broadcasting (Allport and Simpson 1946; Robertson 2017), and begin to analyze what communication strategies can effectively persuade citizens living

120 under autocratic governments (Chmel, Savin and Carpini 2018). We need more information about what topics foreign audiences care about to better design our international broadcast- ing content to reflect their concerns in order to gain a reputation for credibility and increase demand for foreign perspective (Bush and Prather 2018; Chmel, Savin and Carpini 2018; Szostek 2018b). Only by gaining foreign audiences trust can we expect to win over hearts and minds. Prior work shows that the influence of foreign content may be dependent on people’s ex- isting knowledge about specific issues and the extent to which the new information differs from official government propaganda (Tai 2016, 78-79). Future studies should consider alternative mechanisms and assess how subtle variations in international broadcasting can impact receptivity and increase the number of persuadable citizens (Szostek 2018a). It is possible that different types of coverage – like positive coverage of the president – may bet- ter signal unbiasedness. Scholars should also assess whether criticism of a leader’s foreign policy, handling of the economy, or human rights violations may be more persuasive (Bush and Prather 2018; Gruffydd-Jones 2019; Robertson 2017). During the Cold War, Radio Liberty was effective due to its intimate understanding of its audiences, allowing it to better cater its content to Soviets citizens (Mikkonen 2010, 776). It would be useful for policy practitioners to take a step back and better consider the needs of their audiences rather than assuming that they crave critical political content. Strategically framing content to emphasize shared social categories may be a more effective way to promote influence (Turner 1991). If democratic states are going to win the infor- mation war, “the United States and other actors need to develop strategies for enhancing credibility instead of engaging in hybridized propaganda wars” (Surowiec 2017, 25). In sum, I present important evidence about the conditions under which U.S. foreign broadcasting is more likely to be effective, with direct implications for U.S. democracy promotion (Bush and Prather 2018; Farwell 2018). While the results are from a single

121 case, they caution against overstating the efficacy of digital foreign broadcasting. I argue that foreign voices may find it extremely difficult to promote reform in non-democratic states.

122 Conclusion and Implications

As new ICTs lower the cost of reaching foreign audiences, the perceived threat of in- ternational propaganda is likely going to grow (Tucker et al. 2017). Foreign actors will continue to rely on disinformation to promote their foreign policy agenda, sow doubts in electoral outcomes, and exacerbate political polarization (Pomerantsev 2015). Neverthe- less, we should be skeptical that the development of new digital technologies marks a fun- damental shift for democratic governance and national security (Baum and Potter 2019). According to some estimates, “Russian bots tweeted 2.1 million times before the election – certainly a worrisome number. But these represented only 1 percent of all election-related tweets and 0.5 percent of views of election-related tweets” (Nyhan 2018). Russian propa- ganda reached over 126 million people on Facebook alone in the United States. However, over that same period, Americans received 33 trillion items in their newsfeed (Fund 2017). We should not ignore foreign actors’ attempts to manipulate democratic processes but we should also not over exaggerate their influence (Little 2018). In this dissertation, I re-conceptualize how we think about effective propaganda by as- sessing whether exposure to international propaganda has soft power, sharp power, and/or third-person power effects. Propaganda may not fundamentally affect the international bal- ance of power (Lanoszka 2019). In fact, “describing propaganda efforts in military terms indicates a basic misunderstanding of how international propaganda works” (Nicols 1984, 130). International propaganda can have more subtle effect on public opinion - especially by creating the impression the foreign intervention is more effective than it really is (Tomz and Weeks 2019). I assess whether making people aware of Russian propaganda and pro- viding explicit warning messages eliminates the influence of propagandistic messages. I also consider whether our inoculation strategies have inadvertent effects on public opin-

123 ion. Finally, I evaluate the effectiveness of offensive broadcasting in an autocratic context, examining how counter-propagandists can craft messages to persuade skeptical audiences. While there is a lot of attention to the “information war”, there is less discussion about how organizations can craft narratives that appeal to diverse foreign audiences or whether these initiatives have any long-lasting political significance (Pomerantsev 2015; Xie and Boyd-Barrett 2015). In my survey experiments, I find that the direct influence of exposure to Russian disin- formation is limited but not nonexistent. Americans exposed to Russian propaganda tended to exhibit less favorable attitudes toward rival states. Older conservatives are more receptive to foreign-backed conspiracy theories that promote a conservative agenda, while younger people with low political awareness tend to be more receptive to disinformation about U.S. foreign policy. I also test whether exposure to messages that criticize the United States makes Americans more politically cynical but find no evidence of this effect. Notably, I show that some of our initiatives that warn people about foreign propaganda are either ineffective or counter-productive. Offensive measures, which aim to undermine support for autocratic leaders are also generally ineffective - only reaching audiences who are politically disinterested. We should not assume that citizens in non-democratic states crave critical content of their regime, nor should we overestimate their ignorance about their regimes’ actions. I show that while some types of foreign content can lower favor- ability toward non-democratic rules, these effects are limited to individuals who are the least politically engaged. While in the future, this group of citizens may become politically active, it is unclear whether they will be a pivotal force for democratization. Additionally, overly critical content may have inadvertent consequences and help retrench support for autocratic leaders. Perhaps most interestingly, I demonstrate that the belief that other people are suscepti- ble to propaganda can decrease democratic legitimacy and increase support for censorship.

124 The perception of propaganda’s effectiveness on others seems to have larger effects than ex- posure to propaganda itself. While we may never know if Russia swayed the 2016 election, much of that is inconsequential since a significant portion of citizens think that Russian propaganda shaped the outcome. Perceptions of propaganda’s effectiveness can increase support for censorship, which can lead toward a slippery slope of greater media regulation. If citizens aren’t vigilant, governments can use the excuse of foreign interference to inval- idate elections, suppress civil liberties, and promote their own political agenda (Bermeo 2003; Lanoszka 2019; Tomz and Weeks 2019).

Limitations

This dissertation makes several key contributions to research on foreign propaganda, inoculation strategies, autocratic information politics, and international democratization. However, it also has several limitations. First, in online survey experiments, participants are assigned to read specific articles or other experimental messages. Consequently, some might question the external validity of these studies. Do the results that we find in experi- ments tell us anything about political communication in the real world? What do we gain by directly measuring the influence of particular messages? I contend that these survey experiments provide insights into the micro-level effects of particular messages. This allows us to move beyond merely speculating about how people might react if they come into contact with international propaganda and theorizing which individuals are most receptive and what types of messages may have the greatest influence (Lanoszka 2019). Despite long-standing claims that we gain nothing from experiments in the study of propaganda (Ellul 1965), similar studies have provided invaluable insights in political communication (Barabas and Jerit 2010; Brady 2000; Druckman and Leeper 2012; Mullinix, Leeper and Druckman 2015). Survey experiments help researchers over- come issues of endogeneity and selective exposure when it comes to propaganda studies

125 (Druckman et al. 2006). While my designs can be considered “forced exposure” – since participants are not in control of the content they are presented – people’s exposure to international propaganda on social media means that real-life may selection may also be inadvertent (Arceneaux and Johnson 2013). Given that these people likely have very different priors about the issues described and a different interest in the provided information, it is not clear if we could expect similar results in a typical news consumption setting. That being said, the fact that participants complete the surveys from the comfort of their own home or work place increases “the mundane realism of the experiment and the generalizability of the results” as individuals are most likely to be exposed to such messages while browsing social media (Huang and Yeh 2017, 292). How the treatments are operationalized can also shape experimental findings. For ex- ample, participants might be asked to read full articles in experiments, but in the real world, they may only read headlines of news articles. I try to present my treatments as Facebook posts to increase the external validity of the experiments by better mimicking how people may be exposed to propaganda in the real world (Druckman and Kam 2011, 8). Addi- tionally, multiple stimuli that compete for the attention of citizens most likely reduce the effect sizes reported in experiments, where participants are usually exposed to a handful of stimuli and asked to respond immediately (Barabas and Jerit 2010, 238). In some of my experiment, I expose participants to non-political placebo articles to simulate how they would receive information online – mixing politics with advertisements and entertainment. Timing also matters since individuals may be pre-treated with information, meaning that individuals who are technically in the control, might actually be treated (Slothuss 2016). It is also possible that the effects in this study could vary as Russia comes in and out of the news cycle (Einstein and Glick 2015). Assessing how effects change over time is a necessary area of research.

126 Relatedly, I find attitudinal changes on issues that may be low salience for most Amer- icans. For example, it may not be overly surprising that new information about Ukrainian human rights violations, regardless of the source, can change people’s attitudes. People’s assessments of foreign nations may be highly dependent on reports about a single interna- tional event or the country’s leader (Balmas 2017, p. 667). Assessing how negative infor- mation influences people views on states about which individuals have stronger political priors is a necessary area of research (Guardino and Hayes 2017). One final methodological limitation is the reliance on a convenience sample which under-represents older citizens, conservatives, and the less-educated. The point estimates on interactions, in particular, are more noisily estimated due to the small number of individ- uals with these backgrounds, meaning the estimates of the treatment effects for these groups is less precise. Participants, especially in crowdsource services, might also be knowledge- able enough about research methods to pick up on experimental manipulations (Krupnikov and Levine 2014). However, as previous studies have confirmed, the experimental results found in convenience samples do not differ much from those in more representative sam- ples (Buhrmester, Kwang and Gosling 2011; Casler, Bickel and Hackett 2013; Chandler, Mueller and Paolacci 2014; Clifford, Jewell and Waggoner 2015; Coppock 2018). Despite some of these concerns with survey experiments, prior work finds that results that occur in survey experiments resemble what takes place in the “real world” and pro- vide valuable information about the direction of effects and mechanisms – making them invaluable tools in political science research (Barabas and Jerit 2010, 239). Moreover, in conjunction with nationally representative survey data, big-data/text mining, and in-depth knowledge of propaganda structures, these studies provide a more holistic picture on pro- paganda effects and the consequences of our counter-propaganda programs. Additionally, one can obtain greater confidence in one’s findings and the external validity of experimen- tal findings by converging the results from multiple studies (Campbell and Fiske 1959).

127 Consequently, I try to rely on a series of survey of experiments to obtain a broader picture about how exposure to foreign messages and inoculations shape mass attitudes. I assert that these experiments reveal a great deal about direct and indirect impacts of international propaganda that previous studies have overlooked or only hinted at. Beyond methodological concerns, some fundamentally doubt whether experiments or surveys can tell us much about the effect of propaganda in the first place and argue that these methods are wholly inadequate. Ellul argues that, “such experiments take place in a vacuum, in that the individuals subjected to them are cut off from their normal milieu. The normal conditions under which propaganda works are in no way reproduced” (Ellul 1965, 266). These debates highlight the long historical debate on the epistemology behind pro- paganda. While some embrace the neo-Gramsican perspective on propaganda, seeing the study of propaganda as intrinsically tied to theories on ideology, , and discourse (Allen 1993; Burnett 1989; Combs and Nimmo 1993; Cunningham 2002; Sproule 1987; Stanley 2015), others have embraced the propaganda as psychology model by exclusively focusing on measurable effects of particular messages (Biddle 1931; Henderson 1943; Sil- verstein 1987). While, Ellul and others are undoubtable correct in noting that any survey or experiment cannot capture the full influence of propaganda, the positivist approach to communication research has yielded immeasurable benefits to the study of propaganda by giving us insight into who is most persuaded, how they change their attitudes and behaviors, and why they do so.

Future Research

One major contribution of this dissertation is that it raises a host of new questions for future research. Specifically, future research would do well to: (i) test how foreign propaganda on different topics (race, inequality, etc) can exacerbate affective polarization; (ii) analyze international propaganda’s effectiveness outside the United States; (iii) assess

128 the influence of non-Russian propaganda; (iv) unpack the mechanisms driving propaganda and inoculation effects; and (v) identify propaganda effects on elites. I address each of these in turn. First, we should assess on which topics foreign propaganda is most effective. While my studies focused on Russian propaganda on a wide variety of issues including the conflict in Ukraine, deficits in American democratic institutions, the war in Syria, and others, ana- lysts have found that Russian propaganda often tried to promote divisions over race-related issues. Over 1900 of 3500 Russian advertisements on Facebook made explicit references to race politics, with 25% focusing on police and crime (Penzenstadler, Heath and Guynn 2018). Of the five most followed IRA accounts on Twitter, four predominantly covered race-related issues. Content analysis of Russian accounts on Reddit reaffirms that much of the propaganda was devoted to police brutality.14 Researchers found that “Russian ads related to police brutality were issued to coincide with periods of higher unrest” (Etudo, Yoon and Yaraghi 2019, 894). A report on Russian disinformation released in December 2018 argues that the repetitive targeting of with posts concerning Black Lives Matter and policy brutality were meant to discourage voting for Hillary Clinton (Howard et al. 2018, 9). Conversely, posts supporting police officers were meant to appeal to more traditional conservatives and Donald Trump supporters. By playing both sides, the Kremlin aims to exacerbate affective polarization and racial animus in the United States. While much work has been done on the origins of polarization and its consequences, less work had focused on how international actors can exacerbate internal divisions and threaten democratic governance. Second, we need more research on the factors that predict receptivity to international propaganda in post-Soviet states, which may particularly vulnerable. Nowhere has Rus- sian propaganda been a larger threat than in the Baltic States (Chapman and Gerber 2019;

14https://arcdigital.media/russian-propaganda-on-reddit-7945dc04eb7b

129 Gerber and Zavisca 2016; Lanoszka 2019; Sarlo 2017). Latvia, in particular, has been a prime target of Russian disinformation, due to its shared border with Russia, it history with Soviet occupation, and its ethnic composition. According to Maris Cepuritis, a researcher at the Centre for East European Policy Studies, and Rita Rudusa, director of the Baltic Me- dia Centre of Excellence, “the disproportionally large presence of Russian-language media that attempts to ensure Russia’s political influence in Latvia is one of the country’s major challenges to its information security” (Kudors 2018). While the consumption of Russian media networks such as RT and Sputnik is limited (12% and 4% of Latvians), “the total use of Russian media content in TV in Latvia has increased in recent years” (Berzina 2018, 4). Because nearly 90% of Latvians have a working knowledge of Russian, the Kremlin has the potential to not only shape public attitudes toward Russian domestic and foreign policy, but also issues concerning Latvian society, the European Union, and NATO (Bergmane 2016). Research from the Broadcasting Board of Governors find that: “those who use and trust the information they get on Russian media are much more likely to support Russia’s actions in Ukraine, Putin’s domestic policies and Putin’s international policies than those who use Russian Federation media but do not trust it” (BBG 2016, 1). I hope future studies investigate the influence of particular Russian messages in post-Soviet states (Lanoszka 2019). Third, it is critical to assess how the findings here generalize to propaganda not coming from Russia. Some might also wonder what analysis of the Kremlin’s external propaganda can teach us about propaganda emanating from China, , or terrorist organizations like ISIS. For instance, Tomz and Weeks(2019) find that American react similarly to foreign interventions when they are attributed to China, Pakistan, , arguing that “the specific country mentioned had little effect on public perceptions of intervention” (18). However, more research is necessary on when the nationality of the foreign broadcaster matters for persuasion.

130 Fourth, one of limitations of the present studies is their inability to precisely estimate the mechanisms driving some of the significant (and null) findings. For instance, across my studies, I find that inoculation messages have little effect on receptivity to propaganda, and in some case, backfire and entrench disinformation. However the studies do not assess whether exposure to information about the source caused individuals to update their be- liefs about the propaganda outlets themselves. Future work should explicitly test whether subjects exposed to information about the source thought that the RT and Sputnik arti- cles involved were propaganda, public diplomacy, disinformation or legitimate news. This would provide insight into the mechanisms driving the null effect of revealing the mes- sage source and whether individuals are updating their beliefs about the content of foreign networks when they are given additional information. I also find that exposure to propaganda leads people to think that others are less suscep- tible to foreign disinformation. I speculate that if exposure to propaganda, and/or an inoc- ulation message, causes people believe that propaganda is less threatening than they pre- viously believed, they may update their views on propaganda’s influence on others (Smith 1970). Messages that decrease the perceived threat of propaganda can lessen propaganda’s presumed influence on others (Sun, Pan and Shen 2008, 282). Showing people that foreign propaganda is not as sophisticated or threatening as they previously believed can decrease its perceived influence over politics. However, future work should untangle what components of propaganda and/or inoculation messages heighten or lower propaganda’s presumed influence on others, and test precisely why we witness these effects. I also find that foreign broadcasting may shape the public opinion of some citizens un- der very specific conditions, but there is little evidence that this strategy is strong force for international democratization (Martin 1971; Nicols 1984; White 1952). However, contrary to my expectations, there is no evidence that greater trust in the source is the reason non- voters adopted more anti-Putin views when exposed to balanced foreign criticism. Washing

131 one’s dirty linens in public may not improve the credibility of foreign news outlets and for- eign broadcasters may have little control about how they are perceived by international audiences (Truex 2016; White 1952). Future work needs to address when foreign content can be more persuasive than domestic content (and vice versa). Finally, I focus on how exposure to Russian-sponsored propaganda influences ordinary citizens political attitudes. Some might question whether ordinary citizens’ attitudes matter given their lack of knowledge on foreign policy issues and the unclear connection between public opinion and policy outcomes (Lanoszka 2019, 13). However, foreign propaganda can also influence political elites who have more direct control over foreign policy. For example, during a presidential rally in Pennsylvania in 2016, Trump claimed to have evi- dence that Hillary Clinton was responsible for the deaths of four Americans in Benghazi. The alleged smoking gun, an email from Clinton confidant Sidney Blumenthal, was a fake that originally appeared on the Russian-funded website Sputnik. According to the NPR re- port that covered the incident, “it’s unclear how Trump obtained the same misinformation that appeared in Sputnik” (Naylor 2016). Another troubling example came in February 2017 when Fox News correspondent Bill O’Reilly questioned Trump’s admiration of Russian president Vladimir Putin. “But he’s a killer though,” O’Reilly said. “Putin’s a killer.” Rather than denounce Putin, President Trump responded, “There are a lot of killers. We’ve got a lot of killers. What do you think – our country’s so innocent” (Pengelly 2017)? However one may feel about U.S. conduct abroad, it is unusual for an American pres- ident to equate the actions of the United States with those of a rival country. Perhaps most unnerving, the president’s statement sounded similar to arguments expressed in Rus- sian state-sponsored media that deflect criticism of Russian policies by promoting moral equivalence between democratic and non-democratic countries (Kurtzleben 2017). Since becoming president, some have noticed that President Trump has repeated the Kremlin’s

132 disinformation on the Soviet Union’s invasion of Afghanistan in 1979, the utility of NATO, and Ukraine conflict (Boot 2019; Frum 2019). Despite these statements, Trump’s policy toward Russia has been marked more by continuity than change, suggesting that rhetoric may not affect grand strategy (Dombrowski and Reich 2017). In Europe, the Kremlin’s ability to shift populist party leaders’ political behavior is troubling. Prior research reveals that the intensity of Russia’s linkages with populist parties in Europe increased after the outbreak of conflict in Ukraine (Laruelle 2015; Klapsis 2015; Kreko´ and Gyori˝ 2015; Orenstein 2014). According to Vadim Nikitin, “seemingly aban- doned by their own governments, some of Europe’s disenchanted have started to support previously fringe extremist parties. They have also started to look to Putin for sympathy” (Nikitin 2016). According to Surowiec, while “Russia and the targeted group of European parties enjoy different depths of relationships, their campaigning tends to have polariz- ing effects on politics in Europe” (Surowiec 2017). Some are concerned that if autocrats continue to establish ties with far-right and far-left parties, they can shift foreign public opinion in ways that undermine democratic governance and the liberal international order (Nye 2017, 16; Rohac, Zgut and Gyori 2017, 11).

Final Thoughts

While I am far from the first to suggest that international propaganda campaigns are in- effective (Martin 1971; White 1952), or effective under very particular conditions (Howard et al. 2018; Nelson and Taneja 2018; Oates 2017), I am the first to explicitly test the influ- ence of foreign propaganda on public opinion while also analyzing the efficacy of counter- propaganda strategies. We are beginning to understand more about international propa- ganda campaigns, but there is still a dearth of research on their influence on mass attitudes (Gerber and Zavisca 2016; Peisakhin and Rozenas 2018). This dissertation complements previous work that examines how often international propaganda is viewed or shared to

133 measure its success (Metzger and Siegel 2019; Orttung and Nelson 2019). Through a combination of survey research, content analysis, text-based machine learning, network analysis, and survey experiments, I believe we can obtain a broader understanding of the effects of international propaganda and better design programs to counter disinformation. Finally, the findings presented here have broader implications for propaganda in the era of big data. In his analysis of international propaganda, Martin(1971) argued that:

“If we could select our audience on the basis of certain idiocratic factors - - objective physical and personal characteristics peculiar to an individual, such as age, sex, race education – we might increase by a statistical significant frac- tion the proportion of those influence by a message. But we should have no control over such factors as personality and susceptibility to persuasion, exist- ing values, beliefs and opinion or attitudes toward the objects, subjects, and situation involved in the persuasive message” (69-70).

However, with the data revolution in communication, actors can increasingly target audiences on the very factors Martin said were impossible, thereby increasing the chance that foreign messages can exert greater political influence. What makes modern propaganda so compelling and potentially harmful is that foreign actors can utilize big data and social media to develop highly targeted political messages. Communicators can leverage decades of research on successful persuasion with increasing volumes of data to craft messages that better resonate with diverse audiences. As technology allows actors to micro-target audiences in new and sophisticated ways, it is important for all of us to pay attention to how governments, businesses, and independent actors use propaganda to undermine our democracy.

134 References

Adelman, Levi and Nilanjana Dasgupta. 2019. “Effect of Threat and Social Identity on Reactions to Ingroup Criticism: Defensiveness, Openness, and a Remedy.” Personality and Social Psychology Bulletin 45(5):740–753.

Adena, Maja, Ruben Enikolopov, Maria Petrova, Veronica Santarosa and Ekaterina Zhu- ravskaya. 2015. “Radio and the Rise of the Nazis in Prewar Germany.” The Quarterly Journal of Economics 130(4):1885–1939.

Ahler, Douglas J. and Gaurav Sood. 2018. “The Parties in Our Heads: Misperceptions about Party Compositions and Their Consequences.” The Journal of Politics .

Albright, Jonathan. 2017a. “Instagram, Meme Seeding, and the Truth about Facebook Manipulation, Pt. 1.” Medium .

Albright, Jonathan. 2017b. “Who Hacked the Election? Ad Tech did. Through “Fake News,” Identity Resolution and Hyper-Personalization.” Medium . https://medium.com/tow-center/who-hacked-the-election-43d4019f705f.

Alexander, Michele G., Shana Levin and Peter J. Henry. 2005. “Image Theory, Social Identity, and Social Dominance: Structural Characteristics and Individual Motives Un- derlying International Images.” Political Psychology 26(1):27–45.

Allcott, Hunt and Matthew Gentzkow. 2017. “Social Media and Fake News in the 2016 Election.” Journal of Economic Perspectives 31(2):211–236.

Allcott, Hunt, Matthew Gentzkow and Chaun Yu. 2019. “Trends in the Diffusion of Mis- information on Social Media.” Research and Politics pp. 1–8.

Allen, Barry. 1993. “On the Definition of Propaganda.” Propaganda and the Ethics of Rhetoric.” Ottawa: The Canadian Journal of Rhetorical Studies pp. 1–12.

Allen-Ebrahimian, Bethany, Elias Groll and Robbie Gramer. 2016. “New House Bills Take Aim at Foreign Propaganda.” Foreign Policy .

Allen, Mike. 1991. “Meta-analysis Comparing the Persuasiveness of One-sided and Two- sided Messages.” Western Journal of Speech Communication 55(4):390–404.

Allport, Floyd H. and Mary Mathes Simpson. 1946. “Broadcasting to an Enemy Country: What Appeals are Effective, and Why.” The Journal of Social Psychology 23(2):217– 224.

Althaus, Scott L., Jill Edy, Robert Entman and Patricia Phalen. 1996. “Revising the In- dexing Hypothesis: Officials, Media and the Libya Crisis.” Political Communication 13(4):407–421.

135 Ambrosio, Thomas. 2010. “Constructing a Framework of Authoritarian Diffusion: Con- cepts, Dynamics, and Future Research.” International Studies Perspectives 11(4):375– 392.

Anspach, Nicolas M. and Taylor N. Carlson. 2018. “What to Believe? Social Media Com- mentary and Belief in Misinformation.” Political Behavior pp. 1–22.

Arceneaux, Kevin and Martin Johnson. 2013. Changing Minds or Changing Channels?: Partisan News in an Age of Choice. University of Chicago Press.

Arceneaux, Kevin and Ryan J. Vander Wielen. 2017. Taming Intuition: How Reflection Minimizes Partisan Reasoning and Promotes Democratic Accountability. Cambridge University Press.

Arif, Ahmer, Leo Graiden Stewart and Kate Starbird. 2018. “Acting the Part: Examining Information Operations Within #BlackLivesMatter Discourse.” Proceedings of the ACM on Human-Computer Interaction 2:1–26.

Ashmore, Richard D., David Bird, Frances K. Del Boca and Robert C. Vanderet. 1979. “An Experimental Investigation of the Double Standard in the Perception of International Affairs.” Political Behavior 1(2):123–135.

Ausderan, Jacob. 2014. “How Naming and Shaming Affects Human Rights Perceptions in the Shamed Country.” Journal of Peace Research 51(1):81–95.

Avgerinos, Katherine P. 2009. “Russia’s Public Diplomacy Effort: What the Kremlin is Do- ing and Why it’s Not Working.” Journal of Public and International Affairs 20(1):115– 132.

Ayalon, Ami, Elad Popovich and Moran Yarchi. 2016. “From Warfare to Imagefare: How States Should Manage Asymmetric Conflicts with Extensive Media Coverage.” Terror- ism and Political Violence 28(2):254–273.

Badawy, Adam, Emilio Ferrara and Kristina Lerman. 2018. “Analyzing the Digital Traces of Political Manipulation: The 2016 Russian Interference Twitter Campaign.” arXiv preprint arXiv:1802.04291 .

Bailard, Catie Snow. 2012. “A Field Experiment on the Internet’s Effect in an African Election: Savvier Citizens, Disaffected Voters, or Both?” Journal of Communication 62(2):330–344.

Balmas, Meital. 2017. “Bad News: The Changing Coverage of National Leaders in Foreign Media of Western Democracies.” Mass Communication and Society 20(1):663–685.

Balmas, Meital. 2018. “Tell me Who is Your Leader, and I will Tell You Who You Are: Foreign Leaders’ Perceived Personality and Public Attitudes Toward their Countries and Citizenry.” American Journal of Political Science 62(2):499–514.

136 Banas, John A. and Stephen A. Rains. 2010. “A Meta-analysis of Research on Inoculation Theory.” Communication Monographs 77(3):281–311.

Barabas, Jason and Jennifer Jerit. 2010. “Are Survey Experiments Externally Valid.” Amer- ican Political Science Review 104(2):226–242.

Barghoorn, Frederick Charles. 1964. Soviet Foreign Propaganda. Princeton University Press.

Baron, Reuben M. and David A. Kenny. 1986. “The Moderator-Mediator Variable Distinc- tion in Social Psychological Research: Conceptual, Strategic, and Statistical Considera- tions.” Journal of Personality and Social Psychology 51(6):1173–1182.

Baum, Matthew A. and Philip BK Potter. 2019. “Media, Public Opinion, and Foreign Policy in the Age of Social Media.” The Journal of Politics .

Bayer, Lili and Joanna Plucinska. 2018. “Orban’s Media Puppermaster.” Politico .

BBG. 2016. “Role of Russian Media in the Baltics and Moldova.” Broadcasting Board of Governors .

Behrouzian, Golnoosh, Erik C. Nisbet, Alsenur Dal and Ali C¸arkoglu.˘ 2016. “Resisting Censorship: How Citizens Navigate Closed Media Environments.” International Journal of Communication 10:4345–4367.

Bell, Mark S. and Kai Quek. 2018. “Authoritarian Public Opinion and the Democratic Peace.” International Organization 71(1):227–242.

Benkler, Yochai. 2018. “The Russians didn’t swing the 2016 election to Trump. But Fox News might have.” .

Benkler, Yochai, Robert Faris and Hal Roberts. 2018. Network Propaganda: Manipulation, Disinformation, and Radicalization in American Politics. Oxford University Press.

Bennett, Lance W. 1996. “An Introduction to Journalism Norms and Representations of Politics.” Political Communication 13(373-384).

Bergmane, Una. 2016. “Latvia’s Debate About Russian Propaganda.” Baltic Bulletin . https://www.fpri.org/article/2016/07/latvias-debate-russian-propaganda/.

Bermeo, Nancy Gina. 2003. Ordinary People in Extraordinary Times: The Citizenry and the Breakdown of democracy. Princeton University Press.

Bernays, Edward L. 1928. Propaganda. New York: H. Liveright.

Berzina, Ieva. 2018. “Political Trust and Russian Media in Latvia.” Journal on Baltic Security 4(2):1–9.

137 Besley, Timothy and Andrea Prat. 2006. “Handcuff for the grabbing hand? Media capture and goverment accountability.” American Economic Review 96(3):720–736.

Best, Samuel J., Brian Chmielewski and Brian S. Krueger. 2005. “Selective Exposure to Online Foreign News During the Conflict with Iraq.” Harvard International Journal of Press/Politics 10(4):52–70.

Biddle, William W. 1931. “A Psychological Definition of Propaganda.” The Journal of Abnormal and Social Psychology 26(3):283–295.

Bisgaard, Martin. 2019. “How Getting the Facts Right Can Fuel Partisan-Motivated Rea- soning.” American Journal of Political Science .

Bittman, Ladislav. 1985. The KGB and Soviet Disinformation: An Insider’s View. Wash- ington: Pergamon-Brassey’s.

Bjola, Corneliu. 2018. “The Ethics of Countering Digital Propaganda.” Ethics & Interna- tional Affairs 32(3):305–315.

Boduszynski, Mieczyslaw and Philip Breeden. 2017. “Russian Disinformation and U.S. Public Diplomacy.” CPD Blog .

Bolsover, Gillian and Phillip Howard. 2017. “Computational Propaganda and Political Big Data: Moving Toward a More Critical Research Agenda.” Big Data 5(4):273–276.

Boot, Max. 2018. “Without the Russians, Trump wouldn’t have won.” The Washington Post .

Boot, Max. 2019. “Once again, President Trump is repeating talking points from Moscow.” The Washington Post .

Born, Kelly. 2017. “Six Features of the Disinformation Age.” Project Syndicate .

Boukes, Mark and Hajo G. Boomgaarden. 2015. “Soft News With Hard Consequences? Introducing a Nuanced Measure of Soft Versus Hard News Exposure and Its Relationship with Political Cynicism.” Communication Research 42(5):701–731.

Brady, Henry E. 2000. “Contributions of Survey Research to Political Science.” Political Science & Politics 33(1):47–58.

Branscombe, Nyla R., Naomi Ellemers, Russel Spears and Bertjan Doosje. 1999. “The Context and Content of Social Identity Threat.” Social Identity: Context, Commitment, Content pp. 35–58.

Brewer, Marilynn B. 1999. “The Psychology of Prejudice: Ingroup Love and Outgroup Hate?” Journal of Social Issues 55(3):429–444.

138 Brewer, Paul R. 2006. “National Interest Frames and Public Opinion About World Affairs.” Harvard International Journal of Press/Politics 11(4):89–102. Brewer, Paul R., Kimberly Gross, Sean Aday and Lars Willnat. 2004. “International Trust and Public Opinion about World Affairs.” American Journal of Political Science 48(1):93–109. Brock, Maria. 2018. “Political Satire and its Disruptive Potential: irony and Cynicism in Russia and the US.” Culture, Theory, and Critique 59(3):281–298. Bronstein, Michael V., Gordon Pennycook, Adam Bear, David G. Rand and Tyrone D. Cannon. 2019. “Belief in Fake News is Associated with Delusionality, Dogmatism, Re- ligious Fundamentalism, and Reduced Analytic Thinking.” Journal of Applied Research in Memory and Cognition 8(1):108–117. Brooks, Stephen. 2015. Anti-Americanism and the Limits of Public Diplomacy: Winning Hearts and Minds? Routledge. Budak, Ceren, Divyakant Agrawal and Amr El Abbadi. 2011. “Limiting the Spread of Misinformations in Social Networks.” Proceedings of the 20th International Conference on World Wide Web, ACM . Budak, Ceren, Sharad Goel and Justin M. Rao. 2016. “Fair and Balanced? Quan- tifing Media Bias Through Crowdsourced Content Analysis.” Public Opinion Quarterly 80(1):250–271. Buhrmester, Michael, Tracy Kwang and Samuel D. Gosling. 2011. “Amazon’s Mechanical Turk a New Source of Inexpensive, Yet High-quality, Data?” Perspectives on Psycho- logical Science 6(1):3–5. Bump, Philip. 2017. “One of the Busiest Websites in the U.S. in 2016 Regularly Linked to Russia Propaganda.” Washington Post . https://wapo.st/2rqUfPv (accessed May 7, 2018). Burnett, Nicholas. 1989. “Ideology and Propaganda: Toward an Integrative Approach.” Propaganda: A Pluralistic Perspective pp. 127–137. Bush, Sarah Sunn and Lauren Prather. 2017. “The Promise and Limits of Election Ob- servers in Building Election Credibility.” The Journal of Politics 79(3):921–935. Bush, Sarah Sunn and Lauren Prather. 2018. “Who’s There? Election Observer Identity and the Local Credibility of Elections.” International Organization 72(3):659–692. Campbell, Donald T. and Donald W. Fiske. 1959. “Convergent and Discriminant Validation by the Multitrain-multimethod Matrix.” Psychological Bulletin 56(2):81–105. Cappella, Joseph N. and Kathleen H. Jamieson. 1996. “News Frames, Political Cynicism, and Media Cynicism.” The Annals of the American Academy of Political and Social Science 546(1):71–84.

139 Cappella, Joseph N. and Kathleen H. Jamieson. 1997. Spiral of Cynicism: The Press and the Public Good. Oxford University Press.

Carter, Erin B. and Brett Carter. 2018. Fighting for Citizens’ Minds: Autocratic Propa- ganda in the Information Age. Cambridge University Press.

Casler, Krista, Lydia Bickel and Elizabeth Hackett. 2013. “Separate but equal? A compar- ison of participants and data gathered via Amazon’s MTurk, social media, and face-to- face behavioral testing.” Computer in Human Behavior 29(6):2156–2160.

CBS. 2017. “U.S. launches TV network as alternative to Russian propaganda.” CBS News . https://www.cbsnews.com/news/us-current-time-tv-network-rfe-russia-russian- propaganda-misinformation-rt/.

Centola, Damon and Michael Macy. 2007. “Complex contagions and the weakness of long ties.” American Journal of Sociology 113(3):702–734.

Chandler, Jesse, Pam Mueller and Gabriele Paolacci. 2014. “Nonna¨ıvete´ among Ama- zon Mechanical Turk workers: Consequences and solutions for behavioral researchers.” Behavior Research Methods 46(1):112–130.

Chapman, Hannah S. and Theodore P. Gerber. 2019. “Opinion-Formation and Issue- Framing Effects of Russian News in Kyrgyzstan.” International Studies Quarterly .

Chen, Dan. 2018. “Political Context and Citizen Information: Propaganda Effects in China.” International Journal of Public Opinion Research .

Chen, Yuyu and David Y. Yang. 2018. “The Impact of Media Censorship: 1984 or Brave New World?” American Economic Review 109(6):2294–2332.

Chiang, Chun-Fang and Brian Knight. 2011. “Media Bias and Influence: Evidence from Newspaper Endorsements.” The Review of Economic Studies 78(3):795–820.

Chmel, Kirill, Nikita Savin and Michael X. Delli Carpini. 2018. “Making Politics Attrac- tive: Political Satire and Exposure to Political Information in New Media Environment in Russia.” Higher School of Economics Research Paper No. WP(BRP 63):1–31.

Chong, Alberto, Ana L. De La O, Dean Karlan and Leonard Wantchekon. 2014. “Does Cor- ruption Information Inspire the Fight or Quash the Hope? A Field Experiment in on Voter Turnout, Choice, and Party Identification.” The Journal of Politics 77(1):55–71.

Chong, Dennis and James N. Druckman. 2012. “Counterframing Effects.” The Journal of Politics 75(01):1–16.

Citrin, Jack and Laura Stoker. 2018. “Political Trust in Cynical Age.” Annual Review of Political Science 21:49–70.

140 Clayton, Katherin, Jase David, Kristen Hinckley and Yusaku Horiuchi. 2018. “Partisan Motivated Reasoning and Misinformation in the Media: Is News from Ideologically Uncongenial Sources More Suspicious?” Available at SSRN 3035272 .

Clayton, Katherin, Spencer Blair, Jonathan A. Busam, Samuel Forstner, John Glance, Guy Green, Anna Kawata, Akhila Kovvuri, Jonathan Martin, Evan Morgan, Morgan Sandhu, Rachel Sang, Rachel Scholz-Bright, Austin T. Welch, Andrew G. Wolff, Amanda Zhou and Brendan Nyhan. 2019. “Real Solutions for Fake News? Measuring the Effectiveness of General Warnings and Fact-Check Tags in Reducing Belief in False Stories on Social Media.” Political Behavior .

Clifford, Scott, Ryan M. Jewell and Philip D. Waggoner. 2015. “Are Samples Drawn from Mechanical Turk Valid for Research on Political Ideology?” Research & Politics 2(4).

Cohen, Jeremy, Diana Mutz, Vincent Price and Albert Gunther. 1988. “Perceived Im- pact of Defamation: An Experiment on Third-Person Effects.” Public Opinion Quarterly 52(2):161–173.

Cohen, Jeremy and Robert G. Davis. 1991. “Third-Person Effects and the Differential Impact in Negative Political Advertising.” Journalism Quarterly 68(4):680–688.

Combs, James E. and Dan D. Nimmo. 1993. The New Propaganda: The Dictatorship of Palaver in Contemporary Politics. Longman Publishing Group.

Conor and Friedersdorf. 2018. “Trump and Russia Both Seek to Exacerbate the Same Political Divisions.” The Atlantic . https://www.theatlantic.com/politics/archive/2018/01/trump-russia-twitter/551093/.

Constine, Josh. 2018. “Facebook Shrinks Fake News After Warnings Backfire.” Tech Crunch . https://techcrunch.com/2018/04/27/facebook-false-news/.

Coppock, Alexander. 2018. “Generalizing from Survey Experiments Conducted on Me- chanical Turk: A Replication Approach.” Political Science Research and Methods pp. 1– 16.

Crabtree, Charles, David Darmofal and Holger L. Kern. 2015. “A Spatial Analysis of the Impact of Western German Television on Protest Mobilization During the East German Revolution.” Journal of Peace Research 52(3):269–284.

Crabtree, Charles, Holger L. Kern and Steven Plaff. 2018. “Mass Media and the Diffusion of Collective Action in Authoritarian Regimes: The June 1953 East German Uprising.” International Studies Quarterly 62(2):301–314.

Crilley, Rhys. 2017. “Did RT Influence the 2016 US Elections.” E-International Relations .

141 Crilley, Rhys. 2018. “Should the UK Government Ban Russia Today? Our Research Suggests it Should not ’Shut Up and Go Away’.” AHRC Blog . https://ahrc- blog.com/2018/04/05/should-the-uk-government-ban-russia-today-research-suggests-it- should-not-shut-up-and-go-away/.

Cunningham, Stanley. 2002. The Idea of Propaganda: A Reconstruction. Greenwood Publishing Group.

Dancey, Logan. 2012. “The Consequences of Political Cynicism: How Cynicism Shapes Citizens Reactions to Political Scandals.” Political Behavior 34(3):411–423.

Danielson, Elena. 2004. Cold War Broadcasting Impact. In Report on a Conference orga- nized by the Hoover Institution and the Cold War. International History Project of the Woodrow Wilson International Center for Scholars.

Darby, Luke. 2019. “Fox News Was Duped by a Seth Rich Conspiracy Pushed by Russian Intelligence.” GQ .

Daskal, Jennifer. 2019. “A ’Fake News’ Law Gives Singapore Worrisome Powers.” The New York Times .

Daugherty, William E. and Morris Janowitz. 1958. A Psychological Warefare Casebook. Johns Hopkins Press.

Dave, Paresh and Christopher Bing. 2019. “Russian Disinformation on YouTube Draws Ads, Lacks Warning Labels: Researchers.” Reuters .

Davison, W. Phillips. 1983. “The Third-Person Effect in Communication.” Public Opinion Quarterly 47(1):1–15.

Deeks, Ashley, Sabrina McCubbin and Cody M. Poplin. 2017. “Addressing Russian Influ- ence: What Can We Learn From U.S. Cold War Counter-Propaganda Efforts?” Lawfare .

DellaVigna, Stefano, Ruben Enikolopov, Vera Mironova, Maria Petrova and Ekaterina Zhu- ravskaya. 2014. “Cross-Border Media and Nationalism: Evidence from Serbian Radio in Croatia.” American Economic Journal: Applied Economics 6(3):103–132.

DeMarzo, Peter M., Dimitri Vayanos and Jeffrey Zwiebel. 2003. “Persuasion Bias, Social Influence, and Unidimensional Opinions.” The Quarterly Journal Economics 118(3):909–968.

D’Hooghe, Ingrid. 2014. China’s Public Diplomacy. Martinus Nijhoff Publishers.

Doaui, Aziz. 2014. “The “Presumed” Influence of US international Broadcasting: Under- standing Arab Audiences Responses to AL-Hurra Telivision.” Democratic Communique 26(2):138–159.

142 Dombrowski, Peter and Simon Reich. 2017. “Does Donald Trump have a Grand Startegy.” International Affairs 93(5):1013–1037.

Dragojlovic, Nick. 2013. “Leaders without Borders: Familiarity as a Moderator of Transna- tional Source Cue Effects.” Political Communication 30(2):297–316.

Dragojlovic, Nick. 2015. “Listening to Outsiders: The Impact of Messenger National- ity on Transnational Persuasion in the United States.” International Studies Quarterly 59(1):73–85.

Druckman, James N. and Cindy D. Kam. 2011. Students as experimental participants: A defense of the narrow data base. In Cambridge handbook of experimental political science. Cambridge University Press pp. 41–57.

Druckman, James N., Donald P. Green, James H. Kuklinksi and Arther Lupia. 2006. “The Growth and Development of Experimental Research in Political Science.” American Po- litical Science Review 100(4):627–635.

Druckman, James N. and Thomas J. Leeper. 2012. “Learning More from Political Com- municaton Experiments: Pretreatment and its Effects.” American Journal of Political Science 56(4):875–896.

Dutta-Bergman, Mohan J. 2006. “US Public Diplomacy in the Middle East: A Critical Cultural Aproach.” Journal of Communication Inquiry 30(2):102–124.

Einstein, Katherine Levine and David M. Glick. 2015. “Do I think BLS data are BS? The Consequences of Conspiracy Theories.” Political Behavior 37(3):679–701.

Eisinger, Robert M. 1999. “Cynical America? Misunderstanding the Public’s Message.” The Public Perspective 10:45–48.

Eisinger, Robert M. 2000. “Questioning Cynicism.” Society 37(5):55–60.

Elenbaas, Matthijs and Claes H. De Vresse. 2008. “The Effects of Strategic News on Political Cynicism and Vote Choice Among Young Voters.” Journal of Communication 58(3):550–567.

Ellul, Jacques. 1965. Propaganda: The Formation of Men’s Attitudes. Knopf.

Endres, Kyle and Kristin J. Kelly. 2018. “Does Microtargeting Matter? Campaign Contact Startegies and Young Voters.” Journal of Elections, Public Opinion, and Parties 28(1):1– 18.

Entman, Robert M. 2004. Projections of Power: Framing News, Public Opinion, and US Foreign Policy. University of Chicago Press.

143 Erber, Ralph and Richard R. Lau. 1990. “Political Cynicism Revisited: An Information- Processing Reconciliation of Policy-based and Incumbency-based Interpretations of Changes in Trust in Government.” American Journal of Political Science 34(1):236–253.

Ettinger, Karl E. 1946. “Foreign Propaganda in America.” Public Opinion Quarterly 10(3):329–342.

Etudo, Ugo, Victoria Y. Yoon and Niam Yaraghi. 2019. “From Facebook to the Streets: Russian Troll Ads and Black Lives Matter Protests.” Proceedings of the 52nd Hawaii International Conference on System Sciences .

Eveland, William P., Amy I. Nathanson, Benjamin H. Detenber and Douglas M. McLeod. 1999. “Rethinking the Social Distance Corollary: Perceived Likelihood of Exposure and the Third-Person Perception.” Communication Research 26(3):275–302.

Fan, Ying. 2008. “Soft Power: Power of Attraction or Confusion?” Place Branding and Public Diplomacy 4(2):147–158.

Faris, Robert M., Hal Roberts, Bruce Etling, Nikki Bourassa, Ethan Zuckerman and Yochai Benkler. 2017. “Partisanship, Propaganda, and Disinformation: Online Media and the 2016 U.S. Presidential Election.” Berkman Klein Center .

Farkas, Johan, Jannick Schou and Christina Neumayer. 2018. “Cloaked Facebook Pages: Exploring Fake Islamist Propaganda in Social Media.” New Media & Society 20(5):1850–1867.

Farkas, Johan and Marco Bastos. 2018. “IRA Propaganda on Twitter: Stoking Antagonism and Tweeting Local News.” Proceedings of the 9th International Conference on Social Media and Society . http://openaccess.city.ac.uk/19401/1/FarkasBastos-SMS.pdf.

Farwell, James. 2018. “Countering Russian Meddling in US Political Processes.” Parame- ters 48(1):37–47.

Feignberg, Andrew. 2017. “My Life at a Russian Propaganda Networks.” Politico .

Feng, Guangchao Charles and Steve Zhongshi Guo. 2012. “Support for Censorship: A Multilevel Meta-analysis of the Third-person Effect.” Communication Reports 25(1):40– 50.

Festinger, Leon. 1957. A Theory of Cognitive Dissonance. Stanford CA: Stanford Univer- sity Press.

Figueira, Alvaro and Luciana Oliveira. 2017. “The Current State of Fake News: Challanges and Oppurtunities.” Procedia Computer Science pp. 817–825.

Filipov, David. 2017. “A majority of Russians don’t trust Putin to solve corruption. But they trust him to run the country.” The New York Times .

144 Filloux, Frederic. 2017. “You can’t sell news for what it costs to make.” Medium .

Fisher, Aleksandr. 2019a. “A New Cold War? International Public Opinion of Russia and the United States.” International Journal of Public Opinion Research .

Fisher, Aleksandr. 2019b. “Perceptions of Russian Interference in U.S. Elections Matters as Much as Actual Involvement.” Foreign Policy Research Institute .

Fisher, Aleksandr. 2020. “Trickle Down Soft Power: Do Russia’s Ties to European Parties Influence Public Opinion?” Foreign Policy Analysis .

Fletcher, Richard, Alessio Cornia, Lucas Graves and Rasmus Klesis Nielsen. 2018. “Mea- suring the reach of “fake news” and online disinformation in Europe.” Reuters Institute for the Study of Journalism .

Flock, Elizabeth. 2018. “After a Weeks of Russian Propaganda, I was Questioning Every- thing.” PBS . https://www.pbs.org/newshour/arts/after-a-week-of-russian-propaganda-i- was-questioning-everything.

Fly, Jamie, Laura Rosenberger and David Salvo. 2018. “Policy Blueprint for Countering Authoritarian Interference in Democracies.” The German Marshall Fund of the United States .

Foyle, Douglas C. 1997. “Public Opinion and Foreign Policy: Elite Beliefs as a Mediating Variable.” International Studies Quarterly 41(1):141–170.

Freeman, Ben. 2018. “How to counter foreign propaganda on TV.” The Hill .

Friestad, Marian and Peter Wright. 1994. “The Persuasion Knowledge Model: How People Cope with Persuasion Attempts.” Journal of Consumer Research 21(1):1–31.

Frum, David. 2019. “Why Is Trump Spouting Russian Propaganda?” The Atlantic .

Frye, Timothy, Scott Gehlbach, Kyle L. Marquardt and Ora John Reuter. 2017. “Is Putin’s Popularity Real?” Post-Soviet Affairs 33(1):1–15.

Fund, John. 2017. “Combating Russian Disinformation: Use Reagan’s Techniques.” National Review . https://www.nationalreview.com/2017/11/russian-propaganda- tech-companies-facebook-twitter-google-should-follow-reagan-technique-counter- propaganda/.

Gagliardone, Iginio. 2013. “China as a Persuader: CCTV Africa’s First Steps in the African Mediasphere.” Ecquid Novi: African Journalism Studies 34(3):35–40.

Gainous, Jason, Kevin M. Wagner and Charles E. Ziegler. 2018. “Digitial Media and Polit- ical Sophistication in Authoritarian Systems: Russia’s 2011 and 2016 Duma Elections.” Democratization 25(2):209–226.

145 Gallacher, John D., Vlad Barash, Philip Howard and John Kelly. 2017. “Junk News on Military Affairs and National Security: Social Media Disinformation Campaigns Against US Military Personnel and Veterans.” Working Paper .

Geddes, Barbara and John Zaller. 1989. “Sources of Popular Support for Authoritarian Regimes.” American Journal of Political Science 33(2):319–347.

Gelman, Andrew and Eric Loken. 2014. “The Statistical Crises in Science.” American Scientist 102(6):460–465.

Gerber, Alan and Donald Green. 1999. “Misperceptions About Perceptual Bias.” Annual Review of Political Science 2:189–210.

Gerber, Theodore P. and Jane Zavisca. 2016. “Does Russian Propaganda Work?” The Washington Quarterly 39(2):79–98.

Gerber, Therodore P. 2014. “Beyong Putin? Nationalism and Xenophobia in Russian Public Opinion.” The Washington Quarterly 37(3):113–134.

Gerstel, Dylan. 2017. “ISIS and Innovative Propaganda: Confronting Extremism in the Digital Age.” Swarthmore International Relations Journal 1(1):1–9.

Gilboa, Eytan. 2008. “Searching for a Theory of Public Diplomacy.” The Annals of the American Academy of Political and Social Science 616(1):55–77.

Golan, Guy J., Ilan Manor and Phillip Arceneaux. 2019. “Mediated Public Diplomacy Re- defined: Foreign Stakeholder Engagement via Paid, Earned, Shared, and Owned Media.” American Behavioral Scientist .

Golan, Guy J. and Joon Soo Lim. 2016. “Third-Person Effect of ISIS Recruitment Propa- ganda: Online Political Self-Efficacy and Social Media Activism.” International Journal of Communication 10:4681–4701.

Goldsmith, Benjamin E. and Yusaku Horiuchi. 2009. “Spinning the Globe? US Public Diplomacy and Foreign Public Opinion.” The Journal of Politics 71(3):863–875.

Goldsmith, Benjamin E. and Yusaku Horiuchi. 2012. “In Search of Soft Power: Does Foreign Public Opinion Matter for US Foreign Policy?” World Politics 64(03):555–585.

Golitsyn, Anatoliy. 1990. New Lies for Old: The Communist Strategy of and Disinformation. Clarion House.

Gordon, Joseph S. 1988. Psychological Operations: The Soviet challenge. Westview Press.

Graves, Lucas. 2018. “Understanding the Promise and Limits of Automated Fact- Checking.” Factsheet .

146 Greene, Samuel and Graeme Robertson. 2017. “Agreeable Authoritarians: Personality and Poltiics in Contemporary Russia.” Comparative Political Studies 50(13):1802–1834.

Gregory, Bruce. 2016. “Mapping Boundaries in Diplomacu’s Public Dimension.” The Hague Journal of Diplomacy 11(1):1–25.

Grieco, Joseph M., Christopher Gelpi, Jason Reifler and Peter D. Feaver. 2011. “Let’s Get a Second Opinion: International Institutions and American Public Support for War.” International Studies Quarterly 55(2):563–583.

Grinberg, Nir, Kenneth Joseph, Lisa Friedland, Briony Swire-Thompson and David Lazer. 2019. “Fake News on Twitter During the 2016 US Presidential Election.” Science 363(6425):374–378.

Gruffydd-Jones, Jamie J. 2019. “Citizens and Condemnation: Strategic Uses of Inter- national Human Rights Pressure in Authoritarian State.” Comparative Political Studies 52(4):579–612.

Guardino, Matt and Danny Hayes. 2017. “Foreign Voices, Party Cues, and U.S. Pub- lic Opinion about Military Action.” International Journal of Public Opinion Research pp. 1–13.

Guess, Andrew. 2015. “Measure for Measure: An Experimental Test of Online Political Media Exposure.” Political Analysis 23(1).

Guess, Andrew and Alexander Coppock. 2018. “Does Counter-Attitudinal Information Cause Backlash? Results from Three Large Survey Experiments.” British Journal of Political Science pp. 1–19.

Guess, Andy, Brendan Nyhan and Jason Reifler. 2018. “Selective Exposure to Disinforma- tion: Evidence from the Consumption of Fake News During the 2016 US Presidential Campaign.” Unpublished Manuscript . https://www.dartmouth. edu/ nyhan/fake-news- 2016.pdf.

Gunitsky, Seva. 2015. “Corrupting the Cyber-commons: Social Media as a Tool of Auto- cratic Stability.” Perspectives on Politics 13(1):42–54.

Gunther, Albert. 1991. “What We Think Others Think: Cause and Consequence in the Third-Person Effect.” Communication Research 18(3):355–372.

Gunther, Albert C. and Douglas J. Storey. 2003. “The Influence of Presumed Influence.” Journal of Communication 53(2):199–215.

Gunther, Albert C. and Paul Mundy. 1993. “Biased Optimism and the Third-Person Effect.” Jounralism Quarterly 70(1):58–67.

147 Hadar, Leon. 2017. “Is Foreign Propaganda Even Effective.” The American Conservative . https://www.theamericanconservative.com/articles/russia-is-foreign-propaganda-even- effective/.

Hall, Holly Kathleen. 2017. “The New Voice of America: Countering Foreign Propaganda and Disinformation Act.” First Amendment Studies pp. 1–13.

Hamborg, Felix, Karsten Donnay and Bela Gipp. 2018. “Automated Identification of Media Bias In News Articles: An Interdisciplinary Literature Review.” International Journal on Digital Libraries pp. 1–25.

Hanlon, Bradley. 2018a. “From Nord Stream to Novichok: Kremlin Propaganda on Google’s Front Page.” Alliance for Securing Democracy .

Hanlon, Bradley. 2018b. “It’s Not Just Facebook: Countering Russia’s Social Media Of- fensive.” Alliance for Securing Democracy .

Hanlon, Bradley and Grant Bennett. 2018. “Twitter Release Reveals the Kremlin’s News Impersonation Game.” Alliance for Securing Democracy .

Hanlon, Bradley and Thomas Morley. 2019. “Russia’s Network of Millennial Media.” Alliance for Securing Democracy .

Hanson, Gary, Paul Michael Haridakis, Audrey Wagstaff, Rekha Sharma and James D. Ponder. 2010. “The 2008 Presidential Campaign: Poltical Cynicism in the Age of Face- book, Myspace, and Youtube.” Mass Communication and Society 13(5):584–607.

Hart, Joshua and Molly Graether. 2018. “Something’s Going on Here: Psychological Pre- dictors of Belief in Conspiracy Theories.” Journal of Individual Differences .

Hart, P. Sol and Erik C. Nisbet. 2011. “Boomerang Effects in Science Communication: How Motivated Reasoning and Identity Cues Amplify Opinion Polarization About Cli- mate Mitigation Policies.” Communication Research 39(6):701–723.

Hass, Glen R. 1981. Effects of Source Characteristics on Cognitive Responses and Per- suasion. In Cognitive Responses in Persuasion, ed. Richard E. Petty, Thomas M. Ostrom and Timothy C. Brock. Hillsdale, NJ: Lawrence Erlbaum. pp. 141–172.

Hayes, Danny and Matt Guardino. 2011. “The Influence of Foreign Voices on US Public Opinion.” American Journal of Political Science 55(4):831–851.

Hegedus,˝ Daniel.´ 2016. “The Kremlin’s Influence in Hungary: Are Russian Vested Inter- ests Wearing Hungarian National Colors.” DGAPkompakt .

Hegelich, Simon and Dietmar Janetzko. 2016. “Are Social Bots on Twitter Political Actors? Empirical Evidence from a Ukranian Social Botnet.” ICWSM .

148 Hellman, Maria and Charlotte Wagnsson. 2017. “How Can European States Respond to Russian Information Warfare? An Analytical Framework.” European Security 26(2):153–170. Helmus, Todd C., Elizabeth Bodine-Baron, Andre Radin, Madeline Magnusohn, Joshua Mendelsohn, William Marcellino, Andriy Bega and Zev Winkelman. 2018. Russian Social Media Influence: Understanding Russian Propaganda in Eastern Europe. Rand Corporation. Henderson, Edgar H. 1943. “Toward a Definition of Propaganda.” The Journal of Social Psychology 18(1):71–87. Herman, Edward S. and Noam Chomsky. 1988. Manufacturing consent: The political economy of the mass media. Random House. Hern, Alex, Pamela Duncan and Ella Creamer. 2018. “Russian trolls’ tweets cited in more than 100 UK news articles.” The Guardian . Herpen, Marcel H. Van. 2015. Putin’s Propaganda Machine: Soft Power and Russian Foreign Policy. Rowman & Littlefield. Herrmann, Richard K. 2017. “How Attachments to the Nation Shape Beliefs About the World: A Theory of Motivated Reasoning.” International Organization 7(1):61–84. Herrmann, Richard K. and Michael P. Fischerkeller. 1995. “Beyond the Enemy Image and Sprial Model: Cogntive-Startegic Research after the Cold War.” International Organiza- tion 49(3):415–450. Holbert, Lance R., R. Kelly Garret and Laurel S. Gleason. 2010. “A New Era of Minimal Effects? A Response to Bennet and Iyengar.” Journal of Communication 60(1):15–34. Hollyer, James R., Peter Rosendorff and James Raymond Vreeland. 2015. “Transparency,, Protest, and Autocratic Stability.” American Political Science Review 109(4):764–784. Holsti, Rudolg Ole. 2009. Public Opinion and American Foreign Policy. University of Michigan Press. Honig, Or and Ariel Reichard. 2018. “Evidence-Fabricating in Asymmetric Conflicts: How Weak Actors Prove False Propaganda Narratives.” Studies in Conflict & Terrorism 41(4):297–318. Hornsey, Matthew J., Mark Trembath and Sasha Gunthorpe. 2004. “’You Can Criticize Because You Care’: Identity Attachment, Constructivness, and the Intergroup Sensitivity Effect.” European Journal of Social Psychology 34(5):499–518. Hovland, Carl I. and Wallace Mandell. 1952. “An Experimental Comparison of Conclusion-Drawing by the Communicator and by the Audience.” The Journal of Ab- normal and Social Psychology 47(3):581–588.

149 Hovland, Carl I. and Walter Weiss. 1951. “The Influence of Source Credibility on Com- munication Effectiveness.” Public Opinion Quarterly 15(4):635–650.

Howard, Philip N., Bharath Ganesh, Dimitra Liotsiou, John Kelly and Camille Francois. 2018. The IRA, Social Media and Political Polarization in the United States, 2012-2018. Oxford University.

Hsu, Stephen. 2018. “Russian Fake Tweets Visualized.” Towards Data Science .

Huang, Haifeng. 2015a. “International Knowledge and Domestic Evaluations in a Chang- ing Society: The Case of China.” American Political Science Review 109(3):613–634.

Huang, Haifeng. 2015b. “Propaganda as Signaling.” Comparative Politics 47(4):419–444.

Huang, Haifeng. 2018. “The Pathology of Hard Propaganda.” The Journal of Politics 80(3):1034–1038.

Huang, Haifeng and Yao-Yuan Yeh. 2017. “Information from Abroad: Foreign Media, Selective Exposure, and Political Support in China.” British Journal of Political Science pp. 1–26.

Hunt, Albert. 2018. “Yes, Russian Election Sabotage Helped Trump Win.” Bloomberg .

Hutchings, Stephen. 2018. “We Must Rethink Russia’s Propaganda Machine in Or- der to Reset the Dynamic That Drives It.” LSE British Politics and Policy Blog . http://blogs.lse.ac.uk/politicsandpolicy/we-must-rethink-russian-propaganda/.

Hutchings, Stephen and Joanna Szostek. 2015. Dominant Narratives in Russian Political and Media Discourse during the Ukraine Crisis. In Ukraine and Russia People, Poli- tics, Propaganda and Perspectives. E-International Relations. http://www.e-ir.info/wp- content/uploads/2016/06/Ukraine-and-Russia-E-IR-2016.pdf.

Im, Jane, Eshwar Chandrasekharan, Jackson Sargent, Paige Lighthammer, Taylor Denby, Ankit Bhargava, Libby Hemphill, David Jurgens and Eric Gilbert. 2019. “Still Out There: Modeling and Identifying Russian Troll Accounts on Twitter.” arXiv preprint arXiv:1901.11162 .

Imai, Kosuke, Luke Keele and Teppei Yamamoto. 2010. “Identification, Inference, and Sensitivity Analysis for Causal Mediation Effects.” Statistical Science 25:51–71.

Imhoff, Roland and Pia Lamberty. 2018. “How paranoid are conspiracy believers? Toward a more finegrained understanding of the connect and disconnect between paranoia and belief in conspiracy theories.” European Journal of Social Psychology 48(7):909–926.

Irion, Frederick. 1950. Public Opinion and Propaganda. New York: Thomas Y. Crowell Company.

150 Isaac, Mike and Daisuke Wakabayshi. 2017. “Russian Influence Reached 126 Million Through Facebook Alone.” The New York Times .

Issac, Matthew S. and Kent Grayson. 2017. “Beyond Skepticism: Can Accessing Persua- sion Knowledge Bolster Credibility.” Journal of Consumer Research 43(6):895–912.

Ivanov, Bobi, Stephen A. Rains, Sarah A. Geegan, Sarah C. Vos, Nigil D. Haarstad and Kimberly A. Parker. 2017. “Beyond Simple Inoculation: Examining the Persuasive Value of Inoculation for Audiences with Initially Neurtral or Opposing Attitudes.” West- ern Journal of Communication 81(1):105–126.

Jackson, Dan. 2011. “Strategic Media, Cynical Public? Examining the Contingent Effects of Strategic News Frames on Political Cynicism in the United Kingdom.” The Interna- tional Journal of Press/Politics 16(1):75–101.

Jamieson, Kathleen H. 2018. Cyberwar: How Russian Hackers and Trolls Helped Elect a President What We Don’t, Can’t, and Do Know. Oxford University Press.

Jang, Mo S. and Joon K. Kim. 2018. “Third Person Effects of Fake News: Fake News Regulation and Media Literacy Interventions.” Computers in Human Behavior 80.

Jones, Seth G. 2018. “Going on the Offensive: A U.S. Startegy to Combat Russian Infor- mation Warfare.” CSIS .

Jowett, Garth S. and Victoria O’Donnell. 2014. Propaganda and Persuasion. Sage Publi- cations.

Kahan, Dan M. 2013. “Ideology, Motivated Reasoning, and Cognitive Reflection.” Judge- ment and Decision Making 8(4):407–424.

Kahne, Joseph and Benjamin Bowyer. 2017. “Educating for Democracy in a Partisan Age: Confronting the Challenges of Motivated Reasonning and Misinformation.” American Educational Research Journal 54(1):3–34.

Kalogeropoulos, Antonis, Richard Fletcher and Rasmus Klesis Nielsen. 2018. “News brand attribution in distributed environments: Do people know where they get their news?” New Media & Society pp. 1–19.

Keating, Vincent Charles and Katarzyna Kaczmarska. 2018. “Russia’s In- fluence is Much More Than Propaganda and Fake News.” EuroNews . http://www.euronews.com/2018/04/04/russia-s-influence-is-much-more-than- propaganda-and-fake-news-view.

Keeley, Greg. 2018. “Combatting Russian information warfare — in the Baltics.” The Hill . http://thehill.com/opinion/technology/382245-combatting-russian-information- warfare-in-the-baltics.

151 Keersmaecker, Jonas De and Arne Roets. 2017. “‘Fake news’: Incorrect, but hard to cor- rect. The role of cognitive ability on the impact of false information on social impres- sions.” Intelligence 65:107–110.

Kern, Lutz Holger and Jens Hainmueller. 2009. “Opium for the Masses: How Foreign Media can Stabilize Authoritarian Regimes.” Political Analysis 17(4):377–399.

Kertzer, Joshua D. and Ryan Brutger. 2016. “Decomposing Audience Costs: Bringing the Audience Back into Audience Cost Theory.” American Journal of Political Science 60(1):234–249.

Khaldarova, Irina and Mervi Pantti. 2016. “Fake News: The Narrative Battle Over the Ukraine Conflict.” Journalism Practice 10(7):891–901.

Khazan, Olga. 2013. “which occurs when officials implicated in wrongdoing whip out a counter-example of a similar abuse from the accusing country, with the goal of under- mining the legitimacy of the criticism itself.” The Atlantic .

Kiesler, Charles A. and Sara B. Kiesler. 1964. “Role of Forewarning in Persuasive Com- munications.” The Journal of Abnormal and Social Psychology 68(5):547–549.

Kim, Young Mie, Jordan Hsu, David Neiman, Colin Kou, Levi Bankston, Soo Yun Jim, Richard Heinrich, Robyn Baragwanath and Garvesg Raskutti. 2018. “The Stealth Me- dia? Groups and Targets Behind Divisive Issue Campaigns on Facebook.” Political Com- munication pp. 1–29.

Kinder, Donald R. and Cindy D. Kam. 2009. Us Against Them: Ethnocentric Foundations of American Opinion. University of Chicago Press.

King, Gary, Jennifer Pan and Margaret E. Roberts. 2013. “How censorship in China al- lows goverment criticism but silences collective expression.” American Political Science Review 107(2):326–343.

King, Gary, Jennifer Pan and Margaret E. Roberts. 2017. “How the Chinese Government Fabricates Social Media Posts for Strategic distraction, not engaged argument.” Ameri- can Political Science Review 111(3):484–501.

Kirchick, James. 2017. “Why Russia’s RT should register as an agent of a foreign govern- ment.” Brookings .

Klapsis, Antonis. 2015. An Unholy Alliance: The European Far Right and Putin’s Russia. Wilfried Martens Centre for European Studies. https://www.martenscentre.eu/sites/default/files/publication-files/far-right-political- parties-in-europe-and-putins-russia.pdf.

Koch, Jeffery W. 2003. “Political Cynicism and Third Party Support in American Presi- dential Elections.” American Politics Research 31(3):48–65.

152 Kreko,´ Peter´ and Lor´ ant´ Gyori.˝ 2015. “Russia and the European Far Left.” Wilfried Martens Centre for European Studies .

Krupnikov, Yanna and Adam Seth Levine. 2014. “Cross-Sample Comparisons and External Validity.” Journal of Experimental Political Science 1(1):59–80.

Kudors, Andis. 2018. “Latvia: Disinformation Resilience Index.” Ukranian Prism Foreign Policy Council . http://prismua.org/en/english-latvia-disinformation-resilience-index/.

Kunda, Ziva. 1990. “The Case for Motivated Reasoning.” Psychological Bulletin 108(3):480–498.

Kuran, Timur. 1991. “Now Out of Never: The Element of Surprise in the East European Revolution of 1989.” World Politics 44(1):7–48.

Kurtzleben, Danielle. 2017. “Trump Embraces One Of Russia’s Favorite Propaganda Tac- tics — Whataboutism.” NPR .

Kux, Dennis. 1985. “Soviet Active Measures and Disinformation.” Parameters 15(4):19– 27.

Landon-Murray, Michael, Edin Mujkic and Brian Nussbaum. 2019. “Disinformation in Contemporary US Foreign Policy: Impacts and Ethics in an Era of Fake News, Social Media, and Artificial Intelligence.” Public Integrity pp. 1–11.

Lanoszka, Alexander. 2019. “Disinformation in International Politics.” European Journal of International Security 4(2):227–248.

Laruelle, Marlene. 2015. Eurasianism and the European Far Right: Reshaping the Europe– Russia Relationship. Lexington Books.

Lasswell, Harold D. 1927. “The Theory of Political Propaganda.” American Political Sci- ence Review 21(3):627–631.

Lau, Richard R., Lee Sigelman and Ivy Brown Rovner. 2007. “The Effects of Negative Po- litical Campaigns: A Meta-Analytic Reassessment.” The Journal of Politics 69(4):1176– 1209.

Lazarsfeld, Paul F., Bernard Berelson and Hazel Gaudet. 1944. The People’s Choice: How the Voter Makes Up His Mind in a Presidential Campaign. New York: Columbia Uni- versity Press.

Lazer, David MJ, Matthew A. Baum, Yochai Benkler, Adam J. Berinsky, Kelly M. Green- hill, Filippo Menczer, Miriam J. Metzger, Brendan Nyhan, Gordon Pennycook, David Rothschild, Michael Schudson, Steven A. Sloman, Cass R. Sunstein, Emily A. Thorson, Duncan J. Watts and Jonathan L. Zittrain. 2018. “The Science of Fake News.” Science 359(6380):1094–1096.

153 Leighton, Marian Kirsch. 1991. Soviet Propaganda as a Foreign Policy Tool. University Press of America.

Letterman, Clark. 2018. “Image of Putin, Russia Suffers Internationally.” Pew Re- search Center . http://www.pewglobal.org/2018/12/06/image-of-putin-russia-suffers- internationally/.

Levay, Kevin E., Jeremy Freese and James N. Druckman. 2016. “The Demographic and Political Composition of Mechanical Turk Samples.” SAGE Open 6(1).

Levendusky, Matthew S. and Neil Malhotra. 2015. “(Mis)perceptions of Partisan Polariza- tion in the American Public.” Public Opinion Quarterly 80(1):378–391.

Levin, Dov H. 2016. “When the Great Power Gets a Vote: The Effects of Great Power Electoral Interventions on Election Results.” International Studies Quarterly 60(2):189– 202.

Lewandowsky, Stephan, Ullrich K. H. Ecker, Collen M. Seifert, Norbert Schwarz and John Cook. 2012. “Misinformation and its Correction: Continued Influence and Successful Debiasing.” Psychological Science In the Public Interest 13(3):106–131.

Lichtenstein, Dennis, Katharina Esau, Lena Pavlova, Dmitry Osipov and Nikita Argylov. 2018. “Framing the Ukraine Crisis: A Comparison Between Talk Show Debates in Russian and German Television.” International Communication Gazette pp. 1–23.

Lim, Gabrielle, Etienne Maynier, John Scott-Railton, Alberto Fittarelli, Ned Moran and Ron Deibert. 2019. “Burned After Reading Endless Mayfly’s Ephemeral Disinformation Campaign.” Citizen Lab .

Linebarger, Paul. 1954. . Washington: Combat Press Forces.

Litman, Leib, Jonathan Robinson and Tzvi Abberbock. 2017. “TurkPrime.com: A Ver- satile Crowdsourcing Data Acquisition Platform for the Behavioral Sciences.” Behavior Research Methods pp. 1–10.

Little, Andrew T. 2017. “Propaganda and Credulity.” Games and Economic Behavior 102:224–232.

Little, Andrew T. 2018. “Fake News, Propaganda, and Lies can be Pervasive even if they aren’t Persuasive.” Critique 11(1):21–34.

Llewellyn, Clare, Laua Cram, Adrian Favero and Robin L. Hill. 2018. “For Whom the Bell Trolls: Troll Behaviour in the Twitter Brexit Debate.” arXiv preprint arXiv:1801.08754 .

Lord, Charles G., Lee Ross and Mark R. Lepper. 1979. “Biased Assimilation and Atti- tude Polarization: The Effects of Prior Theories on Subsequently Considered Evidence.” Journal of Personality and Social Psychology 37(11):2098–2109.

154 Lorimor, E.S. and S. Watson Dunn. 1968. “Reference Groups, Congruity Theory and Cross-Cultural Persuasion.” Journal of Communication 18(4):354–368.

Lucas, Edward and Ben Nimmo. 2015. “Information Warfare: What is it and How to Win it?” CEPS Policy Briefs .

Lucas, Edward and Peter Pomerantsev. 2016. “Winning the Information War.” CEPA .

Lumsdaine, Arthur A. and Irving L. Janis. 1953. “Resistance to ’‘ Pro- duced by One-sided and Two-sided “Propaganda” Presentations.” Public Opinion Quar- terly 17(3):311–318.

Lundgren, Per, Nils Hanson, Kjersti Løken Stavrum, Liljan Weihe and Tone Gunhild Haugan-Hepsø. 2018. Fighting Fakes The Nordic Way. Nordic Council of Ministers.

Lynch, Marc. 2006. Voices of the New Arab Public: Iraq, Al-Jazeera, and Middle East Politics Today. Columbia University Press.

Lytvynenko, Jane. 2017. “InfoWars Has Republished More Than 1,000 Articles From RT Without Permission.” BuzzFeed News .

Machleder, Josh. 2015. “Taking the High Road in the Propaganda War.” Foreign Pol- icy . https://foreignpolicy.com/2015/05/12/taking-the-high-road-in-the-propaganda-war- ukraine-russia-media/.

Mackie, Diane M. and Sarah Queller. 2000. The Impact of Group Membership on Persua- sion: Revisiting “Who Says What to Whom With What Effect? In Attitudes, Behavior, and Social Context: The Role of Norms and Group Membership,, ed. Deborah H. Terry and Michael A. Hogg. Lawrence Erlbaum Associates, Publishers.

Mackinnon, David P. and Amanda J. Fairchild. 2009. “Current Directions in Mediation Analysis.” Current Direction in Psychological Science 18(1):16–20.

Madrigal, Alexis C. 2018. “Russia’s Troll Operation Was Not That Sophisticated.” The Atlantic .

Malksoo,¨ Maria. 2018. “Countering Hybrid Warfare as Ontological Security Management: The Emerging Practices of the EU and NATO.” European Security 27(3):374–392.

Manaev, Oleg. 1991. “The Influence of Western Radio on the Democratization of Soviet Youth.” Journal of Communication 41(2):72–91.

Manheim, Jarol B. 1994. Strategic Public Diplomacy and American Foreign Policy: The Evolution of Influence. Oxford University Press.

March, Luke. 2009. “Managing Opposition ina Hybrid Regime: Just Russia and Parastatal Opposition.” Slavic Review 68(3):504–527.

155 Marinov, Nikolay. 2013. “Voter Attitudes When Democracy Promotion Turns Partisan: Evidence from a Survey Experiment in Lebanon.” Democratization 20(7):1297–1321.

Marinov, Nikolay. 2018. “International Actors as Critics of Domestic Freedoms.” Available at SSRN 2750240 .

Martin, Diego A. and Jacob N. Shapiro. 2019. “Trends in Online Foreign Influence Efforts.” Working Paper pp. 1–64.

Martin, Gregory J. and Ali Yurukoglu. 2017. “Bias in Cable News: Persuasion and Polar- ization.” American Economic Review 107(9):2565–2599.

Martin, John L. 1971. “Effectiveness of International Propaganda.” The Annals of the American Academy of Political and Social Science 398(1):61–70.

Marwick, Alice and Rebecca Lewis. 2017. “ and Disinformation On- line.” New York: Data And Society Research Institute .

Mazella, David. 2007. The Making of Modern Cynicism. University of Virginia Press.

McGeehan, Timothy P. 2018. “Counterinng Russian Disinformation.” Parameters 48(1):49–57.

McGuire, William J. and Demetrios Papageorgis. 1962. “Effectivness of Forewarning in Developing Resistance to Persuasion.” Public Opinion Quarterly 26(1):24–34.

McLeod, Douglas M., Benjamin H. Detenber and William P. Eveland Jr. 2001. “Behind the Third-person Effect: Differentiating Perceptual Processess for Self and Other.” Journal of Communication 51(4):678–695.

Mejias, Ulises A. and Nikolai E. Vokuev. 2017. “Disinformation and the Media: The Case of Russia and Ukraine.” Media, Culture, & Society pp. 1–16.

Melissen, Jan. 2005. The New Public Diplomacy: Between Theory and Practice. In The New Public Diplomacy. Palgrave Macmillan UK pp. 3–27.

Metzger, Megan M. and Alexandra A. Siegel. 2019. “When State-Sponsored Media Goes Viral: Russia’s Use of RT to Shape Global Discourse on Syria When State-Sponsored Media Goes Viral: Russia’s Use of RT to Shape Global Discourse on Syria.” Working Paper .

Metzger, Miriam J., Andrew J. Flanagin and Ryan B. Medders. 2010. “Social and heuristic approaches to credibility evaluation online.” Journal of Communication 60(3):413–439.

Michiewicz, Ellen. 2014. No Illusions: The Voice of Russia’s Future. Oxford University Press.

156 Mikelionis, Lukas. 2018. “Russian troll farm made Twitter accounts for fake newspapers to spread real news.” New York Post .

Mikkonen, Simo. 2010. “Stealing the Monopoly of knowledge?: Soviet reactions to US Cold war broadcasting.” Kritika: Explorations in Russian and Eurasian History 11(4):771–805.

Mongomery, Jacob, Brendan Nyhan and Michelle Torres. 2018. “How Conditioning on Posttreatment Variables Can Ruin Your Experiment and What to Do about It.” American Journal of Political Science 65(3):760–775.

Muller,¨ Philipp. 2013. “National Identity Building Through Patterns of An International Third-Person Perception in News Coverage.” International Communication Gazette 75(8):732–749.

Mullinix, Kevin J., Thomas J. Leeper and James N. Druckman. 2015. “The Generalizability of Survey Experiments.” Journal of Experimental Political Science 2(2):109–138.

Murray, Shoon. 2014. “Broadening the Debate about War: The Inclusion of Foreign Crit- ics in Media Coverage and Its Potential Impact on US Public Opinion.” Foreign Policy Analysis 10(4):329–350.

Murrock, Eric, Joy Amulya, Mehri Druckman and Tetiana Liubyva. 2018. “Winning the War on State-Sponsored Propaganda: Results from an Impact Study of a Ukrainian News Media and Information Literacy Program.” Journal of Media Literacy Education 10(2):53–85.

Nagorski, Zygmunt. 1971. “Soviet International Propaganda: Its Role, Effectiveness, and Future.” The Annals of the American Academy of Political and Social Science 398(1):130–139.

Naylor, Brian. 2016. “Trump Apparently Quotes Russian Propaganda To Slam Clinton On Benghazi.” NPR .

Nelson, Elizabeth, Robert Orttung and Anthony Livshen. 2015. “Measuring RT’s Impact on YouTube.” Russian Analytical Digest 8(177).

Nelson, Jacob L. and Harsh Taneja. 2018. “The Small, Disloyal Fake News Audience: The Role of Audience Availability in Fake News Consumption.” New Media & Society .

Nelson, Toby. 2019. “How RT Frames Conflict: A Comparative Analysis.” Russian Journal of Communication 11(2):126–140.

Nestler, Steffen and Boris Egloff. 2010. “When Scary Messages Backfire: Influence of Dis- positional Cognitive Avoidance on the Effectiveness of Threat Communications.” Jour- nal of Research in Personality 44(1):137–141.

157 Newman, Nic, Richard Fletcher, David A. L. Leby and Rasmus Klesis Nielsen. 2016. “Reuters Institute Digital News Report 2016.” Reuters Institute for the Study of Jour- nalism .

Nicols, John Spicer. 1984. “Wasting the Propaganda Dollar.” Foreign Policy 56:129–140.

Nikitin, Vadim. 2016. “From Russia with Love - How Putin is Winning Over Hearts and Minds.” The National . https://www.thenational.ae/arts-culture/the-long-read-from- russia-with-love-how-putin-is-winning-over-hearts-and-minds-1.173822.

Nimmo, Ben. 2016. “Propaganda in a New Orbit: Information Warefare Initiative.” Center for European Policy Analysis .

Nimmo, Ben. 2018. “Russia’s Full Spectrum Propaganda.” Medium .

Nye, Joseph S. 2004. Soft Power: The Means to Success in World Politics. PublicAffairs.

Nye, Joseph S. 2017. “Will the Liberal Order Survive: The History of an Idea.” Foreign Affairs 96:10–16.

Nye, Joseph S. 2018. “How Sharp Power Threatens Soft Power.” Foreign Affairs .

Nyhan, Brendan. 2018. “Fake News and Bots May Be Worrisome, but Their Political Power Is Overblown.” The New York Times .

Nyhan, Brendan and Jason Reifler. 2010. “When Corrections Fail: The Persistence of Political Misperceptions.” Political Behavior 32(2):303–330.

Oates, Sarah. 2017. “Kompromat Goes Global?: Assessing a Russian Media Tool in the United States.” Slavic Review 76(1):57–65.

O’Brien, Thomas Nicholas. 1989. Russian Roulette: Disinformation in the U.S. Goverment and News Media. Master’s thesis University of South Carolina.

O’Keefe, Daniel J. 1999. “How to Handle Opposing Arguments in Persuasive Messages: A Meta-analytic Review of One-sided and Two-sided Messages.” Annuals of the Inter- national Communication Association 22(1):209–249.

Oliver, Eric J. and Thomas J. Wood. 2014. “Conspiracy Theories and the Paranoid Style(s) of Mass Opinion.” American Journal of Political Science 58(4):952–966.

Oppenheimer, Daniel M., Tom Meyvis and Nicolas Davidenko. 2009. “Instructional Ma- nipulation Checks: Detecting Satisficing to Increase Statistical Power.” Journal of Ex- perimental Social Psychology 45(4):867–872.

Orenstein, Mitchell A. 2014. “Putin’s Western Allies.” Foreign Affairs 25. https://www.foreignaffairs.com/articles/russia-fsu/2014-03-25/putins-western-allies.

158 Orr, Caroline. 2018. “Russian Propaganda On Reddit.” Arc Digital .

Orttung, Robert W. and Elizabeth Nelson. 2019. “Russia Today’s Strategy and Effective- ness on Youtube.” Post-Soviet Affairs 35(2):77–92.

Orwell, George. 1949. 1984. London: Secker and Warburg.

Page, Benjamin I., Robert Y. Shapiro and Glenn R. Dempsey. 1987. “What Moves Public Opinion?” American Political Science Review 81(1):23–44.

Paul, Byrant, Michael B. Salwen and Michel Dupagne. 2000. “The Third-Person Effect: A Meta Analysis of the Perceptual Hypothesis.” Mass Communication & Society 57-85.

Paul, Christopher and Miriam Matthews. 2016. “The Russian “Firehose of Falsehood” Propaganda Model.” Rand Corporation .

Paul, Thazha V. 2005. “Soft Balancing in the Age of US Primacy.” International Security 30(1):46–71.

Peisakhin, Leonid and Arturas Rozenas. 2018. “Electoral Effects of Biased Media: Russian Television in Ukraine.” American Journal of Political Science 62(3):535–550.

Pengelly, Martin. 2017. “Donald Trump repeats respect for ’killer’ Putin in Fox Super Bowl interview.” The Guardian .

Pennycook, Gordon and David G. Rand. 2017. “Assessing the Effect of ’Disputed’ Warn- ings and Source Salience on Perceptions of Fake News Accuracy.” SSRN .

Pennycook, Gordon and David G. Rand. 2018. “Lazy, Not Biased: Suceptibility to Partisan Fake News is Better Explained by Lack of Reasoning Than by Motivated Reasoning.” Cognition .

Penzenstadler, Nick, Brad Heath and Jessica Guynn. 2018. “We read every one of the 3,517 Facebook ads bought by Russians. Here’s what we found.” USA Today .

Pereira, Andrea and Jay Van Bavel. 2018. “Identity Concerns Drive Belief in Fake News.” Working Paper .

Perloff, Richard M. 1993. “Third-Person Effect Research 1983-1992: A Review and Syn- thesis.” International Journal of Public Opinion Research 5(2):167–184.

Perloff, Richard M. 1999. “The Third Person Effect: A Critical Review and Synthesis.” Media Psychology 1(4):353–378.

Petty, Richard E. and John T. Cacioppo. 1977. “Forewarning, Cognitive Responding, and Resistance to Persuasion.” Journal of Personality and Social Psychology 35(9):645–655.

159 Pirlott, Angela G. and David P. Mackinnon. 2016. “Design Approaches to Experimental Mediation.” Journal of Experimental Social Psychology 66:29–38.

Pomerantsev, Peter. 2014a. Nothing Is True and Everything Is Possible: The Surreal Heart of the New Russia. PublicAffairs.

Pomerantsev, Peter. 2014b. “Russia’s Ideology: There Is No Truth.” The New York Times .

Pomerantsev, Peter. 2015. “The Kremlin’s Information War.” Journal of Democracy 26(4):40–50.

Pornpitakpan, Chanthika. 2004. “The Persuasiveness of Source Credibility: A Critical Review of Five Decades’ Evidence.” Journal of Applied Social Psychology 34(2):243– 281.

Prat, Andrea and David Stromberg.¨ 2013. “The Political Economy of Mass Media.” Ad- vances in Economics and Econometrics .

Pratkanis, Anthony R. and Elio Aronson. 2001. Age of Propaganda: They Everyday Use and Abuse of Persuasion. Macmillan.

Prior, Markus. 2013. “Media and Political Polarization.” Annual Review of Political Science 16:101–127.

Puddington, Arch. 2000. Broadcasting Freedom: The Cold War Triumph of Radio Free Europe and Radio Liberty. University Press of Kentucky.

Qin, Bei, David Stromberg¨ and Yanhui Wu. 2017. “Why Does China Allow Freer Social Media? Protest Versus Surveillance and Propaganda.” Journal of Economic Perspectives 31(1):117–140.

Qiu, Linda. 2017. “Fingerprints of Russian Disinformation: From AIDS to Fake News.” The New York Times .

Qiu, Xiaoyan, Diego FM Oliveira, Alireza Sahami Shirazi, Alessandro Flammini and Fil- ippo Menczer. 2017. “Limited Individual Attention and Online Virality of Low-Quality Information.” Nature Human Behavior 1(7):1–7.

Rawnsley, Gary D. 2015. “To Know Us is to Love Us: Public Diplomacy and International Broadcasting in Contemporary Russia and China.” Politics 35(3-4):273–286.

Redlawsk, David P. 2002. “Hot Cognition or Cool Consideration? Testing the Effects of Motivated Reasoning on Political Decision Making.” Journal of Politics 64(4):1021– 1044.

Reid, Scott A. and Michael A. Hogg. 2005. “A Self-Categorization Explanation for the Third-Person Effect.” Human Communication Research 31(1):129–161.

160 Renz, Bettina. 2016. “Russia and ‘Hybrid Warfare’.” Contemporary Politics 22(3):283– 300.

Resnick, Paul, Aviv Ovadya and Garlin Gilchrist. 2018. “Iffy Quotient: A Platform Health Metric for Misinformation.” Center for Social Media Responsibility .

Ribeiro, Filipe N., Lucas Graves, Fabririco Benevento, Abhijnan Chakroborty, Juhi Kul- shrestha, Mahmoudreza Babaei and Krishna P. Gummadi. 2018. “Media Bias Monitor: Quantifying Biases of Social media News Outlets at Large-scale.” In Twelfth Interna- tional AAAI Conference on Web and Social Media .

Rijkhoff, Susanna Afra Maria. 2015. False Alarm! The Measurement and Assessment of Political Cynicism and the Consequences for Political Participation. Master’s thesis Washington State University.

Risso, Linda. 2013. “Radio Wars: Broadcasting in the Cold War.” Cold War History 13(2):145–152.

Roberts, Margaret E. 2018. Censored: distraction and Diversion Inside China’s Great Firewall. Princeton University Press.

Robertson, Graeme. 2017. “Political Orientation, Information and Perceptions of Election Fraud: Evidence from Russia.” British Journal of Political Science 47(3):589–608.

Robinson, Michael. 1976. “Public Affairs Television and the Growth of Political Malaise: The Case of “The Selling of the President”.” American Political Science Review 70(3):409–432.

Roescher, Franziska, Leon Yin, Richard Bonneau, Jonathan Nagler and Joshua Tucker. 2018. “What’s the strategy of Russia’s Internet trolls? We analyzed their tweets to find out.” The Washington Post .

Roese, Neal J. and Gerald N. Sande. 1993. “Backlash Effects in Attack Politics.” Journal of Applied Social Psychology 23(8):632–653.

Roetter, Charles. 1974. The Art of Psychological Warfare, 1914-1945. New York: Stein & Day.

Rohac, Dalibor. 2015. “Cranks, Trolls, and Useful Idiots.” Foreign Policy .

Rohac, Dalibor, Edit Zgut and Lorant Gyori. 2017. “Populism in Europe and its Russian Love Affair.” American Enterprise Institute . http://www.aei.org/wp- content/uploads/2017/01/Populism-in-Europe-and-Its-Russian-Love-Affair.pdf.

Rojas, Hernando. 2010. “Corrective Actions in the Public Sphere: How Perceptions of Media and Media Effects Shape Political Behaviors.” International Journal of Public Opinion Research 22(3):343–363.

161 Roman, Nataliya, Wayne Wanta and Iuliia Buniak. 2017. “Information Wars: Eastern Ukraine Military Conflict Coverage in the Russian, Ukrainian and US Newscasts.” In- ternational Communication Gazette 79(4):357–378.

Rozenas, Arturas and Denis Stukal. 2018. “How Autocrats Manipulate Economic News: Evidence from Russia’s State-Controlled Television.” SSRN .

Rucinski, Dianne and Charles T. Salmon. 1990. “The ’Other’ as the Vulnerable Voter: A Study of the Third-Person Effect in the 1988 US Presidential Campaign.” International Journal of Public Opinion Research 2(4):345–368.

Ruggiero, Thomas E. 2000. “Uses and Gratifications Theory in the 21st Century.” Mass Communication & Society 3(1):3–37.

Rugh, William A. 2006. American Encounters with Arabs: The” Soft Power” of US Public Diplomacy in the Middle East. Greenwood Publishing Group.

Rupprecht, Tobias. 2015. Soviet Internationalism After Stalin: Interaction and Exchange Between the USSR and Latin America During the Cold War. Cambridge University Press.

Sanovich, Sergey, Denis Stukal and Joshua A. Tucker. 2018. “Turning the Virtual Tables: Goverment Strategies for Addressing Online Opposition with an Application to Russia.” Comparative Politics 50(3):435–482.

Sarlo, Alexandra Wiktorek. 2017. “Fighting Disinformation in the Baltic States.” Baltic Bulletin . https://www.fpri.org/article/2017/07/fighting-disinformation-baltic-states/.

Schafer and Gatov. 2017. “An Evolution of the Post-Soviet Russian Media: An Interview with Vasily Gatov.” Public Diplomacy Magazine . https://bit.ly/2I7WYo7 (accessed May 4, 2018).

Schaffner, Brian F. and Cameron Roche. 2016. “Misinformation and Motivated Reasoning: Responses to Economic News in a Politicized Envrionment.” Public Opinion Quarterly 81(1):86–110.

Schatz, Edward and Renan Levine. 2010. “Framing, Public Diplomacy, and Anti- Americanism in Central Asia.” International Studies Quarterly 54(3):855–869.

Scherr, Sebastian and Philipp Muller.¨ 2017. “How Perceived Persuasive Intent and Reac- tance Contribute to Third-Person Perceptions: Evidence from Two Experiments.” Mass Communication and Society 20(3):315–335.

Schultz, Kenneth A. 2017. “Perils of Polarization for US Foreign Policy.” The Washington Quarterly 40(4):7–28.

162 Seddon, Max. 2018. “Documents Show How Russia’s Troll Army Hit America.” BuzzFeed News .

Seib, Philip. 2008. The Al Jazeera Effect: How the New Global Media are Reshaping World Politics. Potomac Books, Inc.

Shambaugh, David. 2007. “China’s Propaganda System: Institutions, Processes and Effi- cacy.” The China Journal 57:25–58.

Shambaugh, David. 2013. China Goes Global: The Partial Power. Oxford University Press.

Shao, Chengcheng, Giovanni Luca Ciampaglia, Onur Varol, Alessandro Flammini and Fil- ippo Menczer. 2017. “The Spread of Fake News by Social Bots.” Working Paper .

Shao, Li and Dongshu Liu. 2018. “The Road to Cynicism: The Political Consequences of Online Satire Exposure in China.” Political Studies 67(2):517–536.

Sheafer, Tamir and Itay Gabay. 2009. “Mediated Public Diplomacy: A Strategic Con- test Over International Agenda Building and Frame Building.” Political Communication 26(4):447–467.

Sheafer, Tamir and Shaul R. Shenhav. 2009. “Mediated Public Diplomacy in a New Era of Warfare.” The Communication Review 12(3):272–283.

Shehata, Adam. 2014. “Game Frames, Issue Frames, and Mobilization: Disentangling the Effects of Frame Exposure and Motivated News Attention on Political Cynicism and Engagement.” International Journal of Public Opinion Research 26(3):157–177.

Shrout, Patrick E. and Niall Bolger. 2002. “Mediation in Experimental and Nonexperimen- tal Studies: New Procedures and Recommendations.” Psychological Methods 7(4):422– 455.

Shulman, Stephen and Stephen Bloom. 2012. “The Legitimacy of Foreign Interventions in Elections: The Ukranian Response.” Review of International Studies 38(2):445–471.

Silverman, Craig. 2019. “An Iranian Disinformation Operation Impersonated Dozens Of Media Outlets To Spread Fake Articles.” BuzzFeed News .

Silverstein, Brett. 1987. “Towards a Science of Propaganda.” Political Psychology pp. 45– 59.

Slothuss, Rune. 2016. “Assessing the Influence of Political Parties on Public Opinion: The Challenge from Pretreatment Effects.” Political Communication 33(2):302–327.

Smith, Don D. 1970. “Some Effects of Radio Moscow’s North American Broadcasts.” Public Opinion Quarterly 34(4):539–551.

163 Spangher, Alexander, Gireeja Ranade, Besmira Nushi, Adam Fourney and Eric Horvitz. 2018. “Analysis of Strategy and Spread of Russia-sponsored Content in the US in 2017.” arXiv preprint arXiv:1810.10033 .

Sproule, Michael J. 1987. “Propaganda Studies in American Social Science: The Rise and Fall of the Critical Paradigm.” Quartely Journal of Speech 73(1):60–78.

Stanley, Jason. 2015. How Propaganda Works. Princeton University Press.

Sun, Ye, Lijiang Shen and Zhongdang Pan. 2008. “On the Behavioral Component of the Third-Person Effect.” Communication Research 35(2):257–278.

Sun, Ye, Zhongdang Pan and Lijiang Shen. 2008. “Understanding the Third-Person Per- ception: Evidence from a Meta-Analysis.” Journal of Communication 58(2):280–300.

Surowiec, Pawel. 2017. “Post-Truth Soft Power: Changing Facets of Propaganda, Kom- promat, and Democracy.” Georgetown Journal of International Affairs 18(3):21–27.

Susskind, Jamie. 2018. Future Politics: Living Together in a World Transformed by Tech. Oxford University Press.

Szostek, Joanna. 2015. “How Western Plans to Fight Putin’s Propaganda War Could Back- fire.” The Conversation . http://theconversation.com/how-western-plans-to-fight-putins- propaganda-war-could-backfire-42868.

Szostek, Joanna. 2017. “The Power and Limits of Russia’s Strategic Narrative in Ukraine: The Role of Linkage.” Perspectives on Politics 15(2):379–395.

Szostek, Joanna. 2018a. “New Media Reportoires and Strategic Narrative Reception: A Paradox Dis/belief in Authoritarian Russia.” New Media & Society 20(1):68–87.

Szostek, Joanna. 2018b. “Nothing is True? The Credibility of News and Conflicting Narra- tives during “Information War” in Ukraine.” The International Journal of Press/Politics 23(1):116–135.

Taber, Charles S. and Milton Lodge. 2006. “Motivated Skepticism in the Evaluation of Political Beliefs.” American Journal of Political Science 50(3):755–769.

Tai, Qiuqing. 2016. “Western Media Exposure and Chinese Immigrants Political Percep- tions.” Political Communication 33(1):78–97.

Tal-Or, Nurit, Jonathan Cohen, Yariv Tsfati and Albert Gunther. 2010. “Testing Causal Direction in the Influence of Presumed Media Influence.” Communication Research 37(6):801–824.

Tamkin, Emily. 2017. “United Russia Completes Report on How U.S. Media Influenced Russian Elections.” Foreign Policy .

164 Taylor, Kyle. 2017. “Europeans Favoring Right-Wing Populist Parties Are More Positive on Putin.” Pew Research Center. http://www.pewresearch.org/fact- tank/2017/01/24/europeans-favoring-right-wing-populist-parties-are-more-positive- on-putin/. Tenove, Chris, Jordan Buffie, Spencer McKay and David Moscrop. 2019. “Digital Threats to Democratic Elections: How Foreign Actors Use Digital Techniques to Undermine Democracy.” Working Paper . Tetlock, Phillip E. 2002. “Social Functionalist Framework for Judgement and Choice: Intutitive Politicans, Theologians, and Prosecutors.” Psychological Review 109(3):451– 471. Thomson, Oliver. 1999. Easily Led: A . Phoenix Mill, UK: Sutton. Tiedge, James T., Arther Silverblatt, Michael J. Havice and Richard Rosenfeld. 1991. “Dis- crepancy Between Percieved First-Person and Perceived Third-Person Mass Media Ef- fects.” Journalism Quarterly 68(1-2):141–154. Timberg, Craig. 2016. “Russian propaganda effort helped spread ‘fake news’ during elec- tion, experts say.” The Washington Post . Timberg, Craig and Romm. 2019. “It’s not just the Russians anymore as Iranians and others turn up disinformation efforts ahead of 2020 vote.” The Washington Post . Tingley, Dustin, Teppei Yamamoto, Kentaro Hirose, Luke Keele and Kosuke Imai. 2014. “Mediation: R Package for Causal Mediation Analysis.” Journal of Statistical Software 59(5):1–38. Tolstrup, Jakob. 2015. “Black Knights and Elections in Authoritarian Regimes: Why and How Russia Supports Authoritarian Incumbents in post-Soviet States.” European Jour- nal of Political Research 54(4):673–690. Tomlinson, Sam. 2016. “The Rise of Cross-Border News.” PwC UK . https://press.pwc.com/Multimedia/image/the-rise-of-cross-border-news/a/efce95b9- fd1d-42f3-8b5e-86a0d2ea3a04. Tomz, Michael and Jessica L. Weeks. 2019. “Public Opinion and Foreign Electoral Inter- vention.” Working Paper . Truex, Rory. 2016. “Bias and Trust in Authoritarian Media.” SSRN . Tsfati, Yariv. 2007. “Hostile Media Perceptions, Presumed Media Influence, and Minority Alientation.” Journal of Communication 57(4):632–651. Tucker, Joshua A., Yannis Theocharis, Margaret E. Roberts and Pablo Barbera.´ 2017. “From Liberation to Turmoil: Social Media and Democracy.” Journal of Democracy 28(4):46–59.

165 Tucker, Joshua, Andrew Guess, Pablo Barbera,´ Cristian Vaccari, Alexandra Siegel, Sergey Sanovich, Denis Stukal and Brendan Nyhan. 2018. “Social Media, Political Polariza- tion, and Political Disinformation: A Review of the Scientific Literature.” Unpublished Manuscript .

Turner, J.C. 1991. Social Influence. Buckingham, UK: Open University Press.

Urban, George R. 1997. Radio Free Europe and the Pursuit of Democracy: My War within the Cold War. Yale University Press.

Uttaro, Ralph A. 1982. “The Voices of America in International .” Law and Contemporary Problems 45(1):103–122.

Valentino, Nicholas A., Matthew N. Beckmann and Thomas A. Buhr. 2001. “A Spiral of Cynicism for Some: The Contingent Effects of Campaign News Frames on Participation and Confidence in Government.” Political Communication 18(4):347–367.

Valline, Robert P., Lee Ross and Mark R. Lepper. 1985. “The Hostile Media Phenomenon: Biased Perception and the Perceptions of Media Bias int he Coverage of the Beirut Mas- sacre.” Journal of Personality and Social Psychology 49(3):577–585.

Vosoughi, Soroush, Deb Roy and Sinan Aral. 2018. “The Spread of True and False News Online.” Science 359(6380):1146–1151.

Vresse, Claes H. De. 2004. “The Effects of Strategic News on Political Cynicism, Issue Evaluations, and Policy Support: A Two-wave Experiment.” Mass Communication & Society 7(2):191–214.

Vresse, Claes H. De. 2005. “The Spiral of Cynicism Reconsidered.” European Journal of Communication 20(3):283–301.

Vresse, Claes H. De and Holli A. Semetko. 2002. “Cynical and Engaged: Startegic Cam- paign Coverage, Public Opinion, and Mobilization in a Referendum.” Communication Research 29(6):615–641.

Walker, Christopher. 2018. “What is “Sharp Power”.” Journal of Democracy 29(3):9–23.

Wallace, Jeremy. 2013. “Cities, Redistribution, and Authoritarian Regime Survival.” The Journal of Politics 75(3):632–645.

Wanta, Wayne, Guy Golan and Cheolhan Lee. 2004. “Agenda Setting and International News: Media Influence on Public Perceptions of Foreign Nations.” Journalism & Mass Communications Quarterly 81(2):364–377.

Wardle, Claire and Hossein Derakhshan. 2017. “Information Disorder: Towards an In- terdisciplinary Framework for Research and Policy Making.” Council of Europe Report pp. 1–90. https://firstdraftnews.org/coe-report/.

166 Warren, Camber T. 2014. “Not by the Sword Alone: Soft Power, Mass Media, and the Production of State Sovereignty.” International Organization 68(1):111–141. Watanabe, Kohei. 2017. “The Spread of the Kremlin’s Narratives by a Western News Agency During the Ukraine Crisis.” The Journal of International Communication 23(1):138–158. Way, Lucan Ahmad and Adam E. Casey. 2019. “How Can We Know if Russia is a Threat to Western Democracy? Understanding the Impact of Russia’s Second Wave of Election Interference.” Working Paper . Wei, Ran, Stella C. Chia and Ven-Hwei Lo. 2011. “Third-Person Effect and Hostile Media Perception Influences on Voter Attitudes toward Polls in the 2008 Presidential Election.” International Journal of Public Opinion Research 23(2):169–190. West, Darrell M. 2017. “How to Combat Fake News and Disinformation.” Brookings . https://www.brookings.edu/research/how-to-combat-fake-news-and-disinformation/. Weyland, Kurt. 2017. “Autocratic Diffusion and Cooperation: The Impact of Interests vs. Ideology.” Democratization 24(3):1235–1252. Whitaker, Eric. 2012. Terrorism Warnings as Strategic Appeals: An Analysis of Press Reporting and Public Reactions. Master’s thesis University of Nebraska-Lincoln. https://bit.ly/2Qq8mjx. White, Ralph K. 1952. “The New Resistance to International Propaganda.” Public Opinion Quarterly 16(4):539–551. Wood, Thomas and Ethan Porter. 2018. “The Elusive Backfire Effect: Mass Attitudes Steadfast Factual Adherence.” Political Behavior pp. 1–29. Xiang, Jun and Jay D. Hmielowski. 2017. “Alternative Views and Eroding Support: The Conditional Indirect Effects of Foreign Media and Internet Use on Regime Support in China.” International Journal of Public Opinion Research 29(3):406–425. Xie, Shuang and Oliver Boyd-Barrett. 2015. “External-national TV News Networks’ Way to America: Is the United States Losing the Global “Information War”.” International Journal of Communication 9:66–83. Yablokov, Ilya. 2015. “Conspiracy Theories as a Russian Public Diplomacy Tool: The Case of Russia Today (RT).” Politics 35(3-4):301–315. Yanagizawa-Drott, David. 2014. “Propaganda and Conflict: Evidence from the Rwandian Genocide.” The Quarterly Journal of Economics 129(4):1947–1994. Yevgeniy, Golovchenko, Mareike Hartmann and Rebecca Adler-Nissen. 2018. “State, Me- dia and Civil Society in the Information Warfare over Ukraine: Citizen Curators of Dig- ital Disinformation.” International Affairs 94(5):975–994.

167 Yip, Hilton. 2018. “China’s $6 Billion Propaganda Blitz Is a Snooze.” Foreign Policy .

Youmans, William Lafi and Shawn Powers. 2012. “Remote Negotiations: International Broadcasting as Bargaining in the Information Age.” International Journal of Commu- nication 6(2149-2172).

Zaharna, Rhonda S., Amelia Arsenault and Ali Fisher. 2014. Relational, Networked and Collaborative Approaches to Public Diplomacy: The Connective Mindshift. Routledge.

Zaller, John. 1992. The Nature and Origins of Mass Opinion. Cambridge University Press.

Zannettou, Savvas, Tristan Caulfield, Emiliano De Cristofaro, Michael Sirivianos, Gianluca Stringhini and Jeremy Blackburn. 2019. “Disinformation Warfare: Understanding State- Sponsored Trolls on Twitter and Their Influence on the Web.” arXiv:1801.09288v .

Zhang, Cui and Charles William Meadows. 2012. “International Coverage, Foreign Policy, and National Image: Explorting the Complexities of Media Coverage, Public Opinion, and Presidential Agenda.” International Journal of Communication 6(76-93).

168 Appendix A: Demonizing The Enemy

Measures

Please tell me if you have a very favorable, somewhat favorable, somewhat unfavor- able or very unfavorable opinion of the following countries. United States; Russia; Ukraine; China

1. Very unfavorable

2. Somewhat unfavorable

3. Neither favorable nor unfavorable

4. Somewhat favorable

5. Very favorable

6. Don’t know

Would you say your overall opinion of these countries’ foreign policy is very favor- able, somewhat favorable, somewhat unfavorable, or very unfavorable? United States foreign policy; Russian foreign policy; Ukrainian foreign policy; Chinese foreign policy

1. Very unfavorable

2. Somewhat unfavorable

3. Neither favorable nor unfavorable

4. Somewhat favorable

5. Very favorable

6. Don’t know

169 Would you say your overall opinion towards the following leaders is very favorable, somewhat favorable, somewhat unfavorable, or very unfavorable? Barack Obama; Vladimir Putin, Petro Poroshenko

1. Very unfavorable

2. Somewhat unfavorable

3. Neither favorable nor unfavorable

4. Somewhat favorable

5. Very favorable

6. Don’t know

In response to the situation involving Russia and Ukraine, would you favor or op- pose the United States doing the following actions? Increasing economic and diplomatic sanctions on Russia; arms and military supplies to the Ukrainian government

1. Strongly oppose

2. Somewhat oppose

3. Neither favor or oppose

4. Somewhat favor

5. Strongly favor

6. Don’t know

How important to the interests of the United States is what happens between Russia and Ukraine?

170 1. Extremely important

2. Very important

3. Moderately important

4. Slightly important

5. Not at all important

How much, if anything, have you read or heard about tensions between Russia and Ukraine over territory in eastern Ukraine?

1. A great deal

2. A lot

3. A moderately amount

4. A little

5. Nothing at all

171 Survey Sample Characteristics

Table 5: Summary Table

No. % Age Cohort 18-29 277.00 31.30 30-49 442.00 49.94 50+ 166.00 18.76 Total 885.00 100.00 Female Male 403.00 45.54 Female 482.00 54.46 Total 885.00 100.00 White Non-white 194.00 21.92 White 691.00 78.08 Total 885.00 100.00 Education High School 79.00 8.93 Some College 315.00 35.59 College 340.00 38.42 Post Graduate 151.00 17.06 Total 885.00 100.00 Party Democrat 375.00 45.02 Republican 182.00 21.85 Independent 276.00 33.13 Total 833.00 100.00

172 Balance Test

Table 6: Balance Test

treatments Control Info Source Intention Total No. No. No. No. No. Age Cohort 18-29 71 69 69 68 277 30-49 112 114 113 103 442 50+ 44 38 45 39 166 Total 227 221 227 210 885 Pearson chi2(6) = 0.7854 Pr = 0.992 Female Male 101 108 100 94 403 Female 126 113 127 116 482 Total 227 221 227 210 885 Pearson chi2(3) = 1.3414 Pr = 0.719 White Non-white 49 46 53 46 194 White 178 175 174 164 691 Total 227 221 227 210 885 Pearson chi2(3) = 0.4431 Pr = 0.931 Education High School 23 15 22 19 79 Some College 79 82 80 74 315 College 90 87 86 77 340 Post Graduate 35 37 39 40 151 Total 227 221 227 210 885 Pearson chi2(9) = 3.0371 Pr = 0.963 Party Democrat 93 92 97 93 375 Republican 50 49 45 38 182 Independent 70 69 70 67 276 Total 213 210 212 198 833 Pearson chi2(6) = 1.5349 Pr = 0.957

173 Robustness Checks

I include the regression tables used to recreate the main figures in the paper. I also include robustness checks with controls, and the full sample. In the main text of the article, I excluded individuals who failed the following reading check after exposure to the treatment groups. Here, I include them and find the results are nearly identical. Reading Check 1: Participants who received a treatment were asked whether the article they read was about: (1) Ukrainian human rights violations; (2) Russian human rights violations; (3) a Ukrainian corruption scandal; or (4) Russian corruption. Choices were randomized and individuals who did not answer correctly were removed. Reading Check 2: Participants who received a treatment were asked what the source of the the article they read was. Individuals who did not answer correctly were removed.

174 Table 7: Regression Table for Figure 1

Ukraine Ukraine FP Poroshenko Russia Russia FP Putin β SE β SE β SE β SE β SE β SE Info -0.29∗ (0.09) -0.26∗ (0.09) -0.18 (0.09) 0.01 (0.10) -0.00 (0.10) -0.02 (0.11) Source -0.28∗ (0.09) -0.32∗ (0.10) -0.12 (0.09) 0.02 (0.10) 0.01 (0.10) -0.03 (0.11) Intention -0.30∗ (0.10) -0.33∗ (0.10) -0.28∗ (0.10) -0.02 (0.11) -0.11 (0.10) -0.21 (0.11) Constant 2.89∗ (0.06) 2.81∗ (0.07) 2.82∗ (0.07) 2.19∗ (0.07) 2.05∗ (0.07) 2.13∗ (0.08) Observations 808 689 580 819 774 808 R-Squared .018 .024 .015 .0002 .0021 .0047 Robust standard errors in parentheses Note: OLS regression. Only includes individuals who passed reading checks. ∗ p < 0.05 175 Table 8: Regression Table for Figure 2

Sanctions Arms β SE β SE Info 0.07 (0.11) -0.02 (0.11) Source -0.05 (0.12) -0.13 (0.11) Intention -0.06 (0.13) -0.25∗ (0.12) Constant 3.30∗ (0.08) 2.44∗ (0.08) Observations 803 808 R-Squared .0016 .007 Robust standard errors in parentheses Note: OLS regression. ∗ p < 0.05 176 Table 9: Full Sample Robustness Check

Ukraine Ukraine FP Poroshenko Russia Russia FP Putin b se b se b se b se b se b se Info -0.26∗ (0.09) -0.27∗ (0.09) -0.18 (0.09) 0.01 (0.10) -0.00 (0.10) -0.01 (0.11) Source -0.24∗ (0.09) -0.30∗ (0.09) -0.09 (0.09) -0.01 (0.10) -0.02 (0.10) -0.06 (0.11) Intention -0.24∗ (0.09) -0.26∗ (0.10) -0.23∗ (0.10) 0.01 (0.10) -0.05 (0.10) -0.12 (0.11) Constant 2.89∗ (0.06) 2.81∗ (0.07) 2.82∗ (0.07) 2.19∗ (0.07) 2.05∗ (0.07) 2.13∗ (0.08) Observations 866 743 629 877 831 866 R-Squared .013 .018 .011 .000086 .00041 .0017 Robust standard errors in parentheses Note: OLS regression. Includes individuals who failed reading checks. ∗ p < 0.05 177 Table 10: Full Sample Robustness Check

Info 0.06 (0.11) 0.00 (0.11) Source -0.05 (0.12) -0.09 (0.11) Intention -0.02 (0.12) -0.16 (0.11) Constant 3.30∗ (0.08) 2.44∗ (0.08) Observations 858 865 R-Squared .0011 .0031 Robust standard errors in parentheses Note: OLS regression. ∗ p < 0.05 178 Table 11: Treatment Effects with Controls

Ukraine Ukraine FP Poroshenko Russia Russia FP Putin β SE β SE β SE β SE β SE β SE Info -0.30∗ (0.09) -0.30∗ (0.09) -0.14 (0.09) -0.01 (0.10) -0.10 (0.10) -0.02 (0.11) Source -0.30∗ (0.09) -0.37∗ (0.10) -0.12 (0.10) -0.04 (0.10) -0.11 (0.10) -0.08 (0.11) Intention -0.29∗ (0.10) -0.34∗ (0.10) -0.24∗ (0.10) -0.03 (0.10) -0.17 (0.10) -0.21 (0.11) Republican 0.05 (0.08) 0.04 (0.08) 0.02 (0.08) 0.13 (0.08) 0.22∗ (0.08) 0.35∗ (0.10) Independent -0.05 (0.10) 0.05 (0.10) -0.22∗ (0.10) 0.24∗ (0.11) 0.21∗ (0.10) 0.26∗ (0.12) Female -0.07 (0.07) -0.09 (0.07) -0.18∗ (0.07) -0.02 (0.07) 0.01 (0.07) -0.03 (0.08) 30-49 -0.04 (0.08) 0.06 (0.08) 0.00 (0.08) -0.22∗ (0.08) -0.17∗ (0.08) -0.05 (0.09) 179 50+ -0.02 (0.10) 0.12 (0.10) 0.04 (0.10) -0.35∗ (0.10) -0.32∗ (0.10) -0.19 (0.12) White -0.03 (0.08) -0.02 (0.09) 0.03 (0.09) -0.07 (0.09) -0.01 (0.09) -0.12 (0.11) Some College -0.13 (0.16) -0.13 (0.14) -0.08 (0.17) -0.24 (0.15) -0.18 (0.14) -0.30 (0.18) College -0.05 (0.16) 0.02 (0.14) -0.12 (0.17) -0.31∗ (0.15) -0.21 (0.14) -0.43∗ (0.18) Post Graduate 0.10 (0.17) 0.08 (0.15) -0.02 (0.18) -0.49∗ (0.16) -0.50∗ (0.15) -0.67∗ (0.19) Knowledge Ukraine 0.06 (0.03) 0.03 (0.04) -0.04 (0.04) -0.11∗ (0.04) -0.18∗ (0.04) -0.05 (0.04) Constant 2.86∗ (0.20) 2.78∗ (0.19) 3.07∗ (0.22) 2.92∗ (0.21) 2.89∗ (0.20) 2.69∗ (0.24) Observations 768 653 551 776 734 765 R-Squared .039 .046 .034 .057 .097 .06 Robust standard errors in parentheses Note: OLS regression. Only includes individuals who passed reading checks. ∗ p < 0.05 Table 12: Treatment Effects with Controls

Info 0.12 (0.11) -0.02 (0.12) Source 0.05 (0.12) -0.08 (0.12) Intention -0.05 (0.12) -0.25∗ (0.12) Republican -0.32∗ (0.10) 0.14 (0.10) Independent -0.52∗ (0.14) -0.35∗ (0.12) Female -0.12 (0.09) -0.09 (0.08) 30-49 0.20∗ (0.10) -0.07 (0.10) 50+ 0.36∗ (0.13) 0.20 (0.12) White -0.02 (0.11) -0.02 (0.11) Some College 0.06 (0.17) -0.30 (0.18) College 0.25 (0.17) -0.12 (0.19) Post Graduate 0.47∗ (0.19) -0.14 (0.20) 180 Knowledge Ukraine 0.18∗ (0.04) 0.20∗ (0.04) Constant 2.70∗ (0.25) 2.17∗ (0.24) Observations 762 766 R-Squared .089 .08 Robust standard errors in parentheses Note: OLS regression. ∗ p < 0.05 Bayesian Additive Regression Trees (BART)

While heterogenous treatment effects cannot be summarized by looking at the coeffi- cients on interaction terms, one can plot the estimated treatment effect on each individual’s covariate profile along with 95% confidence intervals. As these more robust models demon- strate, Russian messages tend to lower evaluations of Ukraine but do not have a significant effect on attitudes toward Russia itself.

Figure 39: Bart Estimated Treatment Effects

181 Figure 40: Bart Estimated Treatment Effects

182 Appendix B: The Cynical and the Conspiratorial

Placebo Posts

183 Figure 41: Placebo Posts

(a) Placebo #1 (b) Placebo #2

(c) Placebo #3 (d) Placebo #4 184 Figure 42: Placebo Posts

(a) Placebo #5 (b) Placebo #6

(c) Placebo #7 (d) Placebo #8 185 Survey Sample Characteristics

• Table 1 presents the sample characteristics of the the Mturk Survey in Study #1.

• Table 2 presents the sample characteristics of the the Mturk Survey in Study #2.

Table 13: Study 2: Summary Table

No. % Female Male 417.00 44.36 Female 523.00 55.64 Total 940.00 100.00 Age Cohort 18-29 275.00 29.26 30-49 475.00 50.53 50+ 190.00 20.21 Total 940.00 100.00 White Non-white 226.00 24.04 White 714.00 75.96 Total 940.00 100.00 Education High School or Lower 102.00 10.85 Some College 324.00 34.47 College 346.00 36.81 Postgraduate 168.00 17.87 Total 940.00 100.00 Party Identification Democrat 532.00 56.60 Republican 298.00 31.70 Independent 110.00 11.70 Total 940.00 100.00

186 Table 14: Study 1: Summary Table

No. % Gender Male 480.00 48.00 Female 520.00 52.00 Total 1000.00 100.00 Age Cohort 18-29 286.00 28.60 30-49 549.00 54.90 50+ 165.00 16.50 Total 1000.00 100.00 White Non-White 239.00 23.90 White 761.00 76.10 Total 1000.00 100.00 Education High School or Lower 108.00 10.80 Some College 350.00 35.00 College 377.00 37.70 Postgraduate 165.00 16.50 Total 1000.00 100.00 Party Republican 331.00 33.20 Independent 123.00 12.34 Democrat 543.00 54.46 Total 997.00 100.00

187 Balance Across Treatments

• Table 3 presents the balance test for Study #1.

• Table 4 presents the balance test for Study #2.

Table 15: Balance Across Treatments

treatments Control Propaganda Inoculation Total No. No. No. No. Age Cohort 18-29 90 97 88 275 30-49 156 154 165 475 50+ 68 57 65 190 Total 314 308 318 940 Pearson chi2(4) = 1.7904 Pr = 0.774 Female Male 140 135 142 417 Female 174 173 176 523 Total 314 308 318 940 Pearson chi2(2) = 0.0525 Pr = 0.974 White Non-white 70 73 83 226 White 244 235 235 714 Total 314 308 318 940 Pearson chi2(2) = 1.2835 Pr = 0.526 Education High School or Lower 38 33 31 102 Some College 107 104 113 324 College 119 110 117 346 Postgraduate 50 61 57 168 Total 314 308 318 940 Pearson chi2(6) = 2.4903 Pr = 0.870 Party Independent 99 88 76 263 Republican 71 70 89 230 Democrat 134 141 145 420 Total 304 299 310 913 Pearson chi2(4) = 6.1903 Pr = 0.185

188 Table 16: Balance Across Treatments

Treatments Control Information Source Inoculation Total No. No. No. No. No. Gender Male 113 132 126 109 480 Female 140 118 128 134 520 Total 253 250 254 243 1,000 Pearson chi2(3) = 4.6606 Pr = 0.198 Age Cohort 18-29 81 63 65 77 286 30-49 127 135 152 135 549 50+ 45 52 37 31 165 Total 253 250 254 243 1,000 Pearson chi2(6) = 11.5556 Pr = 0.073 White Non-White 55 58 62 64 239 White 198 192 192 179 761 Total 253 250 254 243 1,000 Pearson chi2(3) = 1.5469 Pr = 0.671 Education High School or Lower 33 14 29 32 108 Some College 91 96 86 77 350 College 90 101 92 94 377 Postgraduate 39 39 47 40 165 Total 253 250 254 243 1,000 Pearson chi2(9) = 12.4589 Pr = 0.189 Party Republican 83 93 79 76 331 Independent 26 27 30 40 123 Democrat 144 129 143 127 543 Total 253 249 252 243 997 Pearson chi2(6) = 7.6021 Pr = 0.269

189 Robustness with Manipulation Checks

Many experimenters are concerned that participants are not reading treatment materials or flippantly answering survey questions, introducing noise and decreasing the validity of their results (Oppenheimer, Meyvis and Davidenko 2009). Excluding participants who fail reading check or screening individuals by the least amount of time needed to finish the survey are potential solutions to this problem (Chen 2018, 7), but they also can lead to post-treatment bias (Mongomery, Nyhan and Torres 2018). I present the results of the full sample in the main text, but I provide robustness checks based on reading checks and time taken to complete the survey here. Results are nearly identical across these different samples.

• Figure 1 & Figure 2 presents the effect of the treatments across different samples (with controls, those who passed reading checks, those who took the limited amount of time to complete the survey.)

190 Figure 43: Treatment Effects on Political Cynicism Scale

Figure 44: Treatment Effects on Conspiracy Scale

191 Interaction Effects

I check whether age, political ideology, and educatio moderate the effect of the treat- ments on cynicism across the two studies.

Figure 45: Age Interaction - Cynicism

Study #2: Political Cynicism − Age Interaction

.2

.1

0 Treatment effect

−.1

Propaganda Inoculation −.2 20 25 30 35 40 45 50 55 60 65 70 Age

Figure 46: Ideology Interaction - Cynicism

Study#2: Political Cynicism − Ideology Interaction

.15 Propaganda Inoculation

.1

.05

0 Treatment effect −.05

−.1

Liberal Moderate Very liberal Slightly liberal Conservative Very conservative Slightly conservative Ideology

192 Figure 47: Education Interaction - Cynicism

Study:#2: Political Cynicism − Education Interaction

.2 Propaganda Inoculation

.1

0 Treatment effect

−.1

−.2 High School or Lower Some College College Postgraduate Education

193 Figure 48: Age Interaction - Cynicism

Study #1: Political Cynicism − Age Interaction

.2

.1

0 Treatment effect

−.1

Information Source Inoculation −.2 20 25 30 35 40 45 50 55 60 65 70 Age

Figure 49: Ideology Interaction - Cynicism

Study #1: Political Cynicism − Ideology Interaction

.1

.05

0

−.05 Treatment effect

−.1 Information Source Inoculation

Liberal Moderate Very liberal Slightly liberal Conservative Very conservative Slightly conservative Ideology

194 Figure 50: Education Interaction - Cynicism

Study#1: Political Cynicism − Education Interaction

.1

.05

0 Treatment effect

−.05

Information Source Inoculation −.1 High School or Lower Some College College Postgraduate Education

195 Appendix C: Propaganda’s Presumed Influence

Survey Questions (Dependent Variables)

How effective do you think Russia propaganda was at influencing your voting decision during the 2016 U.S. presidential elections?

1. Not at all effective

2. Not very effective

3. Somewhat effective

4. Very effective

How effective do you think Russia propaganda was at influencing other citizens’ vot- ing during the 2016 U.S. presidential election?

1. Not at all effective

2. Not very effective

3. Somewhat effective

4. Very effective

How effective do you think Russian propaganda was at influencing Democrats’ voting during the 2016 U.S. presidential election?

1. Not at all effective

2. Not very effective

3. Somewhat effective

4. Very effective

How effective do you think Russian propaganda was at influencing Republicans’ vot- ing during the 2016 U.S. presidential election?

1. Not at all effective

2. Not very effective

3. Somewhat effective

4. Very effective

196 Russian propaganda on social media helped sway the results of the 2016 Presidential election 1. Strongly disagree —————————————————————7. Strongly agree The presence of Russian propaganda during the campaign makes the results of the 2016 Presidential election illegitimate 1. Strongly disagree —————————————————————7. Strongly agree Any Russian-backed television station should be banned 1. Strongly disagree —————————————————————7. Strongly agree Congress should increase regulations on online news 1. Strongly disagree —————————————————————7. Strongly agree

197 Balance Test People with higher levels of political awareness are more likely to be in the Control group. Also, there are fewer republicans in the Inoculation group. Since these factors are related to the outcome variables, I controls for a series a controls in my main results but present robustness checks without controls in Online Appendix F.

Figure 51: Balance Test for Sample

Balance Test Control Propaganda Inoculation

Female

30−49

50+

White

Some College

College

Postgraduate

Republican

Democrat

Political Awareness

Political Interest

Distrust in Media

−.2 0 .2 .4 −.2 0 .2 .4 −.2 0 .2 .4

Robustness Checks Standard errors in parenthesis. Model 1 examines the effect of the treatments in the full sample with no controls. Model 2 examines the effect of the treatments in the full sample with controls. Model 3 examines the effect of the treatments in the sample of individuals who took a sufficient amount of time to take the survey with no controls. Model 3 examines the effect of the treatments in the sample of individuals who took a sufficient amount of time to take the survey with controls.

198 Figure 52: Robustness Checks

199 Sensitivity Analysis for Mediation Analysis

Figure 53: Sensitivity Analysis for Presumed Propaganda Effect on Others on Illegitimacy

Figure 54: Sensitivity Analysis for Presumed Propaganda Effect on Others on Support for Censorship

200 Online Media Regulation I find that people who tend to think propaganda shaped other people’s voting behavior are also more likely to support more congressional regulation of online media – even when controlling for partisanship and political awareness.

Congress should increase regulations on online news

Strongly agree

Neither agree nor disagree

Strongly disagree

Not at all effective A little effective Pretty effective Very effective Propaganda Effect on Others

Figure 55: Dependent variable is support for regulating online media. Independent vari- able is belief about propaganda’s influence on others’ voting behavior. All controls in- cluded. Results disaggregated by partisans (including leaners). Horizontal lines represent 95% confidence intervals for estimates.

201 Bayesian Additive Regression Trees (BART) As one can see in the graphs, the inoculation and the propaganda (to a lesser extent) tend to lower individuals’ perception that propaganda is effective on others. However, more research with larger and more representative samples will be necessary to assess the full extent of heterogeneous treatment effects, especially since strategic information campaigns are highly adept at micro-targeting specific groups to maximize effectiveness.

Figure 56: Bart Estimated Treatment Effects

202 Appendix D: How To Criticize an Autocrat

Summary Statistics

Table 17: Summary Table

No. % Female Male 457.00 47.85 Female 498.00 52.15 Total 955.00 100.00 Age 18-29 210.00 21.97 30-39 259.00 27.09 40-49 191.00 19.98 50-59 154.00 16.11 60+ 142.00 14.85 Total 956.00 100.00 Married No 427.00 44.67 Yes 529.00 55.33 Total 956.00 100.00 Big City No 312.00 32.67 Yes 643.00 67.33 Total 955.00 100.00 Russian No 115.00 12.03 Yes 841.00 87.97 Total 956.00 100.00 Orthodox No 489.00 51.15 Yes 467.00 48.85 Total 956.00 100.00 Regime Support Non-Voter 218.00 22.85 Opposition Voter 177.00 18.55 Regime Voter 559.00 58.60 Total 954.00 100.00

203 Balance Table

204 Table 18: Balance Test

Treatments Domestic Control Foreign Control Domestic One-Sided Foreign One-Sided Domestic Balanced Foreign Balanced Total No. No. No. No. No. No. No. Female Male 72 74 80 81 77 73 457 Female 88 87 79 79 81 84 498 Total 160 161 159 160 158 157 955 Pearson chi2(5) = 1.7960 Pr = 0.877 Age 18-29 35 33 44 29 35 34 210 30-39 40 48 29 45 47 50 259 40-49 39 32 31 31 31 27 191 50-59 23 23 33 23 23 29 154 60+ 24 25 22 32 22 17 142 Total 161 161 159 160 158 157 956 Pearson chi2(20) = 21.5382 Pr = 0.366 Married No 79 75 68 70 66 69 427

205 Yes 82 86 91 90 92 88 529 Total 161 161 159 160 158 157 956 Pearson chi2(5) = 2.3563 Pr = 0.798 Big City No 55 48 58 51 42 58 312 Yes 106 113 100 109 116 99 643 Total 161 161 158 160 158 157 955 Pearson chi2(5) = 5.9426 Pr = 0.312 Russian No 21 22 18 15 22 17 115 Yes 140 139 141 145 136 140 841 Total 161 161 159 160 158 157 956 Pearson chi2(5) = 2.4541 Pr = 0.783 Orthodox No 87 84 70 82 82 84 489 Yes 74 77 89 78 76 73 467 Total 161 161 159 160 158 157 956 Pearson chi2(5) = 4.2190 Pr = 0.518 Regime Support Non-Voter 40 40 35 30 38 35 218 Opposition Voter 33 26 28 25 33 32 177 Regime Voter 88 94 96 104 87 90 559 Total 161 160 159 159 158 157 954 Pearson chi2(10) = 6.3235 Pr = 0.787 Robustness Checks The tables below show the regression results used to recreate the figure in the main texts. They also include robustness checks with control variables. They also show the treatment effects with an alternative operationalization of regime support. I check whether not including Vladimir Zhirinovsky supporters as regime supporters changes the results. None of the main results change. However, opposition voters are now more likely to sup- port Putin when exposed to one-sided foreign criticism, indicating that Zhirinovsky voters are reacting unfavorably to foreign criticism.

206 Table 19: Domestic Criticism Effect on Support for Putin

No Controls Controls β SE β SE One-Sided 0.04 (0.03) 0.03 (0.03) Balanced 0.02 (0.03) 0.03 (0.02) Female 0.02 (0.02) Age -0.00 (0.00) Married -0.03 (0.02) Big City -0.00 (0.02) Russian -0.03 (0.04) Orthodox 0.02 (0.02) Opposition Voter -0.07∗ (0.03) Regime Voter 0.20∗∗∗ (0.03) Internet -0.09∗∗∗ (0.02) Constant 0.49∗∗∗ (0.02) 0.50∗∗∗ (0.06) Observations 492 489 R-Squared .0045 .26 * Note: Dependent variable is support for Vladimir Putin. Treatment effects for individuals who received domestic criticism only. Robust standard errors in parentheses. OLS regression. ∗ p < 0.05, ∗∗ p < 0.01, ∗∗∗ p < 0.001

207 Table 20: Foreign Criticism Effect on Support for Putin

No Controls Controls b se b se One-Sided 0.01 (0.03) -0.00 (0.03) Balanced -0.05 (0.03) -0.04 (0.03) Female 0.02 (0.02) Age 0.00 (0.00) Married -0.02 (0.02) Big City 0.01 (0.02) Russian -0.00 (0.03) Orthodox 0.04 (0.02) Opposition Voter -0.09∗∗ (0.03) Regime Voter 0.18∗∗∗ (0.03) Internet -0.07∗∗∗ (0.02) Constant 0.52∗∗∗ (0.02) 0.39∗∗∗ (0.06) Observations 475 471 R-Squared .01 .26 * Note: Dependent variable is support for Vladimir Putin. Treatment effects for individuals who received foreign criticism only. Robust standard errors in parentheses. OLS regression. ∗ p < 0.05, ∗∗ p < 0.01, ∗∗∗ p < 0.001

208 Table 21: Domestic Criticism Effect on Support for Putin

No Controls Controls b se b se One-Sided 0.01 (0.06) 0.03 (0.06) Balanced 0.06 (0.06) 0.07 (0.05) Opposition Voter -0.05 (0.06) -0.04 (0.06) Regime Voter 0.22∗ (0.04) 0.21∗ (0.04) One-Sided X Opposition Voter 0.01 (0.09) 0.00 (0.09) One-Sided X Regime Voter 0.02 (0.07) 0.01 (0.07) Balanced X Opposition Voter -0.09 (0.08) -0.10 (0.08) Balanced X Regime Voter -0.04 (0.06) -0.04 (0.06) Female 0.03 (0.02) Age -0.00 (0.00) Married -0.03 (0.02) Big City 0.00 (0.02) Russian -0.03 (0.04) Orthodox 0.02 (0.02) Internet -0.09∗ (0.02) Constant 0.38∗ (0.04) 0.47∗ (0.07) Observations 474 471 R-Squared .23 .27 * Note: Dependent variable is support for Vladimir Putin. Treat- ment effects for individuals who received domestic criticism only. Robust standard errors in parentheses. OLS regression. ∗ p < 0.05, ∗∗ p < 0.01, ∗∗∗ p < 0.001

209 Table 22: Foreign Criticism Effect on Support for Putin

No Controls Controls b se b se One-Sided 0.00 (0.06) 0.01 (0.06) Balanced -0.16∗ (0.06) -0.14∗ (0.06) Opposition Voter -0.16∗ (0.07) -0.15∗ (0.07) Regime Voter 0.15∗ (0.05) 0.15∗ (0.05) One-Sided X Opposition Voter 0.03 (0.09) -0.00 (0.09) One-Sided X Regime Voter -0.01 (0.07) -0.02 (0.07) Balanced X Opposition Voter 0.20∗ (0.09) 0.17∗ (0.08) Balanced X Regime Voter 0.13∗ (0.07) 0.11 (0.06) Female 0.01 (0.02) Age 0.00 (0.00) Married -0.02 (0.02) Big City 0.01 (0.02) Russian 0.00 (0.03) Orthodox 0.04 (0.02) Internet -0.07∗ (0.02) Constant 0.45∗ (0.04) 0.42∗ (0.06) Observations 473 471 R-Squared .24 .27 * Note: Dependent variable is support for Vladimir Putin. Treat- ment effects for individuals who received foreign criticism only. Robust standard errors in parentheses. OLS regression. ∗ p < 0.05, ∗∗ p < 0.01, ∗∗∗ p < 0.001

210 Table 23: Domestic Criticism Effect on Support for Putin

No Controls Controls b se b se One-Sided 0.01 (0.06) 0.03 (0.06) Balanced 0.06 (0.06) 0.07 (0.05) Opposition Voter -0.01 (0.05) -0.00 (0.05) Regime Voter 0.24∗ (0.05) 0.23∗ (0.04) One-Sided X Opposition Voter 0.01 (0.08) -0.01 (0.08) One-Sided X Regime Voter 0.03 (0.07) 0.01 (0.07) Balanced X Opposition Voter -0.10 (0.08) -0.10 (0.07) Balanced X Regime Voter -0.05 (0.07) -0.05 (0.06) Female 0.03 (0.02) Age -0.00 (0.00) Married -0.02 (0.02) Big City -0.01 (0.02) Russian -0.03 (0.04) Orthodox 0.02 (0.02) Internet -0.08∗ (0.02) Constant 0.38∗ (0.04) 0.48∗ (0.07) Observations 474 471 R-Squared .25 .28 * Note: Dependent variable is support for Vladimir Putin. Treat- ment effects for individuals who received domestic criticism only. Robust standard errors in parentheses. OLS regression. ∗ p < 0.05, ∗∗ p < 0.01, ∗∗∗ p < 0.001

211 Table 24: Foreign Criticism Effect on Support for Putin

No Controls Controls b se b se One-Sided 0.00 (0.06) 0.00 (0.06) Balanced -0.16∗ (0.06) -0.14∗ (0.06) Opposition Voter -0.13∗ (0.06) -0.13∗ (0.06) Regime Voter 0.20∗ (0.05) 0.19∗ (0.05) One-Sided X Opposition Voter 0.06 (0.08) 0.05 (0.08) One-Sided X Regime Voter -0.04 (0.07) -0.04 (0.07) Balanced X Opposition Voter 0.20∗ (0.08) 0.18∗ (0.08) Balanced X Regime Voter 0.12 (0.06) 0.10 (0.06) Female -0.00 (0.02) Age 0.00 (0.00) Married -0.02 (0.02) Big City -0.00 (0.02) Russian 0.00 (0.03) Orthodox 0.04 (0.02) Internet -0.07∗ (0.02) Constant 0.45∗ (0.04) 0.46∗ (0.06) Observations 473 471 R-Squared .2731151 .3052137 * Note: Dependent variable is support for Vladimir Putin. Treatment ef- fects for individuals who received foreign criticism only. Robust stan- dard errors in parentheses. OLS regression. ∗ p < 0.05, ∗∗ p < 0.01, ∗∗∗ p < 0.001

212 Characteristics of Different Citizens I demonstrate how regime voters, opposition voters, and non-voters differ across a host of characteristics. Notably non-voters are more apolitical than regime voters and opposi- tions voters. They are also more likely to only trust foreign media and use the internet for news rather than traditional media.

Non−Voter Opposition Voter Regime Voter

Female

Age

Education

Married

Big City

Russian

How Religious

Orthodox

Political Interest

Internet

Social Media Use

Media Optimist

Domestic Media Truther

Foreign Media Truther

Attitudes toward U.S.

−.4 −.2 0 .2 .4 −.4 −.2 0 .2 .4 −.4 −.2 0 .2 .4

Figure 57: Multinomial logistic regression. Horizontal lines represent 95% confidence intervals for estimates.

213 Bayesian Additive Regression Trees (BART)

Figure 58: Bart estimated treatment effects

Figure 59: Bart estimated treatment effects

214