<<

From Anti-Vaxxer Moms to Militia Men: Influence Operations, Narrative Weaponization, and the Fracturing of American Identity

Dana Weinberg, PhD (contact) & Jessica Dawson, PhD

ABSTRACT

How in 2020 were anti-vaxxer moms mobilized to attend reopen alongside armed militia men? This paper explores the power of weaponized narratives on social media both to create and polarize communities and to mobilize collective action and even violence. We propose that focusing on invocation of specific narratives and the patterns of narrative combination provides insight into the shared sense of identity and meaning different groups derive from these narratives. We then develop the WARP (Weaponize, Activate, Radicalize, Persuade) framework for understanding the strategic deployment and presentation of narratives in relation to group identity building and individual responses. The approach and framework provide powerful tools for investigating the way narratives may be used both to speak to a core audience of believers while also introducing and engaging new and even initially unreceptive audience segments to potent cultural messages, potentially inducting them into a process of radicalization and mobilization. (149 words)

Keywords: influence operations, narrative, social media, radicalization, weaponized narratives

1 Introduction In the spring of 2020, several FB groups coalesced around protests of states’ coronavirus- related shutdowns. Many of these FB pages developed in a concerted action by gun-rights activists (Stanley-Becker and Romm 2020). The FB groups ostensibly started as a forum for people to express their discontent with their state’s shutdown policies, but the bulk of their membership hailed from outside the targeted local communities. Moreover, these online groups quickly devolved to hotbeds of conspiracy theories, malign information, and hate speech (Finkelstein et al. 2020; Perry, Whitehead, and Grubbs 2020; Thomas and Zhang 2020). In addition to voicing concerns about economic issues and individual rights, these groups actively rejected evolving medical advice about recommended precautions such as masks to prevent the spread of COVID-19, promoting both distrust and misinformation. Their messages have been spread aggressively by bots and trolls, similar to Russian patterns of interference in the 2016 election, leading to speculation of involvement by foreign actors like Russsia or , if not in organizing the groups than in amplifying and promoting them (Benson 2020).Through online of their offline activities, the FB groups exaggerated the reach and size of both the community against quarantine as well as those in support of various fringe ideologies (Beskow and Carley 2019; Timberg, Dwoskin, and Balingit 2020). Moreover, in addition to mobilizing people to show up at local protests, these groups have been linked to real world violence (Schulte and Eggert 2020; Wood 2020). Social media has the potential to bring together communities as well as to polarize them, both in the virtual world and in the real world (Benkler, Faris, and Roberts 2018). The algorithms of profit-driven social media platforms encourage both community building and polarization (Tufekci 2018), promoting content that encourages active and extended engagement in order to make audiences available to advertisers (Zuboff 2019). Consequently, online engagement reinforces underlying preferences, beliefs, and group membership, creating an echo-chamber while distancing us from others who are not like us. It also provides opportunities for geographically and socially distant groups and individuals with common interests to find one another as well as opportunities for inauthentic actors to fake identities and exploit these online interaction. Thus, social media provides an unprecedented tool for building collective identity among a wide array of interest groups and, in malign cases, for social manipulation and even social cyber warfare (Beskow and Carley 2019). Aside from understanding that social media influencers may magnify the reach and even the resonance of particular messages (McDonald, Bail, and Tavory 2017), we have yet to understand why some messages resonate strongly enough to influence behavior on- and off-line. How, for example, did the FB reopen groups persuade anti-vaxxer moms vocally concerned about the safety of vaccines for children to attend reopen protests alongside armed militia men, protesting government restriction of personal freedom of movement? Why did the groups’ false information and conspiracy theories spread faster and further on social media than accurate information? How were their messages so potent that they encouraged, not only attendance at protests, but prolonged online engagement well beyond the protests and even offline acts of violence? Previous research suggests the answer lies in large part in the identity-invoking narratives embedded in social media messages. Taking a cultural sociology approach, we define narratives as stories from which we derive cultural meaning, whether in relation to broader culture or to our individual “scripts of self” (Lamont 2019). In this paper, we argue that narratives are a powerful tool in the affinity-focused social media environment for building group cohesion, deepening

2 conflict, and influencing behavior. We contend that actors, both foreign and domestic, are leveraging and manipulating narratives in an effort to exacerbate existing schisms in American society. We explore how narratives are being used to make destabilizing messages appeal to mainstream audiences, and we advocate an approach to studying social media and other types of messaging that examines narrative content and narrative combinations and adjacencies. We then propose a theoretical framework for deep and rigorous investigation of narrative content in social media and other types of persuasive messaging, in particular the role of weaponized narratives, in creating messages that resonate deeply enough with people to shape behavior and identities both online and offline, leading in extreme cases to radicalization. Foreign and Domestic Malign Influence Operations and the Role of Narrative Since at least 2014, Russia has engaged in a dedicated influence operation against the United States (116th Congress 2017; Greenberg 2019; Pomerantsev 2019; Samuels 2018). Although the approach is distinctly modern in its use of social media platforms, the methods align with old Soviet doctrine using to gain a strategic edge over competitors (Bagge 2019; Department of State 1987; Gioe, Lovering, and Pachesny 2020; Greenberg 2019). The U.S. Senate Intelligence Committee suggests that the purpose of this ongoing operation has been to sow discord and undermine Americans’ trust in democratic institutions (116th Congress 2017). This concerted operation serves to create a potent of “useful idiots” (Gioe 2018; Jankowicz 2020b; Michael 2019). In keeping with classic Soviet tactics (Rid 2020), social media enables current Russian influence operations to enlist large swaths of the American public as witting or unwitting accomplices in spreading and legitimating misinformation and subversive content that serves to exacerbate existing schisms in American society. The Russians are not the only foreign power engaged in social cyber warfare with the US. Other foreign powers are also engaged in information warfare on social media, with having identified information operations with roots in China, Iran, and Turkey for example (Chamberlain 2020; Twitter Transparency Center 2020). At the same time, domestic extremist and terrorist groups are launching their own influence campaigns (Department of Homeland Security 2009; Goldenberg and Finkelstein 2020). These operations may plant narrative seeds of discord and radicalism in or our communities, but they may also reap and amplify existing, homegrown narratives. Counter-cultural groups on and elsewhere sometimes describe their activities as “slipping red pills to normies” (Nagle 2017). This is a reference to the Matrix movies, in which the protagonist is offered a red pill that will change his view of everything. Facebook (FB) groups and subReddits have become places where “normies” and “newbies” are exposed to more fringe messages along with other more benign messages that might have attracted them in the first place (Farrell et al. 2020; Shepherd 2020). A growing body of research in cultural sociology has pointed to the persuasive and identity-building capacities of narratives for social movements and the way activists and stigmatized groups make strategic use of narratives, or storytelling, to shape understandings and mobilize participants (Braddock 2020b). As Polletta points out, “storytelling is able to secure a sympathetic hearing for positions unlikely to gain such a hearing otherwise” (Polletta 2009:105) in large part because stories are open to interpretation. Characteristically ambiguous in their moral, narratives invite others to attribute their own meaning, thereby making new or minority ideas or opinions less antagonistic or threatening (see also (Polletta and Chen 2012)).

3 Beyond a vehicle for relatively benign introduction of new ideas, narratives also have the capacity to engage new recruits (Polletta 1998). Successful claims to legitimacy require close adherence to dominant cultural codes (Alexander and Smith 1993; Smith 2010), driving the strategic decisions actors make about conforming to and challenging culture in the narratives they choose to tell and how they tell them (Polletta and Chen 2012). The sharing of narratives is a social activity, and the storyteller strings together a chain of events that work together to convey a point to the audience. However, the moral or point of a narrative is often not contained within the story itself but rather is what Polletta (Polletta 2008) terms “allusive.” Polletta and Callahan (2017) posit that the audience derives a story’s meaning by reference to other stories it invokes, for example: “We hear a story of a little guy going up against a big guy, and we recognize them as David and Goliath. We hope David will win and, if he does, we take the message that cleverness can triumph over brute force. Stories’ persuasive power lies in their ability to call up other compelling stories” (2017, 3). In Polletta and Callahan’s formulation, the most powerful narratives can be invoked by simple reference, usually to the story’s protagonist, because the audience already knows the story. Moreover, stories have the capacity to build collective identity, not through didactic lectures on morals, but through their personal nature. Audiences might take pleasure in catching a reference to a story they already know, especially if it is a story known mostly to insiders (Polletta and Callahan 2017) as in the use of dog whistles (Albertson 2006). But stories also carry deeper collective meaning when they hit a salient emotional chord, whether or not they are true. Hochschild terms these stories that feel as if they are true “deep stories” (Hochschild 2016). Nostalgia narratives, narratives that harken to a glorified version of the past, are a potent example of potentially deep stories that can construct or reconstruct identity. Nostalgia narratives provide a way for groups collectively to rewrite their history and affirm a positive identity (Maly, Dalmage, and Michaels 2013) and can be used, for example, to “defend territory, to create a sense of authenticity, and to give legitimacy to a way of life” (Kasinitz, Philip and Hillyard, David 1995:161). As such, these powerful narratives can provide the basis for collective action (Ocejo 2011). In influence operations, the power of narratives to shape identities and to drive interpretation of accounts and events is perhaps their most important feature. In his account of how Cambridge Analytica used micro-targeting to interfere in the 2016 election, Chris Wylie provides insight into the identity-affirming and fortifying aspect of narratives using the example of Fox News: … the network was conditioning people’s sense of identity into something that could be weaponized. Fox fuels anger with its hyperbolic narratives because anger disrupts the ability to seek, rationalize and weigh information. … With their guard down, Fox’s audience is then told they are part of a group of ‘ordinary Americans’. This identity is hammered home over and over, which is why there are so many references to ‘us’ and direct chatting to the audience by the moderators. The audience is reminded that if you are really an ‘ordinary American,’ this is how you—i.e., ‘we’—think. This primes people for identity-motivated reasoning, which is a bias that essentially makes people accept or reject information based on how well it serves to build or threaten group identity rather than on the merits of the content. This motivated reasoning is how Democrats and Republicans can watch the exact same newscast and reach the opposite conclusion. But I began to understand that Fox works because it grafts an identity onto the of viewers, who then begin to interpret a debate about ideas as

4 an attack on their identity. This in turn triggers a reactance effect, whereby alternative viewpoints actually strengthen the audience’s resolve in their original belief, because they sense a threat to their personal freedom…. This has an insidious effect in which the more debate occurs, the more entrenched the audience becomes. (Wylie 2019:76–77) Wylie’s description portrays a moralizing based not in human universals of harm and fairness, which are the terrain of moral psychology, but in the moral boundaries of group identity (Hitlin and Vaisey 2013). In this process, the moral psychology of harm and fairness becomes defined by the boundaries of group membership (Dawson 2016). Wylie’s observations suggest the use of various narrative strategies and helps explain the positional entrenchment, polarization, and seeming irrelevance and dismissal of facts that have recently become frequently observed features of the American political and social landscape. While Wylie points explicitly to narrative in Fox’s use of “hyperbolic narratives,” or what Smith (Smith 2010) might refer to as “narrative inflation” or “apocalyptic narratives,” his account also points to the strategic use of personalized narratives (directing chatting to the audience and use of personal pronouns) and narratives of group identity. Narratives are thus the tool that invokes the group boundaries and deliver the moralized content in relation to collective identity (Alexander and Smith 1993). Russian election interference and other influence operations have utilized micro-targeting to deliver their personalized messaging to potentially receptive informants (Cadwalladr and Graham-Harrison 2018; Kaiser 2019; Wylie 2019; Zuboff 2019). The Cambridge Analytica methodology used by both the Russians and political campaigns leveraged both demographic information and personality tests (Flick 2016; Kramer 2012; Matz et al. 2017), focusing especially on “persuadable neurotics” who might most easily be lured into the outrage cycle that sets individuals up for emotion-laden identity formation and entrenchment of positions on political and social issues (Youyou, Kosinski, and Stillwell 2015). Twitter, Facebook, Instagram, and YouTube all have pay-to-play programs that boost visibility to target audiences, but organic reach can also be used to micro-target, an activity the platforms have aided perhaps unwittingly with their content suggestion algorithms. In the Cambridge Analytica case (Forbruker Radet 2020; Schudson 1986; Soriano, Au, and Banks 2013), these micro-targeting campaigns were successfully deployed to populations targeted for voter suppression (Channel 4 News InvestigationsTeam 2020:4) and mobilized them for and against a variety of candidates and causes. Applying the lens of narrative to the Cambridge Analytica playbook helps illuminate the process through which individuals were targeted and emotionally manipulated to influence behavior in susceptible individuals. It might be unusual in academic circles to conceptualize influence operations as campaigns, but the marketing angle, especially when viewed in conjunction with narrative, seems to encompass a great deal of what’s involved in designing content to appeal to particular types of individuals, delivering it to them, and calling them to action. In sum, narratives elicit and play on emotion by tapping into deeply held beliefs and values. From a cultural sociology perspective, collective identity shapes the moral meaning derived from narratives. From a marketing perspective, narratives may be targeted, deployed, and delivered to particular individuals and market segments to encourage particular attitudes and behaviors. From an influence operations perspective, narrative weaponization can be used to move populations toward open conflict. Moreover, shared interpretations serve as an expression, not of one’s individual morality, but of shared identity based on group membership. As Polletta and Callahan conclude in their research on , “stories may have political impact less by persuading than by reminding people which side they are on” (2017:14).

5 Methodological Approach to Studying Narrative Weaponization In considering the spread of malign information, much work has examined the network spread of messages. However, a focus on narratives, while compatible with this approach, also offers the possibility of examining the network of narrative content. Such an examination has the potential to provide powerful explanations about the ideological and emotional pathways through which different social groups and movements are linked. For example, The FB groups represented leveraged opportunities to attract and induct new audiences into ideological communities adjacent to concerns about state quarantine policies and simultaneously mobilized reopen protests. While reopen protestors, anti-vaxxers, and Boogaloo Bois might seem strange bedfellows, they united in the anti-lockdown FB groups around a highly resonant discourse built on existing, homegrown narratives of betrayal by the government and corrupt elites. Narratives of government betrayal invoke the government failure following Vietnam (Belew 2018; Dawson and Weinberg 2020; Gibson 1994) and the betrayal of soldiers’ sacrifices by government officials and elites characterized as looking out for their own self-interest rather than the best interest of the nation (Gibson 1989). These narratives grew to prominence on the American right following the defeat in Vietnam (Belew 2018; Dawson 2019; Gibson 1994) and have continued to gain strength with our protracted military involvement in Iraq and Afghanistan (Bolger 2014). They also gained traction during the Civil Rights Movement, both among groups challenging the state for equal rights and protections and also among the groups who perceived that the state and liberal elites undermined their interests when those rights and protections were enforced (Rodman 2019). These narratives tend to present in company of narratives of betrayal of “regular Americans,” particularly by privileged elites, and narratives of being stripped of power and of needing to take it back (Belew 2018; Rodman 2019). To date, the narrative research literature is largely qualitative, based in deep content analysis, and situated in meaning. However, intensive qualitative methodologies do not easily translate to the big data scale of social media. For this reason, researchers in the social media space have turned to topic modelling and other machine-learning algorithms but often neglect content and meaning. We propose merging the two approaches to scale narrative analysis to social media and other big data use cases. For example, in Terrified, Chris Bail uses plagiarism software to detect the repeated use of news stories presenting the views of anti-Muslim fringe groups (Bail 2016). While his was a specific use case, it points to the patterned wordplay in narrative accounts. We advocate an approach to studying narrative content of persuasive messages that recognizes the power of narrative allusion and of narrative combinations. This approach identifies and tracks narrative references and their relationships to one another. It starts with deep content analysis to identify and catalog relevant social narratives. Unlike topic modelling which starts with words and moves to categories, this approach starts with meaning, in this case narrative categories or families, and then moves to identification of keywords and, in the case of memes, to key images. Through a variety of computational techniques, these narrative identifiers may be applied to a wide variety of content, both textual and pictographic. These reference identifiers enable us to examine when particular narratives are being used, in what context, how they are being repeated and combined, and the latent or “core stories” invoked by families of narrative references. Figure 1 shows the architectural build from identification of keywords to narrative references to underlying core stories.

6 Figure 1. Narrative Keywords, References, and Core Stories

1 Particular narratives are invoked through specific keywords. These sets of keywords or images in whole or in part signal the sharing of a narrative or a reference to it. For example, the keywords “basket of deplorables” refer to the narrative of Clinton’s slurring Trump supporters, calling them a “basket of deplorables” while “Bibles and guns” alludes to an offhand Obama remark about disenfranchised people clinging to their Bibles and guns (Clinton 2016; Pilkington 2008). Together with other related narrative references, these narratives are often used in anti- establishment stories. Taken together with stories of COVID conspiracies and fake news for example, they tap into a core story of “regular Americans” being betrayed by the government and a corrupt elite. Keywords may reference more than one narrative, and they are often the path through which various narratives and core stories are linked. At the same time, storytellers may combine narratives unrelated by keywords, thereby creating another pathway that connects them. Shared keywords and references as well as common combinations point to narrative adjacencies, helping to explain how one narrative may lead “down the rabbit hole” to a host of others. For example, Dawson and Weinberg (2020) use various keyword references related to sacrifice narratives in American Rifleman, the mainstream publication of the National Rifle Association (NRA). The analysis shows the shifts in narrative-related keywords, combinations of superficially unrelated narratives, and the attendant shifts in underlying narrative meaning. In their analysis, they demonstrated how narratives of “sacrifice” for the nation, which tie to national identity and a sense of civic responsibility, were increasingly combined with narratives about heroes other than soldiers and narratives about defending freedom, particularly individual freedoms and gun rights from politicians and others who might threaten them. In combination, these narrative allusions promoted a message of militant patriotism untethered from either military service or to civic obligation to a government that might strip individuals of their freedoms. In this way, the NRA co-opted the American civil religious narrative of soldiers’ sacrificing themselves for the nation to wrap an anti-government, counter-cultural message in patriotic trappings and to foster shared collective identity among both anti-government and patriotic factions. We can see a similar move in some of the messages around quarantine , which reference patriotic narratives of defending the Constitution and Bill of Rights alongside anti-government narrative references about political leaders threatening those freedoms. Narrative networks help us understand how social media engagement algorithms may have cross-pollinated groups with adjacent narratives. Jankowicz (2020a) describes the narrative jumps she encountered in researching FB around the reopening of a gym in New Jersey:

In the “REOPEN NJ” group, multiple posts alleged that the video of George Floyd’s death had been created by crisis actors to instigate a civil war, to undermine President Trump, and to distract the American people. … And, their authors write, they are protesting the wrong thing anyway: Bill Gates is conducting the “real” genocide against Black children with his vaccine program, meant not only to enrich himself but to plant microchips in our bodies to track and control us. According to these groups, Gates also paid the World Health Organization to label the coronavirus a pandemic. (Jankowicz 2020a) Thus, in the example, we see narrative linkages between various COVID conspiracies with narratives related to Black Lives Matter, vaccines, child safety, and betrayal by corrupt elites.

1 Moreover, FB’s algorithms were complicit in driving even more material to the author and providing opportunities to join similar groups: Within 12 hours of joining just two of these groups, Facebook “suggested” I follow a Facebook page whose owner insinuates to her nearly 170,000 followers that the Democratic Party organized the George Floyd protests as part of its election strategy. Thanks to my recent memberships, the top suggestion on Facebook’s “Groups” discovery tab was “SHEEP NO MORE,” a 10,000-member group with , a cartoon beloved by white supremacists, in its cover image. It shared 710 posts a day related to the QAnon conspiracy until Facebook began removing QAnon content in August. Groups that would make sense for me to join based on my 14 years of engagement on the platform are now interspersed with those about 5G conspiracies, “alternative” health remedies, and false-flag operations, thanks to my research for this story. (Jankowicz 2020a) To hold attention or “increase engagement”, social media platforms suggest and promote content (Beer 2017; Daniels 2018; Seok and DaCosta 2012) with similar keywords without attending to the underlying social significance of the stories that very different groups used them to tell. In Figure 2 we depict a theorized narrative network based on keyword references for narrative families that have been described as part of the reopen protests—COVID, gun rights, , QAnon, and fake news narratives—as well as adjacent narratives of betrayal by elites and the government, including betrayal of the military, forgotten majority, anti- establishment narratives, and anti-immigrant narratives. The figure graphically depicts the linkages between these various narratives and the pathways through which someone could quickly jump—or be lured or pushed—from one narrative set to another. Recognizing the power of narrative allusion and combination, this approach to examining the narrative content of messages provides insight into the interrelated, underlying narratives shared by groups that may otherwise seem ideologically disconnected. It offers, for example, a theoretical narrative-based explanation for how anti-vaxxer moms were mobilized to protest alongside armed militia men, not only at reopen protests but also at QAnon rallies.

2 Figure 2. Theoretical Network of Narrative Adjacency Based on Keyword References

3 The WARP Framework for Understanding Narrative Weaponization for Mobilization and Radicalization Exposed to particular narratives, not every “normie” will choose the red pill or slide down the rabbit hole. Individuals are inundated with a wide array of information and narrative content on social media, and individual variation in preferences and interests creates algorithmically-influenced selection (Noble 2018; Shepherd 2020; Wachter-Boettcher n.d.) into associations and forms of media that influence the strength, saliency, and frequency of exposure. Moreover, some psychological traits may make people more persuadable than others (Kaiser 2019; Lim and Feldman 2020; Wylie 2019). Identifying the effect that exposure to particular narratives has on individual behavior and sentiment is challenging in reality. We propose the WARP framework as a tool to better understand these complex processes. The WARP framework (Figure 3) provides a heuristic for understanding creation of meaning in message content in relation to narrative deployment (ultimately to Weaponize), identity-building projects (ultimately to Radicalize), rhetorical strategies (ultimately to Persuade), and the responses of individuals (ultimately to Activate). WARP directs attention to the narrative power and strategies in message content in service of a community’s collective identity and in relation to individual responses. A framework for understanding the role of narrative in mobilization and radicalization, WARP builds from Braddock’s definition of radicalization as “an incremental social and psychological process prompted by and inexplicably bound in communication, whereby an individual develops an increased commitment to an extremist ideology resulting in assimilation of the beliefs and attitudes consistent with that ideology” (2015:3 emphasis in original). WARP goes further in its conceptualization, moving between the psychological and the social, and considers not just the individual’s movement into a group but the group’s drawing of boundaries around the individuals they recruit. These boundaries then construct collective identity through which individuals, once inducted, filter and interpret the narratives presented to them. We can conceptualize messengers, like Fox News, MSNBC, or the Russian Research Agency or even social movements, as seeking through strategic deployment and presentation of narrative incrementally to increase commitment to a group identity grounded in ideology. This building of collective identity is an incremental, iterative, and possibly non-linear process that has similarities to those used by predators to lure victims (Argomaniz and Lynch 2018; Kruglanski et al. 2014). Weaponize. We can imagine the weaponized function of narratives deployed in service of ideological messaging and identity building: to “woo” audiences, to “wake” them to particular issues, to “win” them to a particular side, to “worry” them about threats, and to “wed” their interests to those of the broader group. They also may “weaponize” narratives against the opposition, for example, using hate speech. Starbird’s (2017) study of alternative narratives around mass shootings concludes that the alternative news sources provided a host of conspiracy theories that then proliferated through Twitter networks: “We also detected strong political agendas underlying many of these stories and the domains that hosted them, … with the conspiracy theories serving a secondary purpose of attracting an audience and reflecting or forwarding that agenda” (Starbird 2017:9). Although not the focus of her study, the research is suggestive of several aspects of weaponized narrative

4 deployment in service of a variety of ideologies connected through narrative. Moreover, her focus on both alternative new sources and on individuals using those sources to construct and proliferate narratives on Twitter suggests weaponized narrative may be the domain not only of organizations, communities, and groups but of freelancing individuals who often gain financial incentives for increasing traffic to their websites (Zuboff 2019). Thus, we see the potential for what we term a “gig economy” of narrative deployment and weaponization, in which individuals take it on themselves to create and disseminate narratives reflecting their own collective identities and ideologies. Activate. On the receiving side, individuals become incrementally activated by narrative messaging and may exhibit a variety of behaviors that outwardly signal the level of their identification with the group through their interaction with messaged content. They might “attend” to certain types of content or messages from certain groups and even begin to seek it out; “affirm” messaging by showing affinity for ideas related to the groups, for example by “liking” or marking content with positive emojis; they might “amplify,” spreading narrative messages by sharing messages and statements from others, for example, by retweeting on Twitter or by upvoting posts on Reddit or by using ; they might “agree” through comments or statements supporting messages. One step beyond agreement is to “argue,” speaking out against views that contradict the group in defense of group ideology. Argument potentially indicates response to a perceived identity threat and is more likely to represent action outside the circle of “true believers,” where one is more likely to invite disagreement and criticism. Finally, individuals might be influenced to “act” offline in support of group goals. Through their responses, individuals may not only engage in offline behavior in service of group ideology, but they may also become online ambassadors, spreading the group’s messages to others. Radicalize. The group identity-building projects at the heart of weaponization and activation are aimed at drawing and enforcing group boundaries and developing salient collective identity. Groups may seek to “resonate” with individuals through emotionally resonant messaging, to “recruit” members, to “reward” them for their participation or attention, to “redirect” and define their interests in favor of the collectives’, and to “reinforce” or sometimes police the boundaries between “us” and “them” or between in-groups and out-groups (Braddock 2015). Reinforcing may also involve drawing individuals away from other relationships and severing ties that might pull them out of the group, much like what has been commonly observed of cults. In addition, in service of collective identity, groups might seek to construct and emphasize the “risk” or threat of annihilation from “them” (Braddock 2020a), and in extreme cases ultimately to “radicalize” individuals by building a salient enough collective identity that individuals are willing to engage in violence in service of the group.

5 Figure 3. The WARP Framework

Persuade. In order to persuade individuals, messengers employ an array of narrative rhetorical strategies, including to “pyramid,” to stack related narratives for greater emotional and messaging power; to “piggyback,” to combine seemingly unrelated references to connect stories or to point to new meanings of old stories (Dawson and Weinberg 2020); to “prejudice” or negative reference to an out-group or known set of villains (Guhl and Davey 2020); to “promote” or praise extremist ideologues and key figures as heroic figures (Guhl and Davey 2020); to “plagiarize” through repeated use of particular narratives using the same, evocative terminology (Bail 2016); to “personalize,” using narrative references in combination with personal pronouns (“I,” “we,”

6 “me,” “us”); and to offer “proof,” pointing to seemingly official or trusted sources as sharing the narrative. Messengers might also “privilege” in-group members by sharing references to which only they would be privy, for example, “dog whistles” —references that broadcast loudly to an in-the-know portion of an audience but are subtle enough not to register with others, providing the ability to speak to two audiences at once (Albertson 2006; Bonikowski and Zhang 2020; Dawson and Weinberg 2020). We can also detect strategies for narrative inflation or escalation (Smith 2010), including to “preach,” the use of religious or of civil religious language and narratives, for example invoking “good and evil” or patriotism (Dawson and Weinberg 2020); to “prosecute,” focusing on violent acts committed by outgroups to dehumanize them and justify further violence (Braddock 2015; Guhl and Davey 2020); and to “protect” referencing the safety, security, or need to protect something vulnerable or sacred (Dawson and Weinberg 2020) through violence if necessary. Examining social media content and interactions is not enough in itself to understand fully the process of mobilization and radicalization through narrative or the risk factors that make certain people more susceptible to it than others. Individuals have various experiences of exposure to particular content, and many shrug it off while others seek more and ultimately act on it. Some are alone in their positive or negative reactions to it, while others are in community with like-minded people. Some are in contact with the “others” who are the focus of a group’s ire or hatred, while others are entirely segregated and isolated from them. Some people are more easily stirred to high emotion and fear (Wylie 2019), while others are slower to ignite. Very little of this variation is observable in the social media environment, despite some evidence of a networked emotional contagion effect (Kramer, Guillory, and Hancock 2014). Thus, other types of research and research methods are recommended for a full elaboration of the WARP process. We illustrate examples of various aspects of the WARP framework using tweets from Twitter’s comprehensive digital archive of state-backed information operations (Twitter Transparency Center 2020), drawing from their collection of Russian (IRA) tweets posted in English between 2009 and May 2020. The tweets from this archive are useful for illustrating WARP, and we pulled examples with narrative references invoking betrayal by the government and elites. The narratives on Twitter are more likely to be more mainstream and palatable than sites which have become havens for deplatformed extremists (Donovan and Boyd 2018). Thus, while we can identify examples of the extreme positions for weaponize, radicalize, and persuade, the messages are not as strident as those that might be found on other platforms with less or looser monitoring. In addition, the nature of the data does not allow for investigation of individual responses to show a person’s engagement or offline activation. Finally, because English-language tweets in the accounts identified by Twitter peaked around the 2016 election and steadily decline with each new round of crackdown on these types of accounts, the examples in Table 1 predate the flurry of narrative exchange around quarantine and state shutdown policies. Yet many point to a long seeding of the ground prior to the polarization around this particular issue.

Table 1. Examples of WARP in IRA Tweets Across a Variety of Narratives WEAPONIZE NARRATIVE DEPLOYMENT

7 Woo We are proud and no one will tell us what to do, especially stupid ignorant politician! #instotus #patriots #patriot https://t.co/H9OwB8LhP4 Wake RT @HealthRanger: Vaccine-pushers challenged to DRINK MERCURY if it's so "safe" https://t.co/gGIE2FCxnL #antivax #mercury Win Militia's built this country. Militia's will win it back Worry A terrorist cell in America trained a man who killed 17 children yesterday. We don't have immigrant problem. We have WHITE SUPREMACY problem https://t.co/M2mHiSNPba Wed Every politicians who supported Trump and every politician who is funded by NRA. Not surprising these two categories of politicians are mostly the same people #ThingsIdLikeToSeeImpeached Weaponize WebSite Caught Selling Ultra-Concealable Knives 4 Slicing Up Trump Supporters At Rallies Antifa, come to Texas, knife to a gun fight https://t.co/xW8fVzn9lZ RADICALIZE IDENTITY BUILDING Resonate Memorial Day isn't about alcohol and BBQ, it's about the soldiers that died to protect our freedoms. #MemorialDayWeekend. Recruit I would rather take care of TEN homeless US Veterans, than 50,000 migrants/illegal aliens.. How About You? https://t.co/Fa3H2Krxix Reward For people who support our troops, who love this country & who appreciate the sacrifices of our Vets, thank you! https://t.co/FDHFkzbAoU Redirect Shameful! Obama Spends $65,000 On Each illegal alien, who hate us. And Nothing On Vets, Who Fought for America. Reinforce This is why so many are taking the knee. Cops are not thinking about country, flag or National Anthem when they are abusing their power. https://t.co/aV7TpdA5lS Risk In today's news: - Neo-Nazi wins Illinois GOP primary. - White supremacist murders black folks in Austin with mail bombs. - Unarmed black man Stephon Clark shot & killed by police in his own backyard. This is Trump's America. Radicalization RT @mgaudy1980: #Qanon Never give up your guns. Fight! Fight! Fight! because once we lose them we are just the walking dead https://t.co/Gy¶ PERSUADE RHETORICAL STRATEGIES Pyramid #MAGA hats should be placed right next to Nazi flags as symbols of fascism and white supremacy! Piggyback DEMOCRATS:

1. They want to disarm our citizens. 2. They lie to minorities. 3. They steal our healthcare. 4. They tax our citizens into poverty. 5. They spit on our troops. 6. They hate our President.

Democrats are the party of illegals and obstructionists Prejudice The same people who pushed the Pizzagate conspiracy now justifying Roy Moore molesting a 14 yo girl. What a bunch of lying, hypocritical scumbags.

8 Promote White patriots protect #Ferguson with rifles while black thugs are arrested. Fixed it. God Bless the #OATHKEEPERS http://t.co/PTRn4fPYOq Plagiarize Trump and Republicans see a 'deep state' foe: https://t.co/YgGxBzbq5Y Trump and Republicans see a 'deep state' foe: Barack Obama https://t.co/c7wKp91VvD https://t.co/JBXGWKI2Li (Tweets, not flagged as retweets, from separate profiles and with different link URLs) Personalize I voted for Trump because I knew he's the only man who could save America from liberal degeneracy. I'm still sure that I made a right choice Privilege Caravan of ILLEGALS is climbing over the border fence as I'm typing this post. Where is the National Guard & Border Patrol? Boot those invaders right back into Mexico. This is ridiculous how Soros funded Libtard SJW groups helped them get here unabated!! https://t.co/dlBILfVeL0 Proof RT @HealthRanger: Flashback: UK government admits swine flu #vaccine causes brain damage, awards compensation to 60 families https://t.co/n… Preach We believe that every American should stand for the National Anthem, and we proudly pledge allegiance to one NATION UNDER GOD! https://t.co/H1ZSTBahTs https://t.co/fOLKsfPY1g Prosecute Over 30% of murders in the United States were committed by illegal aliens. #BeingPatriotic #stopviolence #USA https://t.co/c8DXIGAfoP Protect RT @laurie6805: #PizzaGate #PedoFiles Do NOT let this story die out. Our Children are depending on you

The Russians targeted left and right leaning audiences. It is as yet unclear whether they influenced polarization or whether they merely targeted already polarized audiences to foment further unrest (Bail et al. 2019). During the 2016 election, for example, the Internet Research Agency (IRA) pretended to be right-leaning accounts such as the Tennessee GOP (@ten_GOP) or left-leaning accounts such as Blacktavist (@black_matters) (United States Senate Intelligence Committee 2020). This technique, known as astroturfing, creates a false consensus effect that looks like it’s a grass roots effort rather than outside sponsored content (Al-Rawi and Rahman 2020; Engel et al. 2019; Orr 2018). These accounts and others pretending to be Americans amplified existing social divisions along race and political party lines; for example, the fake IRA accounts engaged strongly in opposite sides of the polarized discussion around Black Lives Matter (Arif, Stewart, and Starbird 2018). Research demonstrates that Twitter users who engaged regularly with the fake IRA accounts showed greater changes in their behavior over time (Dutta et al. 2020). In the Weaponize and Radicalize sections of Table 1, the presentation of messages with narrative content related to betrayal by the government and elites seems to become increasingly intense with each successive narrative maneuver. While the Persuade category has more or less intense rhetorical strategies, it operates more like a tool box of persuasive strategies and does not have a rank order; it is common to see messages use more than one rhetorical strategy at a time. In the Weaponize portion of Table 1, narratives are deployed to appeal to an audience, in this case to their sense of pride and power, seeking to hook their interest (woo) and

9 then move through informing them about issues (wake), in this case the danger of vaccines pushed by suspect experts, to inviting support of the people and ideas on the right side of issues (win), in this case support for militias. Then, much as we see in our narrative network example (Figure 2), they tie issues together, for example, energizing an anti-Trump audience to also be anti-NRA (wed). Finally, they weaponize narratives, in this case preparing the audience to bring guns to fight the evildoers from Antifa who are planning to threaten them with knives. Similarly, in the Radicalize examples, the tweet examples show progressively aggressive identity-building projects, although not representative of maneuvers from one particular group. They start with a message about Memorial Day, using narrative references simultaneously to call upon the audience’s patriotism to remember fallen heroes as well as on resentments about the disrespect for soldier’s sacrifice in Vietnam by civilians and the more recent anger and frustration of Iraq and Afghanistan veterans who frequently quipped that “America wasn’t at war, it was at the mall” (Kandra 2006). The next step, recruit, plays even more explicitly on these sensibilities, inviting the audience actively to consider or even show their interest in and concern for veterans, and then there’s the reward—a thank you to the troops and those who support them—followed by a message creating a boundary around the group—we, Americans who support the military, as well as the vets who fought for America. In the next maneuver, the narrative reference serves to reinforce group boundaries by pitting us, the patriots who in this case support kneeling (a reference to the narratives about the NFL players protesting ), against the unpatriotic police who trample the Constitution. We can also see in this example how some of the same narrative families around government betrayal and individual rights that fueled the FB reopen groups could also be used to mobilize Black Lives Matter protests, connecting to yet another narrative set. In the same vein, the Russian tweets used narrative to emphasized the risk to “black folks” from the government. Switching to identity building for another group, the Russians urge the audience to “fight, fight fight” for the group, in this case supporters of QAnon and gun rights. It is tempting to imagine a nearly linear relationship wherein a group tries to “slip the red pill” to a new recruit and then deploys narratives to lead them step-by-step from recruitment to radicalization. On the narrative deployment side, this process would lead progressively from wooing to weaponizing, perhaps with different narratives invoked with each progressive step. These narrative deployments would lead the recruits through the identity-building cycle from recruitment to radicalization and utilize a variety of narrative strategies along with progressively more explicit narratives. To the extent these efforts were successful, the new recruits would move from paying attention to content to acting on it. However, the reality may be less straightforward. To the extent that the narrative-focused radicalization process works in a linear way—if it actually does—such linearity may be difficult both to execute and to observe. On Twitter, Reddit, Facebook, and other social media platforms that utilize group posting, various amounts of narrative content are immediately available, not doled out in strategic drips and drabs. An individual may be repeatedly exposed to content before engaging with it or can find it as soon as engagement begins and gain full exposure without wading in. Moreover, even though we seem to build communities on these social platforms, the data available to researchers does not show one-on-one communication as in the chat transcripts used in previous work on predation (Edwards, Bradshaw, and Hinsz 2014; Edwards, Edwards, and Martin 2020; Winters and Jeglic 2017).

10 The online process may have adapted along with the constraints of social media platforms and may represent an accumulation and critical mass of messages and responses rather than a step-by-step progression. For example, as in Wylie’s discussion of Fox news, messengers may begin with worrying or weaponized content, creating an identity-building hook from the outset, and then round out the process in ways that deepen individual engagement and commitment to the cause through repeated exposure to various types of narrative deployment (Berry and Sobieraj 2016; Wylie 2019). The COVID pandemic has created a perfect opportunity for foreign influence to capitalize on all of the groundwork they laid previously to amplify already existing social divisions, helping exacerbate destabilizing unrest across the United States. We contend that any polarizing and mobilizing results the IRA and other groups have achieved in their influence operations stems largely from their leveraging narratives related to poignant social divisions in American society and from use of strategic narrative deployment, identity building, and rhetorical maneuvers described in the WARP framework.

Conclusion Disinformation, misinformation, and other types of influence campaigns have become easier with the internet, allowing unlimited broadcasting of social media messages targeted to specific audiences (Cambridge Analytica 2015; Kaiser 2019; Wylie 2019; Zuboff 2020) and designed to sow chaos and enhance existing discord (116th Congress 2017; Froehlich 2019; Galante and Ee 2014). Moreover, social media platforms provide opportunities to target specific individuals based on detailed individually based information at scale, as has been demonstrated by several political campaigns (Gerber et al. 2011; Issenberg 2013; Wylie 2019). The economics of holding attention in order to sell ads promise a reward-for-outrage loop designed to keep audiences signed on and tuned in. Technological advances in machine learning and artificial intelligence combined with the for-profit trade-ins on our privacy and attention have increased the potential visibility of all kinds of groups, providing them with unprecedented opportunities reach new audiences. What makes groups effective at spreading their messages to initial non-believers and mobilizing audiences to action? In this paper, we have argued that social media influence operations, regardless of whether they are foreign or domestic or benevolent or malign, leverage and at times weaponize the emotional and persuasive power of narrative, especially around social issues, to influence individuals and mobilize them to action. Moreover, the underlying core stories shared by popular narratives provide pathways through which individuals engage in a wider network of narrative-based messaging, providing linkages between seemingly disparate communities. Drawing from a well-established literature in cultural sociology on the role of narrative, narrative form, and narrative strategies we advocate an approach to studying message content that recognizes the allusive, meaning-laden power of narrative and attends to narrative invocations and combinations. This approach can be used across instances of narrative references and across platforms, allowing a world of analytic possibilities, from archival analysis to big data text mining, that can be deployed for a variety of use cases. Moreover, examining the interrelationship of narrative content provides a means of understanding and tracking the pathways that bridge seemingly disparate ideological messages and communities, such as anti-vaxxer moms and militia men. This content-

11 focused approach complements the more commonly seen computational approaches to studying social media influence, which focus on actors and the spread of messages but spare little attention, beyond use of topic modelling, to content and its specific social and personal significance. In addition to alluding to other stories, narratives are also allusive in their conveyance of meaning. The storyteller’s intended meaning is often related to the identity and collective action projects of groups and social movements, and the interpretation and reception depends upon one’s sense of collective identity. Combining these insights on narrative with work on radicalization, we have developed the WARP framework, which points to the way narratives are deployed in service of identity and collective action projects, the narrative strategies used in service of these aims, and variations in individual response and susceptibility to influence operations. The WARP framework stands to provide insight into the way different groups deploy and weaponize narratives, use them in service of building collective identity, and engage rhetorical strategies to enhance narrative while also considering individuals’ engagement with and responses to narrative projects that may lead to and further feed activation and commitment by themselves and others. Current events suggest the success of various influence operations in energizing conspiracy theories, exacerbating social divisions, mobilizing protest, and even promoting violence. The time is ripe to study narrative as a weapon of influence and the process through which people become engaged with divisive content to the extent that they are mobilized and even moved to violent action. Such study is a necessary step to developing effective defensive and offensive measures to combat malign influence operations. It may also be the key to developing effective strategies to heal social polarization and bring our nation together again.

12 WORKS CITED

116th Congress. 2017. Report of the Select Committee on Intelligence United States Senate on Russian Active Measures Campaigns and Interference in the 2016 U.S. Election. Senate Report. 116–XX. Washington, D.C.: United States Senate Intelligence Committee.

Albertson, Bethany. 2006. “Dog-Whistle Politics, Coded Communication and Religious Appeals.” American Political Science Association and International Society of Political Psychology 38.

Alexander, Jeffrey C., and Philip Smith. 1993. “The Discourse of American Civil Society: A New Proposal for Cultural Studies.” Theory and Society 22(2):151–207. doi: 10.1007/BF00993497.

Al-Rawi, Ahmed, and Anis Rahman. 2020. “Manufacturing Rage The Russian Internet Research Agency’s Political Astroturfing on Social Media.Pdf.” First Monday, September.

Argomaniz, Javier, and Orla Lynch. 2018. “Introduction to the Special Issue: The Complexity of Terrorism—Victims, Perpetrators and Radicalization.” Studies in Conflict & Terrorism 41(7):491–506. doi: 10.1080/1057610X.2017.1311101.

Arif, Amir, Leo Graiden Stewart, and Kate Starbird. 2018. “Acting the Part: Examining Information Operations within $BlackLivesMatter Discourse.” Proceedings of the ACM on Human-Computer Interaction. doi: https://doi.org/10.1145/3274289.

Bagge, Daniel. 2019. Unmasking Maskirovka: Russia’s Cyber Influence Operations. Defense Press.

Bail, Christopher A. 2016. Terrified: How Anti-Muslim Fringe Organizations Became Mainstream. Princeton University Press.

Bail, Christopher A., Brian Guay, Emily Maloney, Aidan Combs, D. Sunshine Hillygus, Friedolin Merhout, Deen Freelon, and Alexander Volfovsky. 2019. “Assessing the Russian Internet Research Agency’s Impact on the Political Attitudes and Behaviors of American Twitter Users in Late 2017.” Proceedings of the National Academy of Sciences 201906420. doi: 10.1073/pnas.1906420116.

Beer, David. 2017. “The Social Power of Algorithms.” Information, Communication & Society 20(1):1–13. doi: 10.1080/1369118X.2016.1216147.

13 Belew, Kathleen. 2018. Bring the War Home: The White Power Movement and Paramilitary America. Cambridge, Mass: Harvard University Press.

Benkler, Yochai, Robert Faris, and Hal Roberts. 2018. Network : Manipulation, Disinformation, and Radicalization in American Politics. , NY: Oxford University Press.

Benson, Thor. 2020. “Trolls and Bots Are Flooding Social Media with Disinformation Encouraging States to End Quarantine.” Business Insider, April 24.

Berry, Jeffrey M., and Sarah Sobieraj. 2016. The Outrage Industry: Political Opinion Media and the New Incivility. Reprint Edition. Oxford: Oxford University Press.

Beskow, David M., and Kathleen M. Carley. 2019. “Social Cybersecurity: An Emerging National Security Requirement.” Military Review (March-April).

Bolger, Daniel. 2014. Why We Lost: A General’s Inside Account of the Iraq and Afghanistan Wars. First Edition edition. Boston: Eamon Dolan/Houghton Mifflin Harcourt.

Bonikowski, Bart, and Yueran Zhang. 2020. Populism as Dog-Whistle Politics: Anti-Elite Discourse and Sentiments toward Out- Groups. preprint. SocArXiv.

Braddock, Kurt. 2015a. “The Utility of Narratives for Promoting Radicalization: The Case of the Animal Liberation Front.” Dynamics of Asymmetric Conflict 8(1):38–59. doi: 10.1080/17467586.2014.968794.

Braddock, Kurt. 2015b. “The Utility of Narratives for Promoting Radicalization: The Case of the Animal Liberation Front.” Dynamics of Asymmetric Conflict 8(1):38–59. doi: 10.1080/17467586.2014.968794.

Braddock, Kurt. 2020a. “Narrative Persuasion and Violent Extremism: Foundations and Implications.” Pp. 527–38 in Propaganda und Prävention, edited by J. B. Schmitt, J. Ernst, D. Rieger, and H.-J. Roth. Wiesbaden: Springer Fachmedien Wiesbaden.

Braddock, Kurt. 2020b. Weaponized Words: The Strategic Role of Persuasion in Violent Radicalization and Counter-Radicalization. Cambridge University Press.

Cadwalladr, Carole, and Emma Graham-Harrison. 2018. “Revealed: 50 Million Facebook Profiles Harvested for Cambridge Analytica in Major Data Breach.” , March 17.

14 Cambridge Analytica. 2015. “Key Case Studies.”

Chamberlain, P. R. 2020. “Twitter as a Vector for Disinformation.” 8.

Channel 4 News InvestigationsTeam. 2020. “Revealed: Trump Campaign Strategy to Deter Millions of Black Americans from Voting in 2016.” Channel 4 News. Retrieved October 9, 2020 (https://www.channel4.com/news/revealed-trump-campaign-strategy-to- deter-millions-of-black-americans-from-voting-in-2016).

Clinton, Hillary. 2016. “Read ’s ‘Basket of Deplorables’ Remarks on Trump Supporters.” Time. Retrieved December 29, 2019 (https://time.com/4486502/hillary-clinton-basket-of-deplorables-transcript/).

Daniels, Jessie. 2018. “The Algorithmic Rise of the ‘Alt-Right.’” Contexts 17(1):60–65. doi: 10.1177/1536504218766547.

Dawson, Jessica. 2016. “Errors in Judgement: How Status, Values, and Moral Foundations Influence Moral Judgments of Guilt and Punishment.”

Dawson, Jessica. 2019. “Shall Not Be Infringed: How the NRA Used Religious Language to Transform the Meaning of the Second Amendment.” Palgrave Communications 5(1):58. doi: 10.1057/s41599-019-0276-z.

Dawson, Jessica, and DB Weinberg. 2020. “These Honored Dead: Sacrifice Narratives in the NRA’s American Rifleman Magazine.” American Journal of Cultural Sociology. doi: https://doi.org/10.1057/s41290-020-00114-x.

Department of Homeland Security. 2009. Rightwing Extremism: Current Economic and Political Climate Fueling Resurgence in Radicalization and Recruitment. Department of Homeland Security.

Department of State. 1987. Soviet Influence Activities: A Report on Active Measures and Propaganda, 1986-87. Department of State Publication 9627. United States Department of State.

Donovan, Joan, and Danah Boyd. 2018. “The Case for Quarantining Extremist Ideas | The Far Right | The Guardian.” The Guardian. Retrieved October 8, 2020 (https://amp.theguardian.com/commentisfree/2018/jun/01/extremist-ideas-media-coverage-kkk).

Dutta, Upasana, Rhett Hanscom, Jason Shuo Zhang, Richard Han, Tamara Lehman, Qin Lv, and Shivakant Mishra. 2020. “Analyzing Twitter Users’ Behavior Before and After Contact by the Internet Research Agency.” ArXiv:2008.01273 [Cs].

15 Edwards, April, Lynne Edwards, and Alec Martin. 2020. “Cyberbullying Perceptions and Experiences in Diverse Youth.” Pp. 9–16 in International Conference on Applied Human Factors and Ergonomics. Springer.

Edwards, Sarah R., Kathryn A. Bradshaw, and Verlin B. Hinsz. 2014. “Denying Rape but Endorsing Forceful Intercourse: Exploring Differences Among Responders.” Violence and Gender 1(4):188–193.

Engel, Eliot L., Brad Sherman, Gregory W. Meeks, Albio Sires, Gerald E. Connolly, Theodore E. Deutch, Karen Bass, William Keating, David Cicilline, Ami Bera, Joaquin Castro, Dina Titus, Adriano Espaillat, Ted Lieu, Susan Wild, Dean Phillips, Ilhan Omar, Colin Allred, Andy Levin, Abigail Spanberger, Chrissy Houlahan, Tom Malinowski, David Trone, Jim Costa, Juan Vargas, Vicente Gonzalez, Michael T. Mccaul, Christopher H. Smith, Steve Chabot, Joe Wilson, South Carolina, Scott Perry, Ted S. Yoho, Adam Kinzinger, Lee Zeldin, Jim Sensenbrenner, Ann Wagner, Brian Mast, Francis Rooney, Brian Fitzpatrick, John Curtis, Ken Buck, Ron Wright, Guy Reschenthaler, Tim Burchett, Greg Pence, Steve Watkins, and Mike Guest. 2019. Undermining Democracy: Kremlin Tools of Malign Political Influence. Hearing. No 116-41. Washington, D.C.: House of Representatives Committee on Foreign Affairs.

Farrell, Tracie, Oscar Araque, Miriam Fernandez, and Harith Alani. 2020. “On the Use of Jargon and Word Embeddings to Explore Subculture within the Reddit’s .” Pp. 221–30 in 12th ACM Conference on Web Science. Southampton : ACM.

Finkelstein, Joel, John K. Donohue, Alex Goldenberg, Jason Baumgartner, John Farmer, Savvas Zannettou, and Jeremy Blackburn. 2020. “COVID-19, CONSPIRACY AND CONTAGIOUS :” 15.

Flick, Catherine. 2016. “Informed Consent and the Facebook Emotional Manipulation Study.” Research Ethics 12(1):14–28. doi: 10.1177/1747016115599568.

Forbruker Radet. 2020. Out of Control: How Consumers Are Exploited by the Online Advertising Industry. Oslo: Norwegian Consumer Council.

Froehlich, Thomas. 2019. “Ten Lessons for the Age of Disinformation.”

Galante, Laura, and Shaun Ee. 2014. “Defining Russian Election Interference:” 20.

Gerber, Alan S., Gregory A. Huber, David Doherty, and Conor M. Dowling. 2011. “The Big Five Personality Traits in the Political Arena.” Annual Review of Political Science 14(1):265–87. doi: 10.1146/annurev-polisci-051010-111659.

16 Gibson, James William. 1989. “American Paramilitary Culture and the Reconstitution of the Vietnam War.” Pp. 10–42 in Vietnam Images: War and Representation, edited by J. Walsh and J. Aulich. London: Palgrave Macmillan UK.

Gibson, James William. 1994. Warrior Dreams: Violence and Manhood in Post-Vietnam America. Reprint edition. New York: Hill & Wang Pub.

Gioe, David V. 2018. “Cyber Operations and Useful Fools: The Approach of Russian Hybrid Intelligence.” Intelligence and National Security 33(7):954–73. doi: 10.1080/02684527.2018.1479345.

Gioe, David V., Richard Lovering, and Tyler Pachesny. 2020. “The Soviet Legacy of Russian Active Measures: New Vodka from Old Stills?” International Journal of Intelligence and CounterIntelligence 1–26. doi: 10.1080/08850607.2020.1725364.

Goldenberg, Alex, and Joel Finkelstein. 2020. Cyber Swarming, Memetic Warfare and Viral Insurgency: How Domestic Militants Organize on Memes to Incite Violent Insurrection and Terror Against Government and Law Enforcement. Princeton, N.J.: Network Contagion Research Institute.

Greenberg, Andy. 2019. Sandworm: A New Era of Cyberwar and the Hunt for the Kremlin’s Most Dangerous Hackers. Doubleday.

Guhl, Jakob, and Jacob Davey. 2020. “A Safe Space to Hate: White Supremacist Mobilisation on Telegram.” ISD. Retrieved September 1, 2020 (https://www.isdglobal.org/isd-publications/a-safe-space-to-hate-white-supremacist-mobilisation-on- telegram/).

Hitlin, Steven, and Stephen Vaisey. 2013. “The New Sociology of Morality.” Annual Review of Sociology 39(1):51–68. doi: 10.1146/ annurev-soc-071312-145628.

Hochschild, Arlie Russell. 2016. Strangers in Their Own Land: Anger and Mourning on the American Right. New York: The New Press.

Issenberg, Sasha. 2013. The Victory Lab: The Secret Science of Winning Campaigns. Reprint edition. New York: Broadway Books.

Jankowicz, Nina. 2020a. “How an Anti-Shutdown Celebrity Is Made.” The Atlantic, October 3.

Jankowicz, Nina. 2020b. How to Lose the Information War: Russia, Fake News, and the Future of Conflict. I.B. Tauris.

17 Kaiser, Brittany. 2019. Targeted: The Cambridge Analytica Whistleblower’s Inside Story of How Big Data, Trump, and Facebook Broke Democracy and How It Can Happen Again. Harper.

Kandra, Greg. 2006. “We’re at War; America’s at the Mall.” CBS Evening News. Retrieved October 18, 2020 (https://www.cbsnews.com/news/were-at-war-americas-at-the-mall/).

Kasinitz, Philip, and Hillyard, David. 1995. “The Old-Timers’ Tale: The Politics of Nostalgia on the Waterfront.” Journal of Contemporary Ethnography 24(2):139–64.

Kramer, A. D. I., J. E. Guillory, and J. T. Hancock. 2014. “Experimental Evidence of Massive-Scale Emotional Contagion through Social Networks.” Proceedings of the National Academy of Sciences 111(24):8788–90. doi: 10.1073/pnas.1320040111.

Kramer, Adam D. I. 2012. “The Spread of Emotion via Facebook.” P. 767 in Proceedings of the 2012 ACM annual conference on Human Factors in Computing Systems - CHI ’12. Austin, Texas, USA: ACM Press.

Kruglanski, Arie W., Michele J. Gelfand, Jocelyn J. Bélanger, Anna Sheveland, Malkanthi Hetiarachchi, and Rohan Gunaratna. 2014. “The Psychology of Radicalization and Deradicalization: How Significance Quest Impacts Violent Extremism: Processes of Radicalization and Deradicalization.” Political Psychology 35:69–93. doi: 10.1111/pops.12163.

Lamont, Michèle. 2019. “From ‘Having’ to ‘Being’: Self‐ worth and the Current Crisis of American Society.” The British Journal of Sociology 70(3):660–707. doi: 10.1111/1468-4446.12667.

Lim, Velvetina, and Gilad Feldman. 2020. “Values and the Dark Side: Meta-Analysis of Links between Dark Triad Traits and Personal Values.”

Maly, Michael, Heather Dalmage, and Nancy Michaels. 2013. “The End of an Idyllic World: Nostalgia Narratives, Race, and the Construction of White Powerlessness.” Critical Sociology 39(5):757–79. doi: 10.1177/0896920512448941.

Matz, S. C., M. Kosinski, G. Nave, and D. J. Stillwell. 2017. “Psychological Targeting as an Effective Approach to Digital Mass Persuasion.” Proceedings of the National Academy of Sciences 114(48):12714–19. doi: 10.1073/pnas.1710966114.

McDonald, Terrence, Christopher Bail, and Iddo Tavory. 2017. “A Theory of Resonance.” Sociological Theory 35(1):1–14.

Michael, George. 2019. “Useful Idiots or Fellow Travelers? The Relationship between the American Far Right and Russia.” Terrorism and Political Violence 31(1):64–83. doi: 10.1080/09546553.2018.1555996.

18 Nagle, Angela. 2017. : Online Culture Wars From And Tumblr To Trump And The Alt-Right. Winchester, UK ; Washington, USA: Zero Books.

Noble, Safiya. 2018. Algorithms of Oppression: How Search Engines Reinforce Racism. 1 edition. New York: NYU Press.

Ocejo, Richard E. 2011. “The Early Gentrifier: Weaving a Nostalgia Narrative on the Lower East Side.” City & Community 10(3):285–310. doi: 10.1111/j.1540-6040.2011.01372.x.

Orr, Caroline. 2018. “Pro-Trump & Russian-Linked Twitter Accounts Are Posing As Ex-Democrats In New Astroturfed Movement.” Medium. Retrieved June 3, 2020 (https://arcdigital.media/pro-trump-russian-linked-twitter-accounts-are-posing-as-ex- democrats-in-new-astroturfed-movement-20359c1906d3).

Perry, Samuel L., Andrew L. Whitehead, and Joshua B. Grubbs. 2020. “Culture Wars and COVID‐ 19 Conduct: Christian Nationalism, Religiosity, and Americans’ Behavior During the Coronavirus Pandemic.” Journal for the Scientific Study of Religion jssr.12677. doi: 10.1111/jssr.12677.

Pilkington, Ed. 2008. “Obama Angers Midwest Voters with Guns and Religion Remark.” The Guardian, April 14.

Polletta, Francesca. 1998. “"It Was Like A Fever . . . " Narrative and Identity in Social Protest.” Social Problems 45(2):137–59.

Polletta, Francesca. 2008. “Storytelling in Politics.” Contexts 7(4):26–31. doi: 10.1525/ctx.2008.7.4.26.

Polletta, Francesca. 2009. It Was Like a Fever: Storytelling in Protest and Politics. University of Chicago Press.

Polletta, Francesca, and Jessica Callahan. 2017. “Deep Stories, Nostalgia Narratives, and Fake News: Storytelling in the Trump Era.” American Journal of Cultural Sociology 5(3):392–408. doi: 10.1057/s41290-017-0037-7.

Polletta, Francesca, and Pang Ching Bobby Chen. 2012. “Narrative and Social Movements.” in The Oxford Handbook of Cultural Sociology, edited by J. C. Alexander, R. Jacobs, and P. Smith. Oxford University Press.

Pomerantsev, Peter. 2019. This Is Not Propaganda: Adventures in the War against Reality.

Rid, Thomas. 2020. Active Measures: The Secret History of Disinformation and . Illustrated Edition. New York: Farrar, Straus and Giroux.

19 Rodman, Rosamond C. 2019. “Scripturalizing and the Second Amendment.” in The Bible and Feminism: Remapping the Field, edited by Y. Sherwood. Oxford University Press.

Samuels, Cassandra Pollock and Alex. 2018. “Hysteria over Jade Helm Exercise in Texas Was Fueled by Russians, Former CIA Director Says.” The Texas Tribune. Retrieved March 1, 2020 (https://www.texastribune.org/2018/05/03/hysteria-over-jade- helm-exercise-texas-was-fueled-russians-former-cia-/).

Schudson, Michael. 1986. Advertising, The Uneasy Persuasion: Its Dubious Impact On American Society. Reprint edition. New York: Basic Books.

Schulte, Grant, and David Eggert. 2020. “Michigan Governor Says Trump’s Words Inspire Extremists as 13 Charged in Kidnapping Attempt.” ABC7 New York. Retrieved October 9, 2020 (https://abc7ny.com/6888314/).

Seok, Soonhwa, and Boaventura DaCosta. 2012. “The World’s Most Intense Online Gaming Culture: Addiction and High- Engagement Prevalence Rates among South Korean Adolescents and Young Adults.” Computers in Human Behavior 28(6):2143–51. doi: 10.1016/j.chb.2012.06.019.

Shepherd, Ryan P. 2020. “Gaming Reddit’s Algorithm: R/The_donald, Amplification, and the Rhetoric of Sorting.” Computers and Composition 56:102572. doi: 10.1016/j.compcom.2020.102572.

Smith, Philip. 2010. Why War?: The Cultural Logic of Iraq, the Gulf War, and Suez. 1 edition. University of Chicago Press.

Soriano, Jacopo, Timothy Au, and David Banks. 2013. “Text Mining in Computational Advertising.” Statistical Analysis and Data Mining 6(4):273–85. doi: 10.1002/sam.11197.

Stanley-Becker, Isaac, and Tony Romm. 2020. “Pro-Gun Activists Using Facebook Groups to Push Anti-Quarantine Protests.” Washington Post, April 19.

Starbird, Kate. 2017. “Examining the Ecosystem through the Production of Alternative Narratives of Mass Shooting Events on Twitter.” Montreal, Canada.

Thomas, Elise, and Albert Zhang. 2020. “ID2020, Bill Gates and the Mark of the Beast: How Covid-19 Catalyses Existing Online Conspiracy Movements.” 21.

20 Timberg, Craig, Elizabeth Dwoskin, and Moriah Balingit. 2020. “Protests Spread, Fueled by Economic Woes and Internet Subcultures.” Washington Post.

Tufekci, Zeynep. 2018. Twitter and Tear Gas: The Power and Fragility of Networked Protest. Reprint edition. New Haven London: Yale University Press.

Twitter Transparency Center. 2020. “Information Operations - Twitter Transparency Center.” Retrieved September 1, 2020 (https://transparency.twitter.com/en/reports/information-operations.html).

United States Senate Intelligence Committee. 2020. Report on the Select Committee on Intelligence United States Senate on Russian Active Measures Campaign and Interference in the 2016 U.S. Election: Volume 4: Review of the Intelligence. Washington, D.C.

Wachter-Boettcher, Sara. n.d. Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech.

Winters, Georgia M., and Elizabeth L. Jeglic. 2017. “Stages of Sexual Grooming: Recognizing Potentially Predatory Behaviors of Child Molesters.” Deviant Behavior 38(6):724–33. doi: 10.1080/01639625.2016.1197656.

Wood, Graeme. 2020. “Kyle Rittenhouse, Kenosha, and the Sheepdog Mentality.” The Atlantic. Retrieved October 9, 2020 (https://www.theatlantic.com/ideas/archive/2020/08/kyle-rittenhouse-kenosha-and-sheepdog-mentality/615805/).

Wylie, Christopher. 2019. Mindf*ck: Cambridge Analytica and the Plot to Break America. New York: Random House.

Youyou, Wu, Michal Kosinski, and David Stillwell. 2015. “Computer-Based Personality Judgments Are More Accurate than Those Made by Humans.” Proceedings of the National Academy of Sciences 112(4):1036–40. doi: 10.1073/pnas.1418680112.

Zuboff, Shoshana. 2019. The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power. 1 edition. PublicAffairs.

Zuboff, Shoshana. 2020. “Opinion | You Are Now Remotely Controlled.” , January 24.

21