<<

Studies in Communication Sciences 18.1 (2018), pp. 69–85

Digital in politics: Definition, typology, and ­countermeasures Marko Kovic, Zurich Institute of Public Affairs Research (ZIPAR), Switzerland* Adrian Rauchfleisch, National University and Zurich Institute of Public Affairs Research (­ZIPAR), Switzerland Marc Sele, Zurich Institute of Public Affairs Research (ZIPAR), Switzerland Christian Caspar, Zurich Institute of Public Affairs Research (ZIPAR), Switzerland *Corresponding author: [email protected]

Abstract In recent years, several instances of political actors who created fake activity on the have been uncovered. We propose to call such fake online grassroots activity digital astroturfing, and we define it as a form of manufactured, deceptive and strategic top-down activity on the Internet initiated by political actors that mimics bottom-up activity by autonomous individuals. The goal of this paper is to lay out a conceptual map of the phenomenon of digital astroturfing in politics. To that end, we introduce, first, a typology of digital astroturfing according to three dimensions (target, actor type, goals), and, second, the concept of digital astroturfing repertoires, the possible combinations of tools, venues and actions used in digital astroturfing efforts. Furthermore, we explore possible restrictive and incentivizing countermeasures against digital astroturfing. Finally, we discuss prospects for future research: Even though empirical re- search on digital astroturfing is difficult, it is neither impossible nor futile.

Keywords political communication, Internet, , astroturfing, bots

1 Introduction envisioned in the ’s early days could, in principle, take place. Access Not long after the advent of the World to the Internet is increasingly common Wide Web, there were high hopes that this even in relatively poor countries, and both new form of access to the Internet might the means for accessing the Internet as prove a boon for democracy. Thanks to the well as the ways in which users can access World Wide Web, citizens could gain ac- political information and engage in politi- cess to politically relevant information far cal discourse have proliferated. easier than ever before. Even more than Of course, the World Wide Web has that: The World Wide Web seemed to car- not produced the democratic utopia that ry the potential to make the elusive ideal some might initially have hoped for. But of deliberative democracy a reality (or, at Internet has certainly had an impact on least, to make instances of deliberative politics and, more specifically, on political democratic participation easier) (Buch- discourse. Citizens do indeed seek politi- stein, 1997; Dahlberg, 2001; Rheingold, cal information online and they do engage 1993). The positive force of the Internet, in new forms of digital political discourse. the hope went, would make democratic Unfortunately, the relative ease of engag- participation easier than ever and thus it ing in political discourse on the Internet would promote democratic values, even in makes it also relatively easy to undermine, non-democratic countries. or at least negatively affect, that very dis- The rapid progress both of Internet course. One perfidious way in which this penetration and of Internet-related tech- can happen: Political actors can create nologies has created circumstances in online activity that seems like authentic which the democratic online revolution as

https://doi.org/10.24434/j.scoms.2018.01.005 © 2018, the authors. This work is licensed under the “Creative Commons Attribution – NonCommercial – NoDerivatives 4.0 International” license (CC BY-NC-ND 4.0). 70 Kovic et al. / Studies in Communication Sciences 18.1 (2018), pp. 69–85 activity by regular citizens, when it is, in persuasive power of some of sock puppets, reality, anything but. such as the influential, yet fictitious “Jenna Online users who are not what they Abrams” (Colins & Cox, 2017). The astro- pretend to in order to ruse real people are turfing campaign of 2016 was so pervasive the opposite of what is hoped for in the that it was proposed that the US create a ideal of deliberative democracy. Fake on- military unit in order to defend from such line users are not interested in participat- campaigns (Hart & Klink, 2017). ing in rational political discourse, but rath- Digital astroturfing is not limited to er in deliberately and actively skewing the U.S. elections. Instances of digital astro- discourse according to the goals they are turfing have been documented in many pursuing. This is not an entirely new tool. countries (Wooley, 2016). Astroturfing ef- So-called astroturfing, fake grassroots forts can be conducted by outside political campaigns and organizations that are actors, such as in the 2017 French election created and backed by political actors or (Ferrara, 2017) or in the 2016 Brexit vote in businesses, have a long tradition (Lyon & the (Bastos and Mercea, Maxwell, 2004; Walker & Rea, 2014). Astro- 2017 ; Llewellyn et al., 2018), or they can turfing on the Internet is more problemat- be campaigns of political actors within ic than the traditional kind: Digital astro- their own country, such as the systematic turfing is cheaper, has a greater scope, and digital astroturfing strategy of (King is potentially much more effective than et al., 2017). Digital astroturfing is also not regular astroturfing. limited to manually curated sock puppets. For example, many digital astroturfing ef- 1.1 A new era of political astroturfing forts are conducted with automated bots, In the wake of the 2016 U.S. presidential whereby computer software impersonates election, the National Intelligence Coun- humans by acting and reacting in as “nat- cil at the Office of the Director of National ural” ways as possible (Bessi & Ferrara, Intelligence released a declassified version 2016; Shao et al., 2017; Wooley, 2016). of an assessment of Russian state-spon- sored activities during the election (Na- 1.2 The goal of this paper tional Intelligence Council, 2017). In that The phenomenon of digital astroturfing report, the National Intelligence Council is of obvious importance. Digital astro- describes the Russian interference in the turfing represents an inherently deceptive 2016 election as a continuation of co- political activity that is detrimental to the vert influence campaigns that and democratic discourse and to democratic the were conducting in the decision-making. In view of this problem, past. However, the report also mentions the goal of this present paper is threefold. the activities of the St. Petersburg-based First, we want to define digital astro- “” as part of the turfing in a generalized, universally ap- Russian influence campaign. The Internet plicable manner. Digital astroturfing is a Research Agency is a state-sponsored on- complex political activity. Second, we pro- line astroturfing organization (Chen, 2015; pose a typology of different forms of digital MacFarquhar, 2018) specialized in creating astroturfing. That typology is based on the and maintaining sock puppets, fake online type of actor that is engaging in the astro- personae that are mimicking regular users turfing effort; the target that (Bu, Xia, & Wang, 2013). The astroturfing is to be affected by the astroturfing effort, campaign conducted by the Internet Re- and the goal that is pursued by the astro- search Agency was great in scope. In early turfing effort. Third, we discuss possible 2018, notified over 1.4 million of its countermeasures to either prevent digital US-based users that they had interacted astroturfing or to lessen its impact. with sock puppet accounts likely affiliated with the Internet Research Agency (Twitter PublicPolicy, 2018). But the campaign was also sophisticated, as evidenced by the Kovic et al. / Studies in Communication Sciences 18.1 (2018), pp. 69–85 71

2 A generalized definition of digital as the dissemination of deceptive opinions astroturfing in politics by imposters posing as autonomous indi- viduals on the Internet with the intention In the previous section, we have provi- of promoting a specific agenda”(p. 3). sionally introduced digital astroturfing as This definition is similar to the one a type of political online activity that pre- we introduce in three respects. First, we tends to be the activity of individual, regu- have adopted the authors’ very apt no- lar online users, when it is in fact activity tion of “autonomous individuals” into our created by political actors. Such a provi- own definition because it describes very sional definition already describes the na- precisely what kind of is taking ture of digital astroturfing to some degree, place. We expand the descriptive notion but it is derived from an intuitive inductive of autonomous individuals by contrasting process of describing what different cases bottom-up and top-down activity; auton- of digital astroturfing have in common. A omous individuals engage in bottom-up more precise deductive definition is need- activity, and digital astroturfing is top- ed in order to describe digital astroturfing down activity pretending to be bottom-up in a generalized way, independent from activity. Second, and this is nontrivial, the specific singular occurrences of digital authors define digital astroturfing as an astroturfing. To that end, we propose the activity that is taking place on the Inter- following definition: net. And third, they describe digital astro- Digital astroturfing is a form of manu- turfing as a deceptive activity. We agree factured, deceptive and strategic top-down with those three aspects as necessary activity on the Internet initiated by politi- conditions, but the definition lacks accu- cal actors that mimics bottom-up activity racy and generalizability. The definition by autonomous individuals. is inaccurate because it implies that the In this definition, digital astroturfing relevant actors are the imposters posing as a form of political activity consists of as autonomous individuals. However, the five necessary conditions: It takes place on truly relevant actors are not the imposters the Internet, it is initiated by political ac- themselves, but rather the political actors tors, it is manufactured, it is deceptive and initiating the astroturfing efforts. The im- it is strategic. Digital astroturfing is manu- posters, such as, for example, the employ- factured in that it is not an honest expres- ees of the Internet Research Institute in St. sion of autonomous individual opinions, Petersburg, are simply performing tasks but rather activity created to mimic hon- on behalf of political actors; they are the est expression of autonomous individual means to an end. It is possible that in some opinions. It is deceptive, because the goal cases, the actors who initiate the digital of digital astroturfing is to trick the target, astroturfing in pursuit of some goals will usually the public at large (including in- also actively carry out the digital astroturf- dividuals and small groups), into believ- ing effort themselves. For example, a pol- ing that the manufactured activity is real. itician might himself create a number of It is strategic, because the political actors fake Twitter accounts with which he would who engage in digital astroturfing pursue then follow his own, real account in order certain goals in doing so. This definition to feign popularity of his account. But even is relatively simple, but in simplicity, we in such a scenario, the distinction between believe, lies analytical clarity. So far, only who initiates the digital astroturfing effort few authors have attempted to define dig- is clearly separate from how the digital as- ital astroturfing. The most relevant such troturfing effort is actually performed. The attempt is a recent conceptual study by origin of digital astroturfing lies with the Zhang, Carpenter, and Ko (2013). In that political actors who pursue certain goals study, the authors define online astro- by engaging digital astroturfing, and the turfing (the authors use the term online means of actually carrying out the digital rather than digital astroturfing) as follows: astroturfing effort can be, and in many in- “Therefore, we define online astroturfing 72 Kovic et al. / Studies in Communication Sciences 18.1 (2018), pp. 69–85 stances probably are, separate from those the potentials of digital astroturfing, but political actors. she operates with the definition of regular Furthermore, the above definition by astroturfing: « L’astroturfing est une straté- Zhang et al. (2013) is not fully generaliz- gie de communication dont la source est able because it does not describe a general occultée et qui prétend, à tort, être d’ori- principle, but rather one empirical scenar- gine citoyenne » (p. 201). io of digital astroturfing: Cases in which Is a conceptual differentiation be- individuals create fake online personæ, tween regular and digital astroturfing, as the so-called sock puppets, in order to indirectly proposed by our proposal of an communicate certain points of view. While explicit definition of digital astroturfing, such a digital astroturfing scenario might reasonable given the fact that several au- be a very important one, it is not the only thors analyse digital astroturfing, but only possible one as we will later point out in apply a definition of regular astroturfing? our typology. We believe it is, because digital astroturf- ing conceptually clearly differs from regu- 2.1 The difference between regular lar astroturfing. In our definition of digital astroturfing and digital astroturfing astroturfing, we identify five necessary Ours is not the first study to tackle the conditions: The activities take place on phenomenon of digital astroturfing. How- the Internet, they are initiated by political ever, in the previous section, we only dis- actors, and they are manufactured, decep- cuss one definition of digital astroturfing tive and strategic. If digital astroturfing is from one recent study. We are not willing- to be understood simply as the extension ly disregarding other definitions of digi- of astroturfing onto the Internet, then four tal astroturfing, but the status quo of the of these five criteria have to apply to reg- conceptual work on digital astroturfing is ular astroturfing; obviously, regular astro- rather limited: Most studies that look at turfing does not have to fulfil the condition digital astroturfing do not operate with of taking place on the Internet. However, an explicit definition of digital astroturf- regular astroturfing is not adequately de- ing, but they tend to extend the concept of scribed by the remaining four necessary regular astroturfing into the online realm conditions. In regular astroturfing, some instead. For example, Mackie (2009) de- political actors are present, and those ac- scribes astroturfing on the Internet in the tors act strategically in that they pursue following terms: “Astroturfing is the use of goals. But those actors are not necessari- paid agents to create falsely the impres- ly always initiating the activities, they are sion of popular sentiment (the grass roots not necessarily directly manufacturing are fake, thus the term astroturf, which the activities, and they are not necessari- is artificial grass)” (p. 32). Even though ly deceitful about their involvement in the the author goes on to argue that and why activities. Vergani (2014) makes this point the Internet is vulnerable to astroturfing, clear by introducing the concept of grass- he does not propose a separate, explicit roots orchestra, a situation where genuine definition of astroturfing that is occurring grassroots activity takes place, but over on the Internet. This implies that astro- time, political actors get involved in that turfing is astroturfing, no matter where it genuine activity and they influence, if not takes place. Furthermore, Mackie (2009) usurp the grassroots activity in order to is only considering paid agents to be in- steer it in a direction beneficial to them- dicative of astroturfing. We believe that selves. A recent prominent example of a unpaid agents can be involved in digital grassroots orchestra is the Tea Party move- astroturfing as well, as in the case of sym- ment in the : It seems that the pathizers who become involved in digital Tea Party was an amalgamation of genuine astroturfing because they honestly sup- grassroots activity with coordinating and port a cause; we delve into this issue in the mobilizing influence by political actors sections below. In a similar vein as Mackie (Formisano, 2012). (2009), Boulay (2012) explicitly addresses Kovic et al. / Studies in Communication Sciences 18.1 (2018), pp. 69–85 73

The distinction between regular and ties, interest groups, individual politicians, digital astroturfing, in summary, lies in and individual citizens. The first two actor the fact that digital astroturfing is a clearly categories, governments and political par- demarcated dichotomous phenomenon, ties, are self-explanatory. The third actor while regular astroturfing is not; in regu- category, interest groups, perhaps less so. lar astroturfing, varying degrees of genu- We use the term interest group to mean a ine grassroots components can be pres- group that pursues political goals, but is ent. Of course, this conceptual difference neither part of the government nor a po- between digital astroturfing and regular litical party. This can include loose groups astroturfing does not preclude the possi- of like-minded people, nonprofit organi- bility that there are instances of astroturf zations, businesses, and so forth. Interest campaigns where regular and digital astro- groups are thus groups that pursue polit- turfing are combined in order to achieve as ical interests, and to that end, they inter- great an impact as possible (Walker, 2014). act with the political system. Sometimes, interest group activity will be in the form 2.2 A typology of digital astroturfing of discrete, non-public direct interaction With our generalized definition of digital with members of government or parlia- astroturfing, it is possible to reflect upon ment; i. e., in the form of . But in what digital astroturfing can empirically other situations, interest groups are advo- look like. Because of the clandestine na- cating their position publicly. The nature ture of digital astroturfing, it could be very of interest groups can be very varied; typi- misleading to base a typology of digital cal examples of interest groups in Western astroturfing solely on the exposed cases of countries are corporations as employer digital astroturfing, since there is no way interest groups and unions as employee of knowing whether those exposed cases, interest groups (Walker, 1983). In the con- such as the briefly mentioned ones in the text of interest groups as political actors, introduction, cover all possible types of it is important to keep in mind that digi- digital astroturfing. For that reason, we opt tal astroturfing is concerned with political for a theoretico-deductive typology of dig- goals. There is a prominent phenomenon ital astroturfing: It is not possible to induc- of non-political, strictly commercial qua- tively gather information on all instances si-astroturfing that some businesses en- of digital astroturfing, but it is possible to gage in: The practice of faking online con- define a set of dimensions that will plau- sumer reviews to make one’s products or sibly exhaust the possible types of digital services appear more popular than they astroturfing. actually are or to make competitors’ prod- We propose three dimensions that ucts or services appear less popular than encompass the different possible types they actually are (Yoo & Gretzel, 2009). of digital astroturfing: The political actors This type of deceptive online activity by who engage in digital astroturfing, the tar- commercial interest groups is not digital get of the digital astroturfing effort, and astroturfing, because the nature of the ac- the goal of digital astroturfing. tivity that the actors pursue is not political. The first dimension of digital astro- If, however, a business is pursuing political turfing types are the political actors who goals and is engaging in digital astroturfing engage in digital astroturfing. It is reason- to that end, then we subsume that business able use the category of the political ac- in the interest group category of initiating tors that engage in digital astroturfing as a political actors. The fourth category of starting point for a typology of digital as- political actors are individual politicians. troturfing – after all, political actors are the Individual politicians are, for the most conditio sine qua non of digital astroturf- part, members of groups such as political ing; political actors are the initiating force parties, but there are situations in which behind every digital astroturfing effort. We individual politicians act on their own distinguish between five categories of po- in terms of digital astroturfing. We men- litical actors: governments, political par- tioned one such scenario in the introduc- 74 Kovic et al. / Studies in Communication Sciences 18.1 (2018), pp. 69–85 tion: A politician who, as part of his elec- digital astroturfing with trolling in such a tion campaign, paid for Twitter followers. manner is a grave conceptual error. We The second dimension of digital as- have defined digital astroturfing as man- troturfing types is the persuasion target ufactured, deceptive and strategic activity of digital astroturfing. We differentiate on the Internet initiated by political actors. among two general persuasion targets, Of these five necessary conditions, trolling the public and specific political actors. It only satisfies two: Trolling takes place on is quite obvious to imagine the general the Internet, and there is a strategic com- public as one target of digital astroturfing, ponent to it. However, trolling is not initi- since most known cases of digital astro- ated on behalf of political actors, it is not turfing are cases of public digital astroturf- manufactured and it is only somewhat de- ing activities. Of course, political actors ceptive. can also perceive public digital astroturf- Trolling is best understood as mali- ing and be affected by it. But another cate- cious, disruptive or disinhibited (Suler, gory of digital astroturfing is possible, one 2004) online behavior by individuals who that is specifically aimed at political actors engage in the activity of trolling of their and is not public. own individual volition (Hardaker, 2010). The third dimension of the typology Trolling often also appears as a social, of digital astroturfing are the goals the po- coordinated activity (MacKinnon & Zuck- litical actors pursue by engaging in digital erman, 2012). Still, conceptually, digital astroturfing. There are, of course, a myriad astroturfing is completely separate from specific goals that specific political actors trolling, and mixing these concepts should can pursue in specific political contexts. be avoided, not least because talking about The idea of this third dimension of the dig- digital astroturfing in terms of trolling trivi- ital astroturfing typology is not to describe alizes digital astroturfing: Digital astroturf- every possible situational configuration, ing is a deceptive effort aimed at the public but rather to meaningfully divide the spec- and launched by political actors. However, trum of the goals of digital astroturfing into there is one potential connection between a small number of categories. We identify trolling and digital astroturfing: Cases of two such categories of goals: support for digital astroturfing in which the deceptive or opposition to policy, and support for activity takes the form of trolling. In such or opposition to political actors. We pro- cases, the trolling that is taking place is not pose only two categories, since the goal authentic, but manufactured and strate- of digital astroturfing is to communicate gically used by political actors. We believe a valence either towards policy or towards this conceptual distinction is important. political actors, and being positive or neg- Not every activity that is widely described ative is simply the property of the valence. as trolling is in fact trolling. As for the objects of the valence, political However, it is possible that political actors who engage in digital astroturfing actors engage in digital astroturfing and will either want to influence public opin- the honest bottom-up activity they mim- ion or specific political actors with regard ic is trolling. This behavior can still be de- to certain policy issues, either to support scribed in terms of our typology, because their own position on those issues, or to a deceptive and strategic use of trolling oppose the position of other political ac- is meant to disrupt some public debate tors. The other broad category of goals and expression of opinion that a political is support for or opposition to political actor objects to, and those public debates actors. When political actors pursue this and expressions of opinion will, naturally, goal, they will either do so in order to feign refer to some policy or to some political public support for themselves, or to make actors. Fake trolling thus has the goal of other political actors appear unpopular. indirectly stymieing expressions of sup- In public discussions of digital astro- port for or criticism against policy or for turfing, instances of digital astroturfing are political actors. For example, the Chinese sometimes described as trolling. To equate government regularly tries to distract the Kovic et al. / Studies in Communication Sciences 18.1 (2018), pp. 69–85 75

Table 1: Typology of digital astroturfing

Initiating political actor Persuasion target Goal: Support for or opposition to: Policy Political actors Government The Public 1 2 Political party 3 4 Individual politician 5 6 Interest group 7 8 Government Political actors 9 10 Political party 11 12 Individual politician 13 14 Interest group 15 16 Note. Each number represents one digital astroturfing scenario as a combination of three dimensions. Reading example: The digital astroturf- ing type with the number 7 describes a digital astroturfing scenario in which an interest group has the goal of influencing the public opinion on some policy issue. public and change the subject of critical Tarrow, 2011). It is useful to use an analo- discussions online with fabricated user gous concept for digital astroturfing: The comments (King et al., 2017). concept of digital astroturfing repertoires. Combining the three dimensions of Digital astroturfing repertoires cannot be the typology leads to the sixteen different defined universally, because the tools and types of digital astroturfing, as reported venues available for digital astroturfing are in Table 1, provide a useful framework for very much bound by time and space and analyzing individual cases of digital astro- are likely to change – just like the turfing as well as for guiding expectations repertoires of social movements are bound about digital astroturfing efforts in gen- by time and space and likely to change over eral. Digital astroturfing can occur, as we time (Biggs, 2013). Also, describing digital argue in the introduction, in very different astroturfing repertoires is ultimately an political contexts, and as a consequence, inductive task of continued observation of it can be challenging to analyze separate digital astroturfing cases. Even though we cases. With our typology, we provide, in cannot specify a definitive and final list of essence, a useful heuristic – a way to think context specific digital astroturfing reper- about digital astroturfing. toires, there are some typical repertoires, we believe, that encompass a large portion 2.3 Digital astroturfing repertoires of contemporary digital astroturfing rep- Our typology consists of sixteen digital as- ertoires. troturfing scenarios that describe which Digital astroturfing repertoires can be actors pursue what kind of goal with their thought of, as mentioned above, as combi- digital astroturfing efforts. However, the nations of tools, venues and actions. Some different scenarios do not automatically typical digital astroturfing tools are sock imply what specific measures the politi- puppets, click farms, and sympathizers. cal actors take in order to carry out their Sock puppets are, as we describe in the in- digital astroturfing efforts. These specif- troduction, fake online personae (Bu et al., ic efforts consist of three elements: The 2013) that can be used for a variety of pur- specific digital astroturfing tools used, the poses. Click farms are a relatively recent specific venues where these tools are ap- variant of sock puppets tied to the rise of plied, and the specific actions that are tak- . Click farms provide services en with those tools in those venues. that, essentially, boil down to faux social In social movement research, the media activity, such as fake followers on concept of protest repertoires is used to Twitter or fake likes on (Clark, describe which tools social movements 2015). Both sock puppets and click farms use in which contexts (Della Porta, 2013; can contain various degrees of automa- 76 Kovic et al. / Studies in Communication Sciences 18.1 (2018), pp. 69–85 tion through bots, programs that operate ment sections (predominantly those of automatically on behalf of agents (Geer, news websites), and emails. We under- 2005). For example, fake user profiles used stand social media as platforms where in click farms are usually created manual- users, be they individuals, groups or or- ly, but the accounts are subsequently op- ganizations, can connect with other users erated automatically. A lot of the current (Boyd & Ellison, 2007) and where the us- research in the area of digital astroturfing ers themselves are the primary creators of is focused on bots. Bots are certainly im- content (Kaplan & Haenlein, 2010). Typical portant, not least quantitatively, since examples of social media are services such creating and deploying bots is fairly easy as Facebook, Twitter, , Snapchat, and cheap. However, bots are still, for the and so forth. We also consider online pe- most part, not able to persuasively mim- titions to be a form of social media; after ic humans (Stieglitz et al., 2017), limiting all, when signing online , users their potential persuasive power. Bots as connect with each other and by doing so automated sock puppets or as automat- create a certain type of content. Websites ed click farms are certainly important, are separate from social media in that the but manually curated and operated sock users who curate websites do not simply puppets and click farm profiles still mat- create profiles on social media sites, but ter. The idea of sympathizers as digital as- rather create separately hosted, indepen- troturfing tools might, prima facie, seem dent content. Comment sections are the a misclassification. After all, sympathiz- sections on websites that allow website ers are real people who honestly hold the visitors to leave written comments below opinions they put forward. The question content on the website, and sometimes, whether sympathizers as digital astroturf- to like or dislike existing comments. In the ing tools honestly believe in what they do context of digital astroturfing, comment is conceptually irrelevant. Political actors sections of news websites are potentially are still orchestrating, i. e., manufacturing one of the most relevant venues. In prin- some activity that is supposed to look like ciple, comment sections could be thought non-manufactured, spontaneous activity; of as just another form of social media, but that activity is deceptive (the target of the we treat them as a separate venue. Users astroturfing attempt has no knowledge of are the primary creators of content on so- its manufactured origins); and the activi- cial media, but not in comment sections ty is strategic from the point of view of the of news websites – in comment sections, political actor. Whether sympathizers as users only react to content created by jour- the tools of digital astroturfing sympathize nalists. The final venue of digital astroturf- with the political actors on whose behalf ing are direct messages. Describing direct they act has no conceptual impact on the messages (such as emails, online contact nature of the activity at hand. Howard forms, private messages on social media, (2006) describes such a case in which sym- and so forth) as venues is perhaps some- pathizers were mobilized in an orches- what confusing, because direct messages trated attempt without even knowing that aren’t public places on the Internet. But they were instrumentalized by a political what we mean by venue is not a public lo- actor. Finally, paid supporters are similar cation, but rather the place where the digi- to sympathizers: They both engage in ac- tal astroturfing tools are applied. However, tivities on behalf of political actors without direct messages are a somewhat special openly declaring so, and they both do so case of digital astroturfing, because direct with their true identities rather than with messages can be targeted at specific polit- sock puppets. However, paid supporters ical actors, without the general public tak- are not (only) acting out of personal con- ing notice. viction, but out of pecuniary interests – Finally, there are only two digital as- they are being paid for their activities. troturfing actions: Actively creating con- Some typical venues of digital astro- tent, or passively signaling (dis-)approv- turfing are social media, websites, com- al. The difference between active content Kovic et al. / Studies in Communication Sciences 18.1 (2018), pp. 69–85 77 creation and passive signaling of (dis-)ap- ital astroturfing being effective in some proval can be exemplified with comment venues. More specifically, we identify sections on websites. Writing a comment two broad strategies of countermeasures: constitutes an active creation of content, Restrictive countermeasures and incen- whereas liking or disliking is merely a pas- tivizing countermeasures. Restrictive sive signaling of approval or disapproval. countermeasures are measures that limit The different tools, the different venues the spectrum of possible online activities, and the different actions of digital astro- and this limitation of online activities in turfing are summarized in Table 2. general also limits digital astroturfing ac- A digital astroturfing repertoire may con- tivities. Incentivizing countermeasures, in sist of any combination of tools, venues contrast, are measures that are conducive to honest forms of online activity. Incen- tivizing countermeasures do not neces- Table 2: Tools, venues and actions of digital sarily prevent digital astroturfing, but by astroturfing repertoires rewarding more openly honest activity,

Tools Venues Actions incentivizing countermeasures make dig- ital astroturfing less prominent and thus, sock puppets social media creating content potentially, less impactful. click farms websites signalling (dis-) approval Before we delve into both types of sympathizers comment sections countermeasures in more detail, it is in paid supporters Direct messages order to justify why exactly we propose countermeasures in the first place – the fact that we actively discuss countermea- and actions from Table 2. For example, sures implies that we regard digital astro- the combination of sock puppets with turfing to be a problem. comment sections (of news websites) and There are two major reasons why creating content is the digital astroturfing digital astroturfing is an important and repertoire used by the Russian Internet problematic phenomenon, a normative Research Agency described in the intro- one and an empirical one. Normatively, duction. The combination of click farms digital astroturfing is challenging because with social media and the signaling of ap- the act of engaging in digital astroturf- proval is the digital astroturfing repertoire ing, essentially, amounts to subterfuge, used by the Swiss politician mentioned in and subterfuge is nothing more than a the introduction who bought fake twitter form of lying (Grover, 1993). Empirically, followers. A repertoire consisting of sym- digital astroturfing almost certainly has pathizers combined with direct messages some real-world impact. While it is, for and creating content is a digital astroturf- obvious reasons, very difficult to quantify ing repertoire sometimes applied by polit- the effects of digital astroturfing on pub- ical candidates running for office who en- lic opinion or on political actors, it is not courage sympathizers to send pre-written far-fetched to assume that some effects emails to friends and to newspaper editors are very likely to exist. If, for example, a (Klotz, 2007). user comment in the comment section of a news website is written by a sock pup- pet, and some individual reads that com- 3 Countermeasures ment without realizing that it was written by a sock puppet, then the minimal effect Digital astroturfing is a clandestine activ- of digital astroturfing has happened – ity, and as such, it is difficult to devise de- someone was deceived by the digital as- fense mechanisms against it. While there troturfing effort. The probability of such is probably no way to completely prevent a minimal effect of digital astroturfing is digital astroturfing from occurring, there very high; if the amount of people who get are ways to limit the probability of digital deceived by digital astroturfing efforts is astroturfing occurring or, at least, of dig- greater than zero, then the minimal effect 78 Kovic et al. / Studies in Communication Sciences 18.1 (2018), pp. 69–85 exists, and consequently, digital astroturf- The first of those venues is social me- ing has an empirical impact. What is not dia. One potential restrictive measure for clear, however, is the full extent of the em- social media is the detection of inauthen- pirical impacts of digital astroturfing. For tic user accounts. There already exists con- example, there is some evidence that us- siderable research in the area of automat- er-generated comments on news articles ed detection of fake social media accounts, can, in general, affect the perception of usually in the context of spam accounts the content of those news articles (Ander- that post links to products and services son, Brossard, Scheufele, Xenos, & Ladwig, or to other unrelated online content 2014; Lee, 2012). It is entirely possible that (Mukherjee, Liu, & Glance, 2012; Zheng, astroturfed comments can produce simi- Zeng, Chen, Yu, & Rong, 2015). While there lar effects. is some research that extends the question of automated detection of fake profiles 3.1 Restrictive countermeasures, or: to digital astroturfing (Chen et al., 2011), Limiting online activity that research is still scarce. Another pos- The most sweeping restrictive counter- sible restrictive countermeasure for social measure against digital astroturfing is be media platforms is the implementation of to outlaw it. However, an outright crimi- a mandatory real-name policy for users. nalization of digital astroturfing is, at the Implementing a mandatory real-name very least, hardly enforceable (digital as- policy might sound drastic, but it has been troturfing is, after all, clandestine). Also, the official policy of one of the largest so- digital astroturfing activities that do not cial media platforms, Facebook, for years use sock puppets but individuals who en- (Wauters, Donoso, & Lievens, 2014; Giles, gage in digital astroturfing activities with 2011). However, even though using one’s their real identities, would present a dif- real name is the default policy at Facebook, ficult legal situation: While the activity is not every user has to or is able to provide clearly digital astroturfing (a political actor credentials for their identity. Facebook is manufactures deceptive activity in order detecting potential pseudonyms algorith- pursue his own goals), the individuals who mically and, apparently, selectively, which represent the digital astroturfing tools can has led to instances of users accounts be- easily claim to honestly have the opinions ing blocked despite being honest expres- that they express through their activities. sions of real identities (Dijck, 2013; Lingel Outlawing digital astroturfing thus seems & Golub, 2015). Mandatory real-name very unrealistic. Furthermore, such action policies for social media also exist beyond would potentially harm freedom of speech Western social media platforms, notably and thus, from a consequentialist point of in China. A prominent step towards ban- view, do more harm than good (e. g., Deib- ning anonymity on Chinese social media ert et al., 2008). has been taken in 2011, when a manda- More realistic means of restrictive tory real-name policy for so-called mi- countermeasures are countermeasures croblogging sites that are headquartered applied at the different digital astroturfing in Beijing, such as , had been venues as summarized in Table 2. But not introduced. Under that policy, users have all of the venues are suited for restrictive to register with their real identities, but are measures. Specifically, digital astroturfing allowed to use nicknames on their public can neither be meaningfully restricted profiles. However, the policy has not yet when it comes to creating websites (it is been thoroughly enforced (Jiang, 2016). trivially easy to create a website anony- The second digital astroturfing venue mously) nor when it comes to sending di- where restrictive countermeasures can be rect messages, such as emails, to political implemented with a high probability of actors. But for the other two venues, re- the desired impact are comment sections strictive countermeasures can be imple- on news websites. By and large, comment mented. sections of news websites are already gov- erned by some degree of restrictive mea- Kovic et al. / Studies in Communication Sciences 18.1 (2018), pp. 69–85 79 sures in that most news websites have reducing or completely preventing in- some moderation policies in place (Can- stances of digital astroturfing. However, ter, 2013; Hille & Bakker, 2014). Comment implementing such restrictive measures moderation policies can be made more can also have some downsides, and we restrictive in order to prevent astroturfed believe that the benefit of reducing digital comments. One restriction measure is to astroturfing by restrictive measures does ban anonymous comments and enforce a not necessarily outweigh the downsides. real-name policy, similar to the social me- When it comes to the automatic detection dia examples cited above. From a logical of unauthentic user profiles on social me- point of view, banning anonymous com- dia platforms, the potential downside are ments will always prevent sock puppets primarily false positives: When automatic from commenting, which means that this means of detecting and deleting inauthen- measure is definitively effective, but only tic user accounts are implemented, there if it is systematically enforced by means of is a chance that a portion of the accounts an explicit verification process. thus detected will actually be authentic Enforcing a real-name policy is not a ones. However, this downside of false pos- viable restrictive countermeasure against itives can be actively controlled and dialed all digital astroturfing tools listed in down. How many false positives there are ­Table 2: Paid supporters and sympathiz- is to a substantial degree a trade off with ers will still be able to be active on social the desired sensitivity of the automatic de- media and to comment on news websites, tection algorithms. If a maximum sensitiv- since they do so with their real identity. ity is desired, meaning that a perfect rate Another possible restrictive measure of true positives is the goal, then the false is not aimed at the tools used for digital positive rate will, probably, be at its high- astroturfing, but at the specific actions of est. But if a lower sensitivity is accepted, digital astroturfing: Disabling the possi- meaning that not all true positives will be bility of signaling approval or disapprov- detected, then the false positive rate can al for existing comments through liking be lowered. or disliking them. From a logical point of The downsides of restrictive counter- view, disabling the possibility of liking or measures in comments sections on news disliking comments will always prevent websites are less open to adjustment than this form of astroturfing. A third and most the downsides of automatic detection of far-reaching restrictive measure with re- inauthentic user accounts on social media. gard to comments is to completely disable The first such measure we discuss above comments. A number of prominent news is enforcement of a real-name policy in websites have opted for disabling their the comment sections. Perhaps it is not comment section in recent years (Finley, self-evident why non-anonymity should 2015), but not primarily because of per- have downsides at all – if one has an hon- ceived inauthentic commenters. Disabling est opinion to publicly share, why not do comments will, from a logical point of so using their true identity? The problem view, always prevent all instances of astro- with that notion is that in many non-dem- turfed comments and, as a logical conse- ocratic countries, voicing one’s opinion quence, it will always prevent all instances publicly can have dire consequences. A of astroturfed “liking” or “disliking” of ex- prominent example of such a country is isting comments as well. This means that China, where online users criticize the completely disabling comments is the government under the guise of anonymity most effective restrictive measure against because such criticism is essentially illegal astroturfed comments, since it always pre- (Qiang, 2011). Limiting the possibilities of vents 100% of astroturfing instances. anonymous online commenting in such In summary, there are ways to imple- political contexts will either lead to less ment restrictive countermeasures against expression of opinion, or to an increase in digital astroturfing which are very likely sanctions because those who speak out are to lead to the desired outcome, which is easier to be tracked down. Another down- 80 Kovic et al. / Studies in Communication Sciences 18.1 (2018), pp. 69–85 side of doing away with anonymous com- honest behavior, honest online activity menting is commercial in nature. Remov- can become publicly discernible from on- ing anonymous commenting reduces the line activity for which there is no informa- total comment volume (Fredheim, Moore, tion about its authenticity. & Naughton, 2015), but only a small part of One goal of such incentives is to en- those anonymous comments is likely to be courage user participation under one’s astroturfed. This means that websites will true identity, without enforcing it. In the lose a lot of user-generated content and, case of social media, encouraging users consequently, website traffic. This can ul- to use their true identity consists of two timately lead to loss in revenue, which is steps. First, there needs to be an actual a relevant factor for commercial operators possibility for users to verify their identity of social media and of news websites. on a given social media platform. Second, The strongest measure with regard after a user has verified his or her identity, to comments on news websites that we the information of the authenticity of the discuss above is the complete disabling social media profile in question needs to of comments. Completely removing the be publicly visible. Some prominent social possibility for users to comment is a mea- media platforms, such as Twitter and Face- sure, we believe, that represents a step book, have such a verification program in backwards in terms of the development of place, and verified accounts are awarded the Internet, because it is a measure that with a publicly visible badge. However, reduces the opportunities for honest citi- both Twitter and Facebook do not current- zens to participate in public discourse. Of ly implement their verification scheme course, the Internet is a big place, and any widely for their whole user base, but only one user can, for example, easily create a to a relatively small portion of users who website to express whatever opinions he are a “public figure, media company or or she may hold. Furthermore, news orga- ” (Facebook, 2015). We believe that nizations obviously have no obligation to users can be incentivized to opt into a ver- host user-generated content (Jönsson & ification process by nudging them towards Örnebring, 2011) of any sort on their on- it. Users can be presented with advantages line platforms. But the possibility for cit- of a verified account, such as the pro-so- izens to directly react to news content by cial norm of transparency and authentici- commenting on it and by engaging with ty, but also with the rather playful element other users who comment as well does of receiving a badge that is proof of their seem like an added discursive value which verified status. However, such nudging is lost by disabling comments altogether. strategies are, to a certain degree, paternal- Restrictive countermeasures against istic in nature (Thaler & Sunstein, 2009). digital astroturfing are possible, but they We think that the paternalistic element of carry with them some degree of down- verification options should be minimized sides. Because of those downsides, it is by explicitly presenting verification as a worth exploring alternative strategies of choice that brings with it certain benefits. countermeasures: Such countermeasures User verification is also possible for that do not restrict, but instead incentiv- the venue of comment sections. Besides ize. identity verification, another type of veri- fication is possible for comment sections: 3.2 Incentivizing countermeasures, or: Incentives that allow for anonymous com- Rewarding honesty menting, but reward a history of good and The general idea behind incentivizing consistent commenting behavior. One countermeasures against digital astro- such scheme is currently applied by the turfing is to provide incentives that moti- New York Times, where some comment- vate users to behave in ways that are more ers are algorithmically selected for a pub- likely to be honest. Incentivizing honest licly visible verification badge based on behavior does not prevent digital astro- their commenting history (Sullivan, 2014). turfing from occurring, but by rewarding Commenters become eligible for verifica- Kovic et al. / Studies in Communication Sciences 18.1 (2018), pp. 69–85 81 tion if they have not run afoul of comment ment in the study of digital astroturfing, moderators in the past. The main incen- since very basic facts such as who is doing tive under that scheme is that verified what in the pursuit of which strategic goals commenters are able to immediately get is, by definition, absent in digital astroturf- their comments published, without prior ing. Even though it undoubtedly poses moderation. unique challenges, the empirical study of In comparison with restrictive digital astroturfing is not futile. countermeasures, incentivizing ones have A first step in the study of digital as- the great advantage that they do not limit troturfing is the establishment of a plau- the spectrum of online activities, but rath- sible conceptual framework of digital er expand it by offering rewards for behav- astroturfing. The very goal of our study is ior that is more likely to be honest. Howev- to contribute to this first step. In order to er, incentivizing countermeasures against conduct empirical research, we first need digital astroturfing are not without down- a sound understanding of how to think sides. One major downside is the tempo- about our object of study. In this sense, we ral component of incentivization: It takes do not think that research on digital astro- time, first, to implement incentives, and turfing should be exploratory in nature, second, for a critical part of a given user as is sometimes suggested for research on base to react to incentives. For example, regular astroturfing (Boulay, 2013). if Twitter and Facebook decided to offer Some important recent research ef- verification to all of their users, that would forts in the area of digital astroturfing are pose a non-trivial technical challenge, and focusing on so-called “social bots” (Bas- the widespread adoption of verification tos and Mercea, 2017; Ferrara et al., 2016; would probably not be immediate but Woolley, 2016;). Social bots represent one rather follow adoption rates and patterns digital astroturfing tool (automated sock- similar those of other technological inno- puppets) used within one venue (social vations (Rogers, 2003). media). As such, they most certainly war- rant scientific scrutiny. However, it is im- portant to note that social bots probably 4 Conclusion: Is research on digital represent the low hanging fruit of digital astroturfing feasible? astroturfing. Social bots are a relatively crude form of digital astroturfing, and be- Laying out a conceptual map of digital cause of their highly automated nature, astroturfing is only a means to an end – they are relatively easy to detect. Further- a map is meant to be used to explore. We more, and perhaps just as importantly, the have described a typology with ideal-types less sophisticated a digital astroturfing ef- (Weber, 1922). In the real world, no two fort, the less probable it is that the effort instances of digital astroturfing will be ex- will have the impact intended by the initi- actly the same, and they might employ any ating actor behind the digital astroturfing mix of astroturfing repertoires. Still, the effort. Not only could there be no effect, map that we propose, even though prelim- but there could actually be a negative ef- inary and likely incomplete, should be of fect (from the point of view of the initiat- some use for maneuvering the terrain of ing actor): When people become aware digital astroturfing. But how exactly can of persuasion attempts and thus develop digital astroturfing be explored empirical- persuasion knowledge, they tend not only ly? After all, digital astroturfing is a clan- to not be persuaded, but to actually react destine activity, and if it is carried out suc- negatively (Friestad & Wright, 1994). cessfully, we do not know that it has taken There are several empirical research place. This makes any kind of research strategies that can be applied to the study inherently challenging. For example, re- of digital astroturfing. A very important search designs that are routinely used in one are case studies. To date, most re- the study of other forms of political com- search on cases of digital astroturfing is munication can be impossible to imple- done by investigative journalists. Such 82 Kovic et al. / Studies in Communication Sciences 18.1 (2018), pp. 69–85 journalistic work is, of course, very valu- are summarized in Table 2 are suitable for able, but a more scientifically vigorous this, social media services and news orga- analysis of digital astroturfing cases is still nizations that have comment sections on necessary, not least because of the need their websites. Both social media services to ground case study research in a sound and news organizations should, in princi- conceptual framework. An obvious dis- ple, be interested in reducing, or at least in advantage of case studies is the fact that understanding digital astroturfing on their the cases that can be analyzed cannot respective platforms. The specific nature be freely selected – only those instances of the collaboration can be twofold. First, of digital astroturfing that have been re- the combined effort of researchers and vealed to be digital astroturfing are avail- operators of social media or of news plat- able for case study analysis. This a priori forms can be focused on the detection of selection could potentially introduce bias, digital astroturfing. The second possible because observed digital astroturfing is kind of collaboration is the implementa- perhaps different from unobserved digital tion and monitoring of countermeasures astroturfing. However, this potential bias against digital astroturfing. For this kind is not an insurmountable problem. Giv- of collaboration, greater weight should en a conceptual foundation that consists, be given to the implementation of incen- among other things, of a typology and a tivizing countermeasures than to restric- reasonable expectation of digital astro- tive ones. Restrictive countermeasures turfing repertoires, the potential selection are mostly a technical affair with poten- bias in case study research can be actively tially big downsides, while incentivizing addressed. For example, if case study re- countermeasures involve not only tech- search consistently failed to identify a cer- nical innovation, but observation of and tain type of digital astroturfing that could interaction with users as well. Restrictive indicate that that type of digital astroturf- countermeasures carry the promise of ing is very successful at remaining hidden. relatively fast short-term solutions, but in Another possible research strategy order to effectively combat digital astro- consists of surveys and interviews of polit- turfing in the long term, we believe that ical actors. It might seem naive to suggest innovations in the form of incentivizing to simply ask political actors whether they countermeasures are inevitable. engage in digital astroturfing. However, it is not at all unheard of for political actors to disclose sensitive information for scientif- References ic purposes as long as they are guaranteed anonymity. But talking to political actors Anderson, A. A., Brossard, D., Scheufele, D. A., about digital astroturfing does not have Xenos, M. A., & Ladwig, P. (2014). The “Nas- to produce a direct “admission of guilt” ty Effect”: Online incivility and risk per- to be useful. For example, the perception ceptions of emerging technologies. Journal of digital astroturfing by political actors is of Computer-Mediated Communication, valuable data, because it could indicate 19(3), 373–387. doi:10.1111/jcc4.12 009. how prevalent a phenomenon digital as- Apfl, S., & Kleiner, S. (2014). Die Netzflüsterer troturfing is. Talking to people who are [The net whisperers]. DATUM, 2014(11). potentially involved in or knowledgeable Retrieved from https://datum.at/die-netz- about digital astroturfing does not have to fluesterer/. be limited to political actors. Communica- Bastos, M. T., & Mercea, D. (2017). The Brexit tion professionals in the business of politi- botnet and user-generated hyperpartisan cal consulting are also willing to talk about news. Social Science Computer Review. their line of work, given a sufficient layer of Advanced online publication, 1–18. anonymity (e. g., Serazio, 2014). doi:10.1177/0894439317734157. A third promising research strategy Bessi, A., & Ferrara, E. (2016). Social bots distort is the direct collaboration with venues of the 2016 U.S. Presidential election online digital astroturfing. Two of the venues that discussion. First Monday, 21(11). Retrieved Kovic et al. / Studies in Communication Sciences 18.1 (2018), pp. 69–85 83

from http://firstmonday.org/article/ of online deliberative forums extending view/7090/5653. the . Information, Com- Biggs, M. (2013). How repertoires evolve: The munication & Society, 4(4), 615–633. diffusion of suicide protest in the twenti- doi:10.1080/13691180110097030. eth century. Mobilization: An International Deibert, R., Palfrey, J., Rohozinski, R., & Zittrain, Quarterly, 18(4), 407–428. doi:10.17813/ J. (Eds.) (2008). The information revolu- maiq.18.4.njnu779530x55082. tion and global politics. Access denied: The Boulay, S. (2013). Can methodological require- practice and policy of global Internet filter- ments be fulfilled when studying con- ing. Cambridge, MA: MIT Press. cealed or unethical research objects? The Della Porta, D. (2013). Repertoires of con- case of astroturfing. ESSACHESS – Journal tention. In D. A. Snow, D. Della Porta, B. for Communication Studies, 6(2/12), Klandermans, & D. McAdam (Eds.), The 177–187. Wiley-Blackwell encyclopedia of social Boulay, S. (2012). Quelle(s) considération(s) and political movements. Oxford, UK: pour l’éthique dans l’usage des technolo- Blackwell. doi:10.1002/9780470674871. gies d’information et de communication wbespm178. en relations publiques? Analyse de cas DePaulo, B. M., Kashy, D. A., Kirkendol, S. E., d’astroturfing et réflexion critique. Revista Wyer, M. M., & Epstein, J. A. (1996). Lying Internacional de Relaciones Públicas, 2(4), in everyday life. Journal of Personality 201–220. and Social Psychology, 70(5), 979–995. Boyd, D. M., & Ellison, N. B. (2007). Social net- doi:10.1037/0022-3514.70.5.979. work sites: Definition, history, and schol- Dijck, J. v. (2013). “You have one identity”: arship. Journal of Computer-Mediated Performing the self on Facebook and Communication, 13(1), 210–230. LinkedIn. Media, Culture & Society, 35(2), doi:10.1111/j.1083-6101.2007.00393.x. 199–215. doi:10.1177/0163443712468605 Bu, Z., Xia, Z., & Wang, J. (2013). A sock puppet Erat, S., & Gneezy, U. (2011). White lies. detection algorithm on virtual spaces. Management Science, 58(4), 723–733. Knowledge-Based Systems, 37 (January), doi:10.1287/mnsc.1110.1449. 366–377. doi:10.1016/j.knosys.2012.08.016 Facebook (2015). What is a verified page or Buchstein, H. (1997). Bytes that bite: The Inter- profile? Retrieved from https://www.face- net and deliberative democracy. Constel- book.com/help/196050490547892. lations, 4(2), 248–263. doi:10.1111/1467- Ferrara, E. (2017). and social 8675.00052. bot operations in the run up to the 2017 Canter, L. (2013). The misconception of online French presidential election. First Monday, comment threads. Journalism Practice, 22(8). Retrieved from https://arxiv.org/ftp/ 7(5), 604–619. doi:10.1080/17512786.2012 arxiv/papers/1707/1707.00086.pdf. .740172. Fielding, N., & Cobain, I. (March 17, 2011). Re- Chen, A. (June 2, 2015). The agency. The New vealed: US spy operation that manipulates York Times. Retrieved from http://www. social media. . Retrieved nytimes.com/2015/06/07/magazine/ from http://www.theguardian.com/ the-agency.html. technology/2011/mar/17/us-spy-opera- Chen, C., Wu, K., Srinivasan, V., & Zhang, X. tion-social-networks. (2011). Battling the : Finley, K. (October 8, 2015). A brief history Detection of hidden paid posters. arX- of the end of the comments. WIRED. iv:1111.4297 [cs]. arXiv: 1111.4297. Retrieved from http://www.wired. Clark, D. B. (April 21, 2015). The bot bubble: com/2015/10/brief-history-of-the-de- How click farms have inflated social media mise-of-the-comments-timeline/. currency. The New Republic. Retrieved Formisano, R. P. (2012). The Tea Party: A brief from https://newrepublic.com/arti- history. Baltimore, MD: Johns Hopkins cle/121551/bot-bubble-click-farms-have- University Press. inflated-social-media-currency. Fredheim, R., Moore, A., & Naughton, J. (2015). Dahlberg, L. (2001). The Internet and demo- Anonymity and online commenting: An cratic discourse: Exploring the prospects empirical study (SSRN Scholarly Paper No. 84 Kovic et al. / Studies in Communication Sciences 18.1 (2018), pp. 69–85

ID 2591299). Social Science Research Net- Klotz, R. J. (2007). Internet campaigning for work. Rochester, NY. grassroots and astroturf support. Social Geer, D. (2005). Malicious bots threaten net- Science Computer Review, 25(1), 3–12. work security. Computer, 38(1), 18–20. doi:10.1177/0894439306289105. doi:10.1109/MC.2005.26. Lee, E.-J. (2012). That’s not the way it is: How Giles, J. (2011). Zeroing in on the real you. New user-generated comments on the news Scientist, 212(2836), 50–53. doi:10.1016/ affect perceived . Journal S0262-4079(11)62669-9. of Computer-Mediated Communica- Grover, S. L. (1993). Lying, deceit, and subter- tion, 18(1), 32–45. doi:10.1111/j.1083- fuge: A model of dishonesty in the work- 6101.2012.01597.x. place. Organization Science, 4(3), 478–495. Lingel, J., & Golub, A. (2015). In face on Face- doi:10.1287/orsc.4.3.478. book: Brooklyn’s drag community and Hardaker, C. (2010). Trolling in asynchronous sociotechnical practices of online com- computer-mediated communication: munication. Journal of Computer-Me- From user discussions to academic defini- diated Communication, 20(5), 536–553. tions. Journal of Politeness Research. Lan- doi:10.1111/jcc4.12125. guage, Behaviour, Culture, 6(2), 215–242. Llewellyn, C., Cram, L., Favero, A., & Hill, R. L. doi:10.1515/JPLR.2010.011. (2018). For whom the bell trolls: Troll be- Hart, S. W., & Klink, M. C. (2017). 1st Troll Battal- haviour in the Twitter Brexit debate. ArX- ion: Influencing military and strategic op- iv:1801.08754 [Cs]. Retrieved from http:// erations through cyber-personas. In 2017 arxiv.org/abs/1801.08754. International Conference on Cyber Conflict Lyon, T. P., & Maxwell, J. W. (2004). Astroturf: (CyCon U.S.) (pp. 97–104). doi:10.1109/ Interest group lobbying and corporate CYCONUS.2017.8167503. strategy. Journal of Economics & Manage- Hille, S. & Bakker, P. (2014). Engaging the social ment Strategy, 13(4), 561–597. doi:10.1111/ news user. Journalism Practice, 8(5), 563– j.1430-9134.2004.00023.x. 572. doi:10.1080/17512786.2014.899758 MacFarquhar, N. (February 18, 2018). Inside Howard, P. N. (2006). New media campaigns the Russian troll factory: Zombies and and the managed citizen. Communication, a breakneck pace. . society and politics. Cambridge: Cam- Retrieved from https://www.nytimes. bridge University Press. com/2018/02/18/world/europe/rus- Jiang, M. (2016). Managing the micro-self: the sia-troll-factory.html governmentality of real name registration Mackie, G. (2009). Astroturfing infotopia. Theo- policy in Chinese microblogosphere. ria: A Journal of Social and Political Theo- Information, Communication & Soci- ry, 56(119), 30–56. ety, 19(2), 202–220. doi:10.1080/136911 MacKinnon, R., & Zuckerman, E. (2012). Don’t 8X.2015.1060723. feed the trolls. Index on , 41(4), Jönsson, A. M., & Örnebring, H. (2011). Us- 14–24. doi:10.1177/0306422012467413. er-generated content and the news. Jour- McCornack, S. A. & Levine, T. R. (1990). When nalism Practice, 5(2), 127–144. doi:10.1080 lies are uncovered: Emotional and rela- /17512786.2010.501155. tional outcomes of discovered deception. Kaplan, A. M., & Haenlein, M. (2010). Users Communication Monographs, 57(2), 119– of the world, unite! The challenges and 138. doi:10.1080/03637759009376190. opportunities of social media. Business National Intelligence Council. (2017). Assessing Horizons, 53(1), 59–68. doi:10.1016/j.bush- Russian Activities and Intentions in Recent or.2009.09.003. US Elections (Intelligence Community King, G., Pan, J., & Roberts, M. E. (2017). How Assessment). Washington, DC. Retrieved the Chinese government fabricates social from https://www.dni.gov/files/docu- media posts for strategic Distraction, not ments/ICA_2017_01.pdf. engaged argument. American Political Sci- Neuhaus, C. (September 2, 2015). Hans-Peter ence Review, 111(3), 484–501. doi:10.1017/ Portmanns Twitter-Wunder: 11 000 Freun- S0003055417000144. de sollt ihr sein. Neue Zürcher Zeitung. Kovic et al. / Studies in Communication Sciences 18.1 (2018), pp. 69–85 85

Qiang, X. (2011). The battle for the Chinese In- Walker, E. T. (2014). Grassroots for hire: Public ternet. Journal of Democracy, 22(2), 47–61. affairs consultants in american democracy. doi:10.1353/jod.2011.0020. Cambridge, MA: Cambridge University Rheingold, H. (1993). The virtual community: Press. Homesteading on the electronic frontier. Walker, E. T., & Rea, C. M. (2014). The political Cambridge, MA: MIT Press. mobilization of firms and industries. An- Rogers, E. M. (2003). Diffusion of innovations. nual Review of Sociology, 40(1), 281–304. New York: Free Press. doi:10.1146/annurev-soc-071913-043215 Serazio, M. (2014). The new media designs of Walker, J. L. (1983). The origins and mainte- political consultants: Campaign produc- nance of interest groups in America. The tion in a fragmented era. Journal of Com- American Political Science Review, 77(2), munication, 64(4), 743–763. doi:10.1111/ 390–406. doi:10.2307/1958924. jcom.1 2078. Wauters, E., Donoso, V., & Lievens, E. (2014). Shao, C., Ciampaglia, G. L., Varol, O., Flammi- Optimizing transparency for users in ni, A., & Menczer, F. (2017). The spread social networking sites. info, 16(6), 8–23. of by social bots. ArX- doi:10.1108/info-06-2014-0026. iv:1707.07592 [Physics]. Retrieved from Weber, M. (1922). Wirtschaft und Gesellschaft. http://arxiv.org/abs/1707.07592. Tübingen, : Mohr. Stieglitz, S., Brachten, F., Berthelé, D., Schlaus, Woolley, S. C. (2016). Automating power: Social M., Venetopoulou, C., & Veutgen, D. (2017). bot interference in global politics. First Do social bots (still) act different to hu- Monday, 21(4). mans? – Comparing metrics of social bots Yoo, K.-H., & Gretzel, U. (2009). Comparison of with those of humans. In G. Meiselwitz deceptive and truthful travel reviews. In (eds), Social computing and social media. D. W. Höpken, D. U. Gretzel, & D. R. Law Human behavior (pp. 379–395). Cham, (Eds.), Information and Communication Switzerland: Springer. doi:10.1007/978-3- Technologies in Tourism 2009 (pp. 37–47). 319-58559-8_30. Vienna: Springer. doi:10.1007/978-3-211- Suler, J. (2004). The online disinhibition effect. 93971-0_4 CyberPsychology & Behavior, 7(3), 321–326. Zhang, J., Carpenter, D., & Ko, M. (2013). Online doi:10.1089/1094931041291295. astroturfing: A theoretical perspective. Sullivan, M. (April 28, 2014). How “verified AMCIS 2013 Proceedings. commenters” earn their status at The Zheng, X., Zeng, Z., Chen, Z., Yu, Y., & Rong, Times, and why. The New York Times. C. (2015). Detecting spammers on social Retrieved from http://publiceditor.blogs. networks. Neurocomputing, 159, 27–34. nytimes.com/2014/04/28/how-verified- doi:10.1016/j.neucom.2015.02.047. commenters-earn-their-status-at-the- times-and-why/. Tarrow, S. G. (2011). Power in movement: Social movements and contentious politics. Cam- bridge, MA: Cambridge University Press. Thaler, R. H., & Sunstein, C. R. (2009). Nudge: Improving decisions about health, wealth, and happiness. New York: Penguin. Twitter PublicPolicy. (2018). Update on Twitter’s review of the 2016 U.S. election. Retrieved from https://blog.twitter.com/official/ en_us/topics/company/2018/2016-elec- tion-update.html. Vergani, M. (2014). Rethinking grassroots campaigners in the digital media: The “grassroots orchestra” in Italy. Australian Journal of Political Science, 49(2), 237–251. doi:10.10 80/10361146.2014.898129