Digital Astroturfing

Total Page:16

File Type:pdf, Size:1020Kb

Digital Astroturfing Digital Astroturfing: Definition, typology, and countermeasures. Marko Kovic,∗ Adrian Rauchfleisch,y and Marc Selez ZIPAR - Zurich Institute of Public Affairs Research, Zurich, Switzerland First draft: December 22, 2015 This version: October 12, 2016 ∗[email protected] yadrian.rauchfl[email protected] [email protected] 1 Abstract In recent years, several instances of political actors creating fake grassroots activity on the Internet have been uncovered. We propose to call such fake online grassroots activity digital astroturfing. In this paper, we lay out a conceptual map of the phenomenon of digital astroturfing. To that end, we first define digital astroturfing as a form of manufactured, deceptive and strategic top-down activity on the In- ternet initiated by political actors that mimics bottom-up activity by autonomous individuals. Next, we explore a typology of digital astroturf- ing according to the dimensions of the target of digital astroturfing, the political actors who engage in digital astroturfing and the goals of the digital astroturfing activity. Following the discussion of our proposed typology, we introduce the concept of digital astroturfing repertoires, the possible combinations of tools, venues and actions used for digital astroturfing efforts. Finally, we discuss how to prevent or curb dig- ital astroturfing by implementing certain restrictive or incentivizing countermeasures. The main use of this conceptual study is to serve as a basis for future empirical work. Even though empirical research on digital astroturfing is inherently difficult since digital astroturfing is a clandestine activity, it is not impossible. We suggest some viable research strategies. 1 Introduction: The many faces of digital as- troturfing On August 18, 2015, Lyudmila Savchuk won a lawsuit against her former employer, the St. Petersburg based “Internet Research Agency”1. Formally, Ms. Savchuk filed the lawsuit against her former employer because of irregu- larities concerning the employment contract and the payment of wages. Such a dispute between an employee and their employer is a rather mundane affair, but Ms. Savchuk’s case was quite remarkable for another reason. The Internet Research Agency is an organization that, according to different sources – Ms. Savchuk, who is an investigative journalist, being one of them – , specialized in a specific form of propaganda2 on behalf of the Russian government under president Vladimir Putin: The Internet Research Agency’s employees would create a number of fake online personæ, sometimes referred to metaphorically 1 “Агентство интернет-исследований” 2We understand propaganda as a form of persuasive political communication which does not appeal to reason and argument but rather to emotion by means of manipulation through symbols (Jowett & O’Donnell, 2014; Lasswell, 1927). 2 as sock puppets (Bu, Xia, & Wang, 2013), and spread messages supportive of the Russian government across different channels, such as the comment sections of news websites (A. Chen, 2015; France-Presse, 2015; Khazan, 2013). The case of the Internet Research Agency might be one that has been prominently discussed through thorough news reporting, but it is far from the only occurrence of online activities that are not genuine but rather created by political actors. For example, it has been observed for a number of years that the Chinese government pays users to promote pro-government standpoints in online discussions. The Chinese goverment’s activity in that respect has become so notorious that the paid users are sometimes referred to as the “50 Cent army” (Cook, 2011; Hung, 2010) or the “Internet water army”3 (C. Chen, Wu, Srinivasan, & Zhang, 2011). Dishonest and manipulative online activities are not limited to (former) Communist countries. A prominent Western example of such activities has been the United States Central Command’s attempt at creating an “online persona management service”, i.e., a system for creating and deploying sock puppets in online communication flows (Fielding & Cobain, 2011). It is not always governments that engage in such activities; other actors do so as well, and possibly on less of a grand scale. For example, in 2014 it came to light that a number of Austrian companies and political parties hired a marketing agency to create user comments on news websites that cast them in a positive light (Apfl & Kleiner, 2014). Some instances of dishonest online propaganda verge on the hapless, such as a Swiss politician’s attempt at radiating popularity by paying for followers on the social media service Twitter (Neuhaus, 2015). The cases mentioned above are quite different from each other: Different political actors in different countries do different things with, presumably, different goals. However, all of the cases mentioned above also have something in common: They are all examples of a type of political communication on the Internet that hides its true nature and that is intended to give the impression of being the honest opinion of individual online users when it is, in fact, not. We call this type of political online communication digital astroturfing. The term “astroturfing” is usually used to describe “fake grassroots” campaigns and organizations backed by businesses or political actors (Lyon & Maxwell, 2004; E. T. Walker & Rea, 2014); the term is an allusion to the “AstroTurf” brand of synthetic turf. With the concept of digital astroturfing, we aim to offer an understanding of the general principle of astroturfing for the specific domain of Internet-based activities. The goal of the present paper is threefold. First, we define digital astro- 3The name “Internet water army” is a metaphor for the large number of users who “flood” online discussions with pro-government opinions. 3 turfing in a generalized, universally applicable manner. Second, we propose a typology of different forms of digital astroturfing, based on the type of actor and the type of goal that is pursued by the digital astroturfing effort. Third, we discuss possible countermeasures to either prevent digital astroturfing or to lessen its impact. 2 A generalized definition of digital astroturf- ing In the previous section, we have provisionally introduced digital astroturfing as a type of political online activity that pretends to be the activity of individual, regular online users, when it is in fact activity created by political actors. Such a provisional definition already describes the nature of digital astroturfing to some degree, but it is derived from an intuitive inductive process of describing what different cases of digital astroturfing have in common. A more precise deductive definition is needed in order to describe digital astroturfing in a generalized way, independent from specific singular occurrences of digital astroturfing. To that end, we propose the following definition: Digital astroturfing is a form of manufactured, deceptive and strategic top-down activity on the Internet initiated by political actors that mimics bottom-up activity by autonomous individuals. In this definition, digital astroturfing as a form of political activity consists of five necessary conditions: It takes place on the Internet, it is initiated by political actors, it is manufactured, it is deceptive and it is strategic. Digital astroturfing is manufactured in that it is not an honest expression of autonomous individual opinions, but rather activity created to mimic honest expression of autonomous individual opinions. It is deceptive, because the goal of digital astroturfing is to trick the audience, usually the public at large, into believing that the manufactured activity is real. It is strategic, because the political actors who engage in digital astroturfing pursue certain goals in doing so. This definition is relatively simple, but in simplicity, we believe, lies analytical clarity. So far, only few authors have attempted to define digital astroturfing. The most relevant such attempt is a recent conceptual study by Zhang, Carpenter, and Ko (2013, p. 3). In that study, the authors define online astroturfing (the authors use the term “online” rather than “digital” astroturfing) as follows: Therefore, we define online astroturfing as the dissemination of deceptive opinions by imposters posing as autonomous individuals 4 on the Internet with the intention of promoting a specific agenda. [Italics in the original.] This definition is similar to the one we introduce in three respects. First, we have adopted the authors’ very apt notion of “autonomous individuals” into our own definition because it describes very precisely what kind of deception is taking place. We expand the descriptive notion of autonomous individuals by contrasting bottom-up and top-down activity; autonomous individuals engage in bottom-up activity, and digital astroturfing is top-down activity pretending to be bottom-up activity. Second, and this is non-trivial, the authors define digital astroturfing as an activity that is taking place on the Internet. And third, they describe digital astroturfing as a deceptive activity. We agree with those three aspects as necessary conditions, but the definition lacks accuracy and generalizability. The definition is inaccurate because it implies that the relevant actors are the “imposters” posing as autonomous individuals. However, the truly relevant actors are not the “imposters” themselves, but rather the political actors initiating the astroturfing efforts. The “imposters”, such as, for example, the employees of the Internet Research Institute in St. Petersburg, are simply performing tasks on behalf of
Recommended publications
  • An Examination of the Impact of Astroturfing on Nationalism: A
    social sciences $€ £ ¥ Article An Examination of the Impact of Astroturfing on Nationalism: A Persuasion Knowledge Perspective Kenneth M. Henrie 1,* and Christian Gilde 2 1 Stiller School of Business, Champlain College, Burlington, VT 05401, USA 2 Department of Business and Technology, University of Montana Western, Dillon, MT 59725, USA; [email protected] * Correspondence: [email protected]; Tel.: +1-802-865-8446 Received: 7 December 2018; Accepted: 23 January 2019; Published: 28 January 2019 Abstract: One communication approach that lately has become more common is astroturfing, which has been more prominent since the proliferation of social media platforms. In this context, astroturfing is a fake grass-roots political campaign that aims to manipulate a certain audience. This exploratory research examined how effective astroturfing is in mitigating citizens’ natural defenses against politically persuasive messages. An experimental method was used to examine the persuasiveness of social media messages related to coal energy in their ability to persuade citizens’, and increase their level of nationalism. The results suggest that citizens are more likely to be persuaded by an astroturfed message than people who are exposed to a non-astroturfed message, regardless of their political leanings. However, the messages were not successful in altering an individual’s nationalistic views at the moment of exposure. The authors discuss these findings and propose how in a long-term context, astroturfing is a dangerous addition to persuasive communication. Keywords: astroturfing; persuasion knowledge; nationalism; coal energy; social media 1. Introduction Astroturfing is the simulation of a political campaign which is intended to manipulate the target audience. Often, the purpose of such a campaign is not clearly recognizable because a political effort is disguised as a grassroots operation.
    [Show full text]
  • Political Astroturfing Across the World
    Political astroturfing across the world Franziska B. Keller∗ David Schochy Sebastian Stierz JungHwan Yang§ Paper prepared for the Harvard Disinformation Workshop Update 1 Introduction At the very latest since the Russian Internet Research Agency’s (IRA) intervention in the U.S. presidential election, scholars and the broader public have become wary of coordi- nated disinformation campaigns. These hidden activities aim to deteriorate the public’s trust in electoral institutions or the government’s legitimacy, and can exacerbate political polarization. But unfortunately, academic and public debates on the topic are haunted by conceptual ambiguities, and rely on few memorable examples, epitomized by the often cited “social bots” who are accused of having tried to influence public opinion in various contemporary political events. In this paper, we examine “political astroturfing,” a particular form of centrally co- ordinated disinformation campaigns in which participants pretend to be ordinary citizens acting independently (Kovic, Rauchfleisch, Sele, & Caspar, 2018). While the accounts as- sociated with a disinformation campaign may not necessarily spread incorrect information, they deceive the audience about the identity and motivation of the person manning the ac- count. And even though social bots have been in the focus of most academic research (Fer- rara, Varol, Davis, Menczer, & Flammini, 2016; Stella, Ferrara, & De Domenico, 2018), seemingly automated accounts make up only a small part – if any – of most astroturf- ing campaigns. For instigators of astroturfing, relying exclusively on social bots is not a promising strategy, as humans are good at detecting low-quality information (Pennycook & Rand, 2019). On the other hand, many bots detected by state-of-the-art social bot de- tection methods are not part of an astroturfing campaign, but unrelated non-political spam ∗Hong Kong University of Science and Technology yThe University of Manchester zGESIS – Leibniz Institute for the Social Sciences §University of Illinois at Urbana-Champaign 1 bots.
    [Show full text]
  • Freedom on the Net 2016
    FREEDOM ON THE NET 2016 China 2015 2016 Population: 1.371 billion Not Not Internet Freedom Status Internet Penetration 2015 (ITU): 50 percent Free Free Social Media/ICT Apps Blocked: Yes Obstacles to Access (0-25) 18 18 Political/Social Content Blocked: Yes Limits on Content (0-35) 30 30 Bloggers/ICT Users Arrested: Yes Violations of User Rights (0-40) 40 40 TOTAL* (0-100) 88 88 Press Freedom 2016 Status: Not Free * 0=most free, 100=least free Key Developments: June 2015 – May 2016 • A draft cybersecurity law could step up requirements for internet companies to store data in China, censor information, and shut down services for security reasons, under the aus- pices of the Cyberspace Administration of China (see Legal Environment). • An antiterrorism law passed in December 2015 requires technology companies to cooperate with authorities to decrypt data, and introduced content restrictions that could suppress legitimate speech (see Content Removal and Surveillance, Privacy, and Anonymity). • A criminal law amendment effective since November 2015 introduced penalties of up to seven years in prison for posting misinformation on social media (see Legal Environment). • Real-name registration requirements were tightened for internet users, with unregistered mobile phone accounts closed in September 2015, and app providers instructed to regis- ter and store user data in 2016 (see Surveillance, Privacy, and Anonymity). • Websites operated by the South China Morning Post, The Economist and Time magazine were among those newly blocked for reporting perceived as critical of President Xi Jin- ping (see Blocking and Filtering). www.freedomonthenet.org FREEDOM CHINA ON THE NET 2016 Introduction China was the world’s worst abuser of internet freedom in the 2016 Freedom on the Net survey for the second consecutive year.
    [Show full text]
  • What Is Astroturfing? Churchofgodbigsandy.Com
    What Is Astroturfing? The following information is from the “Edifying the Body” section of the Church of God Big Sandy’s website, churchofgodbigsandy.com. It was posted for the weekend of Dec. 10, 2016. Compiled by Dave Havir A definition titled “Astroturf” was posted at powerbase.info. Following is an excerpt of the definition. __________ “Astroturf” refers to grassroots groups or coalitions which are actually fake; often created or heavily funded by corporations, public relations firms, indus- try trade associations, and political interests. Astroturfing is used by organizations to give the illusion of genuine public support to their cause, manufacturing public opinion in what some commen- tators have called “democracy for hire.” As a deceptive use of third-party technique, Astroturfing can be considered a form of propaganda. Unlike genuine grassroots activism, which tends to be people-rich but cash- poor, Astroturf activism is normally people-poor but cash-rich. Astroturf campaigns work by recruiting the support of less-informed activists and individuals to their cause, often by means of deception. Astroturfing can mislead the public into believing that the views of the astro- turfer are mainstream and that widespread genuine support actually exists, when in most cases it does not. Deceptive Astroturf campaigns are thus most likely to occur where the interests of wealthy or powerful interests come into conflict with the interests of the public. ★★★★★ A definition titled “Astroturfing” was posted at weebly.com. Following is an excerpt of the definition. __________ “Astroturfing” denotes political, advertising, or public relations campaigns that are formally planned by an organization, but are disguised as sponta- neous, popular “grassroots” behavior.
    [Show full text]
  • Battling the Internet Water Army: Detection of Hidden Paid Posters
    Battling the Internet Water Army: Detection of Hidden Paid Posters Cheng Chen Kui Wu Venkatesh Srinivasan Xudong Zhang Dept. of Computer Science Dept. of Computer Science Dept. of Computer Science Dept. of Computer Science University of Victoria University of Victoria University of Victoria Peking University Victoria, BC, Canada Victoria, BC, Canada Victoria, BC, Canada Beijing, China Abstract—We initiate a systematic study to help distinguish on different online communities and websites. Companies are a special group of online users, called hidden paid posters, or always interested in effective strategies to attract public atten- termed “Internet water army” in China, from the legitimate tion towards their products. The idea of online paid posters ones. On the Internet, the paid posters represent a new type of online job opportunity. They get paid for posting comments is similar to word-of-mouth advertisement. If a company hires and new threads or articles on different online communities enough online users, it would be able to create hot and trending and websites for some hidden purposes, e.g., to influence the topics designed to gain popularity. Furthermore, the articles opinion of other people towards certain social events or business or comments from a group of paid posters are also likely markets. Though an interesting strategy in business marketing, to capture the attention of common users and influence their paid posters may create a significant negative effect on the online communities, since the information from paid posters is usually decision. In this way, online paid posters present a powerful not trustworthy. When two competitive companies hire paid and efficient strategy for companies.
    [Show full text]
  • Introduction: What Is Computational Propaganda?
    !1 Introduction: What Is Computational Propaganda? Digital technologies hold great promise for democracy. Social media tools and the wider resources of the Internet o"er tremendous access to data, knowledge, social networks, and collective engagement opportunities, and can help us to build better democracies (Howard, 2015; Margetts et al., 2015). Unwelcome obstacles are, however, disrupting the creative democratic applications of information technologies (Woolley, 2016; Gallacher et al., 2017; Vosoughi, Roy, & Aral, 2018). Massive social platforms like Facebook and Twitter are struggling to come to grips with the ways their creations can be used for political control. Social media algorithms may be creating echo chambers in which public conversations get polluted and polarized. Surveillance capabilities are outstripping civil protections. Political “bots” (software agents used to generate simple messages and “conversations” on social media) are masquerading as genuine grassroots movements to manipulate public opinion. Online hate speech is gaining currency. Malicious actors and digital marketers run junk news factories that disseminate misinformation to harm opponents or earn click-through advertising revenue.# It is no exaggeration to say that coordinated e"orts are even now working to seed chaos in many political systems worldwide. Some militaries and intelligence agencies are making use of social media as conduits to undermine democratic processes and bring down democratic institutions altogether (Bradshaw & Howard, 2017). Most democratic governments are preparing their legal and regulatory responses. But unintended consequences from over- regulation, or regulation uninformed by systematic research, may be as damaging to democratic systems as the threats themselves.# We live in a time of extraordinary political upheaval and change, with political movements and parties rising and declining rapidly (Kreiss, 2016; Anstead, 2017).
    [Show full text]
  • Complex Perceptions of Social Media Misinformation in China
    The Government’s Dividend: Complex Perceptions of Social Media Misinformation in China Zhicong Lu1, Yue Jiang2, Cheng Lu1, Mor Naaman3, Daniel Wigdor1 1University of Toronto 2University of Maryland 3Cornell Tech {luzhc, ericlu, daniel}@dgp.toronto.edu, [email protected], [email protected] ABSTRACT involves a wide array of actors including traditional news out- The social media environment in China has become the dom- lets, professional or casual news-reporting individuals, user- inant source of information and news over the past decade. generated content, and third-party fact-checkers. As a result, This news environment has naturally suffered from challenges norms for disseminating information through social media and related to mis- and dis-information, encumbered by an in- the networks through which information reaches audiences creasingly complex landscape of factors and players including are more diverse and intricate. This changing environment social media services, fact-checkers, censorship policies, and makes it challenging for users to evaluate the trustworthiness astroturfing. Interviews with 44 Chinese WeChat users were of information on social media [14]. conducted to understand how individuals perceive misinfor- In China, the media landscape is even more complicated be- mation and how it impacts their news consumption practices. cause government interventions in the ecosystem are becoming Overall, this work exposes the diverse attitudes and coping increasingly prevalent. Most people only use local social me- strategies that Chinese users employ in complex social me- dia platforms, such as WeChat, Weibo, and Toutiao, to social- dia environments. Due to the complex nature of censorship ize with others and consume news online.
    [Show full text]
  • From Anti-Vaxxer Moms to Militia Men: Influence Operations, Narrative Weaponization, and the Fracturing of American Identity
    From Anti-Vaxxer Moms to Militia Men: Influence Operations, Narrative Weaponization, and the Fracturing of American Identity Dana Weinberg, PhD (contact) & Jessica Dawson, PhD ABSTRACT How in 2020 were anti-vaxxer moms mobilized to attend reopen protests alongside armed militia men? This paper explores the power of weaponized narratives on social media both to create and polarize communities and to mobilize collective action and even violence. We propose that focusing on invocation of specific narratives and the patterns of narrative combination provides insight into the shared sense of identity and meaning different groups derive from these narratives. We then develop the WARP (Weaponize, Activate, Radicalize, Persuade) framework for understanding the strategic deployment and presentation of narratives in relation to group identity building and individual responses. The approach and framework provide powerful tools for investigating the way narratives may be used both to speak to a core audience of believers while also introducing and engaging new and even initially unreceptive audience segments to potent cultural messages, potentially inducting them into a process of radicalization and mobilization. (149 words) Keywords: influence operations, narrative, social media, radicalization, weaponized narratives 1 Introduction In the spring of 2020, several FB groups coalesced around protests of states’ coronavirus- related shutdowns. Many of these FB pages developed in a concerted action by gun-rights activists (Stanley-Becker and Romm 2020). The FB groups ostensibly started as a forum for people to express their discontent with their state’s shutdown policies, but the bulk of their membership hailed from outside the targeted local communities. Moreover, these online groups quickly devolved to hotbeds of conspiracy theories, malign information, and hate speech (Finkelstein et al.
    [Show full text]
  • The Weaponization of Twitter Bots in the Gulf Crisis
    International Journal of Communication 13(2019), 1389–1415 1932–8036/20190005 Propaganda, Fake News, and Fake Trends: The Weaponization of Twitter Bots in the Gulf Crisis MARC OWEN JONES Hamad bin Khalifa University, Qatar1 To address the dual need to examine the weaponization of social media and the nature of non-Western propaganda, this article explores the use of Twitter bots in the Gulf crisis that began in 2017. Twitter account-creation dates within hashtag samples are used as a primary indicator for detecting Twitter bots. Following identification, the various modalities of their deployment in the crisis are analyzed. It is argued that bots were used during the crisis primarily to increase negative information and propaganda from the blockading countries toward Qatar. In terms of modalities, this study reveals how bots were used to manipulate Twitter trends, promote fake news, increase the ranking of anti-Qatar tweets from specific political figures, present the illusion of grassroots Qatari opposition to the Tamim regime, and pollute the information sphere around Qatar, thus amplifying propaganda discourses beyond regional and national news channels. Keywords: bots, Twitter, Qatar, computational propaganda, Gulf crisis, Middle East The Gulf Crisis: Propaganda, Twitter, and Pretexts The recent Gulf crisis, in which Qatar was isolated by Saudi Arabia, Egypt, Bahrain, and the United Arab Emirates (UAE), has been accompanied by a huge social media propaganda campaign. This isolation, termed by some as “the blockade,” is an important opportunity for scholars of global communications to address what Oliver Boyd-Barrett (2017) calls the “imperative” need to explore “characteristics of non-Western propaganda” (p.
    [Show full text]
  • Producers of Disinformation – Version
    Live research from the digital edges of democracy YOU ARE VIEWING AN OLDER VERSION OF THIS LIVE RESEARCH REVIEW. Switch to the latest version. Live Research Review Producers of Disinformation - Version 1.2 By Samuel Spies · January 22, 2020 Version 1.0 Table of Contents Introduction Disinformation in the internet age Who produces disinformation, and why? Commercial motivations Political motivations Conclusion Avenues for future study Spies, Samuel. 2020. "Producers of Disinformation" V1.0. Social Science Research Council, MediaWell. January 22, 2020. https://mediawell.ssrc.org/literature-reviews/producers-of-disinformat ion/versions/1-0/ Introduction One of the promises of the internet has been that anyone with a device and a data connection can find an audience. Until relatively recently, many social scientists and activists joined technologists in their optimism that social media and digital publishing would connect the world, give a voice to marginalized communities, bridge social and cultural divides, and topple authoritarian regimes. Reality has proven to be far more complicated. Despite that theoretical access to audience, enormous asymmetries and disparities persist in the way information is produced. Elites, authoritarian regimes, and corporations still have greater influence and access to communicative power. Those elites have retained their ability to inject favorable narratives into the massive media conglomerates that still dominate mediascapes in much of the world. At the same time, we have observed the darker side of the internet’s democratic promise of audience. In a variety of global contexts, we see political extremists, hate groups, and misguided health activists recruiting members, vilifying minorities, and spreading misinformation. These online activities can have very real life-or-death consequences offline, including mass shootings, ethnic strife, disease outbreaks, and social upheaval.
    [Show full text]
  • Arxiv:1910.07783V4 [Cs.CR] 11 Mar 2021
    Ephemeral Astroturfing Attacks: The Case of Fake Twitter Trends Tugrulcan˘ Elmas Rebekah Overdorf Ahmed Furkan Ozkalay¨ Karl Aberer EPFL EPFL EPFL EPFL Lausanne, Switzerland Lausanne, Switzerland Lausanne, Switzerland Lausanne, Switzerland tugrulcan.elmas@epfl.ch rebekah.overdorf@epfl.ch ahmed.ozkalay@epfl.ch karl.aberer@epfl.ch Abstract—We uncover a previously unknown, ongoing as- the top of users’ news feeds [37], and bots can be used troturfing attack on the popularity mechanisms of social on Reddit to artificially “upvote” posts to increase their media platforms: ephemeral astroturfing attacks. In this visibility [22]. In the case of Twitter trends, adversaries, attack, a chosen keyword or topic is artificially promoted sometimes from abroad [68], boost disinformation and by coordinated and inauthentic activity to appear popular, conspiracy theories to make them trend so that they are and, crucially, this activity is removed as part of the attack. further amplified [1], as in the case of QAnon followers We observe such attacks on Twitter trends and find that hijacking the trend #SaveTheChildren [71]. Due to this these attacks are not only successful but also pervasive. incident, many called on Twitter to stop curating trends We detected over 19,000 unique fake trends promoted by by using the hashtag #UntrendOctober [33]. over 108,000 accounts, including not only fake but also Attacks on popularity mechanisms rely on making in- compromised accounts, many of which remained active and authentic content or actions appear organic. Borrowing ter- continued participating in the attacks. Trends astroturfed minology used to refer to campaigns that fake grassroots by these attacks account for at least 20% of the top 10 organizing on social media, we call them “astroturfing” global trends.
    [Show full text]
  • Astroturfing As a Global Phenomenon
    ASTROTURFING AS A GLOBAL PHENOMENON Johanna Emilia Haikarainen Master’s thesis Organizational communication & PR Department of communications University of Jyväskylä 2.11.2014 Faculty Department Faculty of humanities Department of communication Author Johanna Emilia Haikarainen Title Astroturfing as a global phenomenon Subject Level Organizational communication & PR Master’s degree Month and year Number of pages 11 / 2014 45 Abstract This study is a literature review about a global phenomenon called astroturfing. There has been a lack of academic research concerning astroturfing. This study is conducted in order to give a basis for future research on the subject. The purpose of this study was to find out, what kind of phenomenon astroturfing is and how it has evolved in the course of time. The study also focuses on the criticism and concerns that astroturfing has raised. The theoretical part of this study provides basic information about astroturfing. It introduces various definitions to the term and provides an overview of cases concerning the subject. Also, special terminology related to the subject is introduced and astroturfing detection practices are presented. The method of this study is a literature review. A total of 58 articles were thoroughly examined and 15 of them were selected for the analysis. The analysis was completed as follows: finding the most relevant information concerning astroturfing about each article, with keeping the focus on the research questions. The results indicate that astroturfing is a global phenomenon with concerning nature, thus, it needs to be taken seriously. Astroturfing poses a threat to the credibility and reliability of organizations. It also deals with business ethics and it is considered as deceptive.
    [Show full text]