Online Trolling, Victim Speech, and Bystander Mood 1

Total Page:16

File Type:pdf, Size:1020Kb

Online Trolling, Victim Speech, and Bystander Mood 1 Running head: ONLINE TROLLING, VICTIM SPEECH, AND BYSTANDER MOOD 1 Effect of Online Trolling and Victim Speech on Bystanders’ Mood Daniël van den Corput Tilburg University ONLINE TROLLING, VICTIM SPEECH, AND BYSTANDER MOOD 2 Abstract Despite the growing interest in trolling as an academic subject, research has mainly been theoretical and conducted by second hand data collection. Research in this field challenges with theoretical discrepancies and still requires empirical validation of theoretical data. Moreover, while prevention programs for cyberbullying are gradually coming into existence, it has not yet been applied to trolling, despite the similar negative consequences. Therefore, this study aims to experimentally explore the effects of in-team trolling victimization and the effect of victim speech on mood of bystanders in the competitive online game League of Legends. Congruently with prior non-experimental research, results indicated that participant mood increased by playing a regular game, whereas mood decreased after witnessing trolling victimization. No significant effect of victim speech on bystanders’ mood was found. Theoretical implications in relation to emotional contagion and the general strain theory are discussed. Victim agency is portrayed within a trolling and bystander context. The study is concluded with possible explanations and limitations of the found results and methods. Suggestions for future research regarding bystanders in trolling victimization and victim speech are proposed. Keywords. Bystanders, Online Gaming, Trolling, Victim Speech ONLINE TROLLING, VICTIM SPEECH, AND BYSTANDER MOOD 3 Effects of Online in-Team Trolling and Victim Speech on Mood of Bystanders Bullying is arguably the most extensively-researched form of harassment, with numerous studies regarding different aspects of bullying such as motivations of all concerned parties (Gasser & Keller, 2009) and intervention programs (Merrell, Gueldner, Ross & Isava, 2008). Now, since the internet has become ubiquitous as of the start of the 21st century, the emergence of the problem of bullying has repeated itself in an online manner (Patchin & Hinduja, 2006). The technology that we use enables us to substantially diminish the time-distance barrier that is traditionally faced, while significantly widening the target audience and creating a sense of anonymity (Slonje & Smith, 2008; Hardaker, 2010). This kind of bullying, called cyberbullying, involves using online messaging, websites, emails and social networks to spread a wide variety of content to purposely harm other people (Nicol, 2012). One phenomenon that is similar to cyberbullying is called trolling. People who engage in trolling, i.e. trolls, are characterized as people who communicate online with provocative, offensive or menacing intentions in order to create conflicts and provoke distress for their own enjoyment (March, Grieve, Marrington & Jonason, 2017). While trolling has been studied by means of surveys, interviews and second data collection, very few experimental studies have been conducted to investigate trolling behavior. Moreover, experimental studies in which participants experience trolling first hand have not been conducted at all, even though trolls have been identified as disruptive (Hardaker, 2010), transgressive, subversive (Bishop, 2014) and even criminals (Shin, 2008). On top of that, trolling victims, who are typically part of a minority group (Herring, Job-Sluder, Scheckler & Barab, 2002), often times experience harassment and discrimination (Herring, Job-Sluder, Scheckler & Barab, 2002), calling for support- and intervention programs similar to those of traditional- and cyberbullying. ONLINE TROLLING, VICTIM SPEECH, AND BYSTANDER MOOD 4 Bystander research with regards to cyberbullying is prevalent, with research ranging from consequences of witnessing cyberbullying (e.g. Schenk & Fremouw, 2012; Hinduja & Patchin, 2007) to the effect bystanders can have on cyberbullying behavior, such as intervening (e.g. Bastiaensens, Vandebosch, Poels, van Cleemput, Desmet & De Bourdeaudhuij, 2014) or engaging in hostile behavior (Neff, 2013). Despite this extensive framework of bystander research within a cyberbullying environment, bystander research with respect to trolling is scarce. The present study aims to contribute to the lack of experimental research, in particular with regards to bystanders, by experimentally examining the consequences of witnessing trolling behavior in an online environment. To do so, an online environment in the game that is conceivably frequented by online gaming’s most toxic community, League of Legends (LoL; Kou & Nardi, 2014), will be used to investigate how in-game team trolling behavior influences the mood of bystanders. Results of this study will contribute to facing these problems by answering the following question: How does in-game team communication affect emotional response of players in a competitive online game? Theoretical background Online gaming Presently, effects of trolling on bystanders will be explored in the scope of the game accompanied with online gaming’s most toxic community: League of Legends (Kou & Nardi, 2014). LoL is a multiplayer online battle arena (MOBA): an online matching game in which players have to compete with each other in teams (Park & Kim, 2014). In LoL, players control different kinds of characters, all with unique abilities to ‘kill’ other players and non-player characters in order to gain virtual money and experience points. Castranova (2008) examined human interactions within massive multiplayer online role-playing games (MMORPG) and ONLINE TROLLING, VICTIM SPEECH, AND BYSTANDER MOOD 5 argued that the boundary between these MMORPG’s and the real world has become ambiguous for many players. Some virtual currencies are worth more than some real currencies, new virtual worlds are arising at the rate of Moore’s Law and numerous lawsuits have been filed (and won) by players who were mistreated in their virtual environment (Castranova, 2008). Castranova (2008) continues by stating that twenty percent of users claim that the online environment in which they reside is perceived as their real world. While Johnson, Nacke and Wyeth (2015) state that players in MOBA’s are typically more frustrated due to the higher degree of competitiveness than players in MMORPG’s, players still exhibit teamwork-driven interactions similar to those exhibited in MMORPG’s. In these and most other online games, players can interact with each other via computer mediated communication (CMC). CMC is a text-based type of communication, which is heavily reliant of the system and interface design on which it is broadcasted (Romiszowski, & Mason, 1996). One might think that users are limited in transferring emotions via CMC since it is text based, but Vandergriff (2013) as well as Riordan and Kreuz (2010) found that individuals elicit just as many ‘nonverbal’ cues via CMC as with face to face (F2F) communication. These cues are mainly purported to indicate emotion, disambiguate an utterance, or express contextual cues and implicit meanings. In CMC, individuals achieve the same effect by applying other cues like emoticons, typographic markers and lexical surrogates (Vandergriff, 2013; Riordan & Kreuz, 2010). Derks, Fischer and Bos (2008) even argued that more emotions are conveyed via CMC than via F2F, as people tend to disambiguate their expressions more often via CMC. They also state that negative emotions are communicated more efficiently via CMC, as people sense less negative appraisals of others (Derks, Fischer & Bos, 2008). Individuals feel more anonymous when communicating via CMC and consequently tend to be more direct and explicit, leading to ONLINE TROLLING, VICTIM SPEECH, AND BYSTANDER MOOD 6 more negative and unconcealed expressions (Derks, Fischer & Bos, 2008). Consequently, participants are likely to be affected by witnessing trolling, as virtual environments such as LoL are attended with interpersonal relationships, communities and demeanors similar to those in the real world. On the other hand, research has shown that playing casual video games can be held responsible for improving mood and decreasing stress (Russoniello, O’Brien & Parks, 2009). Also of note is the finding that playing a violent videogame resulted in more arousal among children in relation to a nonviolent game (Fleming, Wood & Debra, 2001). Even though LoL is not classified as a casual game according to the conditions provided by Russoniello and colleagues (2009), its popularity implies that it is considered as enjoyable by a vast amount of players from all over the world. Consequently, Mood of participants is possibly affected by the arousal induced by playing a game of LoL. Trolling In the present study, trolls are considered to be CMC users with a fake constructed identity that appear to wish to be part of a particular group, but in reality want to cause disruption and/or trigger conflicts for their own amusement (Hardaker, 2010). This definition is by no means exhaustive in all contexts as disagreements regarding different characteristics are still the subject of present studies, but it is exhaustive enough in the scope of this study. The definition of trolling is the subject of great discussion, as many studies differ in certain characteristics that are either included or excluded. Coles and West (2016) dedicated an entire study to the ambiguity of the terms ‘troll’ and ‘trolling’ by conducting an analysis regarding the most common use of both words
Recommended publications
  • Trolls Can Sing and Dance in the Movies
    can sing and dance in the Movies TrollsBut let’s be clear! Internet Trolls are not cute or funny! In Internet slang, a troll is a person who creates bad feelings on the Internet by starting quarrels or upsetting people, or by posting inflammatory, extraneous, or off-topic messages with the intent of provoking readers into an emotional outburst. 01. The Insult Troll 06. The Profanity and All-Caps Troll The insult troll is a pure hater, plain and simple. They will This type of troll spews F-bombs and other curse words with his often pick on everyone and anyone - calling them names, caps lock button on. In many cases, these types of trolls are just accusing them of certain things, doing anything they can to bored kids looking for something to do without needing to put too get a negative emotional response from them. This type of much thought or effort into anything. trolling can be considered a serious form of cyberbullying. 02. The Persistent Debate Troll 07. The One Word Only Troll This type of troll loves a good argument. They believe they're There's always that one type of troll who just says "LOL" or "what" or right, and everyone else is wrong. They write long posts and "k" or "yes" or "no." They may not be the worst type of troll online, they're always determined to have the last word - continuing but when a serious or detailed topic is being discussed, their one- to comment until that other user gives up. word replies are just a nuisance.
    [Show full text]
  • Online Harassment: a Legislative Solution
    \\jciprod01\productn\H\HLL\54-2\HLL205.txt unknown Seq: 1 11-MAY-17 15:55 ONLINE HARASSMENT: A LEGISLATIVE SOLUTION EMMA MARSHAK* TABLE OF CONTENTS I. INTRODUCTION .......................................... 501 II. WHY IS ONLINE HARASSMENT A PROBLEM?................ 504 R a. The Scope of the Problem ............................ 504 R b. Economic Impact .................................... 507 R i. Lost Business Opportunities ...................... 507 R ii. Swatting ........................................ 510 R iii. Doxxing ........................................ 511 R III. CURRENT LAW .......................................... 512 R a. Divergent State Law ................................. 512 R b. Elements of the Law ................................. 514 R IV. LAW ENFORCEMENT AND INVESTIGATIVE PROBLEMS ........ 515 R a. Police Training ...................................... 515 R b. Investigative Resources .............................. 519 R c. Prosecutorial Jurisdiction ............................ 520 R V. SOLUTION ............................................... 521 R a. Proposed Legislation ................................ 521 R b. National Evidence Laboratory ........................ 526 R c. Training Materials ................................... 526 R VI. CONCLUSION ............................................ 528 R VII. APPENDIX ............................................... 530 R I. INTRODUCTION A journalist publishes an article; rape threats follow in the comments.1 An art curator has a conversation with a visitor to her gallery;
    [Show full text]
  • Address Munging: the Practice of Disguising, Or Munging, an E-Mail Address to Prevent It Being Automatically Collected and Used
    Address Munging: the practice of disguising, or munging, an e-mail address to prevent it being automatically collected and used as a target for people and organizations that send unsolicited bulk e-mail address. Adware: or advertising-supported software is any software package which automatically plays, displays, or downloads advertising material to a computer after the software is installed on it or while the application is being used. Some types of adware are also spyware and can be classified as privacy-invasive software. Adware is software designed to force pre-chosen ads to display on your system. Some adware is designed to be malicious and will pop up ads with such speed and frequency that they seem to be taking over everything, slowing down your system and tying up all of your system resources. When adware is coupled with spyware, it can be a frustrating ride, to say the least. Backdoor: in a computer system (or cryptosystem or algorithm) is a method of bypassing normal authentication, securing remote access to a computer, obtaining access to plaintext, and so on, while attempting to remain undetected. The backdoor may take the form of an installed program (e.g., Back Orifice), or could be a modification to an existing program or hardware device. A back door is a point of entry that circumvents normal security and can be used by a cracker to access a network or computer system. Usually back doors are created by system developers as shortcuts to speed access through security during the development stage and then are overlooked and never properly removed during final implementation.
    [Show full text]
  • In Re Grand Jury Subpoena Gj2020111968168and Applicationof The
    Case 1:20-sc-03082-BAH Document 3 Filed 03/10/21 Page 1 of 16 UNITEDSTATESDISTRICT COURT FOR THE DISTRICT OF COLUMBIA ) IN RE GRAND JURY SUBPOENA ) SC NO. 1:20-sc-03082 GJ2020111968168AND APPLICATIONOF ) THE UNITEDSTATESOF AMERICAFOR ) AN ORDER PURSUANT TO 18 U.S.C. ) Filed Under Seal § 2705(B) ) ) ) Twitter Account: @NunesAlt ) ) TWITTER, INC.’S MOTIONTO QUASH SUBPOENA AND VACATE NONDISCLOSUREORDERAND MEMORANDUMINSUPPORT INTRODUCTION The government has issued a subpoena (the “Subpoena”) for “[a]ll customer or subscriber account information” for the Twitter user @NunesAlt (the “Account”) from October 1, 2020 to present. Under the First Amendment, the government cannot compel Twitter to produce information related to the Account unless it “can show a compelling interest in the sought-after material and a sufficient nexusbetween the subject matter of the investigation and the information it seek[s].” Inre Grand Jury Subpoena No. 11116275,846 F. Supp. 2d 1, 4 (D.D.C.2012)(internal quotation marksomitted).While Twitter does not have visibility into the purpose of the Subpoena, Twitter has serious concerns whether the government can meet this standard given the context in which it has received the Subpoena. It appears to Twitter that the Subpoena may be related to Congressman Devin Nunes’s repeated efforts to unmask individuals behind parody accounts critical of him. His efforts to suppress critical speech are as well-publicized as they are unsuccessful.He recently sued Twitter, attempting to hold it liable for speech by the parody Twitter accounts @DevinCow, @DevinNunesMom,@fireDevinNunes,and @DevinGrapes, and asking the court in that case to Case 1:20-sc-03082-BAH Document 3 Filed 03/10/21 Page 2 of 16 order Twitter to disclose information identifying those accounts.
    [Show full text]
  • Can Public Diplomacy Survive the Internet?
    D C CAN PUBLIC DIPLOMACY SURVIVE THE INTERNET? BOTS, ECHO CHAMBERS, AND DISINFORMATION Edited by Shawn Powers and Markos Kounalakis May 2017 TRANSMITTAL LETTER Tothe President, Congress, Secretary of State and the American People: Established in 1948, the U.S. Advisory Commission on Public Diplomacy (ACPD) is authorized pur­ suant to Public Law 114- 113 to appraise all U.S. government efforts to understand, inform and in­ fluence foreign publics. We achieve this goal in a variety of ways, including, among other efforts, offering policy recommendations, and through our Comprehensive Annual Report, which tracks how the roughly $1.8 billion in appropriated funds is spent on public diplomacy efforts throughout the world. Part of the Commission’s mandate is to help the State Department prepare for cutting edge and transformative changes, which have the potential to upend how we think about engaging with foreign publics. This report aims to achieve precisely that. In order to think carefully about public diplomacy in this ever and rapidly changing communications space, the Commission convened a group of private sector, government, and academic experts at Stanford University’s Hoover Insti­ tution to discuss the latest research and trends in strategic communication in digital spaces. The results of that workshop, refined by a number of follow-on interviews and discussions with other organizations interested in similar questions, are included in this report. Can Public Diplomacy Survive the Internet? features essays by workshop participants that focus on emergent and potentially transformative technology and communication patterns. The essays also highlight the potential challenges and opportunities these changes create for public diplomacy practitioners in particular and the U.S.
    [Show full text]
  • What to Do When Confronted by an Internet Troll. “One Negative Voice Aimed at Me Has the Incredible Power to Drown out a Thousand Positive Ones
    Internet Trolling What to do when confronted by an internet troll. “One negative voice aimed at me has the incredible power to drown out a thousand positive ones. One of the greatest things I can achieve is to never let it.” – Dan Pearce, Single Dad Laughing The key with online Trolls, just like any bully, is to report their behaviour and avoid responding to their comments. You can control who can contact you online and it’s important to make use of the muting, reporting and blocking features available on these platforms. However, it can be immensely frustrating and hurtful to know that people are spreading falsities and lies about you or your loved ones online. This may leave you feeling overwhelmed, stressed and unsure of how to make it stop. We’ve compiled some guidance on the steps you can take in response to this behaviour on Twitter and Facebook. What is an internet ‘Troll’ UrbanDictionary.com defines Trolling as: “The deliberate act, (by a Troll – noun or adjective), of making random unsolicited and/or controversial comments on various internet forums with the intent to provoke an emotional knee jerk reaction from unsuspecting readers to engage in a fight or argument.” The anonymous nature of the internet allows for the bullies to operate without accountability or fear of punishment. Indeed, the self-creation of profiles allows user to pose as anyone (or no-one) and the nature of public platforms allows one post, or one user, to reach millions. Undoubtedly, these platforms can be used as a force for good, yet at the same time, can cause significant harm.
    [Show full text]
  • 'Please Read the Comments': Commenting Cultures
    Selected Papers of #AoIR2020: st The 21 ​ Annual Conference of the ​ Association of Internet Researchers D u b l i n Virtual, Irela nEventd / 2 8/ - 27-3131 Oc Octobertober 2 0202020 ‘PLEASE READ THE COMMENTS’: COMMENTING CULTURES ACROSS PLATFORMS Crystal Abidin Curtin University Platform-specific commenting cultures An old adage about the internet goes “Don’t Read The Comments”. It is a cynical word of caution from supposedly more experienced and savvy internet users, against a slew of negative, abusive, and unhelpful comments that are usually rampant online, stemming from trolling behaviour (Phillips 2015). “Don’t Read The Comments” has become an internet meme. Alongside parody websites (i.e. @AvoidComments n.d.), trawling through the comments section in search of ludicrosity has become an internet genre in and of itself. This comprises the likes of meme factory ‘The Straits Times Comment Section’ which collates absurd comments from users on a specific newspaper’s Facebook page (STcomments n.d.), as well as internet celebrity troll commentators like ‘American Ken’ M (Know Your Meme n.d.) and Singaporean ‘Peter Tan’ (Yeoh 2018), who post comments on a network of social media and fora in stealthily satirical ways that have even been co-opted for advertorials (Vox 2016). Such vernacular practice has in turn provoked a counter-genre of memes known as “I’m just Here For The Comments” (Tenor n.d.), in which users closely follow social media posts mainly for the resulting discussion and engagement in the comments section rather than the actual post itself. It is on this point of departure that this panel turns its focus to commenting cultures across platforms.
    [Show full text]
  • Cyber -Trolling, Cyber-Impersonation and Social Adjustment Among Secondary School Students in Calabar Education Zone, Cross River State, Nigeria
    British Journal of Education Vol.7, Issue 10, pp.44-52, October 2019 Published by ECRTD- UK Print ISSN: ISSN 2054-6351 (print), Online ISSN: ISSN 2054-636X (online) CYBER -TROLLING, CYBER-IMPERSONATION AND SOCIAL ADJUSTMENT AMONG SECONDARY SCHOOL STUDENTS IN CALABAR EDUCATION ZONE, CROSS RIVER STATE, NIGERIA Denwigwe, C.P., Uche, R.D., Asuquo, P.N., and Ngbar, M.W. Department of Guidance and Counselling, University of Calabar, Calabar Nigeria. ABSTRACT: This study is an investigation of cyber-trolling, cyber-impersonation and social adjustment among secondary school students in Calabar Education Zone of Cross River State, Nigeria. Two hypotheses were formulated to guide the discovery of the influence of cyber-trolling and cyber-impersonation on social adjustment of SS1 students which is the main purpose of the study. The research design adopted was the ex-post facto research design. 8829 public secondary school students formed the study population. A total sample of 579 students was selected through the purposive sampling technique. The instrument for data collection was the researcher-made Cyber bullying Assessment Questionnaire (CAQ), constructed on a four-point Likert scale of strongly agreed, agreed, disagreed and strongly disagreed. The Cronbach Alpha Reliability method was used to establish the reliability coefficient of the instrument with a range of 0.71 to 0.79. The statistical tool for data analysis was the One-way Analysis of Variance (ANOVA). The findings of the study revealed after the testing of the two hypotheses at 0.05 level of significance were that cyber-trolling and cyber- impersonation have negative influence on the social adjustment of secondary school students in Calabar Education Zone of Cross River State, Nigeria.
    [Show full text]
  • FCJ-167 Spraying, Fishing, Looking for Trouble: the Chinese Internet and a Critical Perspective on the Concept of Trolling
    The Fibreculture Journal issn: 1449-1443 DIGITAL MEDIA + NETWORKS + TRANSDISCIPLINARY CRITIQUE issue 22 2013: Trolls and the Negative Space of the Internet FCJ-167 Spraying, fishing, looking for trouble: The Chinese Internet and a critical perspective on the concept of trolling Gabriele de Seta Department of Applied Social Sciences, The Hong Kong Polytechnic Abstract: Internet research has dealt with trolls from many different perspectives, framing them as agents of disruption, nomadic hate breeders and lowbrow cynics spawned by the excessive freedoms of online interaction, or as legitimate and necessary actors in the ecology of online communities. Yet, the question remains: what is a troll, where it come from and where does it belong? Presenting the results of a brief troll-hunt on the Chinese Internet and discussing the features of troll-like figures in Chinese digital folklore, I argue in favour of a localised understanding of Internet cultures, presenting trolling as a culture-specific construct that has come to embody disparate kinds of online behaviour and to function as an umbrella term for different kinds of discourse about the Internet itself. ‘There is always need for a certain degree of civilisation before it is possible to understand this kind of humor” Wang Xiaobo, Civilisation and Satire’ 301 FCJ-167 fibreculturejournal.org FCJ-167 The Chinese Internet and a critical perspective on the concept of trolling Introduction: Why trolls, why China?** As an interdisciplinary field, Internet research is in the challenging position of having to work out useful concepts and categories from precarious jargons, concepts and categories that are constantly tested against, and challenged by, the magmatic and unpredictable development of Internet cultures.
    [Show full text]
  • Machine Learning Model That Successfully Detects Russian Trolls 24 3
    Human–machine detection of online-based malign information William Marcellino, Kate Cox, Katerina Galai, Linda Slapakova, Amber Jaycocks, Ruth Harris For more information on this publication, visit www.rand.org/t/RRA519-1 Published by the RAND Corporation, Santa Monica, Calif., and Cambridge, UK © Copyright 2020 RAND Corporation R® is a registered trademark. RAND Europe is a not-for-profit research organisation that helps to improve policy and decision making through research and analysis. RAND’s publications do not necessarily reflect the opinions of its research clients and sponsors. Limited Print and Electronic Distribution Rights This document and trademark(s) contained herein are protected by law. This representation of RAND intellectual property is provided for noncommercial use only. Unauthorized posting of this publication online is prohibited. Permission is given to duplicate this document for personal use only, as long as it is unaltered and complete. Permission is required from RAND to reproduce, or reuse in another form, any of its research documents for commercial use. For information on reprint and linking permissions, please visit www.rand.org/pubs/permissions. Support RAND Make a tax-deductible charitable contribution at www.rand.org/giving/contribute www.rand.org www.randeurope.org III Preface This report is the final output of a study proof-of-concept machine detection in a known commissioned by the UK Ministry of Defence’s troll database (Task B) and tradecraft analysis (MOD) Defence Science and Technology of Russian malign information operations Laboratory (Dstl) via its Defence and Security against left- and right-wing publics (Task C). Accelerator (DASA).
    [Show full text]
  • Chapter 3 Section 5
    SECTION 5: CHINA’S DOMESTIC INFORMATION CONTROLS, GLOBAL MEDIA INFLUENCE, AND CYBER DIPLOMACY Key Findings • China’s current information controls, including the govern- ment’s new social credit initiative, represent a significant es- calation in censorship, surveillance, and invasion of privacy by the authorities. • The Chinese state’s repression of journalists has expanded to target foreign reporters and their local Chinese staff. It is now much more difficult for all journalists to investigate politically sensitive stories. • The investment activities of large, Chinese Communist Par- ty-linked corporations in the U.S. media industry risk under- mining the independence of film studios by forcing them to consider self-censorship in order to gain access to the Chinese market. • China’s overseas influence operations to pressure foreign media have become much more assertive. In some cases, even without direct pressure by Chinese entities, Western media companies now self-censor out of deference to Chinese sensitivity. • Beijing is promoting its concept of “Internet sovereignty” to jus- tify restrictions on freedom of expression in China. These poli- cies act as trade barriers to U.S. companies through both cen- sorship and restrictions on cross-border data transfers, and they are fundamental points of disagreement between Washington and Beijing. • In its participation in international negotiations on global Inter- net governance, norms in cyberspace, and cybersecurity, Beijing seeks to ensure continued control of networks and information in China and to reduce the risk of actions by other countries that are not in its interest. Fearing that international law will be used by other countries against China, Beijing is unwilling to agree on specific applications of international law to cyberspace.
    [Show full text]
  • Downloaded It.1 Choosing a Dataset That Could Be Compared with Troll Tweets Turned out Not to Be a Trivial Task Because a Number of Assumptions Had to Be Met
    Trolljäger: Understanding Troll Writing as a Linguistic Phenomenon Sergei Monakhov, Friedrich Schiller University Jena (Germany) [email protected] Abstract: Te current study yielded a number of important fndings. We managed to build a neural network that achieved an accuracy score of 91 per cent in classifying troll and genuine tweets. By means of regression analysis, we identifed a number of features that make a tweet more susceptible to correct labelling and found that they are inherently present in troll tweets as a special type of discourse. We hypothesised that those features are grounded in the sociolinguistic limitations of troll writing, which can be best described as a combination of two factors: speaking with a purpose and trying to mask the purpose of speaking. Next, we contended that the orthogonal nature of these factors must necessarily result in the skewed distribution of many diferent language parameters of troll messages. Having chosen as an example distribution of the topics and vocabulary associated with those topics, we showed some very pronounced distributional anomalies, thus confrming our prediction. TROLLJÄGER: UNDERSTANDING TROLL WRITING AS A LINGUISTIC PHENOMENON Introduction In February 2018, the U.S. Justice Department indicted 13 Russian nationals associated with the Internet Research Agency (IRA), based in St. Petersburg, for interfering with the 2016 U.S. presidential election (Barrett, Horwitz, & Helderman, 2018). Those individuals were accused of creating false U.S. personas and operating social media pages and groups designed to attract U.S. audiences and to sow discord in the U.S. political system, which included posting derogatory information about a number of candidates, supporting the presidential campaign of then-candidate Donald J.
    [Show full text]