Examining Youtube's Rabbit Hole of Radicalization
Total Page:16
File Type:pdf, Size:1020Kb
Algorithmic Extremism: Examining YouTube’s Rabbit Hole of Radicalization Mark Ledwich Anna Zaitsev Brisbane, Australia The School of Information [email protected] University of California, Berkeley Berkeley, United States [email protected] Abstract—The role that YouTube and its behind-the-scenes broadcasting a large and widely diverse set of ideas to millions recommendation algorithm plays in encouraging online radical- of people worldwide. Included among general content creators ization has been suggested by both journalists and academics are those who specifically target users with polarizing and alike. This study directly quantifies these claims by examining the role that YouTubes algorithm plays in suggesting radicalized radicalizing political content. While YouTube and other social content. After categorizing nearly 800 political channels, we were media platforms have generally taken a strict stance against able to differentiate between political schemas in order to analyze most inflammatory material on their platform, extremist groups the algorithm traffic flows out and between each group. After from jihadi terrorist organizations [18][19], various political conducting a detailed analysis of recommendations received by positions [20], and conspiracy theorists have nonetheless been each channel type, we refute the popular radicalization claims. To the contrary, these data suggest that YouTubes recommendation able to permeate the content barrier [21]. algorithm actively discourages viewers from visiting radicalizing Extreme content exists on a spectrum. YouTube and other or extremist content. Instead, the algorithm is shown to favor social media platforms have generally taken a strict stance mainstream media and cable news content over independent against the most inflammatory materials or materials that YouTube channels with slant towards left-leaning or politi- are outright illegal. No social media platform tolerates ISIS cally neutral channels. Our study thus suggests that YouTubes recommendation algorithm fails to promote inflammatory or beheading videos, child porn, or videos depicting cruelty radicalized content, as previously claimed by several outlets. towards animals. There seems to a consensus amongst all Index Terms—YouTube, recommendation algorithm, radical- social media platforms that human moderators or moderation ization algorithms will remove this type of content [22]. YouTube’s automatic removal of the most extreme content, I. INTRODUCTION such as explicitly violent acts, child pornography, and animal The internet can both be a powerful force for good, prosocial cruelty, has created a new era of algorithmic data mining behaviors by providing means for civic participation and com- [23][24][13]. These methods range from metadata scans munity organization [1], as well as an attractor for antisocial [25] to sentiment analysis [26]. Nevertheless, content within behaviors that create polarizing extremism [2]. This dual an ideological grey area or that can nonetheless be perceived nature of the internet has been evident since the early days as ”radicalizing” exists on YouTube [27]. Definitions of free of online communication, where ”flame-wars” and ”trolling” speech differ from country to country. However, YouTube have been present in online communities for over two decades operates on a global scale within the cultural background of the [3][4][5]. While such behaviors were previously confined United States with robust legislation that protects speech [7]. to Usenet message boards and limited IRC channels, with Even if there are limitations to what YouTube will broadcast, the expansion of social media, blogs, and microblogging the platform does allow a fair bit of content that could be arXiv:1912.11211v1 [cs.SI] 24 Dec 2019 following the rapid growth of internet participation rates, these deemed as radicalizing, either by accident or by lack of inflammatory behaviors are no longer confined and have left monitoring resources. their early back-channels into public consciousness [6]. Means such as demonetization, flagging, or comment lim- The explosion of platforms, as well as ebbs and flows in the iting is several tools available to content moderators on political climate, has exacerbated the prevalence of antisocial YouTube [28]. Nevertheless, removing or demonetizing videos messaging [7]. Research focusing on uninhibited or antisocial or channels that present inflammatory content has not curtailed communication, as well as extremist messaging online has scrutiny of YouTube by popular media [29]. Recently, the New previously been conducted on platforms including Facebook York Times published a series of articles, notably critiquing [8], Twitter [9], Reddit [10], 4chan and 8chan [11][12], YouTube’s recommendation algorithm, which suggests related Tumblr [13] and even knitting forums such as Ravelry [14]. videos for users based on their prior preferences and users In addition to these prior studies on other platforms, at- with similar preferences [30][15]. The argument put forward tention has recently been paid to the role that YouTube by the NYT is that users would not otherwise have stumbled may play as a platform for radicalization [15][16][17]. upon extremist content if they were not actively searching for As a content host, YouTube provides a great opportunity for it since the role of recommendation algorithms for content on other websites is less prevalent. As such, YouTube’s algorithm more mainstream right-wing content. They have chosen a may have a role in guiding content, and to some extent, prefer- conspiracy theorist Alex Jones’ InfoWars (nowadays removed ences towards more extremist predispositions. Critical to this from YouTube) as their seed channel, and their list of right- critique is that while previous comments on the role that social wing channels reflects this particular niche. InfoWars and media websites play in spreading radicalization have focused other conspiracy channels represent only a small segment on user contributions, the implications of the recommendation of right-wing channels. Besides, the study applies a topic algorithm strictly implicate YouTube’s programming as an analysis method derived from the Implicit Association Test offender. (IAT) [34]. However, the validity of IAT has been contested The critique of the recommendation algorithm is another [35]. In conclusion, we consider the seed channel selection as difference that sets YouTube apart from other platforms. In problematic and the range of the comparison channels as too most cases, researchers are looking at how the users apply vaguely explained [36]. social media tools as ways to spread jihadism [18], alt-right In addition to content analysis of YouTube’s videos, Riberio messages of white supremacy [12]. Studies are also focusing et al. (2019) took a novel approach by analyzing the content on the methods the content creators might use to recruit more of video comment sections, explaining which types of videos participants in various movements; for example, radical left- individual users were likely to comment on overtime. Catego- wing Antifa protests [31]. Nevertheless, the premise is that rizing videos in four categories, including alt-right, alt-light, users of Facebook, Tumblr, or Twitter would not stumble upon the intellectual dark web (IDW), and a final control group, extremists if they are not actively searching for it since the the authors found inconclusive evidence of migration between role of recommendation algorithms is less prevalent. There groups of videos. 1 are always some edge cases where innocuous Twitter hashtags The analysis shows that a portion of commenters does can be co-opted for malicious purposes by extremists or migrate from IDW videos to the alt-light videos. There is also trolls [19], but in general, users get what they specifically a tiny portion of commenter migration from the centrist IDW seek. However, the case for YouTube is different: the rec- to the potentially radicalizing alt-right videos. However, we ommendation algorithm is seen as a major factor in how believe that one cannot conclude that YouTube is a radicalizing users engage with YouTube content. Thus, the claims about force based on commenter traffic only. There are several YouTube’s role in radicalization are twofold. First, there are flaws in the setting of the study. Even though the study content creators that publish content that has the potential is commendable, it is also omitting the migration from the to radicalize [15]. Second, YouTube is being scrutinized for center to the left-of-center altogether, presenting a somewhat how and where the recommendation algorithm directs the skewed view of the commenter traffic. In addition, only a user traffic [17][15]. Nevertheless, empirical evidence of tiny fraction of YouTube viewers engage in commenting. For YouTube’s role in radicalization is insufficient [32]. There are example, the most popular video by Jordan Peterson, a central anecdotes of a radicalization pipeline and hate group rabbit character of the IDW, has 4.7 million views but only ten hole, but academic literature on the topic is scant, as we thousand comments. Besides, commenting on a video does discuss in the next section. not necessarily mean agreement with the content. A person leaving a comment on a controversial topic might stem from II. PRIOR ACADEMIC STUDIES ON YOUTUBE a desire to get a reaction (trolling or flaming) from either RADICALIZATION the content creator or other viewers [37][5]. We are hesitant Data-drive