<<

White Rabbit The logic and proportion of conspiracy theory videos on YouTube: a Foucauldian discourse analysis

M K Sam Birch

Supervisor: Emil Stjernholm

MA Media & Communication Studies: Culture, Collaborative Media, and Creative Industries

Thesis

Spring/Summer 2019

Abstract

Conspiracy theories are everywhere. The internet has provided people with the tools for instantaneous, global communication and this has encouraged the spread of alternative narratives on a variety of platforms. This study is designed as a Foucauldian discourse analysis of how conspiracy theories disseminate and proliferate on YouTube, the pre-eminent provider of video Page | i streaming services. It aims to understand how the platform reflexively influences, and is influenced by, narratives of opposition and conflict, creating an online ‘reality’ which has the power to shape the world offline. Primarily, it addresses questions of whether YouTube promotes subversive thinking by accentuating conspiracy theory videos and recommending progressively extreme content. It also investigates how the design of the platform, including its powerful algorithm and embedded ‘social media’ functions, affects user experience and favours particular discourses over others. To achieve this, search terms were entered into YouTube, with recommended videos being studied and coded to reveal any extant bias. In addition to the content of the videos themselves, titles, descriptions, comments, further side-bar recommendations, likes, dislikes and view-counts were all recorded according to bias, providing an extensive overview of influences inherent to the YouTube platform. Results were analysed according to a Foucauldian discourse analysis which, following a multiperspectival approach, was subsequently summarised using a framework developed by Uldam & Kaun. Patterns were discovered indicating a propensity towards the propagation of increasingly extremist material. Furthermore, various discourses, including those of ambiguity, conflict and surprise, were found to proliferate on YouTube, with conspiracy theories actively benefitting from the algorithmic and thematic functions of the website. Overall, the study elucidates how power struggles enacted in online spaces are affected by their environments and, in turn, can have an effect on the beliefs and behaviour of people worldwide.

Keywords: Conspiracy theory, YouTube, video, recommendations, social media, algorithm, comments, extremism, Foucauldian discourse analysis, mixed methods, multiperspectivalism, content analysis, truth.

Contents 1 Introduction ...... 1 2 (Background &) Literature Review ...... 1 2.1 YouTube ...... 1 2.1.1 YouTube and how it works ...... 1 Page | ii 2.1.2 The YouTube recommendations system ...... 2 2.1.3 Algorithms and the challenge of studying them ...... 4 2.1.4 A balancing act ...... 7 2.1.5 YouTube and conspiracy theories ...... 9 2.1.6 YouTube’s community functions ...... 10 2.2 Conspiracy theories...... 10 2.2.1 Conspiracy theories today ...... 10 2.2.2 What is a conspiracy theory? ...... 11 2.2.3 What are the effects of conspiracy theories? ...... 15 2.2.4 Why do people believe in conspiracy theories? ...... 19 2.2.5 What are common characteristics of conspiracy theories? ...... 21 2.3 Background on Conspiracy theories Chosen for the Study ...... 23 2.3.1 Climate change ...... 23 2.3.2 Flat Earth theory ...... 24 2.3.3 Fluoridation of water ...... 25 2.3.4 HAARP ...... 26 2.3.5 Denver airport ...... 28 3 Theoretical Framework ...... 29 3.1 Foucauldian Discourse Analysis (FDA) ...... 30 3.1.1 Discourse ...... 32 3.1.2 FDA and YouTube ...... 33 3.1.3 FDA and conspiracy theories...... 35 3.2 Providing stability: a multiperspectival approach ...... 37 4 Method ...... 40 4.1 Research Questions ...... 40 4.2 Research Process ...... 41 4.2.1 Search terms used ...... 41 4.2.2 Data collection ...... 42 4.3 Weaknesses and limitations ...... 46

4.4 Ethics ...... 48 5 Results and Analysis ...... 49 5.1 Recommendations ...... 49 5.1.1 Initial top ten search results ...... 49 Page | iii 5.1.2 Further recommendations: following the side-bar ...... 50 5.1.3 Side-bar recommendations ...... 51 5.2 Views ...... 53 5.3 Likes and dislikes ...... 54 5.3.1 Likes/dislikes compared to views ...... 54 5.4 Comments ...... 55 5.4.1 Comment bias ...... 56 5.5 Video titles ...... 57 5.6 Analysis framed by Uldam & Kaun’s four dimensional model ...... 59 6 Conclusion ...... 61 Bibliography ...... 64

1 Introduction In January of this year, YouTube released a statement asserting that it would “*continue its+ work to improve recommendations”1 on its platform. It proclaimed that it intended to reduce recommendations of “content that could misinform users,”2 giving the particular examples of popular conspiracy theory subjects such as flat earth theory, so-called miracle cures and the 9/11 Page | 1 terror attacks.3 This study investigates how conspiracy theories are highlighted by the platform’s algorithm, whether it acts as a gateway to more extreme ideas and if there are other ways in which the website (knowingly or otherwise) promotes the spread of potentially detrimental messages, theories and opinions.

Since its inception in 2004, YouTube has become an increasingly influential media outlet, with independent “produsers”4 of videos able to build and maintain massive global fanbases. Despite being a relatively recent development, YouTube reaches over 1 billion people per day5 and is the second most-used search engine on the internet.6 This gives it incredible power, allowing hosted videos to reach international audiences and impact upon global narratives. Furthermore, recent journalistic studies into apparent YouTube recommendation bias during the 2016 US Presidential Elections suggest that there was considerable partiality towards videos which favoured the eventual winner (Donald Trump).7 It is therefore important to consider the ways in which YouTube’s functionality might (intentionally or inadvertently) create, maintain or encourage alternative and potentially toxic world-views.

The development of online media channels has altered the traditional binary relationship between media producers and audiences, with the power of production being passed from established, often nationally-regulated groups (journalists, photographers, TV executives, etc) to ‘cottage industry’ individuals unbound by the same corporate or public-service considerations. Whilst television, radio, newspapers et al have never been immune from producing sensationalist material, now everyone is able to do so. Furthermore they have platforms which offer access to global audiences and means of profiting financially from spouting opinions which might be untrue or perhaps even dangerous. The digitalisation of conspiracy theories, with their rapid online formation, dissemination and assumption into other conspiratorial narratives, makes them a potent and sometimes destructive force. Recent news items featuring sensitive subjects can quickly become ‘hijacked’ by intentionally disruptive or malicious parties pushing their own interests and ideologies: examples include the Sandy Hook Elementary School shooting which was branded a ‘hoax’ enacted by crisis actors,8 the Nipsey Hussle murder which was immediately linked to a conspiracy theory suggesting ‘Big Pharma’ was withholding cures for serious diseases,9 and the Christchurch mosque attacks which drew

1 YouTube (2019a) 2 Ibid 3 Ibid 4 Bruns (2007) 5 Nicas (2017) 6 Wagner (2017) 7 Lewis (2018); Lewis & McCormick (2018) 8 Svrluga (2019) 9 Bell (2019)

aspersions from right-wing commentators suggesting it could be a ‘false flag’ action by the political left.10

This thesis focuses on more established conspiracy theories, rather than those developing as a reactionary responses to current affairs, and investigates how YouTube processes favouring certain videos intersect with human propensities towards psychologically and socially validating these Page | 2 theories. It also examines the comments sections which accompany videos to understand how they might provide a breeding ground for misinformation by allowing people to disparage factually correct videos, to show support for content which spreads untruths and to make statements which connect otherwise disparate conspiracy theories. Furthermore, the study addresses the intentional use of ambivalence in both titles and content, which effectively lures audiences into watching videos which may be offensive or harmful.

Promoting the UK campaign to leave the European Union, Michael Gove MP asserted that “people in this country have had enough of experts”11 and contemporary society certainly seems more capable, if not actively willing, to reject the testimony of scientists, economists and eye-witnesses in favour of principles (and actions) that can harm themselves and others. This thesis is an attempt to understand how faulty narratives interact, disseminate and multiply. YouTube is a particularly useful area for study as it reflects modern values whilst also acting to shape them.

The results of this investigation are framed as a Foucauldian discourse analysis (FDA), albeit one that has been specifically oriented to face the particular challenges of interrogating an opaque algorithm and examining ‘social media’-type data. YouTube is a media platform which allows users to not only watch practically unlimited content, but to create it as well. There are accusations that the website’s recommendations algorithm is responsible for “radicalizing”12 its audience by proffering videos depicting increasingly extreme content, such as right-wing attitudes.13 However, whilst the algorithm might have significant influence over what is viewed on YouTube, the platform is primarily a ‘host’ and so any disruptive content must by produced by its users. Correspondingly, the recommendations system is informed by the behaviour of people visiting the website. Political power in this case is not a tool wielded by the media but rather negotiated through the availability of multiple discourses, the choices of the general public and the inscrutable machinations of the YouTube algorithm. FDA allows a ‘top-down’ approach to the issue of false narratives; one can investigate how the discourses are developing through their interaction before essentially ‘working backwards’ to the individual videos, or texts. This should help to form an overview of how conspiracy theories function in the framework of YouTube, which may then be extrapolated to other aspects of society. The methodological framework also provides a standpoint from which we can try to analyse the algorithm as a text, perhaps shedding light on its inner workings.

The study is comprised of a content analysis of material found on YouTube. Searches thematically- related to conspiracy theories are made, with the results and subsequently recommended videos being assessed. For each text the title, description and content of the video is evaluated for bias, functions reflecting user-engagement (view-count, ‘like’ buttons, etc) are examined, as is the

10 Moritz-Rabson (2019) 11 Gove (2016) 12 Tufekci (2018) 13 Ibid

comments section on the webpage. A sample of side-bar recommendations for each video are then recorded, with those deemed relevant to conspiracy theories being consequently investigated to indicate whether there is a pattern to YouTube’s recommendations. These results are then analysed according to an FDA to delineate the dominant narratives in an attempt to understand how conspiracy theories function on YouTube: what makes them popular, how the public interacts with them, how they spread across the platform and, ultimately, how they impact upon wider society. Page | 3 Following a multiperspectival approach, these discursive findings are then contextualised by placing them within the four-dimensional framework developed by Uldam and Kaun for studying political participation in social media.14

14 Uldam & Kaun (2017), p.190 2 (Background &) Literature Review

2.1 YouTube

2.1.1 YouTube and how it works Since its inception in 2005,15 YouTube has become one of the goliaths of the internet. Owned by Google,16 it is the second most-visited website worldwide17 and the foremost provider of online video-streaming services.18 There are approximately 2.2 billion individual users of the platform19 who, combined, watch over a billion hours of content every single day.20

Although when “Me at the zoo”21 was uploaded on 23rd April 2005 the founders of YouTube had technically created 100% of the available content, the website was actually designed to host other people’s videos, providing “a convenient and usable platform for online video sharing.”22 Since that time, YouTube has gone from strength to strength, creating a symbiotic relationship with amateur video-makers worldwide leading to its pre-eminent position online and allowing it to affect multifarious global narratives. One key driver for this success is the huge variety of videos hosted by YouTube which ensures there is something to suit every taste and demographic. Michael Strangelove neatly describes the appeal of the website thus: “you would have to be dead inside not to find something emotionally or intellectually compelling on YouTube.”23 Through showcasing the antics, endeavours and productions of a global community of video-makers, YouTube naturally features content which reflects the interests, opinions, ambitions, partialities, prejudices and proclivities of a vast sample of humankind. Burgess & Green recognise that the practical ubiquity of YouTube is less related to its “topdown activities”24 than to the endeavours of its users: “various forms of cultural, social, and economic values are collectively produced by users en masse, via their consumption, evaluation, and entrepreneurial activities.”25 The nexus of YouTube and its users has been an ‘active agent’ in the development of media in the 21st century, shaping audience expectations as well as the type of content being produced.

Entertainment is big business and YouTube is no different; although official figures are not readily available, “analysts at Morgan Stanley estimate that the service’s revenue will top $22 billion in 2019.”26 Content hosted on the video-sharing website is monetised by Google, who use advertising embedded in the site (and pop-ups in the videos themselves) to promote commercial partners: “the longer people stay on YouTube, the more money Google makes.”27 However, it is not just YouTube that sees financial benefits from the attention of its audience, as individual users can also reap

15 Wikipedia (2019) 16 Ibid 17 Alexa (2019) 18 Nicas (2017) 19 Popken (2018) 20 Nicas (2017) 21 Karim (2005) 22 Burgess & Green (2009), p.11 23 Strangelove (2010), p.3 24 Burgess & Green (2009), p.11 25 Ibid, p.11-12 26 Shaw & Bergen (2018) 27 Tufekci (2018)

advertising revenue by choosing to monetise the videos they have created. In addition, the platform offers a number of services which create profit for the business whilst similarly providing a financial incentive to video-producers; these include channel memberships, a ‘merchandise shelf’ store, Super Chat income and money derived from YouTube Premium subscribers.28 Furthermore, there are practically unlimited ways in which YouTube users can make money from their videos without deriving it directly from Google, some examples of which are “links for direct donations in their Page | 2 video descriptions, their online merchandise stores, affiliate links for apps or paid mentions within the videos.”29 It is therefore in the interests of both YouTube and its money-making video producers to keep the audience engaged for as long as possible. This creates a dynamic whereby both the hosting website and the user-producers are incentivised to ensure that people watching the videos keep clicking on different content which, somewhat unsurprisingly, has encouraged some questionable practices.

2.1.2 The YouTube recommendations system A significant player in the ‘keep-ball’ game of audience retention is the YouTube algorithm. Described by the website’s engineers as one of the “largest scale and most sophisticated industrial recommendation systems in existence,”30 the algorithm is responsible for enticing viewers to ‘stay a- while longer’ and watch more videos. When you conduct a search in YouTube, the results are shaped by the algorithm. When you are watching a video, the side-bar recommendations are chosen by the algorithm. When the video finishes, YouTube automatically enqueues and plays another video, chosen by the algorithm. The product of “using a field of artificial intelligence called machine learning to parse massive databases of user history”31 the algorithm has been modelled on “human data”32 and thus utilises patterns discoverable from previous user(s) behaviour to recommend content and predict future behaviour. Unfortunately, whilst YouTube state that they “update our recommendations system all the time... to make sure we’re suggesting videos that people actually want to watch,”33 there is plentiful evidence that the algorithm is less-than-perfect, recommending content that is potentially harmful to individual users and society as a whole.

Perhaps because it is partly based on an individual’s previous behaviour, the YouTube recommendation system can seem somewhat blinkered. At the more innocuous end of the spectrum the influence of algorithms can lead to a lack of variety in our online entertainment: Christo Wilson (a computer-science professor) is quoted in the New York Journal as saying “if I only watch heavy-metal videos, of course it's serving me more of those. But then I'm missing out on the diversity of culture that exists.”34 Attempted personalisation of content can become restrictive, creating a “filter bubble”35 around the user which removes “conflicting information that the algorithm deems unnecessary.”36 Essentially, a recommendation system is unlikely to suggest

28 YouTube Help (2019) 29 Popken (2018) 30 Lewis (2018) 31 Nicas (2017) 32 Popken (2018) 33 YouTube (2019a) 34 Nicas (2017) 35 Frangos et al (2018), p.259 36 Ibid

classical music, timber-sports, particle physics or polar bears if you have shown no interest in these (or closely-related) subjects previously, regardless of whether you might actually have a predisposition to be interested in them. Although boredom, or general lack of stimulation, might seem the most terrible outcome of such as system, the effects of a ‘filter bubble’ can be much more detrimental to an individual, skewing their world-view and even pushing them towards extreme ideologies. Page | 3

Humans’ tendency to value and trust information that fits with their existing beliefs and opinions is a recognised psychological phenomenon known as ‘confirmation bias’ or ‘myside bias,’37 and it is this predilection which can be enhanced by the repetition of content pushed by particular algorithms. Put simply, if you like tennis, think it is important and watch a few online videos of the sport, then the (possibly endless) recommendation of further videos related to tennis will serve to ‘confirm’ your belief that tennis is both important and worthwhile. This can be more problematic when the subject is not racquet-sports, with evidence suggesting that algorithms can favour far-right politics, conspiracy theories and literally any other subject in a similar way, with recommendations instantly available to ‘confirm’ even the most tentative personal beliefs in potentially unsavoury ideologies. Furthermore, this ‘confirmation bias’ is not just enacted through repetition but, it is argued by commentators, through increasingly radical intensification of the message.

Critics have variously represented the process of following YouTube recommendations as going down a “radical”38 ‘rabbit hole’ “of extremism”39 or “of untruth.”40 This is because the algorithm seems to advance ever more extreme videos regardless of subject matter: in a newspaper article Tufekci describes how a preliminary study of YouTube recommendations “*appeared+ to constantly up the stakes”41 taking even fairly mundane interests to their outermost limits: “videos about vegetarianism led to videos about veganism. Videos about jogging led to videos about running ultramarathons.”42 Lewis, a journalist for The Guardian, describes how his journey along the “conveyor belt of clips”43 made scheduled stops at “men mocking distraught teenage fans of Logan Paul... children stealing things and... children having their teeth pulled out with bizarre, homemade contraptions.”44 This trend may be significantly more problematic when transposed to political subjects. Kaiser & Rauchfleisch’s research project in the field of ‘disinformation studies’ connects far- right radicalisation with the ‘filter bubbles’ created within YouTube and the platform’s propensity towards recommending ever-more radical videos.45 Recently, a UK government Home Affairs Select Committee convened to question representatives from YouTube (and other platforms) about the proliferation of extremist and hate-related content being apparently prioritised by the website, with the chairperson, Yvette Cooper, visibly upset by the incrementally extreme far-right content (with seemingly escalating levels of racism) that had been proffered by the website.46 This inclination towards radical content is conceivably related to the ways in which users interact with YouTube,

37 Weigmann (2018), p.2 38 Kaiser & Rauchfleisch (2018) 39 Tufekci (2018) 40 Shaw & Bergen (2018) 41 Tufekci (2018) 42 Ibid 43 Lewis (2018) 44 Ibid 45 Kaiser & Rauchfleisch (2018) 46 Home Affairs Select Committee (2019)

with the algorithm ‘learning’ what type of content makes people click on more videos and therefore simply ‘holding a mirror up’ to human behaviour (replete with its flaws and psychological weaknesses). Varshney describes how there is an increasing need for “surprise to capture attention”47 and, with people having vast amounts of data and choice at their fingertips, “highly surprising signals *are necessary+ to get attention.”48 This creates an environment where facts devalue relative to the element of surprise and, furthermore, negative sentiments appreciate as they Page | 4 are considered more unusual (or ‘surprising’) than positive ones.49 The YouTube website already uses ‘position bias’ to highlight additional videos, placing them in algorithm-defined order of preference,50 in addition to ‘auto-playing’ the next recommended video. However, individual video producers must compete for their videos to be clicked, leading to an ‘arms race’ to generate surprise, with many instances of unrealistic thumbnail pictures, sensationalist titles and outlandish content being used to motivate viewers. Humans are hard-wired “to pay attention to danger, food, and sex in preference to other subjects”51 and these themes are often writ-large on YouTube thumbnails to entice the online audience, even where their appearance in a video is fleeting (or completely non-existent). YouTube insists that progress has been made in reducing so-called ‘clickbait’ from its recommendations, explaining how tweaks to its algorithm have improved the overall veracity of suggested videos: “You might remember that a few years ago, viewers were getting frustrated with clickbaity videos with misleading titles and descriptions... We responded by updating our system to focus on viewer satisfaction instead of views, including measuring likes, dislikes, surveys, and time well spent, all while recommending clickbait videos less often.”52 Regardless of these improvements, even a perfunctory surf through the YouTube website provides many examples of video recommendations promising the improbable, the fantastic or the shocking. This, unlike many of the claims of various video-producers on the website,53 should not be a huge surprise. The platform was designed to host user-generated videos and collect revenue from advertisements, so any social responsibilities that have arisen from the service’s popularity and the content provided by its user-base are always likely to be a secondary consideration.

2.1.3 Algorithms and the challenge of studying them Algorithms are used for everything. They essentially ‘crunch the data’ to create credit scores,54 Spotify playlists,55 warnings that your bank card is being used in a ‘suspicious’ way56 and video recommendations on YouTube.57 They can contain vast reserves of data and, dependant on their individual remit, can be considered both powerful and influential. As discussed above, the YouTube

47 Varshney (2019), p.82 48 Ibid, p.86 49 Ibid, p.82 50 Lerman (2016), p.5 51 Brodie (2009), p.72 52 YouTube (2019a) 53 e.g. ApexTV (2017); Interesting Facts (2018) 54 Zarsky (2016), p.126 55 Bucher (2018), p.54 56 Ibid, p.57 57 Ibid, p.48

algorithm is central to the website’s functioning; “organizing and gatekeeping”58 but also “personalising ‘recommendations’”59 which have a direct influence on the platform’s viewers. It is necessary in this thesis to investigate how YouTube’s algorithm affects both its own users and narratives occurring in wider society. Unfortunately, and for a variety of reasons, algorithms are notoriously difficult to interrogate. This section therefore concentrates on exploring the characteristics of these computational phenomena and how they might be better understood, Page | 5 relying on Bucher’s comprehensive research for significant insight into the matter.

Firstly, when this study repeatedly refers to ‘the YouTube algorithm’ or ‘recommendation system,’ it is something of a misnomer. Algorithms are actually multiplicitous,60 with many different algorithms working simultaneously “to create a unified experience.”61 There is “not... one single algorithm”62 driving YouTube, but a complex “networked *system+”63 of “constantly changing”64 algorithms interacting to create audience recommendations and shape user experience. One cannot, therefore, think about dissecting a single algorithm, because they are legion and defined as much by their relationships to one another as their own constituent parts.

Secondly, as Tufekci says in a TED Talk addressing this subject, “we no longer really understand how these complex algorithms work.”65 She cites the sheer quantity of ‘big data’ that they contain, explaining that trying to comprehend an algorithm’s rationale through looking at its “giant matrices... maybe millions of rows and columns”66 is like discerning what she is thinking through taking “a cross-section of my brain.”67 Bucher conducts an in-depth study of algorithms as “black boxes”68 which is a common analogy for “an object whose inner functioning cannot be known.”69 Whilst admitting that the ‘black box’ paradigm is useful for outlining the “seemingly inaccessible or secretive nature” of algorithms, Bucher questions whether depicting them as ‘unknowable’ is factually correct, or at all useful.70 Essentially algorithms are produced by humans, the programmers who code them and the people whose data ‘feeds’ them, so their “general principles and operational logics”71 are, in many ways, already known. Their outputs are also observable, which is a key point developed later.

Although algorithms may be difficult to scrutinise, due to their multiplicity and seemingly ‘closed’ nature, they still have a distinctive character. Zarsky highlights two main attributes as being “opacity and... automation”72 with the former relating to the regular absence of transparency and the latter

58 Ibid, p.781 59 Ibid, p.785 60 Ibid, p.48 61 Ibid, p.47 62 Ibid, p.48 63 Ibid, p.47 64 Ibid, p.48 65 Tufecki (2017) 66 Ibid 67 Ibid 68 Bucher (2018), p.41 69 Ibid, p.43 70 Ibid, p.47 71 Ibid, p.57 72 Zarsky (2016), p.119

describing the automatic data analyses they perform.73 It is the mechanical function of data- processing that arguably makes the algorithm an amoral entity, “they are not inherently good or bad, neutral or biased,”74 they just process information according to a set of parameters. This means that whilst Bucher describes how many algorithms have “harmful or discriminatory effects”75 and Tufekci raises concerns about them collecting constituting an “infrastructure of surveillance authoritarianism,”76 it is not the algorithm’s fault: its outputs are simply a result of its inputs. Bucher Page | 6 explains that this issue emphasises the difficulties in attributing “agency”77 to algorithms, as it is unclear where any (unpalatable) discrimination comes from: the “implicated”78 human influence or the calculations derived from “machine-learning.”79 Furthermore, algorithms are perhaps designed (albeit unintentionally) to produce objectionable results; a study by Schmitt et al illustrates how extremist media content is always likely to be conflated with antithetical ‘counter-messages’ because of their related topics80 whilst one can also easily predict that a processing system predicated on differentiating between subjects based on established patterns is inevitably going to produce discriminatory results. Whether positive, negative or neutral, it is hard to ascertain exactly from where this discrimination originates due to algorithms’ lack of transparency.81

This thesis endeavours to basically understand if YouTube favours certain types of content over others and, with the website’s recommendations system being fundamental to its operation, it is imperative that one understands (as far as is possible) the algorithm(s) behind it. The traits mentioned above (multiplicity, complexity, ‘black box’ similarity, mutability) are not the only ones which make algorithms difficult to study. The creators of specific algorithms are unwilling to reveal the inner workings of their prize assets for commercial reasons (to safeguard intellectual property), for practical reasons (to stop people from “gaming the system”82) and also to protect themselves against accusations of unacceptable practices which they might then be pressured to change. The defence of ignorance, inherent in the description of algorithms as unknowable or autonomous, is useful; it provides developers and owners with a ‘get out of jail free card’ allowing them to deflect unsavoury accusations by claiming “that detection or prior knowledge was impossible.”83 Finally, the ‘personalisation’ of content that algorithms allow ensures everyone’s experience is bespoke, which makes any study even more complicated.

So can we learn anything about these oblique, multiple, ever-changing, heavily guarded, seemingly impenetrable, interrelated bodies of data? Bucher certainly believes so. She suggests it is not necessary to prise open ‘the box’ to gain a better understanding of these oddities84 instead proposing that “the first step... is to unknow them.”85 This is a process of distancing, whereby the

73 Ibid 74 Bucher (2018), p.56 75 Ibid, p.45 76 Tufekci (2017) 77 Bucher (2018), p.51 78 Ibid, p.54 79 Ibid, p.53 80 Schmitt et al (2018), p.780 81 Zarsky (2016), p.126 82 Bucher (2018), p.44 83 Ibid, p.56 84 Bucher, p.58 85 Ibid, p.46

researcher stops focussing directly upon what the algorithm is, shifting their gaze to observe what effects it has,86 how it affects (or attracts the awareness of) people87 and what its purposes are.88 This approach is particularly relevant to the question of conspiracy theory proliferation on YouTube when approached via a Foucauldian discourse analysis. Bucher advocates a methodology where one avoids thinking about “why the algorithm came to a certain conclusion,”89 but rather concentrates on “what that conclusion suggests about the kinds of realities that are emerging because people are Page | 7 using algorithms.”90 This could be interpreted as an invitation to evaluate the discursive impacts of algorithmic process to decipher what it means to our social reality. In theoretical terms, discourse analysis seems to be a suitable way of attempting to indirectly decode the algorithm as text. In practical terms, Bucher’s suggestion that we stop trying to look inside algorithms and instead use “speculative experimentation”91 of “inputs and outputs”92 has certainly informed the method employed in this study. These are both explored in greater detail in the respective sections below.

2.1.4 A balancing act In her widely-disseminated piece for the New Your Times, Zufekci’s assessment of YouTube’s situation is scathing: they “make so much money while potentially helping to radicalize billions of people, reaping the financial benefits while asking society to bear so many of the costs.”93 The platform is accused in many journalistic reports of being a breeding-ground for “lies, hoaxes and misinformation,”94 full of “videos light on facts but rife with wild speculation”95 where “fiction is outperforming reality.”96 By providing a platform which is open to everyone, but where popularity (and financial reward) is distributed according to ‘views,’ YouTube is implicated in promoting objectionable practices ranging from “how to make explosives,”97 through right-wing radicalisation98 to “rants by... Holocaust deniers.”99 To make matters worse, the company is profiting from the questionable content it contains.100 YouTube is aware that it harbours contentious material, with their user ‘terms of service’ waiving any rights as regards content which is “factually inaccurate, offensive, indecent, or otherwise objectionable.”101 The recent missive indicating that the company is “taking a closer look at how we can reduce the spread of content that comes close to—but doesn’t quite cross the line of—violating our Community Guidelines”102 purports that it is addressing issues around algorithmic recommendation of misleading information, whilst stopping short of banning it

86 Ibid, p.61 87 Ibid, p.63 88 Ibid, p.64 89 Ibid, p.58 90 Ibid, p.58 91 Ibid, p.61 92 Ibid, p.60 93 Tufekci (2018) 94 Ibid 95 Popken (2018) 96 Chaslot (2018) 97 Strangelove (2010), p.151 98 Kaiser & Rauchfleisch (2018) 99 Shaw & Bergen (2018) 100 Wakefield (2019) 101 YouTube (2019b) 102 YouTube (2019a)

altogether: “to be clear, this will only affect recommendations of what videos to watch, not whether a video is available on YouTube.”103

This stated “balance between maintaining a platform for free speech and living up to our responsibility to users”104 is perfectly understandable from a company perspective, even though outside agencies (i.e. journalists and politicians) might find it frustrating. Lewis and McCormick Page | 8 describe how YouTube puts “a wall around its data... *protecting+ itself from scrutiny,”105 yet as the “the single most important engine of YouTube’s growth”106 the algorithm (and the information it is built upon) is undoubtedly of incredible value to the company. Secrecy in this case is justifiable from a commercial perspective. No business wants their most highly-prized assets in the public domain where they can be meddled with by external bodies or ‘ripped off’ by competitors. There is also a “fear of censorship within the Internet community”107 which has considerable influence online; if a social media site, or video-hosting platform, alienates a proportion of its users through heavy- handed proscriptive practices, entire communities can shift to other websites, domains and services. YouTube have a successful formula/algorithm and they are naturally unwilling to implement any practices which might compromise their dominant market position. Another issue, that has already been intimated above, is the algorithm itself may be functioning in a manner that is beyond the understanding (and thus the effective control) of the platform’s own programmers; a situation that would surely increase international scrutiny of the company.

To surmise, it is in the interests of the organisation to avoid being subject to wide-ranging regulation. Ensuring compliance with rules imposed by governments and regulatory bodies can be a significant additional expense, especially when a business operates internationally across innumerable jurisdictions. Although YouTube’s basic position is to “disclaim all and any liability in connection with Content”108 (effectively discharging responsibility to the content-producers) it has been pushed to make statements (and presumably changes) which indicate a greater social responsibility. Public announcements regarding the company’s “commitment and sense of responsibility to improve the recommendations experience”109 admit there is a public service aspect to the business whilst intimating that it is capable of effective self-regulation. However, for a corporation accused of “long [prioritizing] growth over safety”110 which has effectively established a dynamic media platform “because it was unfettered by producers, network executives, or regulators,”111 there still appears to be a significant schism between its public relations discourse and the types of content available on (and actively recommended by) the website.112

103 Ibid 104 Ibid 105 Lewis & McCormick (2018) 106 Lewis (2018) 107 Strangelove (2010), p.108 108 YouTube (2019b) 109 YouTube (2018a) 110 Shaw & Bergen (2018) 111 Ibid 112 Strangelove (2010), p.107

2.1.5 YouTube and conspiracy theories YouTube invites scrutiny because it hosts videos which contain objectionable material. In addition to “pornography, violence, and racism,”113 there is concern about hate speech and far-right content.114 Furthermore the social media, or community, functions of the YouTube platform also seem to act as a highly-visible, largely-unregulated area with the ‘public comments’ section “notorious for online trolling, flaming and abuse.”115 Page | 9

One of the most concerning features of YouTube is the proliferation of misinformation. The company already indemnifies itself against legal recourse from “content that is factually inaccurate”116 but that does nothing to protect the public from potential harm. In media reports, Popken states that “YouTube, as one of our primary windows into the world, is shaping our understanding of it”117 whilst Tufekci emphasises “how many... young people — turn to YouTube for information.”118 There are numerous detrimental effects of disseminating misleading information, which are delineated later in the study, but the availability of false narratives on YouTube is simultaneously undeniable and troubling.

A prime example of YouTube foregrounding questionable information is the abundance of videos related to conspiracy theories. In the company’s statement “Continuing our work to improve recommendations on YouTube,”119 the link between conspiracy theories and their possibly damaging effects is made explicit, with a few specific examples spotlighted: “we’ll begin reducing recommendations of borderline content and content that could misinform users in harmful ways—such as videos promoting a phony miracle cure for a serious illness, claiming the earth is flat, or making blatantly false claims about historic events like 9/11.”120 The relationship between misinformation, conspiracy theories and radicalisation has been illustrated by various commentators,121 with Lewis reporting that, even after YouTube had begun sanctioning a particular video for ‘violating its guidelines,’ it was simultaneously recommending it to viewers.122 The company has removed advertising revenue from “content like mass shootings”123 to inhibit people from profiting direct from tragedy (although other indirect forms of income are still possible), yet there appears to be a significant number of YouTube users who are making money by creating and distributing false information, such as implausible conspiracy theories. When “a jailed radical preacher ranks top for search term ‘British Muslim spokesman,’”124 independent evidence shows that during the 2016 US election “YouTube was six times more likely to recommend videos that aided Trump than his adversary”125 and the algorithm is accused of “systematically [amplifying] videos that are divisive, sensational and conspiratorial”126 there certainly seems to be something

113 Ibid, p.106 114 Wakefield (2019) 115 Murthy &Sharma (2018), p.192 116 YouTube (2019b) 117 Popken (2018) 118 Tufekci (2018) 119 YouTube (2019) 120 Ibid 121 Kaiser & Rauchfleisch (2018); Shaw & Bergen (2018); Lewis (2018); Tufekci (2018) 122 Lewis (2018) 123 Popken (2018) 124 Wakefield (2019) 125 Lewis (2018) 126 Ibid

disconcerting happening. With YouTube’s projected turnover for 2019 exceeding $22 billion127 from “*racking+ up the ad sales”128 the company’s expressed principles become rather more unpalatable.

2.1.6 YouTube’s community functions Page | 10 Whilst the primary purpose of YouTube is to allow users to share and watch videos, the platform also includes a number of embedded ancillary functions which provoke further user interaction. These “people-focussed features,”129 such as the ability to comment on videos, ‘like’ or ‘dislike’ content and subscribe to producer-run ‘channels,’ make YouTube a more immersive experience, akin to a social media site.130 A prominent component of the website is the comments forum that accompanies a video, although a minority of producers choose to ‘disable,’ or remove, it. As briefly mentioned earlier, these fora have a reputation for attracting divisive, inflammatory and abusive comments from other users.131 This tendency towards hostility is not exceptional to YouTube, with studies of newspapers’ online discussion boards also revealing a propensity for negativity,132 conflict133 and vitriolic discourse.134

This study included the collection of data related to use of the ‘like’ and ‘dislike’ buttons on the YouTube website, as well as recording the number of views of each video. Following Lerman’s assertion that “social influence bias, communicated through social signals, helps direct attention to online content that has been liked, shared or approved by many others,”135 the presence of a high number of ‘likes’ or views may influence user behaviour. Perhaps most importantly, there is an investigation into comments on every video studied, noting the total number posted and conducting a content analysis of those considered ‘most popular.’ Madden et al consider these fora to be “a large repository of user-generated information which may be mined”136 and the visible, yet semi- anonymous, opinions contained within them provide an important insight into the narratives surrounding each video.

2.2 Conspiracy theories

2.2.1 Conspiracy theories today Conspiracy theories and the narratives around them have occurred throughout history. From “precursors in antiquity”137 conspiracy theories have been omnipresent, attaching themselves to “every major event of the last 2,000 years.”138

127 Shaw & Bergen (2018) 128 Tufekci (2018) 129 Madden et al (2012), p.694 130 Murthy & Sharma (2018), p.194 131 Murthy & Sharma (2018), p.192; Strangelove (2010), p.106; Madden et al (2012), p.699 132 Slavtcheva-Petkova (2016), p.1125 133 Ibid, p.1129 134 Ksiazek et al (2015), p.850 135 Lerman(2016), p.5 136 Madden et al (2012), p.698 137 Uscinski (2018d), p.33 138 Hunt (2008)

Despite “the increasing authority of science over our knowledge,”139 conspiracy theories remain part of the popular consciousness into the 21st century. If anything, with technological developments democratising the media landscape140 by decentralising power from traditional outlets141and providing phenomenal instant global access to information,142 conspiracy theories have become more relevant to a modern audience; to paraphrase Uscinski, timelessness has become timeliness.143 Conspiracy theories are not just “alive and well,”144 they are thriving, with even comparatively small- Page | 11 scale conspiracy theories able to gain traction online, “instantly jump borders”145 and be accepted into “non-state-based alliances among global users.”146

Michael Wood posits that the vast quantity of information available in the internet age provides the “raw material”147 for conspiracy theorising. The apparent counterpart to this hypothesis is that, in a world where open access to data is considered the norm, when “the information stream available to the public... is demonstrably incomplete,”148 it creates a by-product of suspicion which, in turn, causes people to investigate and speculate.

Recently, research into conspiracy theories has increased at a seemingly exponential rate149 with papers being published on the subject “nearly every day.”150 This drive to research has been catalysed by a post-truth151 world where ‘fake news’ and “credible falsehoods”152 permeate all aspects of social interaction; in brief, “conspiracy theories are everywhere.”153

2.2.2 What is a conspiracy theory? Defining a ‘conspiracy theory’ is not straightforward. Although it is an enduring and well-known concept, and despite many previous attempts, there is still no overwhelming consensus on what the phrase means. This is partly because the term has been used pejoratively and, often, haphazardly to describe any unexplained event or phenomena, regardless of merit.

Conspiracy theories have a “bad reputation”154 and, whilst that is at least partially justified, there is an increasing willingness in the academic community to move away from broad-brush rejection, dispensing with derogatory language and accepting that there are often good reasons for these beliefs (even if they say more about the ‘believers’ than the subject of their theory). Hofstadter’s article “The Paranoid Style in American Politics” justifiably continues to have a strong influence over

139 Fassin (2011), p.41 140 Strangelove (2010), p.158 141 Antony & Thomas (2010), p.1283 142 Slavtcheva-Petkova (2016), p.1116 143 Uscinski (2018), p.1 144 Goertzl (1994), p.738 145 Aaronovitch(2009) 146 Jacobs (2017), p.337 147 Wood (2013), p.32 148 Jacobs (2017), p.352 149 Uscinski (2018d), p.42 150 Uscinski (2018), p.2 151 OED (2019a) 152 Jacobs (2014), p.334 153 Uscinski (2018), p.10 154 Dentith (2018), p.196

studies in the field yet, even by the author’s own admission, the term ‘paranoid style’ is intentionally negative.155 Conspiracy theorists have subsequently been summarily excluded from discussions because they are “portrayed as suffering from crippled epistemologies [and] being paranoid(-esque) in their thinking.”156 The term ‘conspiracy theory’ had arguably become so toxic, that association with it could undermine a person’s credibility.157 Orr & Husting highlight how disparaging someone who “*challenges+ authority and power”158 as a conspiracy theorist allows “an accuser to “go Page | 12 meta””159 on them, dodging the subject at hand by directly “impugning their character, intelligence, and... emotional maturity.”160 This inclination towards dismissing conspiracy theories ‘out of hand’ is perhaps why they are frequently associated with the oppressed, the powerless and the disenfranchised.

Conspiracy theories often assume an anti-establishment stance or tone. This pits the theorists against hegemonic mechanisms of control which traditionally act as arbiters of knowledge, money and power. This is sometimes conceived as a ‘David and Goliath’ struggle between an individual (or small group) and powerful bodies like corporations,161 governments, NGOs or academic institutes. Regardless of the specific theory being espoused, many critics attribute particular significance to this concept of railing against authority.162 Fassin notes that ““people who... consider that they have been... dominated or discriminated against, are particularly prone to conspiracy theories”163 and many critics have found links between racial minorities and these beliefs. There are various reasons extended for this: Goetzel cites “conspiracies... directed specifically at blacks,”164 Hofstadter references those who are “shut out of the political process,”165 Simmons & Parsons demonstrate African-American acceptance of conspiracy theories is inversely correlated with perceptions of the community’s political power166 and Orr & Husting suggest that derogatory terminology can have racial undertones, silencing minorities through undermining their legitimacy.167 By excoriating ‘conspiracy theorists’ en masse, powerful in-groups weaponise the term to maintain existing structures of authority, including those delineated by race. The idea that public debate is unfairly biased against conspiracy theories and their proponents is therefore “not entirely unfounded.”168

Conspiracy theories are not, however, just the domain of the disenfranchised. Perhaps the most high-profile contemporary exponent of subversive thought is also one of the most influential people in the world: Donald Trump. Through his speeches, policies and (in)famous tweets, the 45th President of America has expressed extreme scepticism about global warming (at one stage suggesting it was invented by the Chinese to undermine US manufacturing)169 and supported “a

155 Hofstadter (1964), p.77 156 Dentith (2018), p.202 157 Orr & Husting (2018), p.85 158 Ibid, p.82 159 Ibid, p.83 160 Ibid 161 Weigmann (2018), p.3 162 Thresher-Andrews (2013), p.6; Neville-Shepard (2018), p.122 163 Fassin (2011), p.46 164 Goetzel (1994), p.736 165 Hofstadter (1964), p.86 166 Simmons & Parsons (2005), p.594 167 Orr & Husting (2018), p.82 & 90 168 Dentith (2018), p.203 169 Weigmann (2018), p.1

range of seemingly unrelated matters [which] could all be boiled down to one singular overarching conspiracy narrative: political elites sold out the interests of regular Americans to foreign interests.”170 In fact, Trump falls into many demographics commonly associated with conspiracy belief and science denialism: he is white, male and demonstrably right-wing.171

172 Aaronovitch talks about conspiracy theories as “history for losers” and it is this association with Page | 13 powerlessness and failed epistemologies that, perhaps unfairly, taints the entire subject. There is certainly a technical problem in talking about ‘successful’ conspiracy theories: if a theory is proved true, then it is no longer a ‘theory’ so is removed from the field of speculation and appropriated by those who administer actual knowledge. Essentially, the realm of conspiracy theory can only contain artefacts which are (as yet) unverified and unresolved, giving the whole subject the semblance of failure. However, there are many examples of conspiracy theories that have been proven to be true (see cover-up activities around the JFK assassination 173 Operation Northwoods and MK-ULTRA174) and, whilst this element of rightness precludes them from remaining ‘conspiracy theories’ per se, it indicates that some theories are definitely worthwhile.

There is a suggestion that some official narratives, where an authority has stated something as true which has later been proved to be false, is another example of conspiracy theory in action. Apter, in her review of literature, succinctly describes how governments espouse narratives that ask people to believe in some conspiracy theories (that Iraq has ‘weapons of mass destruction,’ that al-Qaeda is a coherent international network of militant jihadists, etc) whilst deterring you from making unofficial connections between the conflict in Iraq and oil supplies, or political motivations behind the fear- mongering ‘war on terror.’175

Whilst some conspiracy theories are proved true, the vast majority are probably not. In saying “when conspiracy theorists are right, it is by chance,”176 Uscinski intimates a kind of infinite monkeys/works of Shakespeare dynamic; when innumerable theories are fired out, a few will hit the target. It is this proliferation of false theories that contribute to the poor reputation of the entire genre. Some of the ideas being disseminated are unreasonably outlandish (“powerful leaders of this planet are members of a race of lizards”177), contradict millennia-old facts (‘flat earthers’) or involve such a vast conspiracy that they are nigh-on impossible (global warming is a hoax). Taking a purposive view, it is likely that many conspiracy theorists are promoting subversive concepts to follow commercial, political or psychological personal agendas.

This variation underscores a key problem in defining conspiracy theories. Many are untrue, unlikely or otherwise unconscionable. A few have been proved correct, some contain elements of truth and there is a whole range of others from the probable to the impossible. This creates a wide spectrum

170 Uscinski (2018), p.3 171 Hansson (2017), p.39 172 Aaronovitch (2009) 173 Hagen (2018), p.28 174 Hunt (2008) 175 Apter (2006), p.369 176 Uscinski (2018c), p.109 177 Franks et al (2013), p.2

of veracity. Whereas some have epistemological value, others “toward the fringe”178 are aberrant from reality, making any overarching definition based on truth difficult to impose.

So, what can be said about conspiracy theories? According to Dentith, “most scholars… *think+ there is something suspicious”179 about them, and this sentiment is certainly observable in the work of 180 181 182 183 Fassin, Hofstadter, Wood and Brotherton. The use of ‘suspicious’ is compelling, because Page | 14 conspiracy theories appear to fulfil both definitions of the word: they are outwardly suspicious of mainstream narratives whilst themselves being intrinsically suspicious (or questionable) in nature. Whilst Dentith lightly condemns the widespread use of ‘suspicion’ to describe conspiracy theories184 he also advocates “*assessing+ such beliefs on a case-by-case basis,”185 a strategy that involves taking an investigative (or ‘suspicious’) look at each theory before passing judgment. Although this technique admits the possibility that a given theory could be ‘beyond suspicion,’ Dentith’s general position might feasibly be described as treating conspiracy theories as ‘suspicious until proven otherwise.’

Despite the alleged rejection of ‘suspicion,’ Dentith’s particularist186 definition of conspiracy theory is a good starting point; it primarily denies any inherent association with falsehood and encourages the appraisal of each theory according to its “individual merits.”187 This study into representations on YouTube contains a number of very different conspiracy theories spanning the spectrums of veracity and popularity; each one is therefore individually evaluated to ensure the results are contextualised and reflect the variation possible within this continuum of beliefs. The definition used will also follow Dentith’s assertion “that conspiracy theories are theories about conspiracies,”188 whilst applying Uscinski’s exclusions of “strictly paranormal or supernatural phenomena… for example, Bigfoot, Loch Ness and Chupacabra.”189

It is worth remembering that conspiracy theorists are contrarians; they posit ideas which challenge conventional knowledge. Their views are often speculative and can be irrational, “inconsistent and implausible, not to say absurd.”190 Even Dentith makes it clear that conspiracy theories are not “prima facie rational.”191 They visibly champion values of subversion, even where the basis for their opposition is built on flimsy foundations. The definition of ‘conspiracy theory’ used therefore indicates an awareness of the propensity towards conflict (and fallibility towards fallaciousness) that characterises many examples of the genre. Brotherton’s description is a useful reminder of the

178 Uscinski (2018), p.15 179 Dentith (2018b), p.97 180 Fassin(2011), p.40 181 Hofstadter (1964), p.77 182 Wood (2013), p.32 183 Brotherton (2013), p.9 184 Dentith (2018), p.197; Dentith (2018b), p.94 185 Dentith (2018b), p.104 186 Dentith (2018), p.197 187 Ibid 188 Dentith (2018b), p.94 189 Uscinski (2018b), p.49 190 Fassin (2011), p.46 191Dentith (2018), p.104

potential imperfections of conspiracy theories: “I define conspiracy theory as an unverified claim of conspiracy which is not the most plausible account of an event or situation.”192

Finally, any definition of conspiracy theory should indicate that they are increasingly important to society and self-perpetuating in nature. They can undermine a person’s social impulses,193 constitute 194 195 a “public health issue” and influence presidential elections or national referenda. Furthermore, Page | 15 multiple studies have proven that people exposed to conspiracy theories are more likely to believe other conspiracy theories.196 This “slippery slope”197 towards accepting many unproven theories is compounded by the fact that many of them are “interwoven”198 by common threads and themes, with one conspiracy acting as a “gateway”199 to others. Uscinski also describes how conspiracy theories multiply “like tribbles,”200 with politicians fending off accusations of conspiracy by advancing alternative conspiracy theories until the truth is completely buried underneath unsubstantiated supposition and conjecture.201

In summary, conspiracy theories are found everywhere and can be disseminated by everyone from the “tinfoil hat crowd”202 to ‘official sources’ like presidents. Marginalised groups have used them to question governments and powerful actors have employed them to maintain their privileged positions. They exist on a spectrum of veracity, from theories which have proven true to obdurately irrational and unverifiable fantasies. It is therefore necessary to evaluate each conspiracy theory on its particular strengths and weaknesses, never assuming intrinsic falsehood. One must, however, keep in mind that these ideologies are tools of contradiction which may be primarily intended to challenge authority, or represent anti-establishment opinion, rather than prove a specific theory is correct. Regardless of their provenance or intention, contemporary conspiracy theories can have a serious impact on public behaviour, which makes them an especially salient subject for study.

2.2.3 What are the effects of conspiracy theories? There is much interest in how conspiracy theories affect both society and individuals. With theories proliferating ever quicker, it is imperative that we understand their potential benefits and detriments. Whilst they are considered a simple, innocuous ‘talking-points’ by some, “*representing+ a typical and healthy by-product of a thriving and open democratic society,”203 there are many indications that conspiracy theorising could cause serious, if not catastrophic, harm to the planet and its inhabitants.

192 Emphasis added. Brotherton (2013), p.9 193 Van der Linden (2015), p.173 194 Glick & Booth (2014), p.798 195 Uscinski (2018), p.10 196 Goertzel (1994), p.731; Lewandowsky et al (2013), p.8; Lantian (2013), p.19; Lewandowsky (2018), p.152; Glick & Booth (2014), p.799; Van der Linden (2015), p.171 197 Van der Linden (2015), p.171 198 Mersereau (2018) 199 Ibid 200 Uscinski (2018), p.5 201 Ibid, p.5-8 202 Wolfson (2018) 203 Thresher-Andrews (2013), p.7

Firstly, let us address the notable positive effects of conspiracy theories. Some have been proven correct, vindicating the ‘believers’ and forcing conspiring actors to admit collusion.204 They allow ordinary people to question official discourses, demanding accountability from governments and commercial entities. Dentith highlights the problem with conflating ‘official,’ establishment- proffered theories with “epistemic authority.”205 This idea is extended by Hagen, who believes that academia’s tendency towards attributing “a level of moral purity… to presidents and other Page | 16 high officials… seems at least inappropriate, if not bizarre.”206 Conspiracy theory can form the framework allowing individuals to construct social critiques before introducing them to the popular consciousness. In turn, this element of ‘public investigation’ can encourage politicians and leaders towards “transparency and good behaviour,”207 because people have the legitimate means to ‘weed out’ corruption. Even if a conspiracy theory is incorrect, its focus can provide an indication of public concerns; “[expressing] social imaginaries and political anxieties that [might otherwise] remain... unheard.”208 Through giving an individual the voice to address external (societal, political, moral, etc) concerns, conspiracy theories also invite personal reflection, education and the exercise of skills related to critical thinking and reasoning.209

Perhaps the most significant benefit that most people derive from conspiracy theories is also probably the most frivolous: they are entertaining. Tales of UFOs landing at Roswell, faked moon landings and lizard people infiltrating the upper echelons of society are fun. Considering the abundance of conspiracy theory videos on YouTube, it is clear that they are proliferating because of their power to entertain: when a video by Vsauce called “Is Earth Actually Flat?” describing various ‘flat-earth’ theories (but not comprehensively expressing an opinion on the matter) is viewed 26,432,647 times,210 one can assume there are large of people watching recreationally. In addition to providing simple diversion, these peculiar tales can have further positive effects: Denver airport has embraced ‘serious’ conspiracy theories, making them into a valuable advertising gimmick.211 Unfortunately, even where a conspiracy theory has been employed for the purposes of entertainment it can have unintended detrimental effects.

Conspiracy theories are a contemporary concern precisely because they can sway public opinion, influence government policy and lead to actions which cause direct harm to individuals, specific groups or even “the long-term sustainability of human civilization.”212 As discussed earlier, they appear to self-perpetuate, in that “*exposure+ to conspiracy narratives increases the belief in various conspiracy theories,”213 meaning even outwardly anodyne speculations can accumulate to influence people’s thinking and behaviour. Fasce & Picó assert that “unwarranted beliefs are not personally or socially innocuous”214 and one can argue that lending credence to some ‘harmless’ yet irrational beliefs can make it easier for other, harmful, irrational ideas to circulate. Beyond this blanket

204 Uscinski (2018b), p.49 205 Dentith (2018b), p.101 206 Hagen (2018), p.34 207 Uscinski (2018), p.20 208 Fassin (2011), p.41 209 Břízová et al (2018), p.1 210 Vsauce (2014) 211 Wolfson (2018) 212 Hansson (2017), p.39 213 Lantian (2013), p.19 214 Fasce & Picó (2019), p.111

criticism, there are various ways that conspiracy theories are considered to have potentially injurious consequences; how injurious depends on the type of theory being expounded, its position on the spectrum of veracity and the intentions of the people involved in its dissemination.

Some conspiracy theories are designed to undermine conventional knowledge to maintain, or improve, business interests. Tobacco lobbies or pesticide makers might produce literature or fund Page | 17 studies which counter conventional criticism of the public safety of their products. Weart asserts that, in these cases, it is not necessary (or even the intention) to prove that the scientific consensus is wrong, rather to sow seeds of doubt: “*raising+ enough questions to convince the public that there was no consensus.”215 This technique is apparent in the output of climate change deniers who frequently conceive global warming as a conspiracy whilst supplying cherry-picked evidence designed to complicate the layperson’s understanding of the problem.216 This ‘muddying of the water’ makes it more difficult for the general public to learn about issues which could have a serious effect on their lives. Despite substantial scientific evidence which proves that (amongst other things) fluoridating water supplies is a good method for preventing tooth decay,217 climate change is being caused by human behaviour,218 genetically-modified foods are safe219 and there is no link between the MMR vaccination and autism,220 misinformation on these subjects is still rife. Even by addressing just these examples, we reveal a number of damaging effects to humankind: respectively, public bodies who reject water fluoridation expose their communities to increased risk of dental caries,221 urgent action is not being taken to arrest climate change, third-world farmers are unable to use disease-resistant crops (thus protecting their harvests and the people who rely on them) and children are dying from diseases which can be prevented by vaccination. In all these cases, conspiracy theories which question the scientific consensus and spread misinformation are responsible for causing damage directly to society and its inhabitants.

An indirect consequence of either malicious or reckless misinformation is time-wasting. In its most innocent form, this accusation could be levelled at most YouTube content; when someone watches a conspiracy theory video about UFOs it might inhibit their thesis-writing but does not adversely affect anyone else. The problem arises when misleading or fallacious beliefs which contradict established fact gain in popularity. Weart’s example of climate scientists who “found a large part of their time had to be spent not doing research, as they would have preferred, but responding to attacks and denial”222 highlights how experts are distracted from developing their areas of expertise by a constant need to defend and justify their work. In many disciplines, this is becoming a permanent bureaucratic requirement, with the scientific community forced to spend valuable time considering how to best present factual information to the public to avoid (unjustified) rejection.223 When this

215 Weart (2011), p.45 216 Hansson (2017), p.41 217 Horowitz (2003), p.6 218 Weigmann (2018), p.1 219 Ibid, p.4 220 NHS (2019) 221 Uscinski (2018), p.11 222 Weart (2011), p.48 223 Weigmann (2018), p.4; Lewandowksy (2018), p.172; Cullen (2018), p.144-5

wasted time could “otherwise have been devoted to research,”224 then human progress is being perceptibly retarded.

Sadly, the actions of conspiracy theorists can impact more than just an individual’s use of their time. There are several accounts of scientists being targeted for their participation in studies which some theories reject. This can involve abusive language, attempts to discredit their work, exhortations to Page | 18 violence and death threats.225 Jay Cullen, a marine chemist, gives a personal account of insults, threats and harassment suffered at the hands of conspiracy theorists who accused him of fabricating his research on the potential fall-out from the Fukushima nuclear disaster.226 Similarly Lewandowsky, a psychologist studying ‘climate scepticism,’ was subjected to racial abuse and his academic work targeted by a campaign to ‘silence’ him.227 This sustained attack succeeded in disrupting the publishing of articles pertinent to climate change denial and wasted a huge amount of time, effort and money.228

Conspiracy theories can also cause societal problems. They constitute a challenge to extant authorities and create uncertainty in who, or what, can be believed. Whilst there are specific cases where such theorising has been justified (through the uncovering of genuine cover-ups), in the vast majority of cases this serves to destabilise functioning mechanisms of spreading information: “establishment institutions are designed to ensure stability; conspiracy theories, on the other hand, are instruments of disruption.”229 Traditionally-trusted groups lose discursive power, leading to “the erosion of scientific authority”230 and a movement towards “hundreds of relative truths,”231 each problematised by the existence of the others. This leads to divisive, polarised and partisan worldviews which, in turn, instigate conflict between proponents of different viewpoints.232 Belief can therefore become inextricably entwined with political affiliation, with a comparative ‘truth’ depending on the ‘side’ you take. A salient example of this situation is the observation by different critics that conservatives, republicans and other persons on the political right are more likely to deny anthropogenic climate change.233

Overall, whilst the capacity for motivating people to action and independent, critical thought could be laudable there appear to be significantly more instances of detrimental effects than there are of positive ones. Uscinski correlates “recent displays of populism, nationalism, xenophobia, and racism”234 with conspiracy thinking, and the language of conspiracy theory is often attacking, destructive and focussed on conflict. There can be adverse results for individuals on both sides of the divide, with some following misleading advice and others having their legitimate work undermined. Society can suffer when the concept of ‘truth’ becomes a matter of opinion and when contentious

224 Lewandowsky (2018), p.168 225 Mersereau (2018) 226 Cullen (2018), p.135-145 227 Lewandowsky (2018), p.149-172 228 Ibid, p.170. 229 Uscinski (2018), p.19 230 Harambam & Aupers (2015), p.447 231 Jacobs (2017), p.336 232 Weart (2011), p.47 233 Van der Linden (2015), p.173; Weart (2011), p.45; Uscinski & Olivella (2017), p.1 234 Uscinski (2018), p.1

narratives are extended to score political points. Franks et al surmise these potential social effects as follows: [Conspiracy theories] may generate a highly cynical form of inter-group engagement in civic society, reducing trust, and driving strategic action which undermines the functioning of a modern public sphere… This is where much of the public concern… arises, since they can create a climate of mistrust 235 or lead to disengagement from mainstream society or from officially recommended practices.” Page | 19 A cacophony of voices, each pouring forth its own version of reality, can only serve to distract people from knowing which sources of information are reliable. When conspiracy theories begin to affect public health policy on crucial subjects like global warming and vaccinations, then the consequences can be devastating.

2.2.4 Why do people believe in conspiracy theories? There are many reasons people engage with conspiracy theories and academics from a variety of fields, spanning psychology, sociology, political theory, cultural studies and philosophy have tried to identify the root of their allure.

According to Thresher-Andrews, conspiracy theorists typically exhibit common personality traits: “high levels of anomie (a lack or rejection of social norms), authoritarianism, and powerlessness, together with low levels of self-esteem and trust.”236 These characteristics seem to fit with the psychological gratifications that conspiracy thinking offers. It provides a framework for opposing convention as “part of a brave insurgency against a corrupt elite or a stifling orthodoxy”237 with privileged access to “secrets and deeper truths.”238 Hofstadter’s concept of ‘paranoia’ fills the gap between feelings of powerlessness239 and the conspiracy theorists’ self-assured championing of subversive ideologies: From a psychiatric perspective, paranoia can be diagnosed as the combination of persecution (the idea of being the victim of a plot) and megalomania (the sense of one’s own greatness). The two features indeed coalesce in all conspiracy theories”240

Like conspiracy theories, conspiracy theorists are not all the same. Whilst people may display some of the psychological traits described above, there are too many people who admit to conspiratorial thinking241 for it to be a purely psychological condition. Hofstadter suggests that his ‘paranoid style’ only affects “a modest minority of the population,”242 so it would make sense that, whilst a specific personality-type might increase a person’s likelihood of subscribing to conspiracy theories in general, there are other psychological and social factors which make conspiracy thinking “a widespread tendency across the entire ideological spectrum.”243

235 Franks et al (2013), p.8 236 Thresher-Andrews (2013), p.7 237 Aaronovitch (2009) 238 Tufekci (2018) 239 Thresher-Andrews (2013), p.7 240 Fassin (2011), p.42 241 Glick & Booth (2014), p.798 242 Hofstadter (1964), p.86 243 Oliver & Wood (2014), p.264

Conspiracy theories are shown to provide people with simple explanations of different phenomena. These are attractive because science and medicine can be complex and the results of their studies obtuse. Aaronovitch indicates that the rapid changes brought about by globalisation leave “people desperate for clear, comforting answers.”244 There is also a “basic human need for being safe and in control”245 so when a “superstorm”246 or “a horrific event like a mass murder of schoolchildren”247 occurs, it is more comforting (and psychologically acceptable) to believe it is the handiwork of a Page | 20 malign conspiracy rather than accept the world is chaotic,248 “cruel and unsafe.”249

The introduction of conspiracy as a for unexplained, involute or distressing information also provides another useful trope for the individual: something to blame.250 There is a “culprit”251 or “tangible enemy”252 towards whom a person can focus their “angry feelings,”253 sadness or confusion. This is another way of retaining a modicum of control. Antagonism of this kind creates a kind of conflict that typifies conspiracy theories: the “Manichean narrative”254 of good versus evil. Hofstadter was concerned that people displaying the paranoid style were drawn to concepts of “absolute good and absolute evil”255 with a willingness to fight valued above the ability to compromise.256 By creating polarised positions between right and wrong, conspiracy theories provide a simple narrative where people can choose ‘a side’ (like the Republican party) and essentially support its stance on various issues (like climate change or immigration).

More general psychological influences also encourage conspiracy thinking. One of these cognitive biases is “illusory pattern perception”257 whereby the brain searches for, and ‘identifies’ patterns or meaning in random data. This is a form of self-deception, as the mind tries to find evidence of collusion in places where there may be none, envisaging “agency where there… *is+ randomness, ‘cock-up’ or structural dynamics which nobody in particular designed.”258

Individuals can also derive a ‘sense of belonging’ from conspiracy theories, becoming part of an exclusive in-group that unites to challenging convention. They then receive further cues on what to believe and how to act: “the milieus or groups to which people belong predispose them to accept certain ideas more easily while rejecting others, as they make sense of the events to which they are exposed.”259 Joining ‘the club’ gives unique access to “unusual and perceptive ways of looking at things”260 as well as divergent techniques for investigation and epistemological standards to be met.

244 Aaronovitch (2018) 245 Weigmann (2018), p.2-3 246 Mersereau (2013) 247 Wood (2013), p.33 248 Apter (2006), p.370 249 Thresher-Andrews (2013), p.5 250 Franks et al (2013), p.7 251 Ibid, p.8 252 Goertzel (1994), p.739 253 Ibid 254 Oliver & Wood (2014), p.954 255 Hofstadter (1964), p.82 256 Ibid 257 Weigmann (2018), p.2 258 Franks et al (2013), p.1 259 Franks et al (2013), p.6 260 Aaronovitch (2009)

Apart from the psychological, there are societal factors which promote conspiracy theories. Progenitors of these theories may decide to appeal directly to public opinion and there are numerous instances of compellingly spun narratives which have been accepted by the public despite having little epistemic warrant (“the set of reliable reasons to believe in [them]”261). As discussed earlier, conspiracy theories are somewhat self-perpetuating with belief in,262 or even awareness of,263 one theory making a person more likely to subscribe to others. Their introduction to the Page | 21 internet has therefore turned them into a ‘plague bacillus,’ predisposed to multiply and infect online spaces and the minds of the people who frequent them.

Finally, there are motivations behind conspiracy beliefs that extend beyond purely social and psychological factors. Perhaps the greatest of which is money. In the New York Observer Mersereau hypothesises that the ‘red thread’ which connects all conspiracy theories, regardless of veracity, is that “the people and organizations pushing *them+… *are+ profiting from doing so.”264 There are “clear financial incentives,”265 whether it is advertising revenue paid by Google for videos on YouTube, book sales, public donations or fees received for interviews. This situation has been enhanced by the internet and positively encourages the creation and dissemination of increasingly surprising, outlandish and vitriolic theories to catch a person’s attention (and thus related advertising revenue).266

2.2.5 What are common characteristics of conspiracy theories? Conspiracy theories are often accused of “faulty reasoning”267 with beliefs borne of pseudoscientific or antiscientific practices. Although many theorists declare their approval for scientific values of enquiry and proof through experiment, there can be a simultaneous rejection of scientific institutions and the perceived crippling dogmatism of their work.268 Allegedly ‘scientific’ evidence submitted by conspiracy theorists can also be questionable and often falls short of the standards required for bona fide scientific research. Conventional science has a long and successful history of verification through peer-review and accreditation.269 Having initially “*scaled+ a wall of skepticism,”270 every scientific idea is subject to cross-examination for the rest of its functional life. On the other hand, conspiracy theories are considered less robust in their ‘internal’ investigations, applying a different set of rules which may be less stringent and more permissive of conflicting, or ‘cherry-picked’ evidence.271

One pseudoscientific technique sometimes used in conspiracy theories is “quote-mining.”272 Evidence is finessed with numerous citations which apparently support the theorists stance (on everything from climate change to creationism), many of which are taken out-of-context or are

261 Fasce & Picó (2019), p.111 262 Goertzel (1994), p.731 263 Lantian (2013), p.19 264 Mersereau (2018) 265 Ibid 266 Varshney (2019), p.82-86 267 Weigmann (2018), p.2 268 Harambam & Aupers (2015), p.477 269 Dentith (2018), p.200 270 Weart (2011), p.41 271 Hansson (2017), p.40 272 Hansson (2017), p.41

derived from propitious, but flawed, sources.273 Another similar practice, the “cross-citation,”274 sees authors citing each other’s works concurrently, so that there is a infinite loop of quotation: X cites Y, Y cites Z, Z cites X, etc.275 Through citing other works, however spuriously, the conspiracy theorist gives his work the semblance of authority and respectability.

276 When challenging established narratives, conspiracy theories might invoke “zombie arguments.” Page | 22 These are old contentions which have already been discredited, demonstrably repelled or otherwise ‘killed off’ but are brought ‘back to life’ to fight another battle. Although they are decrepit and falling apart, constant attacks by hordes of these undead allegations can obfuscate the truth (or at least waste a lot of time).

Conspiracy theories often use ‘questioning’ as a technique to counter criticism and suggest that “there is something wrong with the official story.”277 Even where the questions raised add nothing to their own ideological viewpoint, this method of objection can be cause sufficient doubt to destabilise hypotheses which might otherwise be considered scientific fact.

Michael Wood suggests that theories are becoming “more and more vague,”278 with unsubstantiated propositions being thrust inchoate into the public sphere, with the “specifics of what happened *left+ as an exercise for the reader.”279 This acts as a defence mechanism which deflects criticism of inconsistent premises by leaving them out in the first place. Conspiracy theory believers can then make ‘leaps of faith’ which are most consistent with their existing worldview: a vague statement that 9/11 was an ‘inside job’ may be equally supported by a person who suspects the FBI as one who blames the ‘new world order.’

Conspiracy theories can be extremely resistant to logic and naturally self-sealing against criticism. Weigmann outlines how conspiracy theories are “difficult to argue with reason”280 as their proponents are capable of “*developing+ elaborate rationalizations... to justify their beliefs.”281 Furthermore, a tendency towards mistrusting conventional knowledge sources can result in instant, a priori rejection of scientific expertise.282 Where critiques of conspiracy theories do arise, they are regularly accepted as evidence in favour of the theory: if, for instance, a key premise is that a governmental body is lying, then their denial of conspiracy is considered ‘evidence’ of their mendacity. The fact that counter-arguments against conspiracy can be repurposed by theorists to support their own causes283 indicates the kind of “determined flexibility”284 that characterise conspiracy thinking, whereby “reverses can be accommodated within the theory itself”285 and crucial evidence can be simply ignored. Perhaps this aberrance from logic is due to the psychological

273 Ibid 274 Aaronovitch (2009) 275 Ibid 276 Weart (2011), p.48 277 Brotherton (2013), p.10 278 Wood (2013), p.32 279 Ibid 280 Weigmann (2018), p.1 281 Ibid, p.2 282 Cullen (2018), p.145 283 Aaronovitch (2009) 284 Ibid 285 Ibid

motivations of conspiracy advocates or may be considered justifiable because of social factors. Either way, it can be very difficult to prove these theories wrong or otherwise entirely remove them from the public narrative.

Page | 23 2.3 Background on Conspiracy theories Chosen for the Study All conspiracy theories are not created equal. Some are more truthful than others, some are more probable than others, some are better evidenced than others and some are more harmful than others. It is quite clear that conspiracy theories “need to be assessed on their individual merits”286 and, to that end, there follows a short evaluation of each of the specific theories which are used in this study. Although the exact method used is described in detail later, it is necessary at this stage to give a brief overview of how the data was collected.

A word or phrase relating to a conspiracy theory was entered into the YouTube search box. The recommended video results of this search were viewed, assessed and results collated. There were five original search terms used, each associated with a different conspiracy theory. The subjects investigated intentionally cover a broad range of conspiracy theories, from global concerns to local eccentricities, from seemingly futile arguments against proven fact to matters which are more probable (or at least provide better reasons for the belief). Searches encompassed the following ideas of conspiracy: global warming cause by humankind is unproven or a hoax, the Earth is actually flat, fluoridation of water is harmful, the HAARP scientific facility is a terrifying weapon and Denver airport is a secret headquarters for an evil cartel (the Freemasons, the Illuminati, lizard people, etc).

2.3.1 Climate change Anthropogenic climate change is a matter of scientific consensus.287 In a huge meta-analysis of 11,944 climate articles, Cook et al discovered that “among abstracts expressing a position... 97.1% endorsed the consensus position that humans are causing global warming.”288 Despite being (more or less) undeniable scientific fact,289 some people are still denying climate change is caused by human activity, with accusations that it a “hoax constructed by scheming interests.”290 This “significant gap between public perception and reality,”291 is highlighted by approximately “40% of Americans [rejecting] the consensus.”292 So, when climate change is the “defining issue of our time,”293 why are some people still questioning if it is even happening?

Firstly, climate change science is complex, based on “non-trivial mathematics”294 and not particularly accessible to the layperson. The second issue “has everything to do with politics;”295 there is a

286 Dentith (2018), p.197 287 Weigmann (2018), p.1; Weart (2011), p.45; Lewandowsky (2018), p.149; Cook et al (2013), p.1; Uscinski & Olivella (2017), p.1 288 Cook et al (2013), p.1 289 Weart (2011), p.41-5 290 Uscinski (2018), p.10 291 Cook et al (2013), p.6 292 Uscinski & Olivella (2017), p.1 293 UN (2019) 294 Hansson (2017), p.43

recognised correlation between climate change denial and right-wing politicians.296 Thirdly, the whole gamut of conspiracy theory tactics has been used to oppose the scientific consensus: prominent strategies of denial include quote-mining,297 questioning,298 zombie arguments”299 and suggestions that the matter is still a subject of meaningful debate.300 The latter point has been exacerbated by media praxis whereby a minority of climate change denialists have been given an unjustifiably equal platform to those who representing the scientific consensus.301 Recently, the Page | 24 BBC’s director of news and current affairs issued internal guidelines, making it clear that “man-made climate change exists”302 and warning against creating “false balance”303 when reporting on climate change.

Finally, there can be a psychological aspect to climate change denial. Weart suggests it may be a type of defence mechanism that individuals use to deal with uncomfortable or threatening information.304 Cullen describes how some people are pre-programmed to reject the scientific evidence outright.305 Both these responses may be enhanced by psychological factors common to conspiracy theories, including: the sense of belonging to a certain (political) group, the desire to be safely in control and the preference for simple explanations.

2.3.2 Flat Earth theory Like climate change, the subject matter contested by conspiracy theorists in this case is “so well established and supported by so much rigorous evidence that [it is] legitimately referred to as fact.”306 However, unlike climate change, the scientific debate “about the shape of the Earth has been settled for over two thousand years”307 and therefore constitutes a “scientific truth”308 that underpins practically everything we know. The evidence that we live on a ‘round’ planet is comprehensive, pervasive and verifiable by the layperson.309 It is therefore difficult to understand why anyone would believe that A. the Earth is flat and B. there is a conspiracy to conceal that information.

In Skeptic Magazine, Loxton cites “intuition and fundamentalist religious faith”310 as the two primary motivators for the flat Earth theory. The first of these is subject to an interesting study by Claus- Christian Carbon which shows that whilst people understand the world is spherical, when calculating

295 Lewandowsky (2018), p.149 296 Van der Linden (2015), p.173; Weart (2011), p.45; Uscinski & Olivella (2017), p.1 297 Hansson (2017), p.41 298 Cook et al (2013), p.6 299 Weart (2011), p.46 300 Hansson (2017), p.41 301 Weart (2011), p.46; Weigmann (2018), p.3; Cook et al (2013), p.6 302 Unsworth (2018) 303 Ibid 304 Weart (2011), p.49 305 Cullen (2018), p.145 306 Lewandowsky (2018), p.149 307 Loxton (2018), p.8 308 Carbon (2010), p.130 309 Břízová et al (2018), p.5; Carbon (2010), p.130 310 Loxton (2018), p.9

distances they cognitively ‘flatten’ the Earth, simplifying the estimated measurement.311 Secondly, whilst religion based on faith might cause problems, in practice it is rare to find a credo which defies centuries-old evidence of a ‘round’ Earth.

The idea that an actual conspiracy exists to hide the truth about the Earth’s flatness is hard to understand. Loxton is excusably critical in his assessment of these claims, indicating the absurdity of Page | 25 such beliefs particularly when the concept of a flat Earth already has such thin epistemological warrant.312

Of the five conspiracy theories outlined here, ‘flat Earth’ has the least evidential merit and so the actions of its proponents are probably best evaluated according to general tendencies. Some people will surely be benefitting financially from the theory and there will be psychological gratifications for others (belonging to a group, opposition to the establishment, etc). It is because “flat Earth ideas make no sense and explain nothing”313 that it is included in the study. YouTube talks about “reducing recommendations of borderline content,”314 referring to this specific theory as “content that could misinform users in harmful ways,”315 so it should act as an indicator of how successfully YouTube shapes its recommendations and deals with conspiracy theories, regardless of their merit.

2.3.3 Fluoridation of water Focussed on the public health measure of adding fluoride to water supplies to prevent tooth decay, fluoridation conspiracy theories consider the practice to be harmful in a variety of ways. Fluoridation has been accused of increasing the risk of debilitating health conditions including “cancer, Down syndrome, heart disease, osteoporosis and bone fracture”316 amongst others, despite there being no evidence to support these claims.317 Horowitz describes how fluoride has also been incorrectly called a “cumulative poison, gradually accumulating in body tissues to toxic levels.”318 Glick & Booth list some more radical accusations levelled by conspiracy theorists, who link water fluoridation with disposal of industrial waste, Russian prison camp experiments to produce schizophrenia, Axis power efforts during the second world war to keep populations subdued, a cover-up for US failures to provide dental care to the poor and the regularly-cited plots (by the illuminati, communists, fascists, etc) to ‘take over the world.’319

In contrast, the CDC (Centers for Disease Control and Prevention), which is the principal public health institution in America, considers fluoridation to be one of the public health achievements of the 20th century.320 Fluoride “works to prevent and control dental decay”321 and people from areas where water is fluoridated “*exhibit+ a lower prevalence of dental caries”322 Uscinski provides the

311 Carbon (2010), p.130-5 312 Loxton (2018), p.10-1 313 Ibid, p.11 314 YouTube (2019a) 315 Ibid 316 CDC (1999), p.938 317 Ibid 318 Horowitz (2003) p.5 319 Glick & Booth (2014), p.799 320 CDC (1999), p.933 321 Horowitz (2003), p.3 322 Frangos et al (2018), p.253

example of Calgary City Council who stopped fluoridating the municipal water in 2011, where studies indicated an almost immediate rise in tooth decay amongst children.323

Whilst fluoridation appears similar to climate change or flat Earth theory, with conspiracy theorists facing off against scientific consensus, it is perhaps more nuanced. There is a perceivable ‘side- effect’ called fluorosis, which can lead to discolouration of the teeth. Whilst Horowitz considers this Page | 26 a minor issue when compared to benefits related to tooth decay,324 individuals whose teeth become flecked with white spots may be less psychologically inclined to consider them an indication of excellent dental health. There are also concerns that fluoridating water supplies is a form of “unethical socialised medicine”325 which (regardless of conspiracy theorists’ concerns over nefarious effects - stupefaction, pacification or mind control), may seem unacceptably prescriptive to some people. From a public health perspective, these are minor issues, but they can form a populist entry point for conspiracy narratives to infiltrate mainstream public opinion.

Frangos et al’s comprehensive study into the quality (and quantity) of salient information online provides an overview of the conflicting opinions surrounding the subject.326 Whilst only a minority of web pages took an anti-fluoridation stance, these were often accompanied with “attention-grabbing claims (albeit unsubstantiated).”327 The study concludes with the assertion that “public health professionals... will need to enhance their promotion of fluoridation,” a point echoed by Horowitz328 and one which suggests that control over the ‘truth’ of the matter is still an ongoing struggle.

2.3.4 HAARP The High-Frequency Active Auroral Research Program, or HAARP, is an Alaskan scientific research station329 capable of heating “small regions of the ionosphere and [observing] the effects.”330 Originally developed by the US Air Force to explore new avenues for communicating with nuclear submarines331 the facility was transferred to the University of Alaska in 2015.332 As with many subjects of conspiracy theories, an internet search for HAARP “can get really strange, really fast.”333 Its capabilities are imagined to be powerful and wide-ranging, being variously accredited with the following: modifying the weather,334 controlling natural disasters (including earthquakes) worldwide,335 transforming airborne gas into a ‘killer shield’ to protect from missile attacks,336 mind

323 Uscinski (2018), p.11 324 Horowitz (2003), p.7 325 Frangos et al (2018), p.253 326 Ibid 327 Ibid, p.259 328 Horowitz (2003), p.7 329 Streep (2008), p.61 330 Fournier (2017) 331 Weinberger (2008), p.930 332 Pederson (2015), p.73; Mitchell (2017) 333 Fournier (2017) 334 Mersereau (2018); Fournier (2017); Streep (2008), p.60 335 Mersereau (2018); Jacobs (2017), p.331 336 Weinberger (2008), p.931;Streep (2008), p.60

control,337 firing “death beams,”338 being capable of geophysical manipulation339 and bringing down the Columbia space shuttle.340

Although many of the claims made about HAARP are outlandish, the facility is a more credible target for conspiracy theorists. There are several factors which invite scrutiny from an inquisitive public, for example military involvement, remote geographic location and obscure scientific purposes. As a Page | 27 base for military research Jacobs reported that “the information stream available to the public was and is demonstrably incomplete.”341 Furthermore, she found (in 2015) around 65% of official records on HAARP were ‘withheld’ from the public, with about another 6% being partially withheld.342 This lack of transparency, and the gaps in knowledge it creates, leaves ample room for speculation and, moreover, conspiracy theories.

HAARP is the most high-powered ionosphere heater in the world.343 It is used to “send a focused beam of radio-wave energy into the aurora zone”344 which basically heats up a portion of the atmosphere. Scientists have reported that the facility has been able to create an artificial aurora borealis (or Northern lights).345 To the layperson, whose might not understand the science behind these experiments (even if access to it was not prohibitively restricted), making changes to the Earth’s atmosphere can appear dangerous or suspicious. In the absence of visible facts, people are predisposed to impose their own understanding, prejudices and fantasies on the HAARP station. One significant influence worth noting is the book “Angels Don’t Play This HAARP” by Nick Begich, which portends “a new class of weapons that could change our world profoundly... could mess up the weather”346 and has earned the author “probably a million bucks.”347

Nevertheless, the rumours that surround HAARP are, in the main, unsubstantiated conspiracy theories: there is no credible evidence for the more far-fetched accusations, the facility is not top- secret348 it is not the only ionospheric heater in the world,349 its effects dissipate almost immediately once it is ‘switched off’350 and its intensity is “miniscule compared with… a lightning flash.”351 As part of the University of Alaska Fairbanks, it is now a research facility for hire.352 If HAARP is indeed capable of mind control, it should make the Alaskan university a pretty big player in global academia.

337 Fournier (2017); Streep (2008), p.60; Jacobs (2017), p.330-1 338 Jacobs (2017), p.330 339 Ibid, p.331 340 Streep (2008), p.63 341 Jacobs (2017), p.352 342 Ibid, p.331 343 Pederson (2015), p.72; Jacobs (2017), p.331 344 Rozell (2016) 345 Fournier (2017) 346 Begich (2007), p.8 347 VICE News (2017) 348 Rozell (2016) 349 Weinberger (2008), p.932 350 Fournier (2017) 351 Streep (2008), p.63 352 Rozell (2016)

2.3.5 Denver airport The most frivolous of the conspiracy theory subjects used in this study, Denver International Airport has nonetheless been a magnet for strange hypotheses since it opened in 1995.353 It has been included because theories associated to the airport rest upon a few slightly odd facts and little verifiable evidence. Mostly the conspiracies evoked are ‘grand narratives’ where Denver airport appears to be a suitable accomplice. Page | 28

The allegations are often related to the scale of the airport, with suggestions it houses a secret military base,354 a concentration camp,355 or an underground headquarters for ‘lizard people,’356 the global elite,357 the New World Order358 or the illuminati.359These assertions are largely based on trivial aspects of the airport: it is huge,360 construction costs were significantly over-budget,361 the subterranean level is massive362 and the configuration of six runways looks (vaguely) like a swastika.363 There are other thematically-related eccentricities to the airport such as “strange and disturbing”364 artworks by Leo Tanguma, an imposing ‘devilish’ sculpture of a blue mustang nicknamed ‘Blucifer’ sitting outside the airport365 and a dedication stone donated by the Freemasons containing cryptic references to the ‘New World Order’366 and the highest order of the masons.367 Taken at face-value, this ‘evidence’ of conspiracy all appears to be rather tenuous.

Denver airport is interesting because unremarkable facts, like inaccurate budget estimates, large basements and sculptures with tragical histories, are used to support the speculation. The epistemic warrant of the ‘hidden base’ theories is demonstrably poor, yet they continue to circulate. One reason why these ideas persist, especially in mainstream media, is because they are quite fun, with the airport itself using them to raise its profile.368 At the more ‘harmless’ end of the spectrum, Denver airport should be a useful starting point for investigating whether YouTube video recommendations do tend towards the extreme.

353 Wolfson (2018) 354 Smith (2015) 355 Ibid 356 Wolfson (2018) 357 Buzzfeed Multiplayer (2016) 358 Ibid 359 Ibid 360 Smith (2015) 361 Ibid 362 Wolfson (2018) 363 Smith (2015) 364 Hunt (2008) 365 Smith (2015) 366 Ibid 367 Buzzfeed Multiplayer (2016) 368 Smith (2015)

3 Theoretical Framework This study has been conceived as a critical discourse analysis because, whilst YouTube’s algorithm decides which videos are recommended, it is informed by the behaviour of billions of global viewers and, ultimately, can only recommend artefacts which already exist on the platform. Woofit describes how “anything can be analysed as a text”369 and on YouTube there are various texts interacting to Page | 29 create meaning, these include (but are not limited to) videos, viewer comments, visible view-counts and the recommendations output of the platform’s algorithm. The functions of the algorithm might be envisaged as bringing together numerous coexistent narratives (viewer predispositions, commercial considerations, pre-programmed normative restrictions, etc) which are distilled into a series of viewing options, presenting a selection of ‘texts’ which the platform considers most relevant. However, the algorithm itself can also be considered as a text, as it plays an essential role in collating and interpreting data collected by YouTube. This concept of algorithm-as-text corresponds with Cheek’s description: “texts not only represent and reflect a certain version of reality, they also play a part in the very construction and maintenance of that reality itself.”370 Hodkinson states that “discourse analysis attempts to place [texts] in context”371 and it is through understanding the broad social, political, psychological and cultural background framing YouTube content, that we might appreciate why conspiracy theories are popular, what makes them “float to the surface,”372 and how their prevalence impacts upon our understanding of the world.

Conspiracy theories challenge conventional knowledge and structures of understanding. YouTube is a marketplace of entertainment where videos compete for the attention of a global audience, with successful user-producers profiting from Google’s advertising revenue. As anyone can post content on YouTube, it reflects the unlimited opinions, interests, discrimination and fallibilities of humankind, with viewer choices reflecting personal preferences and prejudices. Strangelove calls YouTube “a battlefield, a contested ground”373 and it is this evocation of competition that makes the subject ideal for a Foucaldian discourse analysis (FDA). Michel Foucault envisaged discourse as “closely intertwined with relations of power”374 and just a cursory look at the proliferation of videos promoting or debunking conspiracy theories on YouTube reveals various struggles: for views, for validation, for popularity, for the favour of the recommendation algorithm and for authority over the perceived ‘truth’ of the matter.

YouTube is a meeting place for discourse where an algorithm plays a significant role “organizing... gatekeeping”375 and prioritising the texts available to the viewer. It essentially ‘regulates’ the discourse flow through the platform and thus acts as a “*realiser+ of discourse,”376 amplifying some voices (and messages) whilst silencing others. The algorithm is therefore a necessary object for study and yet, as explained earlier, it is distinctively difficult to interrogate. This is one reason why an FDA is employed in this study. Through focussing on discourse, on can evaluate the algorithmic process

369 Wooffitt (2005), p.148 370 Cheek (2004), p.1144 371 Hodkinson (2017), p.68 372 Bouvier (2015), p.153 373 Strangelove (2010), p.4 374 Hodkinson (2017), p.67 375 Schmitt et al (2018), p.781 376 Bouvier (2015), p.153

indirectly “*experimenting+ with inputs and outputs”377 to reveal what it does and, subsequently, how that affects the narratives surrounding it. By using this method, the origin of the output (programmer, artificial intelligence, user data, machine-learning, “human or bot”378) becomes insignificant with observable discourses (and their influence) being the only relevant factor.379 It is not necessary to search for an objective truth, rather to understand that there are ‘truth games’ being played out, “with the audience determining which truth is the victor.”380 Page | 30

Whilst FDA appears to be a suitable methodology for this study, it is inherently problematic. As we shall see, with an FDA “theory and method are intertwined”381 and, furthermore, there is no general method (or framework) stipulated by the theory.382 The methodology therefore appears untethered, with ever-changing, seemingly-capricious discourses unable to act as significant anchors for any subsequent analysis. On the other side, we have multipicitous, impenetrable algorithms performing vast, arguably-incomprehensible calculations in the online ‘cloud.’ The question therefore arises of how to effectively ‘ground’ the study, when its constituent elements are apparently elusive. To address this issue, the FDA is primarily designed to be built upon, and informed by, its context. Furthermore, whilst stopping short of a fully ‘mixed methods’ approach, the analysis also employs multiperspectival383 aspects, using a content analysis framework to clarify the discursive limits and act as a ‘fixed point’ between FDA and the various texts.

3.1 Foucauldian Discourse Analysis (FDA) Foucauldian discourse analysis is a subgenre of critical discourse analysis (CDA) focussing on knowledge (what is considered valid, where does it come from, what effects does it have, how does it shape society)384 and how it is negotiated through discourse. Knowledge is considered ‘conditional,’ replicating the subjective conditions of the time and place of its formation.385 Consistently, there is no objective truth: the subject matter has no “transcendental”386 form that can raise it above the “giant milling mass of overall societal discourse”387 and therefore it is endlessly exposed to potential reconfiguration by the pressure of interconnected narratives. This means two things; firstly, knowledge changes over time and secondly, there is no escaping discourse (or possibility for taking an objective, detached view of it). Regarding the first point, Jäger & Maier explain how every discourse strand (which represents a more concrete utterance of discourse) has “a diachronic and a synchronic dimension,”388 meaning that whilst one can take a synchronic ‘snapshot’ of a given subject at a given point in time, it will still be coloured by diachronic considerations (of background and history) so can never be fully isolated from the vicissitudes of

377 Bucher (2018), p.61 378 Sam (2019), p.337 379 Ibid 380 Ibid, p.339 381 Jørgenson & Phillips (2002), p.4 382 Foucault (1991), p.29 383 Feltham-King & Macleod (2016), p.1 384 Jäger & Maier (2009), p.34 385 Ibid 386 Foucault (1980), p.117 387 Jäger & Maier (2009), p.35 388 Ibid, p.46

time.389 Foucault maintains that it is impossible to escape the realm of discourse or take an ‘outside’ view of it390 and his method invariably involves “direct personal experience”391 of the issue. However, rather than lament at “how trapped you are,”392 you are invited to incite change ‘from the inside.’

Every time you address a discourse, you contribute to its credence. A discourse which is not being Page | 31 discussed, invoked or engaged loses meaning, returning to a “blank state.”393 This is part of the wider struggle to remain relevant and influential. When a YouTube video indicates support for the theory that fluoridation is poisoning water supplies, it is supporting countless discourses: anti- fluoridation, anti-science, anti-establishment, poisoning people is bad, YouTube is a valuable tool for disseminating information, the video-producer’s opinion is worthwhile, etc. The current scrutiny of YouTube’s recommendation mechanism highlights a variety of power struggles being played out in a very public forum. The platform is caught between a drive towards growth (increasing views, time spent and advertising revenue) and a purported responsibility towards the safety of its users. This conflict is echoed by the contrast between its commitment to free speech394 and its desire to avoid inhibiting external regulation. Conspiracy theorists expound more surprising and divisive ideas to catch the attention of larger online audiences, but must avoid being “so extreme in their claims that they are unpalatable to those outside of the echo chambers in which they were developed.”395

Foucauldian discourse analysis allows a “top-down approach to the study of language which is concerned with power, ideology, discourses, texts and subject positions,”396 meaning that one can ‘get to grips’ with the political, social or cultural discourse being negotiated before dealing with the grammar and syntax of individual discursive artefacts. This is because FDA is borne of social academic psychology that does not approach language from a strictly linguistic basis, but considers it “rooted in social practice.”397 If “pictures of reality”398 are being constantly constructed, challenged and destroyed, it is possible to interrogate these manifestations of ‘truth’ directly to potentially discern what power structures underpin them.

Whilst Foucault’s method provides indications of how to approach a research subject, it “is not an integrated paradigm”399 and stops some way short of prescribing a comprehensive methodology. Jäger & Maier explain how CDA is “not a rigid formula that can be followed mechanically to produce results,”400 whilst Foucault himself declares that he “*does not+ develop deductive systems to apply uniformly in different fields of research.”401 An FDA has “no general method”402 is “at most…

389 Ibid 390 Jørgensen & Phillips (2002), p.14 391 Foucault (1991), p.38 392 Ibid, p.174 393 Jäger & Maier (2009), p.40 394 YouTube (2019a) 395 Wood (2013), p.32 396 Wooffitt (2005), p.149 397 Ibid, p.147 398 Jørgensen & Phillips (2002), p.14 399 Diaz-Bone et al (2007), para.1 400 Jäger & Maier (2009), p.56 401 Foucault (1991), p.27 402 Ibid, p.29

instrumental, visionary and dream-like”403 and is more of an invitation404 than an instruction book. This makes any study more personal and certainly fits with the experiential405 quality of being inescapably immersed in the ongoing discourses, reflecting the subjective nature of truth and necessitating the surrender of any personal desire to monopolise, or regulate, knowledge.

Page | 32 3.1.1 Discourse What does it mean to analyse discourse? Discourses are inextricably bonded to narratives of power. Although they reflect human behaviour and mores, they are also instruments for negotiating and exercising social power.406 Whilst their meaning can shift through time, they form discrete “material *realities+”407 which, when totalled, “not only shape, but enable (social) reality.”408 An analysis of discourse is therefore not just an exercise in separating the various conflicting and complementary discourses from one another, but an investigation into what connects them, what makes them appear truthful (or authentic) and where the analyst stands in the overall narrative. Jäger & Maier, whose work has been invaluable to this definition, explain how CDA is designed to reveal the inconsistencies and contradictions between discourses, demarcating the “limits of what can be said and done”409 and revealing how “discourse makes particular statements seem rational and beyond all doubt.”410 They also highlight the schism between the synchronic and diachronic aspects of discourse; whilst an analysis is only “valid at one time and place,”411 the meaning visible in that instance must be understood as a fleeting moment captured from an “ongoing production of reality.”412 Finally, as discussed earlier, anyone conducting a CDA must accept that he is entrenched in the discursive milieu and that any exercise of individual agency is enacted by the discourses, not him.413

Foucault’s estimation of discourse is that it represents a constructive and “productive”414 conduit for power. Similarly, knowledge, truth, power and discourse are all related: more powerful actors have more access to discursive tools (such as media), thus they enjoy more control over the type of knowledge produced and are more likely to shape social ‘truth.’415 Through opening up online media channels to practically everyone, YouTube can be considered a new battleground where discourses concerning who, or what, constitutes a credible source of information are less settled than those evident in traditional media. Conspiracy theories are in a struggle to affect different narratives (who

403 Ibid 404 Ibid, p.40 405 Ibid, p.63 406 Jäger & Maier (2009), p.35 407 Ibid, p.37 408 Ibid, p.36 409 Ibid 410 Ibid 411 Ibid 412 Ibid, p.37 413 Ibid, p.36 414 Jørgensen & Phillips (2002), p.13 415 Ibid, p.14-15

should we believe, where should our knowledge come from, what do we value, etc) and are gaining considerable traction in, what Jacobs calls, the “digitally facilitated ‘noosphere.’”416

When it comes to the attribution of meaning, Foucault connects discourse and objects through the dispositive: a mechanism which describes the interaction between discourse, non-discourse and 417 418 material things. Whereas discourse describes what is “already-said,” non-discourses are all the Page | 33 potential statements, propositions and utterances which have never been verbalised, the “never- said… *or+ not-said.”419 The dispositive functions to assign meaning to objects and thus create, what is considered to be, reality.420 Jäger & Maier expand Foucault’s identified relationship between discourse and things to suggest that “an object that is not assigned any meaning is not an object.”421 It follows from this reasoning that when discourses shift or lose power, then the material reality of an object can similarly shift or disappear; in this case the identity of an object is usurped or as the discourse around it changes.422

3.1.2 FDA and YouTube In describing how YouTube’s value is not primarily about the creative output of the company, rather relying on the activities of its users, Burgess & Green outline a decent analogy for the creation and negotiation of discourses that occurs on the platform: “various forms of cultural, social, and economic values are collectively produced by users en masse, via their consumption, evaluation, and entrepreneurial activities.”423 YouTube users are, through discursive practices, establishing which narratives deserve their attention, which are worth producing and which should be ; they can essentially “set the agenda themselves.”424 The website is then “machine learning”425 lessons from these choices and, via its algorithm, reinforcing some discourses whilst neglecting others. This digital-Darwinist ‘natural selection’ is referred to by Strangelove as the “cultural wars of YouTube… *where+ racist, sexist, *and+ homophobic”426 discourses intersect with those concerning “elections, religion, and armed conflicts.”427

These discussions, or power struggles, are often enacted in the ‘comments section’ that accompanies most videos and which “provides a forum for public dialog and argument.”428 Providing an apparently under-regulated space for semi-anonymous opinion, they can be filled with more niche, or otherwise-restricted, examples of “controversial, racist or offensive”429 discourses. Although unlikely to break through into the mainstream, unpalatable attitudes expressed on a public

416 Jacobs (2017), p.337 417 Jäger & Maier (2009), p.39 418 Foucault (1972), p.25 419 Ibid 420 Ibid, p.39-40 421 Ibid, p.42 422 Ibid, p.43 423 Burgess & Green (2009), p.11-12 424 Antony & Thomas (2010), p.1284 425 Nicas (2017) 426 Strangelove (2010), p.4 427 Ibid 428 Antony & Thomas (2010), p.1281 429 Madden et al (2012), p.699

YouTube forum keep the related narrative ‘alive’ and relevant, creating a modicum of power (however small) for the discourse and interlinking it with discourses attached to what is acceptable to be written on YouTube, on message boards, on the internet, as well as what it is acceptable to say or think. This relates directly to Foucauldian “archaeology” whereby one ‘excavates,’ or deconstructs, the rules and structures that legitimate certain discourses to establish how they (historically) came to exist.430 Through attempting to identify ‘statements,’ which are described by Page | 34 Foucault as atom-like “undecomposable *elements+ that can be isolated,” he seeks to reconceptualise history as relating to the subject of discourse (not its manifestations).431

YouTube has particular rules and norms which shape how discourse is conducted on the platform. There are the company’s prescribed “Community Guidelines”432 that are designed to regulate user behaviour. Whilst the website retains the right to remove any content which fails to abide by these guidelines, they have only a minor impact on what gets initially uploaded. In reality, people are likely to be more heavily influenced by activity already visible on the website which, in turn, is informed by general conventions of internet use. Citing as key the “anonymous nature of virtual participatory environments,”433 Antony & Thomas describe how a common feature of “online discussion forums”434 is a tendency to “become controlled by the belligerent and aggressive.”435 This is a recurring theme in social media, message boards and practically any online space which allows user- generated content. Before exploring how hate-filled, attack-minded and vexatiously disruptive dialogue has taken root in digital environments, it is worth mentioning how the virtual anonymity afforded by the internet invites the critic to focus on discourse: any author is necessarily removed from the information (such as videos or comments) that they upload, information which can subsequently be confirmed, challenged or repurposed by agents beyond the original author’s control. Using an ‘archaeological’ approach, rather than attempt to ‘weigh up’ the disparate influences of YouTube guidelines, online language conventions and anonymity, they are best described according to discourse: here (a discourse of) conflict is enabled by an acceptance of anonymity (and subsequent indifference to repercussions) that similarly invites an ignorance or rejection of YouTube’s “Community Guidelines.”436

The internet has its own language of hatred. Inflammatory or abusing ‘flaming’437 is rife, whilst antagonistic ‘trolling’438 is used to incite argument, anger and conflict. Strangelove describes how “the Internet has unleashed a tsunami of hatred”439 which itself would appear to reflect a wider trend by “politicians... talk-show radio hosts... [and] commercial cultural products such as rap music”440 in attributing value to hateful language and rhetoric. Returning to Varshney, who suggested that negative information is more surprising and surprise is the key to grabbing (and

430 Jørgenson & Phillips (2002), p.12; Bourke & Lidstone (2015), p.835 431 Foucault (1972), p.138 432 YouTube Help (2019c) 433 Antony & Thomas (2010), p.1292 434 Ibid 435 Ibid 436 YouTube Help (2019c) 437 OED (2019b) 438 OED (2019c) 439 Strangelove (2010), p.148 440 Ibid

keeping) attention,441 it is clear that negative or hostile language is more likely to elicit an audience response. This is not to say that all discourse on YouTube is driven by hatred or a desire for conflict: whilst accepting that antagonism is widespread on YouTube’s comments sections, Murthy & Sharma maintain that there are “purposeful discussions”442 amongst the barbs and snares. Furthermore, they contend that even instances of trolling may not be simple ‘bear-baiting’ or ‘spoiling for a fight’ but can be otherwise motivated by “mockery, sarcasm, *or+ the ‘lulz,’”443 the last of which describes Page | 35 general amusement or even light-hearted fun.

Ksiazek et al assert that online hostility “*does+ not contribute to quality discussion,”444 however there is no escaping that an adversarial approach works for many users of YouTube and, by stimulating video-posting and audience-engagement, for the platform itself. The phenomena of video response, which was once an advertised feature of YouTube,445 can create dialogues between different user-producers as they post sequential commentaries and reactions to one another’s videos. It is evident that ‘conversations’ are being held across YouTube and, although the forms of communication used are not always civil (or can even stray into unconscionable “offensive language”446 or “vicious personal attacks”447), they are reflexively shaping the reality of both our on- and off-line existences.

3.1.3 FDA and conspiracy theories Conspiracy theories cover such an extreme range of subjects, from the possibly-true to the patently absurd, that their dispositive impact upon the individual subject is likely to be less important than their overall influence on public discourse and how it is conducted (and validated). One would think that the near-spheroid form of the Earth would be accepted as proven, unnecessary-to-contest fact, and yet the ‘flat Earth’ movement continues to enjoy astonishing support. This may indicate that the conspiracy theorists have a genuine desire to reconfigure the public’s perception of their planet, but may equally be underpinned by other ideological considerations or discourses. People might believe the Earth is flat for religious reasons or because they believe other theories and are pre-disposed to scepticism of established knowledge, alternatively they might see the movement as a lucrative opportunity and consider personal financial gain to be worth more than the roundness of the Earth, they may even be doing it for the ‘lulz,’ considering the whole exploit amusing and their amusement to be worthwhile. Whatever the incentive, the raft of apparent support for flat Earth will not change the shape of the Earth, but it can change some people’s perceptions of the planet and, perhaps more importantly, the way they learn to see the world (both literally and figuratively). Essentially, whilst a single conspiracy theory can have a small influence on one narrative, collectively they all address the issue of knowledge, where it comes from and who decides what is ‘right.’

441 Varshney (2019), p.82-6 442 Murthy & Sharma (2018), p.192 443 Ibid, p.206 444 Ksiazek (2015), p.852 445 YouTube (2013) 446 Ksiazek (2015), p.852 447 Hansson (2017), p.44

Foucault considered discourse to be “closely intertwined with relations of power”448 and it is this power-struggle that underlines the real importance of conspiracy theories. Oliver & Wood rightly assert that “conspiracy theories are simply another type of political discourse that provides a frame of interpretation for public events,”449 with the implication being that the frame is an alternative to dominant or establishment views of the same events. Whilst Uscinski asserts that “the establishment is right far more often than conspiracy theories... [and] when conspiracy theorists are Page | 36 right, it is by chance,”450 he also insists that the “battle of ideas”451 between the two sides is completely necessary in a healthy, free society. In investigating why conspiracy theorists reject scientific authorities, Harambam & Aupers implicate the “politico-economic power structures”452 that might unduly undermine the supposedly objective view attributed to scientists: the idea of impartial neutrality superimposed onto science might actually be a facade utilised by the hegemonic “modern capitalist enterprise”453 to legitimate its own commercial and material interests. The compelling evocation that scientific information presented as knowledge “is the outcome of power games”454 (i.e. “history is written by the winners”455) would appear to justify some suspicion around how facts are ‘created’ and presented to the public.

Conflict ensues between authorities and conspiracy theorists. On one side, establishment actors are seen to “create boundaries and discursively exclude conspiracy theorists from public debate,”456 using their “money, power, visibility, and expertise”457 to dominate the narrative and effectively shape reality. On the other, conspiracy theorists strive to ensure these “boundaries are contested, negotiated, and re-defined”458 through their questions, investigations and meticulous scrutiny. Both sides are capable of underhand tactics, with the establishment using “the epithet conspiracy theorist... to silence, stigmatize, or belittle,”459 essentially going “meta”460 on the opposition, whilst the anti-establishment theorists might employ observably “faulty reasoning,”461 use questionable techniques (like quote-mining462) or “*resort] to ad hominem attacks.”463 This is a no-holds-barred struggle for control over what constitutes reality. Although the odds are stacked in favour of the traditional arbiters of knowledge, conspiracy theorists producing “alternative facts”464 will always be ready to fight and may “occasionally win the battle for public opinion.”465

448 Hodkinson (2017), p.67 449 Oliver & Wood (2014), p.953 450 Uscinski (2018c), p.109 451 Ibid 452 Harambam & Aupers (2015), p.474 453 Ibid 454 Ibid, p.473 455 Orwell (1944) 456 Harambam & Aupers (2015), p.470 457 Uscinski (2018), p.1 458 Harambam & Aupers (2015), p.470 459 Orr & Husting 2018, p.82 460 Ibid, p.83 461 Weigmann (2018), p.2 462 Hansson (2017), p.41 463 Weart (2011), p.41 464 Uscinski (2018c), p.109 465 Uscinski (2018), p.17

Dissenting voices are necessary to maintain a balanced society and keep dominant agencies from extenuating their position as authorities over knowledge. Although conspiracy theorists may be disruptive (at times unjustifiably so), they can constitute a type of safeguard through their constant enquiry. Their probing involvement in any given discourse could be considered constructive, as it maintains some discourses (including that of questioning authority) even though techniques used may be destructive in nature. Alternative ideas act as a counterpoint to dominant ideology, helping Page | 37 to interconnect narratives and ultimately ‘paint’ the “pictures of reality”466 that are created from a vibrant palette of discourse.

3.2 Providing stability: a multiperspectival approach This study advances a methodology with no formal structure to tackle the sometimes ephemeral and opaque texts available on YouTube. In order to provide a firm ‘grounding’ for the analysis, emphasis is placed on the research context, whilst a multiperspectival approach is introduced ‘after the event’ to allow the results to be viewed as relating to a content analysis framework.

Following Sam, “the context of *this+ study informs the methodology”467 and the above discussions of both YouTube and conspiracy theories in relation to discourse are intended to provide a functional background to the research analysis. Sam also explains that whilst there is no fixed method or to an FDA, Foucault provides “methodological guides”468 which have also been described as “Foucault’s toolbox.”469 The first stage of ‘grounding’ this study is therefore through “positionality:”470 using and understanding Foucauldian interpretive techniques having already positioned the research in the discursive context of YouTube and conspiracy theories.

In addition to this substantial theoretical and discursive background, this thesis incorporates a more experimental multiperspectival dimension to its analysis. Described by Feltham-King & Macleod as a means of “mixing quantitative analysis with the qualitative elements of discourse analysis,”471 multiperspectivalism is shown to enhance discourse analysis “through careful supplementation with the quantification allowed in content analysis.”472 Referring to this process as one of creating a bespoke and suitable methodological “package,”473 Jørgenson & Phillips corroborate that “multiperspectival work is not only permissible but positively valued in most forms of discourse analysis.”474 There is, however, some suggestion that mixed methods bridging the interpretative/structuralist divide can lead to “confusion and contradiction.”475 Furthermore, St Pierre warns against becoming trapped in traditional positivist binaries when using antithetical post-

466 Jørgensen & Phillips (2002), p.14 467 Sam (2019), p.338 468 Ibid, p.338 469 Bourke & Lidstone (2015), p.833 470 Sam (2019), p.338 471 Feltham-King & Macleod (2016), p.1 472 Ibid 473 Jørgenson & Phillips (2002), p.4 474 Ibid 475 St Pierre (2014), p.9

structuralist modes of analysis, advocating instead pure analyses perhaps based on concepts rather than methods.476

Taking into account the potential for “confusion”477 and the fact that including alternative methods into discourse analysis “has received little *critical+ attention,”478 the multiperspectival aspect of this study (whilst justifiable) is restricted to providing an intermediate representative structure which Page | 38 acts as a fixed ‘pivot point’ relatable to both the observable discourses and the texts which inform them. Also, it is intended to circumvent the criticism levied at discourse analysis that “its inability to ascertain the extent to which a particular discourse is being deployed, beyond a general sense obtained by the researcher”479 through the provision of another lens through which to view the results. The framework interposed in this situation is one conceived by Udlam & Kaun for studying political participation in social media.480 The model appears at least complimentary to discourse analysis as it focuses on “the complex media ecology”481 where the practices of media platforms are placed in the perspective of “their embeddedness in wider societal conditions.”482 As outlined in the diagram, Uldam & Kaun propose four dimensions which are inextricably interlinked: power relations, technological affordances, social media practices and, crucially, discourses.483 Furthermore, the authors advocate empirical research concentrating on one given dimension, in this case discourse, using the others contextually.484 They also envision the model being used to illustrate mixed method approaches to social media which associate discourse analysis to other methods.

Figure 1: Uldam & Kaun's four dimensions of political participation in social media485

476 Ibid, p.7-16 477 Ibid, p.9 478 Feltham-King & Macleod (2016), p.1 479 Ibid, p.6 480 Udlam & Kaun (2017) 481 Ibid, p.191 482 Ibid, p.192 483 Ibid, p.191 484 Ibid, p.190 485 Ibid, p.191

The results of the study are therefore subjected to an FDA before a content analysis based on Uldam & Kaun’s proposed framework is used for further clarification. One might think of this approach as a book containing intermittent illustrations; the ‘written word’ of the FDA carries the narrative, whilst the ‘pictures’ presented by the above framework, though strictly unnecessary, provide further illustration of key points. Page | 39

4 Method

4.1 Research Questions This research is directed towards a number of key questions. Some respond to YouTube’s statement that it is “reducing recommendations of borderline content and content that could misinform users in harmful ways”486 whilst critics are simultaneously raising concerns that the platform’s powerful Page | 40 algorithm promotes false information and leads to increasingly extreme content.487 Firstly, through collection of relevant data, the study aims to discover whether YouTube favours videos containing conspiracy theories, or alternative facts, above more credible content. Secondly, it asks whether following the recommendation system of the website leads down “an ever more radical rabbit hole,”488 with content suggestions becoming incrementally more extreme. Considering conspiracy theories specifically, both these questions address whether anti-establishment ideas or unproven speculation are given an undue level of exposure by the platform’s algorithmic process, thus increasing their visibility, imbuing them with a misleading sense of validity and making the viewer susceptible to increasingly radical messages.

These issues in attributing preference to videos lead into questions about how YouTube, acting as a frame for content uploaded by innumerable producers, influences both the information it contains and the types of interaction that occurs between the website’s users. YouTube provides a forum (or perhaps thousands of individual fora) in which people can present different ideological positions, yet it also appears reticent to regulate that material, purportedly intending to “[maintain] a platform for free speech.”489 This study, therefore, investigates how the design of the platform and actions of the company stimulate discursive activity and how discourse might reflect values encouraged by YouTube and the functionality of its recommendations system. Furthermore, attention is paid to how integrated ‘social’ features of the website (such as the comments section, or like/dislike buttons) impact upon the viewer experience, through creating signals within the visual field which serve to corroborate or contradict information presented by a video. The research thus aims to examine which influences (other than the recommendations algorithm) are imbedded in the platform, how these may shape any discourse surrounding the video content and what stimulates increased engagement (views, like, comments, etc) in a particular narrative. Considering YouTube as a “contested ground”490 where power over knowledge is being constantly negotiated, the ultimate aim of this study is to better understand whether “divisive, sensational and conspiratorial”491 videos are truly dominating the platform, how this might be encouraged by the recommendation algorithm and the behaviour of its users, and what impact this may have on discourses concerning the creation and distribution of knowledge, who controls the truth and what is valued in modern society.

The research questions for this study can therefore be summarised as follows:  Does YouTube favour conspiracy theories or alternative facts above more credible content?  Does the YouTube recommendations system inevitably lead to more extreme content?  How does YouTube influence both its users and the content it hosts?

486 YouTube (2019a) 487 Tufekci (2018); Lewis (2018) 488 Kaiser & Rauchfleisch (2018) 489 YouTube (2019a) 490 Strangelove (2010), p.4 491 Lewis (2018)

 How does the design of YouTube’s platform stimulate discursive activity and promote the values of the company?  What embedded functions, apart from the algorithm, exist on the website and how do they interact with and influence individual users?  What consequences does discursive activity on YouTube have for wider society? Page | 41

4.2 Research Process In this study, data was collected from YouTube to establish how different narratives are represented on the platform; which ones are promoted by YouTube’s recommendation system, what kind of videos receive the most approbation from users (in terms of views and likes) and how the comments sections interact with the visual content. Basic searches were conducted by entering conspiracy theory-related terms into YouTube’s imbedded search box with the results being recorded.

YouTube “is a particularly unstable object of study, marked by dynamic change”492 and, whilst in their own communiqué the website suggests improvements will be “gradual”493 there are near- constant modifications, with “hundreds of changes”494 possible in a single year. Aiming for a synchronic ‘snapshot’ of YouTube processes and discourses, the data was collected over a period of five days. One computer was used by one person to ensure a consisted process was followed.

To ensure that YouTube was not taking cues from any previous, the computer’s cookies (data stored by websites) were deleted and the platform was accessed using a ‘private browser.’ This provided a ‘clean slate,’ ensuring any recommendations suggested by the website were uninformed by discernible personal preferences. YouTube was opened directly from the browser window by inputting www.youtube.com into the address bar and, upon arriving on the website’s home page, a predetermined search terms was entered into the YouTube search box.

4.2.1 Search terms used Oliver and Wood have commented how “there is no research that systematically examines support for a wide selection of conspiratorial narratives”495 and, whilst they specifically cite American support, this study attempts to encompass an analogously broad range of theories across an even larger (global) demographic. Through investigating how very different types of conspiracy theory are treated by YouTube, one can come to understand the discursive activity of the platform, whether there are patterns in how it treats uploaded content and whether there is any discrimination between theories. The search terms employed in the study were chosen to associate with the specified conspiracy theories (anthropogenic climate change, flat Earth theory, water fluoridation, HAARP and Denver airport), but also to be as neutral as possible: there is little point searching for ‘flat Earth theory’ or ‘fluoridation conspiracy’ because the results will inevitably be about conspiracy theories. Search terms were therefore picked to admit minimal conspiracy bias, whilst allowing the possibility of conspiracy theory results. These search terms are as follows:

492 Burgess & Green (2009), p12 493 YouTube (2019a) 494 Ibid 495 Oliver & Wood (2014), p.952

 Climate change  Earth shape  Fluoridation  HAARP  Denver airport Whilst it should be possible to imagine innumerable videos which could address the above topics Page | 42 without being a conspiracy theory, it could be argued that the phrase ‘earth shape’ might still be unnecessarily partial to conspiracy theories (as the facts are so well established that any content is likely to take a contrary stand). However, there are still ongoing discussions of how round the earth is, due to it not being a perfect geometric sphere but a kind of oblate spheroid (fatter in the middle)496 and, pleasingly, videos which address this fact materialised during data-collection.497

4.2.2 Data collection When the results page loaded after having entered one of the designated search terms in YouTube, the top ten recommendations were opened sequentially in new tabs (still in the private browser window), providing a cache of ten open pages. Any videos in the top ten results which displayed the ‘Ad’ sign, indicating that they were a paid-for promotion, were entirely ignored (not opened in a new tab nor otherwise considered part of the search results). This YouTube-created hierarchy of recommendations provided the first data to be recorded: each video was noted for being in the ‘top ten’ of the given search term, as well as having its ‘chart position’ logged. Studies have shown that “position bias”498 where information is displayed most conspicuously, such as at the top of a list of search results, serves to make people believe it is more important and interesting than material less- prominently located. Once this data had been collected, the initial ‘search results’ page was closed down and each individual video page was assessed in turn.

A significant amount of data was harvested from each video page. In addition to information appealing to ‘position bias,’ like the side-bar video recommendations, the study logged statistics related to “social influence bias”499 (where social signals infer importance or interest), such as view- counts and ‘likes,’ and some data which straddled both biases, namely the top-rated comments. For each of the initial ‘top ten’ videos, the subsequent data was recorded:

Title–The name of the video on YouTube

Title Bias –The content of each title was analysed to establish whether indicated partiality towards, or against, conspiracy theories. Titles were adjudged to support a conspiracy theory (CT), contradict a conspiracy theory (xCT), raise questions about the subject (Q), be ambivalent – usually by suggesting the video offers a standpoint or opinion but not being clear about what it is (A), be neutral (N) or not relate to the subject at all (U). This process follows the common research technique of content analysis through “qualitative, or non-numeric, data”500 with the stipulated categories designed to allow effective compartmentalisation of information received. An extensive

496 Wikipedia (2019b) 497 Manley (2019) 498 Lerman (2016), p.5 499 Ibid 500 Miller (2018), p.4

selection of categories was chosen because it allows “more detailed or finer distinctions regarding the data (if this is so desired)”501 but, should simplification become necessary or preferable, the questioning, ambivalent and neutral classifications could be grouped together under a ‘non-biased’ umbrella heading.

Description Bias – Most videos on YouTube have a description below the title, with the first couple Page | 43 of lines also appearing on the website’s search results page. This was assessed to ascertain pro- or anti- conspiracy theory bias, using the same categories as ‘Title Bias’ above with one addition for when the video had no description whatsoever (0).

Content Bias – One of the most important analyses for the study, ‘content bias’ was uncovered by watching the video. Evaluating bias was not always straightforward because, whilst there might be visual or audible clues, often presenters and narrators used intentionally vague language, raised questions rather than offering opinions or made ambivalent statements. When attributing bias it was therefore imperative to ensure that it was justified. In cases of doubt, no bias was inferred, with the video being usually considered ambivalent, questioning or neutral. However, where there was good cause to believe that a video signified a preference for either position (pro- or anti-conspiracy theory), the legal concept of the ‘man on the Clapham omnibus’ was invoked: if a paradigmatically reasonable person would think the video exhibited bias, then it was adjudged to have done so.

Date published –The date the video was ‘published’ on YouTube

Views – The exact total number of views the video had accumulated when evaluated, providing an indication of both reach and popularity.

Likes – Indicated by a ‘thumbs up’ symbol, users can easily click to ‘like’ a video. The total number was recorded as it is presented on the website, so the figures are exact up until 999, anything between 1,000 and 9,999 is rounded down to the nearest hundred (e.g. 2,180 would be shown as 2.1k and recorded as 2,100), anything above 10,000 is rounded down to the nearest thousand, anything above 100,000 is rounded down to the nearest ten-thousand, etc.

Dislikes –Works in the same way as ‘likes,’ with a ‘thumbs down’ button, and is a good indication of general disapproval.

Name of user-producer – Person, or company, responsible for creating and/or uploading the video. The name acts as a hyperlink to the user’s particular YouTube channel.

Whether user’s channel is verified – When the user-producer’s name is accompanied by a ‘tick’ badge their channel is considered ‘verified.’502 This means they have over 100,000 subscribers and have applied for the accreditation or that the channel is the official YouTube page for a “brand, business or organisation.”503 The affirmative aspect of this symbol could be considered a visual signifier of correctness, encouraging viewers to trust the channel.

Comments – A significant part of the research, data was collected from the comments section underneath each video. Firstly, it was logged whether the comments section was ‘open’ as, in a

501 Ibid, p.5 502 YouTube Help (2019b) 503 Ibid

minority of cases, the comments forum had been disabled by the video’s producer. Secondly, the total exact number of comments posted to the video was logged. Thirdly, a sample of the comments was subjected to a content analysis. There are two possibilities for ordering what people have written below the video: ‘newest first’ which sorts them according to date posted and, the default, ‘top comments.’ Miller, who also makes a content analysis of YouTube comments, advocates “*using] a random selection process where only a subset of the comments are used”504 to avoid Page | 44 creating an “unwieldy”505 sample. In this study, the ten ‘top comments’ were extracted to be evaluated, ignoring any replies they might have. The comments were then assessed in reference to the video to see whether they conveyed support or opposition to conspiracy theories. This was not always straightforward as “it is quite possible for a user to write things in the comments box that are not relevant to the video”506 and, in many cases, “the video needs to be watched closely in order to properly classify the content *of the comment+.”507 Furthermore, there were many instances where comments were ambiguous despite it being clear that the writer was trying to make a meaningful statement. Sometimes this was caused by poor language use. Often sarcasm appears to have been employed (which is a popular intonation on internet message boards but is notoriously difficult to ascertain), however with “short comment length, there may be few contextual clues to help determine whether the intended sentiment is positive or not.”508 Murthy & Sharma provide a good synopsis of issues faced when trying to classify these, often extremely brief or obtuse, comments: “It can be difficult to judge the rationale and intentionality of YouTube users posting and responding, and to disentangle troll-like constructive/destructive, humorous/offensive and serious/banal commentary... The forms of expression can be apocryphal or ambivalent and entangled alongside other antagonistic and non-antagonistic commentary.”509

‘Pinned’ comments (which are produced by the channel owner and stick steadfastly to the top of the comments stream) and those written in any non-English language were ignored, in order to avoid misclassification through mistranslation. If there were fewer than ten available comments (excluding replies, which were universally disregarded), then the shortfall was classified as missing (O). Comments were not logged in full but rather subjected to a similar analysis and categorisation system to the ‘bias’ analyses explained above. Reflecting the diversity of comments on display, and their relationship to the video, there are more groups for the comments. Each comment was catalogued as one of the following:

 iCT – Intimating support for the conspiracy theory  CT – Unequivocal support for the conspiracy theory  eCT – Embellishes or expands the conspiracy theory by offering information not contained in the video  eoCT – Expands to include other conspiracy theories which are not the specific subject of the video  N – Neutral

504 Miller (2018), p.6 505 Ibid 506 Madden et al (2012), p.695 507 Ibid, p.712 508 Ibid, p.706 509 Murthy & Sharma (2018), p.209

 A – Ambivalent, presenting an opinion on the conspiracy theory, but unclear which side it favours  Q – Questioning  ixCT – Intimating opposition to the conspiracy theory  xCT – Explicitly opposes the conspiracy theory  axCT – Provides additional information to challenge the conspiracy theory Page | 45  xoCT – Opposes other conspiracy theories not mentioned in the video  U – Comment unrelated to the conspiracy theory  O – Comment missing (there are less than ten available comments)

Side bar recommendations – The final tranche of data collected concerns the recommendations side bar which appears on each video page. These recommendations are chosen by the YouTube algorithm, with the content at the top of the list comprising an ‘up next’ video which is enqueued to play automatically once your current video is finished. The content of the first five videos on this list were assessed to see if they related to the same conspiracy theory, other conspiracy theories or unrelated matters. The results of this assessment were classified into these groups:

 CT – Supports the same conspiracy theory  MCT – Supports multiple conspiracy theories (including the original one)  DCT – Supports a different conspiracy theory or theories  N – Neutral view of same conspiracy theory  NDCT – Neutral view of a different conspiracy theory  A – Ambivalent take on same conspiracy theory  Q – Questioning video  xCT – Challenges the conspiracy theory  xMCT – Challenges multiple conspiracy theories (including the original one)  xDCT – Challenges a different conspiracy theory or theories.

To establish whether the subject matter of a new, side bar recommended, video related to a conspiracy theory a number of indicators were used. If the video used the word ‘conspiracy’ in its title or description, it was considered a pertinent subject. Otherwise, subject matter was cross- referenced against the Telegraph’s 2008 article “The 30 greatest conspiracy theories”510 and Wikipedia’s “List of conspiracy theories”511 to see if there was a match. Where a ‘side bar’ video was found to be related to one or more conspiracy theories, this video was opened in a new tab to undergo a complete assessment as described above.

Once all the data had been collected from the ‘top ten’ original results page recommendations (from the initial search term), then any relevant ‘side bar recommendation’ videos were assessed following the same procedure. The only difference was, any secondary side bar recommendation videos from this stage of the data collection were NOT opened to be individually evaluated.

This entire process, from deletion of cookies and opening of a new ‘private browser’ window, was completed in full for all five of the designated search terms. If, at any stage, a video was

510 Hunt (2008) 511 Wikipedia (2019c)

recommended multiple times by YouTube, these instances were noted (whether it was a ‘top ten results’ or ‘side bar’ recommendation) with the data collected for the specific video but, obviously, the video was not reassessed, thus avoiding duplication.

The method followed produced a data set comprising 103 videos, comprising 50 from the original searches and 53 from side bar recommendations. Page | 46

4.3 Weaknesses and limitations The data was collected using a desktop PC in France. Although private browser windows were employed, websites are still able to access basic geographical information from an internet connection’s IP address and shape their content according to geolocation, which can have “a significant impact on the results generated.”512 This was obvious in that, after cookies had been cleared, the majority of pre-search recommendations on the YouTube homepage were French- language. However, because the search terms used were in English (except for HAARP which is an English-language acronym), a clear majority of results displayed were acceptable to the study. The only anomalies were a couple of videos recommended about HAARP which were in German (and thus not included in the data collected). However, despite the use of English to conduct the searches, YouTube is still likely to have made recommendations which “reflect the pre-existing beliefs of the particular region based on previous Internet traffic,”513 so results may be flavoured by peculiarly French or, even more local, Alpine tendencies.

When evaluating the ‘side bar’ recommendations, it was necessary to decide whether a video indicated an opinion regarding conspiracy theories. This was more straightforward when considering subjects like Denver airport, where non-conspiracy videos could be about planned improvements to the terminals or show an airplane landing, themes that do not intersect with the conspiracy narrative at all. Unfortunately, videos around the ‘flat Earth’ topic were sometimes more difficult to code. Some side bar recommendations derived from pro- or anti- ‘flat Earth’ conspiracy theories did not talk about the shape of the planet at all but DID feature issues which were intrinsically based on the concept of a ‘round’ Earth; the video may be about space travel, showing images of the (spherical) world from above, or might address any number of other matters where the Earth’s shape is implicit but rarely mentioned. This is because the spheroid nature of the planet is considered scientific fact, so inextricable from many different subjects. In this study only videos whose content specifically addressed the issue of the shape of the Earth were included. This could cause some reverse bias as, because the ‘round’ Earth is so normal to most people it is perhaps not worth a special discussion, thus the majority of videos ‘expressing an opinion’ might predominantly offer non-conventional views.

As discussed in the section regarding the method of data collection above, the message conveyed by some YouTube videos is vague or non-committal, whilst comments can be written in ambiguous or indecipherable language. This creates a need for interpretation which has been addressed according to the methods outlined. Whilst every attempt has been made to code accurately and omit any

512 Frangos et al (2018), p.259 513 Ibid, p.259

ambivalent material, it is possible that the intention of the video-creator or a comment-author may be misinterpreted when the information presented is less-than-clear.

Despite there being a number of relevant comparisons between conspiracy theories and religion, To avoid unnecessary complication, this study has therefore been conducted following the premise that conspiracy theories and religious beliefs are two separate areas of enquiry. In practical terms, this Page | 47 means that where a YouTube recommendation highlights a purely theological matter (e.g. “The Reality of Noah’s Ark”514) it is not considered a ‘conspiracy theory’ for the purposes of the study. However, any video which addresses a relevant conspiracy topic, but where religion is an additional or incidental consideration (e.g. “Is The Earth Flat, Islamic Perspective”515), is counted.

Another limitation of this research is that it relies on a shallow sample of information taken from an otherwise unfathomably “large repository of textual material.”516 Whilst datum has been amassed from over a hundred relevant videos, this still constitutes a mere ‘drop in the ocean’ of content available on YouTube. Also, despite a considerable effort being made to investigate how the YouTube algorithm directs an audience (by listing side-bar recommendations, subsequently studying the relevant videos arising for them, before recording the next ‘stage’ of side-bar recommendations) this process only delves three ‘levels’ deep: whilst pertinent results were found, this procedure does not allow us to travel too far into the “rabbit hole of extremism.”517

Finally, there is the issue of synchronicity. The data was collected as quickly as possible during a single week. However, with YouTube professing that they “update our recommendations system all the time,”518 even that short time period may have seen significant changes to the hidden functionality of the platform. From an analytical perspective, the results of the study must be understood as a ‘snapshot,’ reflecting a momentary reality, or what Jäger & Maier might describe as “a synchronic cut through a discourse strand.”519 However, these critics also insist that analysts must “keep an eye on its history,”520 recognising that a study cannot be isolated from diachronic considerations (what happened before and after). This invites a Foucauldian archaeological understanding of the historical foundations of the discourse. Burgess & Green describe YouTube as “a particularly unstable object of study, marked by dynamic change”521 and it is this rapid development driven by a combination of company platform-development, the uploads of its user- producers and the behaviour of its over two billion interacting viewers522 that make it more suited to a discourse analysis than an attempt to uncover intransigent certainties from a time-specific dataset.

514 World Video Bible School (WVBS) (2016) 515 ILovUAllah™ (2018) 516 Madden et al (2012), p.712 517 Tufekci (2018) 518 YouTube (2019a) 519 Jäger & Maier (2009), p.46 520 Ibid 521 Burgess & Green (2012), p.12 522 Popken (2018)

4.4 Ethics The ethical considerations of this thesis have been primarily shaped by three main influences: YouTube’s “Terms of Service,”523 the General Data Protection Regulation (GDPR) guidance issued by Malmö University,524 and Paul Reilly’s exceptionally insightful study “The ‘Battle of Stokes Croft’ on YouTube: The Development of an Ethical Stance for the Study of Online Comments.“525 Page | 48 The prime clause applicable in YouTube’s “Terms of Service” at the time of data collection is the following general restriction: “5.1.I. you agree not to collect or harvest any personal data of any user of the Website or any Service (and agree that this shall be deemed to include YouTube account names)”526 Subsequently no record of account names or personal details was made as part of the research. Channel names were recorded but, as a single account can manage multiple channels and it is possible to use a different name on YouTube to the Google account you sign in with527, this was considered reasonable. Also, as channel names appear instantly to the public when videos are found by inputting the title into a YouTube search, it seemed unnecessary to omit this information from the data collected.

YouTube’s stipulation about personal data echoes Malmö University guidance specifying a restriction on documenting “user names”528 or “any combination of data that make it possible to link the information to a living person.”529 As regards comments, which are posted publically with a hyperlink to the author’s own channel, these were coded according to bias but no words were transcribed nor were author details retained. This mirrors Reilly’s standards: he does not quote directly from user comments, as quotations could be easily traced “using search engines”530 and may [cause] reputational harm to these participants.”531 Furthermore, whilst channel names from publically posted videos were input to the research database, these were used as markers and no correlation between them has been analysed for the purposes of the study. Reilly advocates the paraphrasing of comments to protect authors from “potential harm,”532 but this practice was not considered necessary in this analysis as individual comments were not required to elucidate the codified results.

Reilly believes that “researchers need to assess each specific research context and tailor their ethical stance accordingly”533 and this study has been conducted following a procedure that keeps specific, traceable information at a minimum whilst still showing the key bases of the data collected. As no personal data has been collected or presented, there was no requirement to seek informed consent as per the GDPR guidance,534 nor did the majority of related stipulations within the document apply.

523 YouTube (2019b) 524 Malmö University (2019) 525 Reilly (2014) 526 YouTube (2019b) 527 YouTube Help (2019d) 528 Malmö University (2019) 529 Ibid 530 Reilly (2014), p.7 531 Ibid, p.6 532 Ibid, p.7 533 Reilly (2014), p.9 534 Malmö University (2019)

5 Results and Analysis The basic method involved making a search for each of the five chosen conspiracy theory subjects before opening the top ten recommendations as presented by YouTube. This gave an initial sample of 50 videos. These were then assessed according to the specified criteria and any of the top five side-bar recommendations which also related to conspiracy theories were also opened in separate Page | 49 browser tabs and assessed. Having omitted any duplicate results, the overall sample size following this process was 103 videos, with 53 side-bar recommendation videos being appended to the initial 50 search recommendations.

The top five side-bar recommendations for every video evaluated were coded and recorded, providing a dataset of 515 side-bar recommendations. For the majority of videos the ‘comments section’ was enabled, allowing the top ten to be coded and recorded. However, as this feature was disabled in 13 of the videos and many which were enabled had not gained ten discrete comments, the overall sample of comments numbered 844 (from a possible maximum of 1030).

5.1 Recommendations

5.1.1 Initial top ten search results The first search results indicate that discourses of conspiracy Initial Recommendations (50 videos) are clearly being maintained by YouTube, despite being considered “not the most plausible account of an event or Pro-CT - 16 videos (32%) situation.”535 Of the 50 videos recommended across five Anti-CT - 22 videos (44%) individual searches, 16 have content promoting or Neutral - 12 videos (24%) supporting conspiracy theories (Pro-CT), 22 contain a Table 1.1: Initial search recommendations message contradicting or countering conspiracy theories (Anti-CT), whilst 12 are considered neutral (Neu). The neutral appellation in this case is applied to any content which does not take a clear stance on the conspiracy theory and was coded as follows: ambiguous (6 videos), neutral (3), unrelated (3) or questioning (0).

These opening recommendations also indicate Initial Recommendations by Subject interesting differences between the five conspiracy theories chosen for the study. Climate Change - 0 Pro-CT, 8 Anti-CT and 2 Neutral Whereas videos recommended for climate Denver Airport - 1 Pro-CT, 2 Anti-CT and 7 Neutral change were almost exclusively Anti-CT Flat Earth - 3 Pro-CT, 6 Anti-CT and 1 Neutral Fluoridation - 5 Pro-CT, 5 Anti-CT and 0 Neutral (challenging the conspiracy theory rejecting HAARP - 7 Pro-CT, 1 Anti-CT and 2 Neutral anthropogenic global warming), the HAARP facility attracted predominately Pro-CT Table 1.2: Initial search recommendations by specific subject recommendations with fluoridation seeing an equal split between the two. These results indicate clear differences between how the five subjects are initially represented by YouTube and could indicate various factors at play. On a basic level, the suggested videos might simply mirror public understanding, with conspiracy theories being more prevalent where there is less public consensus on the ‘truth’ of a given matter. They possibly reflect intervention by the platform on some subjects

535 Brotherton (2013), p.9

“to improve the quality of recommendations,”536 whilst others may be considered too niche to deserve such specialist treatment (although YouTube specifically claims to target “content that could misinform users in harmful ways... [like] claiming the earth is flat”537 when three videos supporting a ‘flat earth’ conspiracy appear in the top ten results). The weighting for or against conspiracy theories may indicate the balance of power as discerned by YouTube, with the parity of ‘fluoridation’ results potentially denoting a fertile battleground of opinion. Regardless, conspiracy theories appear Page | 50 regularly in prime-spot search results, suggesting that conspiracy discourse ‘punches above its weight’ when it comes to visibility on YouTube.

5.1.2 Further recommendations: following the side-bar After assessing the initial results above, the video’s side-bar recommendations were investigated to see which were related to conspiracy Recommended video content, all videos theories. Duplicates aside, this process produced 53 additional videos for Climate Change 10 videos 26 videos (16 side-bar) evaluation: 35 were connected to the five 0 Pro-CT (0%) 3 Pro-CT (11.5%) conspiracy theories already under Anti-CT (80%) 20 Anti-CT (76.9%) 2 Neutral (20%) 3 Neutral (11.5%) investigation, whilst 18 related to other relevant theories. When compared to the Denver Airport 10 videos 11 videos (1 side-bar) 1 Pro-CT (10%) 2 Pro-CT (18.2%) ‘top ten’ search recommendations above, 2 Anti-CT (20%) 2 Anti-CT (18.2%) those appearing in the side-bar illustrated 7 Neutral (70%) 7 Neutral (63.6%) a clear shift towards conspiracy theory Flat Earth 10 videos 22 videos (12 side-bar) narratives. The proportion of Pro-CT videos 3 Pro-CT (30%) 7 Pro-CT (31.8%) rose for every subject (except HAARP, 6 Anti-CT (60%) 13 Anti-CT (59.1%) which started with the highest share of 1 Neutral (10%) 2 Neutral (9.1%) Pro-CT content and gained only one Fluoridation 10 videos 15 videos (5 side-bar) neutral video from the side-bars), 5 Pro-CT (50%) 10 Pro-CT (66.7%) 5 Anti-CT (50%) 5 Anti-CT (33.3%) suggesting a concerted movement towards 0 Neutral (0%) 0 Neutral (0%) conspiracy theory thinking. The most significant change was in the ‘fluoridation’ HAARP 10 videos 11 videos (1 side-bar) 7 Pro-CT (70%) 7 Pro-CT (63.6%) subject which saw the addition of five new 1 Anti-CT (10%) 1 Anti-CT (9.1%) videos, all Pro-CT. Here, notwithstanding 2 Neutral (20%) 3 Neutral (27.3%) the bias of the video, the algorithm is Other CT subjects 0 videos 18 videos (all side-bar) presenting side-bar recommendations 0 Pro-CT 10 Pro-CT (55.6%) which favour conspiracy discourses. 0 Anti-CT 4 Anti-CT (22.2%) 0 Neutral 4 Neutral (22.2%)

The inclusion of the side-bar TOTALS 50 videos 103 videos (53 side-bar) recommendations indicates a swing away 16 Pro-CT (32%) 39 Pro-CT (37.9%) from neutral videos towards Pro-CT ones 22 Anti-CT (44%) 45 Anti-CT (43.7%) (discourses of ‘neutrality’ are being 12 Neutral (24%) 19 Neutral (18.4%) Table 1.3: Recommended video content for all analysed videos

536 YouTube (2019a) 537 Ibid

replaced by those of ‘opinion’). This is significant shift towards conspiracy thinking, especially as the initial searches contained a majority share of Anti-CT videos (44%), which you might assume would produce more Anti-CT recommendations rather than those favouring the converse position.

Page | 51 5.1.3 Side-bar recommendations Whereas the above results refer to the total sample of videos evaluated, 515 side-bar recommendations were actually assessed for bias (the top five side-bar recommendations for 103 evaluated videos). This is due to duplicates (where recommendations are made for videos already assessed), videos which are completely unrelated to conspiracy theories, and the video recommendations following the ‘second round’ of evaluations which were not subject to a full assessment.

The total number of recommendations showed a near-equal Side-Bar Recommendations (515) split between videos featuring Pro-CT content and those which were Anti-CT. The vast majority of recommendations were Pro-CT - 74 videos (14.4%) considered neutral, largely due to the appearance of Anti-CT - 78 videos (15.1%) suggestions entirely unrelated to conspiracy theories (340 Neutral - 363 videos (70.5%)

instances). To understand the balance between side-bar Table 1.4: Side-bar recommendations by bias recommendations which expressed a definite opinion on conspiracy theories, the neutral results were omitted to isolate the relevant results. Pro-CT videos were offered to the viewer almost as frequently as Anti-CT videos. Furthermore, when comparing the side-bar recommendations to the original search recommendations (removing all neutral videos in both cases), one can perceive a shift towards Pro-CT material.

Recommendations with neutral content omitted Initial Results (38 videos) Side-Bar Recommendations (152 videos) Swing

Pro-CT 16 videos (42.1%) 74 videos (48.7%) +6.6% Anti-CT 22 videos (57.9%) 78 videos (51.3%) -6.6%

Table 1.5: All video recommendations with neutral content omitted There are also interesting results if you categorise the full sample of 103 evaluated videos by content, then look at their respective side-bar recommendations. Whilst neutral videos predominate, Pro-CT recommendations appear more likely to follow from Pro-CT videos (26.7%) than Anti-CT recommendations from Anti-CT videos (21.8%). Also, the recommendations algorithm suggests videos with conspiracy theory content (either Pro-CT or Anti-CT) more often for Pro-CT videos than either Anti-CT or neutral videos. Where a video contains neutral content, the split between Pro-CT and Anti-CT video is similar (10.5% and 11.6% respectively).

Breakdown of side-bar recommendations according to initial video content

Pro-CT video (39v, 195r) 52 Pro-CT (26.7%) 18 Anti-CT (9.2%) 125 Neutral (64.1%) Anti-CT video (45v, 225r) 12 Pro-CT (5.3%) 49 Anti-CT (21.8%) 164 Neutral (72.9%) Neutral video (19v, 95r) 10 Pro-CT (10.5%) 11 Anti-CT (11.6%) 74 Neutral (77.9%)

Table 1.6: Side-bar recommendations with reference to the content bias of the initial video Overall, the side-bar recommendations data seems to support the hypothesis that YouTube favours the distribution of conspiracy theories. Although recommendations for Anti-CT videos outnumber

those for Pro-CT, the difference is relatively slight, creating a kind of “false balance”538 whereby Pro- CT content is framed as practically equal in value and substance. A pertinent example is climate change: the BBC’s director of news and current affairs has legislated against providing “outright deniers”539 of anthropogenic climate change with a platform, as it is an undeniable reality.540 On YouTube however, of the 22 assessed videos related to climate change, 3 of those recommended in the side-bar (on 4 separate occasions) take a Pro-CT stance, thus providing a visible, disruptive Page | 52 alternative to the “nearly unanimous”541 scientific consensus.

Foucauldian discourse analysis considers knowledge to be conditional, indivisible from the relationships of power that create it.542 YouTube can be regarded as a “power/knowledge complex”543 wherein a variety of influences make often competing representations which effectively shape what people see and what they believe. Through appealing to popular interests and exploiting psychological predispositions, video producers on the platform can increase their viewership and thus negotiate a greater share of influence over what constitutes truth or reality. The attainment of views, or audience engagement through the acquisition of ‘likes’ or ‘comments,’ can influence the recommendations algorithm of the website which, in turn, can stimulate more views, more engagement, and a higher ‘market share’ for the ideology presented in the video. The side-bar recommendations in YouTube appear to treat conspiracy theories as an ‘equal partner’ in various discourses, with the percentages of Pro-CT videos suggesting that they constitute mainstream thought, rather than being niche, unproven or alternative ideas. YouTube is fulfilling a dispositive function, allowing the discourse of climate change denial to survive (and potentially thrive), rather than lose power and, thus, meaning. In some ways, this insinuation is self-fulfilling, as YouTube’s algorithm enhances activity already occurring on the website. As YouTube is the prepotent global host for online videos (over a billion hours of content being watched every day544) the reality created within its boundaries inevitably impacts upon discourses being negotiated elsewhere, both on- and off-line.

As well as providing disproportionate weight to conspiracy narratives, information collected from YouTube’s side-bar recommendations data also demonstrates a tendency towards promotion of anti-establishment hypotheses. This study only provides data from two strata of YouTube videos (videos selected from an initial search and, subsequently, videos selected from their side-bar recommendations) yet Pro-CT content is shown to gain momentum with further searches, whilst Anti-CT content does not develop (or multiply) at the same rate. The statistical bias towards Pro-CT development suggests an incremental extrapolation from these initial stages would only increase the proportion of Pro-CT recommendations on offer. This is perhaps the “rabbit hole of extremism”545 Tufekci warns about.

538 Hickman (2018) 539 Ibid 540 Ibid 541 Uscinski & Olivella (2017), p.1 542 Jäger & Maier (2009), p.34 543 Ibid, p.35 544 Nicas (2017) 545 Tufekci (2018)

In actively advertising progressively contentious material, YouTube is arguably exploiting human weakness for “highly surprising signals.546 In doing so, they add credence to the discourse of surprise or, particularly, that surprise generated by unconventional or “outlying”547 information has significant value. It is furthermore possible that YouTube, in attempting to maximise views and hours watched, has created an environment that encourages users to produce ever-more surprising and therefore psychologically-gratifying content, so their videos may gain favour with the algorithm and Page | 53 thus increase both view-count and visibility. This may be symptomatic of a platform that “has long prioritized growth,”548 causing other discourses such as public safety, scientific truth or discursive civility549 to ‘lose ground’ in YouTube’s marketplace of ideas (and ideals).

5.2 Views The entire sample of 103 videos had a total view count of 110,334,387, giving an average of 1,071,207 each. As all the videos evaluated were discovered through ‘initial search’ or side-bar recommendations, this intimates high Views: position bias viewership numbers for YouTube-favoured # of videos Average views content. The importance of position bias in Total sample 103 1,071,207 this hierarchy of recommendation is Initial Search Top Ten 50 1,212,565 further exemplified by the combined view- Initial Search Top Three 15 2,105,480 counts for the 50 top ten ‘initial search’ Table 2.1: Number of views per video according to position on page recommendations, which gives the higher average of 1,212,565, whilst tallying just the top three ‘initial search’ results amounts to 2,105,480 views each.

When the view-counts are organised according to video content, the results are significant. Whilst Pro-CT videos average less than 400k views, the figure for Anti-CT videos is over double. The most remarkable finding, however, is that videos Views: breakdown by content with neutral content (ambivalent, neutral, # of videos Average views questioning or unrelated) average almost 3 Pro-CT 39 390,421 million views each. Even if we remove Anti-CT 45 883,729 ‘unrelated’ videos from the neutral count, the Neutral 19 2,912,640 average remains similar (2,895,348). The Neu (- unrelated) 16 2,895,348 disparity between different view counts Table 2.2: Number of views according to video content respective to content is the first surprise, as one might presume that views were a prime influence on the website’s recommendations algorithm. If this seems anomalous, then the fact that Pro-CT videos have significantly less views than all the others is perhaps even more peculiar: if Anti-CT content draws a greater audience, why are Pro-CT videos so prevalent? Finally, it appears that videos which do not take an unequivocal stance on conspiracy theories are, by far, the most popular. Whether it is through providing a neutral perspective of an issue, by being ambiguous in the message presented or simply refusing to ‘take a side,’ those videos which leave open questions are the most watched. This is an interesting correlation which suffuses the data collected throughout the study and, as such, it is investigated further below.

546 Varshney (2019) p.82 547 Ibid 548 Shaw & Bergen (2018) 549 Ksiazek et al (2015), p.851

5.3 Likes and dislikes The data regarding likes, resulting from viewers clicking on a thumbs-up ‘like’ button, reflects the results for video views noted above. As expected, Pro-CT videos average less likes than Anti-CT videos, whilst both category medians are dwarfed by the number of likes gained by neutral videos. Page | 54 This pattern is repeated with dislikes, again mirroring viewership (higher view numbers naturally beget higher like/dislike numbers).

Breakdown of side-bar recommendations according to initial video content

# of videos Av. Likes Like % Av. Dislikes Dislike % Total sample 103 10,493 87.9% 1,450 12.1% Pro-CT 39 4,802 92.6% 386 7.4% Anti-CT 45 6,026 82.3% 1,292 17.7% Neutral 19 32,764 89.1% 4,011 10.9%

Table 3.1: Breakdown of side-bar recommendation views, ‘likes’ and ‘dislikes’ according to initial video content A deeper investigation into the statistics does create an insight into audience engagement. Through comparing the quantity of likes with dislikes, it becomes clear that Pro-CT content is more favourably received (92.6% positive, 7.4% negative) than Anti-CT content (82.3% positive, 17.7% negative). This could indicate that proponents of conspiracy thinking are more likely to actively engage with videos, or that the public considers conspiracy theories more palatable than videos countering, or rejecting, those theories. According to these results, Pro-CT videos receive more positive or supporting feedback then Anti-CT videos.

5.3.1 Likes/dislikes compared to views People engage more with Pro-CT videos than Anti-CT or neutral videos. Anti-CT views gain fewer likes than the other two categories, whilst also receiving more dislikes per view. This suggests that Anti-CT content attracts more negative feedback overall. Although Pro-CT videos might average fewer views, the watching audience is more likely to express an opinion (like/dislike) and that opinion is generally more favourable than neutral or Anti-CT videos.

Quantity of likes and dislikes when compared to view-count

# of videos Total views Likes Dislikes Total Likes & Dislikes Pro-CT 39 15,226,410 187,267 (1.23%) 15,051 (0.1%) 202,318 (1.33%) Anti-CT 45 39,767,820 271,186 (0.68%) 58,140 (0.15%) 329,326 (0.83%) Neutral 19 55,340,157 622,517 (1.12%) 76,208 (0.14%) 698,725 (1.26%)

Table 3.2: Video ‘likes’ and ‘dislikes’ compared to view count When viewing a video on YouTube, the counts for views, likes and dislikes are visible directly below the title, assuming a significant (and rather unmissable) position on the webpage. This gives the audience some indication of the video’s popularity and may influence a person’s decision whether to watch the video or even how to feel about its content. However, although the statistics show a distinct difference (0.55%) between how regularly individuals attribute likes to Pro-CT and Anti-CT videos, in practice this will not be immediately discernable by the viewer: with average view counts numbering above one million, the proportions of likes and dislikes will appear similar to ‘the naked eye.’ This comparison of views to likes/dislikes therefore gives better information about audience

engagement with the types of content, than about the influence these prominently-positioned figures might have on viewer opinion.

The results suggest that supporters of Pro-CT content (or detractors of Anti-CT content) are more vociferous than their ‘opponents.’ Strangelove describes the platform as “a fierce battleground over 550 public opinion” and it is one which seems more highly-prized by conspiracy theorists. Uscinski Page | 55 suggests that, despite “the establishment”551 having more tools at their disposal, “conspiracy theories occasionally win the battle for public opinion”552 and it is perhaps because authorities have majority-control over traditional media, scientific journals and education that they are slow to identify new ‘battlegrounds’ which may influence the general public. A number of scientists and critics have called for better communication, promotion of scientific fact and increased visibility to prevent disruptive elements from ‘hijacking’ their subjects of expertise.553 This drive towards improvement might conceivably involve concerted efforts to occupy online spaces, like YouTube.

Conspiracy theories are based on conflict, taking an oppositional stance in order to challenge existing narratives. As their raison d’être is to conflict they are well-suited for the kind of local skirmishes that play out on platforms like YouTube. Pro-CT adherents are perhaps better-equipped in these online discursive environments as they are primed to “fight it out for supremacy,”554 using every apparatus available (likes, comments, views, etc) to propel their argument. Through providing both ‘like’ and ‘dislike’ buttons, YouTube provides a simple arena for the conflict discourse, and one which is always likely to be embraced (or dominated) by those championing conspiracy discourses.

5.4 Comments In terms of basic engagement, there were 404,459 comments across the 103 evaluated videos. Thirteen videos had their comments sections disabled by the uploading user, reducing the sample to 90 and producing an average of 4,494 comments per video. The mean numbers by content reflect the view counts, with Pro-CT videos collecting the least comments whilst neutral videos accumulate the most. A comparison of Comments on videos with enabled comments sections the quantity of comments with view-counts shows Pro-CT Anti-CT Neutral Videos 33 39 18 that audience engagement Comments 55,545 132,482 216,432 remains fairly consistent Average Comments 1,683 3,397 12,024 regardless of content, with Views 14,288,686 34,876,212 52,724,458 neutral videos showing a Comments per view 0.00389 0.00380 0.00410 marginal lead over Pro-CT Table 4.1: Number of comments on videos with enabled comments sections then Anti-CT videos. This means that, as well as being the most watched, neutral videos are the most commented upon and actually receive more comments per view than videos whose content demonstrates an explicit stance on conspiracy thinking. Neutrality, ambivalence and ambiguity thus

550 Strangelove (201), p.155 551 Uscinski (2018), p.17 552 Ibid 553 Weigmann (2018); Hansson (2017); Fasce & Picó (2019); Cullen (2018); Lewandowsky (2018); Glick & Booth (2014) 554 Uscinski (2018c), p.109

appear to be valuable traits for YouTube videos, drawing superior viewing figures and elevated levels of engagement.

5.4.1 Comment bias For each video, in addition to recording the overall number of comments, the ‘top ten’ were assessed and coded according to bias. Excluding videos where the comments section had been Page | 56 disabled and ignoring comments coded as ‘O’ (in cases where there were less than ten comments available), a total sample of 844 comments was recorded. The majority of these comments (48%) fell into a neutral category consisting of neutral, ambivalent, questioning or unrelated statements. This will have been influenced by aforementioned issues in discerning bias from expressions which are often succinct, sometimes poorly Comment Bias constructed and maybe intentionally Pro-CT Anti-CT Neutral ambiguous in meaning. Those Comments 314 129 401 comments which articulated a clear Percentage 37% 15% 48% attitude towards conspiracy theories Table 4.2: Comment bias for entire sample were weighted heavily towards Pro-CT sentiment with 37% of comments insinuating support for the conspiracy theory relevant to the video (iCT), showing outright support (CT), elaborating upon the theory (eCT), or expanding to endorse other conspiracy theories (eoCT). In contrast, comments which reject a conspiracy theory, through intimation (ixCT), substantive refutation (xCT), providing additional reasons (xaCT) or rejecting related theories (xoCT), amounted to only 15%. This is a significant indication that most people commenting on videos are supporting a conspiracy narrative.

When comment bias is considered according to video content, the results are compelling. Pro-CT videos induce Pro-CT comments (69.1%) with practically zero opposition (3%). Anti-CT videos are supported by Anti-CT comments but with much less vehemence (24.8%) whilst Pro-CT comments are considerably more visible (15.3%). For videos with neutral content, the statistics are understandably more balanced although Pro-CT beliefs have a slight advantage.

Comment bias according to video content

Video Pro-CT Anti-CT Neutral Pro-CT 228 (69.1%) 10 (3%) 92 (27.9%) Comment bias Anti-CT 55 (15.3%) 89 (24.8%) 215 (59.9%) Neutral 31 (28.6%) 30 (27.7%) 94 (60.6%)

Table 4.3: Comment bias compared with video content bias In this study the comments sections are dominated by statements which support conspiracy theories. Where a video conveys a Pro-CT message, it is immediately reinforced by robust Pro-CT comments on the webpage. Videos offering Anti-CT content are challenged by the persistent appearance of Pro-CT opinions in the ‘top ten’ comments. In an online world where people regularly rely on peer reviews to inform their actions, this predominance of prominent Pro-CT thought can be considered a powerful validating force for conspiracy theories and their supporters.

When the comments section is arranged according to ‘top comments,’ a comment must engage with other users to rise up the rankings. This can be achieved by replying to the comment, or clicking to ‘like’ or ‘dislike’ it. Although Pro-CT thought appeared more prominent in the liking/disliking of the videos themselves, it was not necessarily obvious to an audience (who would have to statistically

analyse view-counts for a real understanding of relative approval). With the comments section however, individuals supporting Pro-CT beliefs can actually further its cause by liking relevant comments; these comments are then ‘pushed up’ the order, making them more visible to people visiting the webpage. This, in turn, normalises and encourages Pro-CT hypotheses. The willingness of ‘grass roots’ actors to like, or otherwise engage with, Pro-CT thought can forefront its position on the platform whilst strengthening pro-conspiracy discourse. Page | 57

The commitment of conspiracy theory supporters is also evident in the intensity of conviction observable in comments made on YouTube. Whereas most of the Anti-CT comments merely insinuate that a conspiracy is wrong, Pro-CT comment writers generally display much stronger beliefs and frequently elaborate upon their position or cite other conspiracy theories considered pertinent by the writer. This means that Pro-CT ideas in YouTube’s comment sections are both more visible and more forceful; a powerful combination.

Strength of comment conviction

Total Insinuate CT Support CT Elaborate CT Elaborate other CT Pro-CT 314 131 (41.7%) 78 (24.8%) 74 (23.6%) 31 (9.9%) Anti-CT 129 103 (79.4%) 19 (14.7%) 5 (3.9%) 2 (1.6%)

Table 4.4: Number of Pro-CT and Anti-CT comments according to strength of conviction Compared to the machinations of YouTube’s recommendations system which are “concealed from public view,”555 the comments sections provide an open and largely-unregulated repository of data where one can observe the ‘truth’ of a particular narrative being actively contested. The existence of these fora, and their popularity, give credence to the discourse that public opinions (i.e. those which are expressed by their comments) are worthwhile. Additionally, when these opinions gain the approval of other platform users (through ‘likes’ and replies) they are pushed up the rankings, making them appear even more important and relevant. This can lead to a populism of thought which may not be attached to values of truth or verifiability. Again, the YouTube platform is providing a tool (the comments section) which appears predisposed to favour discourses valued or exploited by conspiracy theorists (multiplicity of opinion/truth, validity of non-expert opinion, conflict, populism, etc).

5.5 Video titles The results have already shown that videos taking a neutral stance on conspiracy subjects average higher viewer numbers and compel more audience engagement in terms of likes/dislikes and comments. Title Bias It would appear that ambiguity ‘sells’ a video and an Pro-CT Anti-CT Neutral examination of title bias confirms this notion. 13 11 79 YouTube video titles can be up to 100 characters in 12.6% 10.7% 76.7% length yet, in this study, a surprising number fail to Table 5.1: Title bias in video sample detail whether they support or dispute the specified conspiracy theories. Over three-quarters of the video titles were neutral, ambivalent, questioning or outwardly unrelated. Pro-CT and Anti-CT titles were therefore relatively rare, although the former did slightly outnumber the latter.

555 Lewis & McCormick (2018)

This apparent ascription of value to equivocal, but related, titles is borne-out when these neutral titles are considered according to their view-counts. Neutral titles draw significantly more views than those positing a Pro-CT or Anti-CT bias. Of these neutral-type titles, those which contain a question have higher view-counts, whilst the single video with an apparently unrelated title has just a fraction of the others. The results suggest that people interested in conspiracy theories, whether they support or disagree with them, are drawn to unresolved video titles and their intrinsic promise of Page | 58 conflict. Title bias and views

# of videos Total views Av. views Pro-CT 13 3,315,369 301,397 Anti-CT 11 11,530,866 886,990 Neutral 79 95,488,152 1,208,711 -Questioning 13 28,266,927 2,174,379 -Neutral 25 32,998,018 1,319,921 -Ambivalent 40 34,221,387 855,535 -Unrelated 1 1,820 1,820

Table 5.2: Number of views according to title bias Returning to Varshney’s assertion that for modern audiences “outlying information is prioritized,”556 the predominance of neutral, ambivalent and questioning videos in terms of both viewership and engagement perhaps indicates that not only is human attention being ‘captured’ by “surprising signals”557 but that individuals are essentially seeking out outlying information by directing themselves towards narratives that are intentionally ambiguous or vague. This behaviour leads to battlegrounds of ideology where a person can choose to agree or disagree with the content presented which may, in turn, influence subsequent viewers of the material. Furthermore, as YouTube prioritises its recommendations according to factors including views, likes, comments and dislikes,558 audience interaction can catalyse further engagement which appeals to the platform’s algorithmic preferences. The potential for conflict draws an audience, they then express conflicting views and the website elevates the video in the rankings, inviting further conflict.

This appeal to conflict raises its discursive value. Argumentative and hostile traits have been shown to thrive online559 and the use of neutral titles or ambiguous content on YouTube invites everyone, regardless of personal bias, to the video webpage. This can lead to pitched battles where like and dislike buttons are used to indicate affiliation and comments fora are packed with message threads expressing support, raising other conspiracy-related issues or degenerating into squabbles over specific facts. When a video achieves this sort of engagement, its stock increases with YouTube’s recommendation algorithm, inviting more views and thus more conflict. This, once again, ‘plays into the hands’ of conspiracy theorists.

Although individual conspiracy theories might concern vaccinations or lizard people, the overarching discourse expounded by a majority of these theorists is anti-establishment. They are positing alternative facts to those which are widely accepted by the populace. To effectively overthrow the official narrative, conspiracy theorists must attack extant authorities and undermine the systems

556 Varshney (2019), p.82 557 Ibid, p.86 558 Briggsby (2018) 559 Murthy & Sharma (2018) p.192

which are seen to created and authenticate knowledge. Conspiracy theory discourse is therefore combative and uncompromising. For some subjects, like outright denial of anthropogenic climate change, merely having ‘a seat at the table’ is sufficient to manipulate public opinion and disrupt international responses to a prospective crisis. On a platform like YouTube, where even vexatious engagement can raise the profile of a video and where practically-unregulated message boards allow concurrent commentaries and elaborations of video content, conspiracy thought can truly thrive. Page | 59 Supported by the recommendations algorithm, which amplifies the behaviour of the website’s users, all discourses begin to converge around conflict. Through active participation and support for Pro-CT content on YouTube, conspiracy theorists successfully turn a narrative into a dispute; ‘bringing it down to their level and beating you with experience.’

5.6 Analysis framed by Uldam & Kaun’s four dimensional model The results thus far have been analysed according to discourse. To provide a ‘fixed point’ between FDA and the texts as presented by YouTube, these discursive findings will now be summarised and contextualised according to the other three dimensions of Uldam & Kaun’s framework: power relations, technological affordances and social media practices.560

Figure 2: Uldam & Kaun's four dimensions of political participation in social media561

YouTube notably acts to maintain its own predominance as a video provider whilst destabilising other power relations. The adherence to “free speech”562 which not only allows climate change denial but perhaps unjustifiably recommends such content, keeps the discourse of denial alive whilst presenting the platform as welcoming to people who harbour such world-views. Through purported self-regulation the website invites discourses of conflict and open ‘conversations,’ which can over- emphasise the importance or relevance of conspiracy thinking. Somewhat ironically, this false depiction of balance serves to weaken official (or more justifiable) narratives, negatively affecting

560 Uldam & Kaun (2017), p.190 561 Ibid, p.191 562 YouTube (2019a)

the establishment/public power relationship whilst maintaining YouTube’s monopoly over online video hosting. Furthermore, through normalising a discourse of conflict where contradictory messages are always available, the platform may be considered further undermining traditional media outlets which stick to one official view (or perhaps refuse to spotlight unconscionable views on certain subjects, like climate change).563 Finally, whilst power relations between users is being constantly negotiated through the prevalence of video and comments supporting certain discourses, Page | 60 the platform itself becomes stronger, with each engagement adding data to its algorithm.

In terms of technical affordances, it is the algorithm (or multiplicitous combination of algorithms) that drives certain discourses, amplifies particular voices and maintains the discourses of conflict and inclusion that draw and retain an audience. Pro-conspiracy videos seem to be favoured, even where they receive fewer views, perhaps suggesting that a ‘false balance’ is more effective at maximising viewer numbers across the platform than favouring a single narrative; in this way you can engage Pro-CT, Anti-CT and neutral users without loss. The influence of the algorithm perhaps invites video- producers to broaden the type and number of subjects available, providing a niche for every predilection and ensuring all tastes are catered for (even where they are objectionable or harmful). The platform’s affordance of Google advertising revenue upholds the capitalist discourse expounded by YouTube as a commercial entity. Through allowing user-producers to benefit financially from their videos the website encourages competition and advocates the acquiescence of other discourses (truth, entertainment, etc) to this overarching interest. As discussed earlier, the YouTube comments section and like/dislike functions positively promote discourses of conflict and engagement. By embedding these social media functions on each webpage, the platform supports values which favour “outlying”564 conspiracy thinking above respect for official (or traditional) sources of knowledge.

The social media practices of validating, reviewing/critiquing, commenting and expanding upon issues being presented benefits the website by both engaging viewers and creating more data for its algorithm. These functions comprise important opportunities for conspiracy theorists: they are able to express their opinions and allegiances (which may constitute an important ingroup performance), they can act as dissenting voices (questioning authority) and they can partake in conflict (such as partisan good v. evil struggles.)565 The social media aspects of the platform also draw out values common to the internet, with their semi-anonymous nature supporting discourses of irreverence, hatred,566 and trolling. As conceded in the results, use of the ‘like’ and ‘dislike’ buttons is likely to fulfil a psychological gratification for the ‘clicker’ rather than influence subsequent visitors to the page. However, all these functions have the effect of publicising engagement-as-discourse, ensuring certain narratives persist in the public consciousness when they might otherwise “lose... their identity.”567

563 Hickman (2018) 564 Varshney (2019), p.82 565 Oliver & Wood (2014), p.954; Hofstadter (1964), p.82 566 Strangelove (2010), p.148 567 Jäger & Maier (2009), p.43

6 Conclusion YouTube is a successful commercial entity. It hosts billions of hours of video content uploaded by global users whilst deriving revenue from advertising space sold on its website. Not only has it assumed a dominant market position in terms of streaming video content but it has revolutionised media, allowing individuals to produce content, make it available to a worldwide audience and profit Page | 61 from doing so. The platform has been driven by growth and incorporates social media-type functionality to increase viewer engagement and better understand its visitors’ conduct when ‘on- site.’ YouTube’s powerful algorithm evaluates colossal amounts of data in order to recommend content which will keep people on the website, watching videos and increasing their advertisement income. Despite, or more likely because of, these factors, YouTube is a very blunt instrument and one with pronounced personality flaws that pervade the platform, adversely influence its users’ behaviour and impact upon society as a transnational whole.

YouTube’s expansion was rapid but there is a definite suggestion that, until recently when it has explicitly addressed issues surrounding “our responsibility to users,”568 its duty to public safety was often neglected, if not ignored.569 Understandably, the company has sought commercial objectives of raising view-counts, increasing user engagement and maximising revenue. Although making videos can be a purely creative exploit, the revenue-driven impetus of the platform compels user- producers to tailor their output towards attaining an improved algorithmic ranking, achieving more favourable positioning in search results or side-bar recommendations and increasing their potential for financial income and personal exposure. Through creating the ‘rules of engagement’ and implicitly valuing view-count, view-time and user engagement above content, YouTube has formed an online environment which reflects and promotes its own commercial values. A capitalist discourse of ‘making money’ is therefore favoured above all others. Unlike the BBC’s Charter in the UK570 or French radio quotas for proportion of francophone language,571 the website has no national public service obligations and essentially self-regulates.572 Similarly, people uploading videos or posting comments are given significant freedom as respects subject matter, with frequent concerns being raised about the prominence of “racist, sexist, homophobic, and verbally violent”573 material. The platform was built to grow, with the recommendations algorithm simply learning how best to shepherd the website’s visitors. Beyond that, it is now purportedly attempting to retrospectively alter its approach, “reduce the spread of content that comes close to—but doesn’t quite cross the line of—violating our Community Guidelines”574 and perhaps change the behaviour of users it has already negatively influenced.

Whilst YouTube may have reinforced certain values, its laissez faire approach to regulation has allowed other ideals and informal practices to develop between its various users. It is possible that the psychological tendency to favour outlying or surprising information has, supported by the virtual echo chamber of the recommendation algorithm, come to occupy the forefront of people’s minds

568 YouTube (2019a) 569 Shaw & Bergen (2018) 570 BBC (2016) 571 Coalition Français pour la Diversité Culturelle (2008), p.44 572 Strangelove (2010), p.106 573 Ibid, p.4 574 YouTube (2019a)

(and their webpages). “Clickbaity”575 thumbnails and descriptions have proliferated, using sensationalist titles to pique an audience’s curiosity and keep people watching video after video. It is this permissive but competitive environment that has allowed conspiracy theories to flourish, becoming embedded in the substantial fabric of the platform. Conversely, whilst there was clearly value in gaining more views, a video’s veracity was always likely to become a secondary concern; where is the point of making a film which is informative, reliable and based on verifiable facts if no- Page | 62 one actually watches it? Furthermore, as the results of this study have proved, it is not even necessary to accurately describe what the video is about. Neutral, ambivalent, questioning and downright vague titles draw more views and more engagement from a global audience than those giving an indication of what to expect. Similarly, in many cases you can watch a video in full without gaining a definitive indication of its ideological bearing. If you adhere to the values of the platform, this makes sense, as a precise title or description might deter a section of the potential audience from clicking on the video link in the first place. Ambivalent content allows viewers to project their own prejudices or preferences on the video, making it potentially acceptable to individuals who hold diametrically opposing beliefs. People could also be drawn to the ‘battlegrounds’ of unresolved discourse that the ‘blank canvas’ of neutrality offers, providing them with an opportunity to vent their opinions in active comment sections.

YouTube’s recommendations system which, by offering incrementally more outlying content, is seen to lead viewers down a conspiracy theory “rabbit hole”576 is logical when the predilections of the platform are taken into account. Regardless of the subject matter, people are unlikely to watch the same thing continually as the material will quickly appear repetitive. If, however, the content becomes more extreme or surprising, it is more likely to keep audience attention. One would assume that this movement is naturally gradual, as suddenly radical content might be considered unpalatable (and therefore left ‘unclicked’), whereas a gentle shift provides sufficient interest for the viewer to stay on the website whilst perhaps simultaneously desensitising them and lowering their inhibitions as regards the material. It is also possible that gently ‘raising the stakes’ keeps the audience in the game for longer, increasing view-count and time spent on the platform: by building a narrative incrementally (through ever-more intense videos), the recommendations system leaves itself with ‘somewhere to go.’ Once again, we are faced with the reality that YouTube’s algorithm is an imperfect tool; it can only be effective by suggesting more specific (and therefore more radical) videos. Informed by previous user behaviour, the recommendations made will always be nostalgic, or backwards-looking, as all predictions of predilection are produced through analysing previous demonstrations of preference. The content becomes immaterial as the algorithm attempts to create an appealing pathway for the viewer.

YouTube has normalised the idea that unbelievable ‘facts’ are available to view, and can be proven, by videos on the website. Conspiracy theories have become eminently visible with ridiculous or taboo subjects appearing on the platform. These inevitably become topics which are recommended, as viewers’ searches and choices inform the algorithm.

The comments sections accessible in YouTube are discourse in action. Conspiracy theorists are in the ascendancy in these fora, as the propensity towards conflict favours the oppositional approach of

575 YouTube (2019a) 576 Uscinski (2018)

the theorists themselves. Ambiguity in both message and logic are acceptable in these discussions, with engagement through comments, replies and likes sufficient to redirect the focus of the narrative. YouTube’s unwillingness to regulate too stringently in these social media-type functions allows many disparate attitudes to arise, including those which are offensive, outlandish or just plain ludicrous. The comments sections are particularly interesting as they constitute a meeting-place for internet standards of communication, the intrinsic values of the platform, the message of the video Page | 63 producer and the opinions of the audience. YouTube’s commercial, growth-driven aspect has created an environment where conspiracy theories can thrive and, despite the corporations’ near- monopoly over streaming video, the conspiracy theorists themselves appear happy to accept this particular establishment actor to safeguard their interests and concerns. Whilst this is a seemingly unlikely partnership, it nonetheless functions to promote the professed contrasting values of the two parties: retaining influence and challenging dominant narratives. Perhaps, ultimately, both YouTube and the conspiracy theorists who operate within its bounds are creating discursive battlegrounds which are predisposed to maintaining their own values.

Bibliography

Aaronovitch, David (2009)

“A Conspiracy-Theory Theory: How to fend off the people who insist they know the 'real Page | 64 story' behind everything” from Wall Street Journal (Eastern Edition) published 19th December 2009, accessed via https://www.wsj.com/articles/SB10001424052748704238104574602042125998498 on 23rd April 2019

Alexa (2019)

Youtube.com traffic statistics taken from Alexa.com website accessed via https://www.alexa.com/siteinfo/youtube.com on 20th April 2019

Antony, Mary Grace & Ryan J. Thomas (2010)

“‘This is citizen journalism at its finest’: YouTube and the public sphere in the Oscar Grant shooting incident” from New Media & Society 12(8): 1280–1296

Anti-Defamation League (2009)

“Rense Web Site Promotes Anti-Semitic Views” from ADL.org accessed via https://www.adl.org/news/article/rense-web-site-promotes-anti-semitic-view on 14th July 2019

ApexTV (2018)

5 Mermaids Caught on Tape & Spotted in Real Life published 20thJuly 2017, accessed via https://www.youtube.com/watch?v=KWeDLac6rEo on 4thMay 2019

Apter, Emily (2006)

“On Oneworldedness: Or Paranoia as a World System” from American Literary History 18(2): 365-389

BBC (2016)

Broadcasting: Copy of Royal Charter for the continuance of the British Broadcasting Corporation (London, Crown, 2016) accessed via https://www.bbc.com/aboutthebbc/governance/charter on 16th July 2019

Begich, Nick (2007)

Angels Don’t Play This HAARP (Anchorage, Earthpulse Press, 2007)

Bell, Chris (2019)

“BBC Trending: Conspiracy theories spread after Nipsey Hussle shooting” from BBC.com published on 2nd April 2019, accessed via https://www.bbc.com/news/blogs-trending- 47785688 on 16th July 2019

Bourke, Terri & John Lidstone (2015) Page | 65 “What is Plan B? Using Foucault’s archaeology to enhance policy analysis” from Discourse: Studies in the Cultural Politics of Education 36(6): 833–853

Bouvier, Gwen (2015),

“What is a discourse approach to Twitter, Facebook, YouTube and other social media: connecting with other academic fields?” from Journal of Multicultural Discourses 10(2): 149- 162

Briggsby (2018)

“YouTube SEO Ranking Factor Study” from Briggsby.com website, published 26th March 2018, accessed via https://www.briggsby.com/reverse-engineering-youtube-search on 20th July 2019

Břízová, Leontýna., Kelsey Gerbec, Jiří Šauer& Jan Šlégr (2018)

“Flat Earth theory: an exercise in critical thinking” from Physics Education 53(045014)

Brodie, Richard (2009)

Virus of the mind: the revolutionary new science of the meme and how it can help you (London, Hay House UK, 2009)

Brotherton, Robert (2013)

“Towards a definition of ‘conspiracy theory’” from Psypag Quarterly 88: 9-14

Bruns, Axel (2007)

“Produsage: Towards a Broader Framework for User-Led Content Creation” from Proceedings Creativity & Cognition 6: 1-8. Accessed via http://eprints.qut.edu.au/6623/1/6623.pdf on 28th December 2018

Bucher, Taina (2018)

“Neither Black nor Box” from If... Then: Algorithmic Power and Politics (Oxford, OUP, 2018)

Burgess, Jean & Joshua Green (2009)

YouTube: Online Video and Participatory Culture (Cambridge, Polity Press, 2009)

Buzzfeed Multiplayer (2016)

7 Creepy Conspiracy theories About The Denver Airport published on 12th April 2016, accessed via https://www.youtube.com/watch?v=NN8DO-Xv9wA on 16th April 2019

Carbon, Claus-Christian (2010)

“The Earth is flat when personally significant experiences with the sphericity of the Earth are absent” from Cognition 116: 130-135

Chaslot, Guillaume (2018) Page | 66 Quoted by Paul Lewis in “'Fiction is outperforming reality': how YouTube's algorithm distorts truth: An ex-YouTube insider reveals how its recommendation algorithm promotes divisive clips and conspiracy videos. Did they harm Hillary Clinton’s bid for the presidency?” from The Guardian online published 2nd February 2018, accessed via https://www.theguardian.com/technology/2018/feb/02/how-youtubes-algorithm-distorts- truth on 4th April 2019

Cheek, Julianne (2004),

“At the Margins? Discourse Analysis and Qualitative Research” from Qualitative Health Research 14(8): 1140-1150

Coalition Français pour la Diversité Culturelle (2008)

Cultural Policies in France published December 2008, accessed via http://www.coalitionfrancaise.org/wp-content/uploads/2013/11/Cultural-policies-in- France.pdf on 16th July 2019

Cook, John., Dana Nuccitelli, Sarah A. Green, Mark Richardson, Bärbel Winkler, Rob Painting, Robert Way, Peter Jacobs & Andrew Skuce (2013)

“Quantifying the consensus on anthropogenic global warming in the scientific literature” from Environmental Research Letters 8(024024)

Cullen, Jay T. (2018)

“Learning about Conspiracy theories: Experiences in Science and Risk Communication with the Public about the Fukushima Daiichi Disaster” from Joseph E. Uscinski (ed) Conspiracy theories and the People Who Believe Them (Oxford, Oxford University Press, 2018)

Dentith, M. R. X. (2018)

“Expertise and Conspiracy theories” from Social Epistemology 32(3): 196-208

Dentith, M. R. X. (2018b)

“Conspiracy theories and Philosophy Bringing the Epistemology of a Freighted Term into the Social Sciences” from Joseph E. Uscinski (ed) Conspiracy theories and the People Who Believe Them (Oxford, Oxford University Press, 2018)

Diaz-Bone, Rainer., Andrea D. Bührmann, Encarnación Gutiérrez Rodríguez, Werner Schneider, Gavin Kendall & Francisco Tirado(2007)

“The Field of Foucaultian Discourse Analysis: Structures, Developments and Perspectives” from Forum: Qualitative Social Research8(2)

Fassin, Didier (2011)

“The Politics of Conspiracy theories: On AIDS in South Africa and a Few Other Global Plots” Page | 67 from Brown Journal of World Affairs 17(2): 39-50

Feltham-King, Tracey & Catriona Macleod (2016)

“How Content Analysis may Complement and Extend the Insights of Discourse Analysis: An Example of Research on Constructions of Abortion in South African Newspapers 1978–2005” from International Journal of Qualitative Methods January-December 2016: 1–9

Flaherty, Emma & Laura Roselle (2018)

“Contentious narratives and Europe: Conspiracy theories and strategic narratives surrounding RT’s Brexit news coverage” from Journal of International Affairs 71(1.5): 53-60

Foucault, Michel (1972)

The Archaeology of Language and The Discourse on Language (translated by A. M. Sheridan Smith), (New York, Pantheon Books, 1972)

Foucault, Michel (1980)

“Truth and Power” from C. Gordon (ed) Power/Knowledge: Selected Interviews and Other Writings 1972-1977 by Michel Foucault (New York, Pantheon, 1980)

Foucault, Michel (1991)

Remarks on Marx: Conversations with Duccio Trombadori (translated by R. James Goldstein and James Cascaito), (New York, Semiotext(e), 1991)

Fournier, Elle (2017)

“HAARP research attracts conspiracies, misunderstandings” from University of Alaska Fairbanks News and Information website, published 24th May 2017, accessed via https://news.uaf.edu/haarp-research-attracts-conspiracies-misunderstandings/ on 18th April 2019

Frangos, Zachary., Maryke Steffens & Julie Leask (2018)

“Water fluoridation and the quality of information available online” from International Dental Journal 68: 253–261

Franks, Bradley., Adrian Bangerter & Martin W. Bauer (2013)

“Conspiracy theories as quasi-religious mentality: an integrated account from cognitive science, social representations theory, and frame theory” from Frontiers in Psychology 4(424): 1-12

Glick, Michael & H. Austin Booth (2014)

“Conspiracy ideation: a public health scourge?” from JADA (Journal of the American Dentists Association) 145(8): 798-799

Goertzel, Ted (1994) Page | 68 “Belief in Conspiracy theories” from Political Psychology 15(4): 731-742

Gove, Michael (2016)

“Michael Gove Argues For The UK To Leave The EU In A Live Sky Q&A” (Faisal Islam, Interviewer) from Sky News, published 4th June 2016, accessed via https://news.sky.com/video/michael-gove-argues-for-the-uk-to-leave-the-eu-in-a-live-sky-q- a-10303640 on 16th July 2019

Hagen, Kurtis (2018)

“Conspiracy theories and the Paranoid Style: Do Conspiracy theories Posit Implausibly Vast and Evil Conspiracies?” from Social Epistemology, 32(1): 24-40

Hansson, Sven Ove (2017)

“Science denial as a form of pseudoscience” from Studies in History and Philosophy of Science 63: 39-47

Harambam, Jaron & Stef Aupers (2015)

“Contesting epistemic authority: Conspiracy theories on the boundaries of science” from Public Understanding of Science 24(4): 466–480

Heller, Karen (2016)

“Meet the elite group of authors who sell 100 million books – or 350 million” from The Independent online published on 28th December 2016, accessed via https://www.independent.co.uk/arts-entertainment/books/meet-the-elite-group-of- authors-who-sell-100-million-books-or-350-million-paolo-coelho-stephen-king- a7499096.html on 12th May 2019

History TV (2019)

“Ancient Aliens” from History TV website accessed via https://www.history.co.uk/shows/ancient-aliens on 12th May 2019

Hodkinson, Paul (2017)

Media, Culture and Society (SAGE Publications, London, 2017)

Home Affairs Select Committee (2019)

Oral evidence: Hate crime and its violent consequences(HC 683) launched 27th October 2017, committee meeting dated 24th April 2019, accessed via

https://www.parliamentlive.tv/Event/Index/1d39d677-7440-4ac1-9d8b-d7812cb90ee2 on 30th April 2019, transcription at http://data.parliament.uk/writtenevidence/committeeevidence.svc/evidencedocument/ho me-affairs-committee/hate-crime-and-its-violent-consequences/oral/100660.pdf

Horowitz, Herschel S. (2003) Page | 69

“The 2001 CDC Recommendations for Using Fluoride to Prevent and Control Dental Caries in the United States” from Journal of Public Health Dentistry 63(1): 3-8

Hunt, H.E. (2008)

“The 30 greatest conspiracy theories” from The Telegraph online (19th November 2008) accessed via https://www.telegraph.co.uk/news/newstopics/howaboutthat/3483477/The- 30-greatest-conspiracy-theories-part-1.html and https://www.telegraph.co.uk/news/newstopics/howaboutthat/3483652/The-30-greatest- conspiracy-theories-part-2.html

ILovUAllah™ (2018)

Is The Earth Flat? Islamic Perspective published on 26th May 2018, accessed via https://www.youtube.com/watch?v=R7uAf8-GlTE on 14th April 2019

Interesting Facts (2018)

10 REAL LIFE GIANTS You Won't Believe Actually EXIST published 2nd October 2018, accessed via https://www.youtube.com/watch?v=45dbnB_tZ4M on 5th May 2019

Jacobs, Karen (2017)

“Theorising the new geomancy: the case of HAARP” from Culture, Theory and Critique 58(4): 330-356

Jäger, Siegfried & Florentine Maier (2009)

“Theoretical and Methodological Aspects of Foucauldian Critical Discourse Analysis and Dispositive Analysis” from Ruth Wodak & Michael Meyer (eds), Methods of Critical Discourse Analysis (London, Sage Publications, 2009)

Jørgensen, Marianne & Louise Phillips (2002)

Discourse analysis: as theory and method (London, Sage Publications, 2002)

Kaiser, Jonas & Adrian Rauchfleisch (2018)

“Unite the Right? How YouTube’s Recommendation Algorithm Connects The U.S. Far-Right” from Medium.com website published 11th April 2018, accessed via https://medium.com/@MediaManipulation/unite-the-right-how-youtubes- recommendation-algorithm-connects-the-u-s-far-right-9f1387ccfabd on 20th April 2018

Karim, Jawed [jawed] (2005)

Me at the zoo published 24th April 2005, accessed via https://www.youtube.com/watch?v=jNQXAC9IVRw on 1st May 2019

Ksiazek, Thomas B., Limor Peer & Andrew Zivic (2015)

“Discussing the News: civility and hostility in user comments” from Digital Journalism 3(6): Page | 70 850-870

Lantian, Anthony (2013)

“A review of different approaches to study belief in conspiracy theories” fromPsypag Quarterly 88: 19-21

Lerman, Kristina (2016)

“Information Is Not a Virus, and Other Consequences of Human Cognitive Limits” from Future Internet 8(21)

Lewandowsky, Stephan (2018)

“In Whose Hands the Future?” from Joseph E. Uscinski (ed) Conspiracy theories and the People Who Believe Them (Oxford, Oxford University Press, 2018)

Lewandowsky, Stephan., Gilles E. Gignac, & Klaus Oberauer (2013)

“The Role of Conspiracist Ideation and Worldviews in Predicting Rejection of Science” from PLoS ONE 8(10)

Lewis, Paul (2018)

“'Fiction is outperforming reality': how YouTube's algorithm distorts truth: An ex-YouTube insider reveals how its recommendation algorithm promotes divisive clips and conspiracy videos. Did they harm Hillary Clinton’s bid for the presidency?” from The Guardian online published 2nd February 2018, accessed via https://www.theguardian.com/technology/2018/feb/02/how-youtubes-algorithm-distorts- truth on 4th April 2019

Lewis, Paul & Erin McCormick (2018)

“How an ex-YouTube insider investigated its secret algorithm: The methodology Guillaume Chaslot used to detect videos YouTube was recommending during the election – and how the Guardian analysed the data” from The Guardian online published 2nd February 2018 accessed via https://www.theguardian.com/technology/2018/feb/02/youtube-algorithm- election-clinton-trump-guillaume-chaslot on 4th April 2019

Loxton, Daniel (2018)

“Is the Earth Flat? Flat Earthers Are Back—You Know the Earth Is Round, But How do You Best Make the Argument?” from Skeptic Magazine 23(2): 8-13

Madden, Amy., Ian Ruthven & David McMenemy (2013)

“A classification scheme for content analyses of YouTube video comments” from Journal of Documentation 69(5) 693-714

Malmö University (2019)

General Data Protection Regulation information, internal university guidelines, unpublished Page | 71 Manley, Scott (2018)

Satellite Photos Show The True Shape Of The Earth published 4th June 2018, accessed via https://www.youtube.com/watch?v=tjx0KcDH7pQ on 14th April 2019

Mersereau, Dennis (2018)

“Why Do People Believe in Weather Control Conspiracy theories?” from The New York Observer (14th May 2018)

Miller, Eric D. (2018)

“Content Analysis of YouTube Comments From Differing Videos: An Overview and Key Methodological Considerations” from SAGE Research Methods Cases Part 2 (London, Sage Publications, 2018)

Mitchell, Sue (2017)

“UAF plans HAARP research campaign” from University of Alaska Fairbanks News and Information website, published 10th February 2017, accessed via https://news.uaf.edu/haarp-research-campaign-planned/ on 18th April 2019

Moretz-Rabson, Daniel (2019)

“Rush Limbaugh Claims New Zealand Mosque Shootings Were False Flag Operation, Offers No Evidence” from Newsweek.com published on 15th March 2019, accessed via https://www.newsweek.com/rush-limbaugh-claims-mosque-attacks-false-flag-1365260 on 16th July 2019

Murthy, Dhiraj & Sanjay Sharma (2018)

“Visualizing YouTube’s comment space: online hostility as a networked phenomena” from New Media & Society 21(1): 191–213

Neville-Shepard, Ryan (2018)

“Paranoid Style and Subtextual Form in Modern Conspiracy Rhetoric” from Southern Communication Journal 83(2): 119-132

NHS (2019)

“'No link between MMR and autism,' major study finds” from NHS Website published 5th March 2019, accessed via https://www.nhs.uk/news/medication/no-link-between-mmr- and-autism-major-study-finds/ on 12th May 2019

Nicas, Jack (2017) Page | 72 “YouTube Notches Global Video Milestone --- Google unit's viewers watch more than 1 billion hours a day, on pace to eclipse TV” from Wall Street Journal (Eastern Edition) published 28th February 2017

OED (2019a)

“Post-truth” from Oxford English Dictionary 3rd Edition (OED.com) published July 2017, accessed via http://www.oed.com/view/Entry/58609044

OED (2019b)

“Flaming” from Oxford English Dictionary 3rd Edition (OED.com) draft editions October 2001, accessed via http://www.oed.com/view/Entry/71018

OED (2019c)

“Trolling” from Oxford English Dictionary 3rd Edition (OED.com) draft editions March 2006, accessed via http://www.oed.comview/Entry/206615

Oliver, J. Eric & Thomas J. Wood (2014)

“Conspiracy theories and the Paranoid Style(s) of Mass Opinion” from American Journal of Political Science 58(4): 952–966

Orr, Martin & Ginna Husting (2018)

“Media Marginalization of Racial Minorities: “Conspiracy Theorists” in U.S. Ghettos and on the “Arab Street”” from Joseph E. Uscinski (ed) Conspiracy theories and the People Who Believe Them (Oxford, Oxford University Press, 2018)

Orwell, George (1944)

“As I Please” from Tribune published 4th February 1944

Pederson, Todd (2015)

“HAARP, the most powerful ionosphere heater on Earth” from Physics Today 68: 72-73

Popken, Ben (2018)

“As algorithms take over, YouTube's recommendations highlight a human problem: A supercomputer playing chess against your mind to get you to keep watching” from NBC news website published 20th April 2018 accessed via https://www.nbcnews.com/tech/social- media/algorithms-take-over-youtube-s-recommendations-highlight-human-problem- n867596 on 20th April 2019

Reilly, Paul (2014)

“The ‘Battle of Stokes Croft’ on YouTube: The Development of an Ethical Stance for the Study of Online Comments” from SAGE Research Methods Cases (London, Sage Publications, 2014) Page | 73 Rozell, Ned (2016)

“HAARP ready for business under UAF management” from University of Alaska Fairbanks News and Information website, published 21st September 2016, accessed via https://news.uaf.edu/haarp-ready-business-uaf-management/ on 18th April 2019

Sam, Cecile H. (2019)

“Shaping Discourse Through Social Media: Using Foucauldian Discourse Analysis to Explore the Narratives That Influence Educational Policy” from American Behavioral Scientist 63(3): 333–350

Schmitt, Josephine B., Diana Rieger, Olivia Rutkowski, & Julian Ernst (2018)

“Counter-messages as Prevention or Promotion of Extremism?! The Potential Role of YouTube: Recommendation Algorithms” from Journal of Communication 68: 780–808

Shaw, Lucas & Mark Bergen (2018)

“YouTube’s Plan to Clean Up the Mess That Made It Rich” from Bloomberg Businessweek online published 26th April 2018, accessed via https://www.bloomberg.com/news/features/2018-04-26/youtube-may-be-a-horror-show- but-no-one-can-stop-watching on 23rd April 2019

Slavtcheva-Petkova, Vera (2016)

“Are Newspapers’ Online Discussion Boards Democratic Tools or Conspiracy theories’ Engines? A Case Study on an Eastern European “Media War”” from Journalism & Mass Communication Quarterly 93(4): 1115–1134

Smith, Oliver (2015)

“The airport that launched a thousand conspiracy theories; Denver International, 20 years old today” from The Telegraph online published 28th February 2015, accessed via https://www.telegraph.co.uk/travel/destinations/north-america/united- states/colorado/articles/The-airport-that-launched-a-thousand-conspiracy-theories/ on 20th April 2019

St Pierre, Elizabeth Adams (2014)

“A Brief and Personal History of Post Qualitative Research: Toward ‘Post Inquiry’” from Journal of Curriculum Theorizing 30(2)

Strangelove, Michael (2010)

Watching YouTube: Extraordinary Videos by Ordinary People (London, University of Toronto Press, 2010)

Streep, Abe (2008)

“The Military’s Mystery Machine: The High-frequency Active Auroral Research Program, or Page | 74 HAARP, has been called a missile-defense tool and a mind-control device. The truth is a bit less ominous” from Popular Science 1st July 2008: 60-63

Svrluga, Susan (2019)

“First, they lost their children. Then the conspiracy theories started. Now, the parents of Newtown are fighting back” from The Washington Post, published 8th July 2019, accessed via https://www.washingtonpost.com/local/education/first-they-lost-their-children-then-the- conspiracies-started-now-the-parents-of-newtown-are-fighting-back/2019/07/08/f167b880- 9cef-11e9-9ed4-c9089972ad5a_story.html on 16th July 2019

Thresher-Andrews, Christopher (2013)

“An introduction into the world of conspiracy” from Psypag Quarterly 88: 5-8

Tufekci, Zeynep (2017)

Zeynep Tufekci: We’re building a dystopia just to make people click on ads [TED Talk from TEDGlobal>NYC], accessed via https://www.ted.com/talks/zeynep_tufekci_we_re_building_a_dystopia_just_to_make_peo ple_click_on_ads on 29th August 2019

Tufekci, Zeynep (2018)

“YouTube, the Great Radicalizer” from The New York Times online published 10th March 2018 via https://www.nytimes.com/2018/03/10/opinion/sunday/youtube-politics- radical.html on 4th April 2019

Uldam, Julie & Anne Kaun (2017)

“Towards a framework for studying political participation in social media” from Wimmer, Wallner, Winter & Oelsner (eds) (Mis)Understanding Political Participation: Digital Practices, New Forms of Participation and the Renewal of Democracy (Abingdon, Routledge, 2017)

UN – United Nations (2019)

“Climate Change” from UN.org (United Nations website) accessed via https://www.un.org/en/sections/issues-depth/climate-change/ on 12th May 2019

Unsworth, Fran (2018)

“Internal editorial guidelines crib sheet” quoted by Leo Hickman in “Exclusive: BBC issues internal guidance on how to report climate change” from Carbon Brief website, accessed via

https://www.carbonbrief.org/exclusive-bbc-issues-internal-guidance-on-how-to-report- climate-change on 20th April 2018

Uscinski, Joseph E. (2018)

“Down the Rabbit Hole We Go!” from Joseph E. Uscinski (ed) Conspiracy theories and the Page | 75 People Who Believe Them (Oxford, Oxford University Press, 2018)

Uscinski, Joseph E. (2018c)

“How Do Conspiracy Theorists and Non-Conspiracy Theorists Interact?” from Joseph E. Uscinski (ed) Conspiracy theories and the People Who Believe Them (Oxford, Oxford University Press, 2018)

Uscinski, Joseph E. (2018d)

“The History of Conspiracy theory Research” from Joseph E. Uscinski (ed) Conspiracy theories and the People Who Believe Them (Oxford, Oxford University Press, 2018)

Uscinski, Joseph E. & Santiago Olivella (2017)

“The conditional effect of conspiracy thinking on attitudes toward climate change” from Research and Politics (Oct-Dec 2017): 1-9

Van der Linden, Sander (2015)

“The conspiracy-effect: Exposure to conspiracy theories (about global warming) decreases pro-social behavior and science acceptance” from Personality and Individual Differences 87: 171-173

Varshney, Lav R. (2019)

“Must Surprise Trump Information?” from IEEE Technology & Science Magazine 38(1): 81-87

VICE News (2017)

The Truth Behind This Big Alaskan Conspiracy theory (HBO) published on 4th October 2017, accessed via https://www.youtube.com/watch?v=BoZf9feQATc on 15th April 2019

Vsauce (2014)

Is Earth Actually Flat? published on 4th December 2014, accessed via https://www.youtube.com/watch?v=VNqNnUJVcVs on 14th May 2019

Wagner, Adam (2017)

“Are You Maximizing The Use Of Video In Your Content Marketing Strategy?” from Forbes.com website, published on 15th May 2017, accessed via https://www.forbes.com/sites/forbesagencycouncil/2017/05/15/are-you-maximizing-the- use-of-video-in-your-content-marketing-strategy/#3aa9b39f3584 on 16th July 2019

Wakefield, Jane (2019)

“'Extremist' Google algorithms concern ex-police chief” from BBC.com, published 26th April 2019 accessed via https://www.bbc.com/news/technology-48068912 on 27th April 2019

Weart, Spencer (2011)

“Global warming: How skepticism became denial” from Bulletin of the Atomic Scientists Page | 76 67(1): 41–50

Weigmann, Katrin (2018)

“The genesis of a conspiracy theory: Why do people believe in scientific conspiracy theories and how do they spread?” from EMBO reports 19(e45935)

Weinberger, Sharon (2008)

“Heating Up the Heavens” from Nature 452 (24th April 2008): 930-932

Wikipedia (2019)

“YouTube” from Wikipedia: The Free Encyclopedia, accessed via https://en.wikipedia.org/wiki/YouTube on 1st May 2019

Wikipedia (2019b)

“Figure of the Earth” from Wikipedia: The Free Encyclopedia, accessed viahttps://en.wikipedia.org/wiki/Figure_of_the_Earth on 13th May 2019

Wikipedia (2019c)

“List of conspiracy theories” from Wikipedia: The Free Encyclopedia, accessed viahttps://en.wikipedia.org/wiki/List_of_conspiracy_theories on 12th April 2019

Wolfson, Sam (2018)

“'Remodelling the lizard people's lair': Denver airport trolls conspiracy theorists” from The Guardian online published 7th September 2018, accessed via https://www.theguardian.com/us-news/2018/sep/07/denver-airport-construction- conspiracy-lizard-people on 20th April 2019

Wood, Michael (2013),

“Has the internet been good for conspiracy theorising?” from Psypag Quarterly 88: 31-34

Wooffitt, Robin (2005)

Conversation analysis and discourse analysis (SAGE Publications, London, 2005)

World Video Bible School [WVBS] (2016)

The Ark | The Reality of Noah's Ark published on 2nd February 2016, accessed via https://www.youtube.com/watch?v=O8A-tCJQdp4 on 13th July 2019

YouTube (2013)

“So long video responses... Next up: better ways to connect” from YouTube Creator Blog published 27th August 2013, accessed via https://youtube- creators.googleblog.com/2013/08/so-long-video-responsesnext-up-better.html on 14th July 2019 Page | 77

YouTube (2019a)

“Continuing our work to improve recommendations on YouTube” from YouTube Official Blog published 25th January 2019, accessed via https://youtube.googleblog.com/2019/01/continuing-our-work-to-improve.html on 4th April 2019

YouTube (2019b)

“Terms of Service” (effective from 22nd January 2019) from YouTube website accessed via https://www.youtube.com/static?gl=GB&template=terms on 28th April 2019

YouTube Help (2019)

“How to earn money on YouTube” from YouTube Help accessed via https://support.google.com/youtube/answer/72857?hl=en on 1st May 2019

YouTube Help (2019b)

“Verification badges on channels” from YouTube Help accessed via https://support.google.com/youtube/answer/3046484?hl=en-GB on 16th May 2019

YouTube Help (2019c)

“Community Guidelines” from YouTube Help accessed via https://support.google.com/youtube/answer/9288567 on 18th July 2019

YouTube Help (2019d)

“Use a different name on YouTube from your Google Account” from YouTube Help accessed via https://support.google.com/youtube/answer/2897336 on 18th July 2019

Zarsky, Tal (2016)

“The Trouble with Algorithmic Decisions: An Analytic Road Map to Examine Efficiency and Fairness in Automated and Opaque Decision Making” from Science, Technology, & Human Values 41(1): 118-132