RESILIENCE AGAINST THE DARK ARTS

A COMPARATIVE STUDY OF BRITISH AND SWEDISH GOVERNMENT STRATEGIES COMBATTING DISINFORMATION

John Peter Fee

Peace and Conflict Studies III Bachelor Thesis 12.0 hp Spring 2021 Supervisor: Kristian Steiner Word count: 12606

Abstract

Western liberal democracies currently face a significant challenge from the growing proliferation of disinformation. With research suggesting that disinformation increases the risk of violence and intergroup conflict, this thesis sought to understand precisely what is being done by states to decrease the likelihood of this happening—specifically, with how the United Kingdom compares with/differs from Sweden in the type of resilience strategies employed to combat disinformation. To answer this question, this thesis conducted a qualitative comparative content analysis to examine government communications for the purposes of identifying, codifying, and describing the different types of resilience strategies combatting disinformation as practised by the United Kingdom and Sweden, to serve as a repository aid in future intervention planning. Utilising a bespoke analytical framework to make sense of resilience strategies of differing scales, a micro-macro perspective was adopted to capture (1) bottom-up focused strategies—which sought to enhance an individual’s ability to independently evaluate the accuracy of the information that they consume and (2) top-down focused strategies—which sought to reduce societal disinformation exposure through structural interventions. This thesis demonstrates that the United Kingdom and Sweden share approximately two-thirds of their disinformation resilience strategy with one another. From 472 items sourced from British and Swedish government communications, this study uncovered 15 micro strategies and 59 macro strategies in total—which, at face value, suggests a genuine bias in favour of a macro strategic resilience approach. To the degree that this is suitable for effective societal resilience against disinformation, remains inconclusive and warrants further research.

Keywords: Disinformation, Resilience, Qualitative, Comparative, Content Analysis, United Kingdom, Sweden. List of Tables

Table 1 Logic of a Micro Strategy...... 13 Table 2 Logic of a Macro Strategy...... 14 Table 3 Content Selection Rules...... 17 Table 4 Strategy Identification Rules...... 17 Table 5 Example of a Macro Strategy...... 18 Table 6 British Media and Information Literacy sub-strategies...... 20 Table 7 Swedish Media and Information Literacy sub-strategies...... 21 Table 8 British Digital Educational Material sub-strategies...... 22 Table 9 Swedish Digital Educational Material sub-strategies...... 23 Table 10 British Independent Journalism and Media sub-strategies...... 24 Table 11 Swedish Independent Journalism and Media sub-strategies...... 26 Table 12 British Strategic Communication sub-strategies...... 27 Table 13 Swedish Strategic Communication sub-strategies...... 30 Table 14 British Situational Awareness sub-strategies...... 31 Table 15 Swedish Situational Awareness sub-strategies...... 31 Table 16 British Civil Society sub-strategies...... 32 Table 17 Swedish Civil Society sub-strategies...... 33 Table 18 British Technological sub-strategies...... 34 Table 19 Swedish Psychological Defence sub-strategies...... 35 Table 20 British Legislative sub-strategies...... 36 Table 21 Components of the Micro Strategy...... 37 Table 22 Components of the Macro Strategy...... 38 Table of Contents

1. Problematisation ...... 1

1.1. Introduction ...... 1 1.2. Purpose and Problem Formulation ...... 1 2. Literature Review ...... 3 2.1. The Contested Information Environment ...... 3 2.2. Disinformation, Definitions and Epistemology ...... 7 2.3. Previous Peace Research on Disinformation ...... 8 3. Analytical Framework ...... 10 3.1. Introduction to Resilience ...... 10 3.2. Reflections on Resilience ...... 12 3.3. Resilience Framework ...... 12 4. Methodology ...... 15 4.1. Design ...... 15 4.2. Source Discussion ...... 15 4.3. Data Collection Process ...... 16 4.4. Analytical Discussion ...... 19 4.5. Reliability ...... 19 5. Analysis ...... 20 5.1. Micro Strategies ...... 20 5.2. Macro Strategies ...... 24 5.3. Discussion ...... 37 6. Conclusion...... 39 References ...... 40 Literature ...... 40 Data collection ...... 51

1. Problematisation

“War today is in the process of undergoing another evolution in response to social and political conditions, namely the speed and interconnectivity associated with contemporary globalisation and the information revolution.” —Emile Simpson, War from the Ground Up

1.1. Introduction The use of information as an instrument of power, both in times of war and peace is as old as civilisation itself, but with over half the world’s population now connected to an instantly responsive global communications network, a considerable number of novel challenges have emerged concerning the rise of subversive influences (World Economic Forum, 2021, pp. 53– 54). Governments, political campaigns, corporations and regular citizens from all over the world are increasingly utilising human resources and digital technologies to carry out large scale manipulation in a deliberate attempt to shape social life (Bradshaw et al., 2020; Woolley & Howard, 2019b). Of particular note, are sophisticated information campaigns that precede and augment military operations, deliberately targeting civil society with misleading content in support of political objectives—as witnessed during the - conflict in 2014 (Mejias & Vokuev, 2017). This sort of informational weaponry is commonly referred to as disinformation—deliberately misleading information that has been fabricated to exploit a target audience to attain some vested interest (Lanoszka, 2019). What is more, research suggests that disinformation increases the risk of intergroup conflict and violence (Arayankalam & Krishnan, 2021; Ward & Beyer, 2018). To make matters more complex, state actors are increasingly outsourcing disinformation activities to a multitude of third-party actors (e.g. private strategic communications firms), to reap the advantages of plausible deniability (Brookings, 2021). This erosion of the distinction between the public and private concerning the attribution of disinformation further complicates conventional understandings of war and peace. In light of these developments, European states are now taking political action to raise awareness and strengthen societal resilience against disinformation (European Commission, 2018, p. 5).

1.2. Purpose and Problem Formulation One of the primary goals of conflict resolution is to get in front of a potential conflict before it begins to “increase the range of situations where violence is not a possibility” (Ramsbotham et al., 2017b, p. 146). In consideration of disinformation’s proliferation and the evidence

1 suggesting an increased risk of intergroup conflict and violence, this thesis seeks to understand what strategies are being employed by state actors to mitigate the likelihood of violence arising from disinformation, for the benefit of future intervention planning.

The United Kingdom (UK) and Sweden both offer promise as two worthwhile cases to study for the purposes of this task for several reasons. Firstly, the UK and Sweden are two state actors who have been targeted by state-sanctioned disinformation attacks (DIIS, 2020; Internetkunskap, 2021). Secondly, these two states have taken deliberate measures to safeguard their societies against disinformation (MSB, 2019; UK Government Communication Service, 2019). Lastly, the researcher is acquainted with the languages and national security strategies of these two states—which, in principle, ought to lead to more compelling and productive work on behalf of the researcher.

The ambition of this study is not to provide an exhaustive account of all counter-disinformation strategies that are presently employed within the UK and Sweden, but rather, to illuminate the different dimensions of strategy that are being employed to combat disinformation as identified within government communications. By analysing the various types of strategies that are being employed by the British and Swedish governments to strengthen societal resilience against disinformation, I aim to illuminate tried and tested practises to serve as a repository aid in future intervention planning.

To accomplish these aims, this thesis seeks to answer the following research question:

1. How does the United Kingdom compare with/differ from Sweden in the type of disinformation resilience strategy it employs?

To answer this research question, the following sub-questions will be analysed:

I. What strategies have the United Kingdom and Sweden employed to strengthen the individual’s ability to evaluate the information they are exposed to?

II. What strategies have the United Kingdom and Sweden employed to reduce societal disinformation exposure?

2 2. Literature Review

Our diffuse and complex digital information ecology makes the study of disinformation a highly challenging task for any researcher or analyst. Therefore, the following literature review aims to synthesise perspectives from across a wide variety of fields for the purpose of clarifying the wider social and technological backdrop that underpins subversive communications and the emergence of disinformation as a contemporary security threat. Additionally, the conceptual and epistemological debate around the term ‘disinformation’ will be discussed, before concluding with a discussion concerning previous research about disinformation within the field of Peace and Conflict Studies (PACS).

2.1. The Contested Information Environment Kornienko et al., (2015) claim that the development of our modern communicative information society has transformed the very nature of global power relations. Traditional and non- traditional actors alike are becoming increasingly empowered by emerging information technologies to influence rules, institutions and social outcomes (Padovani & Pavan, 2011, pp. 553–554). Furthermore, research shows that greater access to information, social networks, and collective opportunities prove to be significant enablers for increased civic engagement and political activity (Vaccari, 2017). Not least with formerly underprivileged groups who can now participate within political affairs at levels previously unimaginable (Singh, 2002, 2013). What is more, the traditional gatekeepers and information brokers of the twentieth century no longer hold the same influence over their targeted audiences now that knowledge curation has effectively become ‘democratised’ (Hussain, 2012; Waltzman, 2017, p. 2). Simply put, a growing multiplicity of actors now have the opportunity to communicate to far-reaching audiences all over the globe at extremely low costs—which includes the possibilities of enhanced autonomy, anonymity and the chance to be whoever, whatever and wherever one wishes (Vartapetiance & Gillam, 2014).

Following these substantial social changes are profound structural transformations across the global information environment. In 2018, the International Data Corporation (IDC) released a report predicting that the global datasphere is predicted to increase fivefold over the next 7 years—from 33 zettabytes (ZB) in 2018 to 175 ZB in 2025 (Rainsel et al., 2018). This surge in data growth follows the rising number of people and devices connecting to the internet each year, with projections estimating that 5.3 billion people will be online in 2023, up from the 3.9 billion people that were reportedly online in 2018 (Cisco, 2020, p. 5). Likewise, networked

3 devices are predicted to reach 29.3 billion globally in 2023, up from the 18.4 billion networked devices in 2018 (Cisco, 2020, p. 29). This amplification of global communications activity has amassed alongside the proliferation of algorithms and bots, which is profoundly altering human information exposure and consumption habits due to enhanced automation of content delivery (Sarts, 2021; Woolley & Howard, 2019b). As the abundance of information intensifies, some scholars have noted how peoples’ motivation and ability to attentively process information diminishes (Petty & Cacioppo, 1986, p. 128). In the words of Weng et al., “the abundance of information to which we are exposed through online social networks and other socio-technical systems is exceeding our capacity to consume it” (Weng et al., 2012, p. 1). It is against this background that Lin (2019) observes the influence of subversive influences competing to capture our scarce attention within an increasingly vast and contested information environment. Each and every day, people the world over are confronted with more and more information that is designed to bolster or challenge previously held beliefs and values (Taber & Lodge, 2006). Jack (2017) highlights the many different forms of subversive influences that are now targeting audiences to achieve these ends—like advertising, public relations and information operations.

Of grave concern here and the focus of this thesis is the growing prevalence of actors who are deliberately misrepresenting information to achieve their desired aims (Woolley & Howard, 2019b). A concern that is shared by many people throughout the world. A comprehensive 2020 survey found that 56% of their sampled respondents (spanning across 40 countries worldwide) are now concerned over the accuracy of the news that they read online (Newman et al., 2020). Elsewhere, some of the world’s most powerful states now openly acknowledge that information technology is being harnessed by a wide range of actors to target and manipulate the perceptions of their domestic populations for political ends (FMPRC, 2020; Library of Congress, 2020; Official Journal of the European Union, 2018). Moreover, mounting evidence suggests that disinformation is becoming more and more prevalent within election interference efforts globally (Baines & Jones, 2018; Freedom House, 2017). According to Whyte and Etudo (2020, p. 125) the bulk of this effort is accomplished via direct audience engagement—using trolls, bots and an array of ‘useful idiots’ to disseminate fabricated information. Simply put, the ever-growing global information environment has given rise to an increased ‘attack surface’ of vulnerabilities that are open to exploitation from a multiplicity of actors (Whyte et al., 2020, p. 3). As Dr Kello puts it, it is now possible to cause “significant harm to a nation’s political, economic and social life without firing a single gun” (Select Committee on International Relations, 2018, p. 29).

4 One notable case that illuminates the grave implications of disinformation would be Russia’s sustained information campaign during the Russia-Ukraine conflict in 2014. Khaldarova and Pantti (2016, pp. 3–4) describe how Russian state-owned media outlets were used by the government to disseminate ‘strategic narratives’ throughout the to shape the perceptions and actions of the domestic population and compatriots in Russia’s near abroad. One such narrative centred on a rising existential fascist threat in Ukraine that leaned heavily on pre-existing enemy symbolism from World War II. Other narratives composed of strong identity and ethnic themes, like a ‘Russian Slavic Orthodox Civilization’ standing in defiant resistance to a ‘decadent’ Europe (NATO StratCom, 2015, p. 4). Such influence tactics are said to prey upon pre-existing social divisions and anxieties using deliberately misleading information to sow distrust and division amongst communities—so as to divide and control the targeted audience (Karlsen, 2019). A more evident effect of disinformation during the Russia- Ukraine conflict was demonstrated in February 2014, when unmarked Russian military units surfaced across Ukraine’s Crimean Peninsula amidst the political turmoil—seizing airfields, key administrative buildings and other strategic points, including Crimea's parliament building, which would promptly facilitate the snap referendum that would bring about Russia’s annexation of Crimea (Grant, 2015). Responding to these unfolding developments at the time, Russian state-owned media espoused what would soon be understood as another government ‘strategic narrative’—describing the unmarked armed units as “similarly dressed and equipped to the local ethnic Russian ‘self-defence squads’,” (The Guardian, 2014). Likewise, responding to direct questioning on the origins of these units in Crimea, Russian president, , made the state’s position firmly clear: “those were local self-defence units” (Kremlin Russia, 2014). It would not be until almost one year after Russia’s annexation of Crimea, that Vladimir Putin would publicly lay bare Russia’s role in Crimea:

“In order to block and disarm 20,000 well-armed [Ukrainian soldiers], you need a specific set of personnel. And not just in numbers, but with skill. We needed specialists who know how to do it”

“That’s why I gave orders to the Defense Ministry -- why hide it? -- to deploy special forces of the GRU (military intelligence) as well as marines and commandos there under the guise of reinforcing security for our military facilities in Crimea” (RFE/RL, 2019).

By partaking in informational manoeuvring of this sort to create ambiguity and produce bandwidth challenges for onlooking states, actors can remain beneath the threshold of war and bypass potential military confrontation with other states to achieve favourable political outcomes (Deibler, 2020, p. 136). This is arguably why a number of states now seek to achieve

5 ‘discourse dominance’ or ‘information advantage’ in pursuit of political advantage through the shaping of audience perceptions (Kania, 2020; UK Ministry of Defence, 2018). Moreover, because of this rising competitive pressure within the global information environment, states are co-evolving and updating their national security strategies to compete accordingly, with inevitable cascading consequences (Ruhmann & Bernhardt, 2019). Copious military doctrines now reveal how this information contest stretches deep into the civilian realm (UK Ministry of Defence, 2018, pp. 1–2; U.S. Department of Defense, 2016, p. 2). Which is eroding distinctions between military and civilian—war and peace—and perhaps amounting to what Dr Kello refers to as “unpeace” (Select Committee on International Relations, 2018, p. 29). It is no exaggeration to say that state coordinated information operations now extend well beyond conflict zones and deep into the heart of public life.

The growing prevalence of subversive influences that result from sophisticated strategic manipulation are now widely studied across a wide array of disciplines—from international relations (Wohlforth, 2020), journalism studies (Waisbord, 2018) media and communications (Taddicken & Wolff, 2020) to security studies (Carter & Carter, 2021). Likewise, there are a number of different terms that are being used to adequately capture and describe the challenges emerging from the prevalence of false information in our digitalised communications society— like computational propaganda (Woolley & Howard, 2019a), disinformation (Lanoszka, 2019), fake news (Vargo et al., 2018), information disorder (Wardle & Derakhshan, 2017), information warfare (Klein, 2018), misinformation (Vraga & Bode, 2020), political warfare (Smith, 1989), post-truth (Haack, 2019) and truth decay (Kavanagh & Rich, 2018).

Owing to the far-reaching nature of this phenomenon and the copious terms being used to describe the driving forces behind it, it is necessary to alleviate the conceptual saturation around this study’s focus area to justify the appropriate terminology for the purposes of this research. As mentioned earlier, this thesis is exclusively concerned with the type of information that contains deliberately misleading elements—that is, information that is misleading by intent, rather than by accident. There are a number of terms that are used interchangeably to describe misleading information that is spread under the guise of factual information—these are ‘propaganda’, ‘misinformation’, and ‘disinformation’ (Guess & Lyons, 2020). The term propaganda remains vague and ill-defined, however, whilst most scholars accept that it constitutes a deliberate process to persuade an audience to further some agenda, albeit at times, unethically, many would agree that propaganda still has the capacity to be truthful (Guess & Lyons, 2020; Taylor, 2003). Misinformation and disinformation on the other hand, both

6 unequivocally refer to inaccurate information, with the underlying intentions of the message being the key differentiator between the two (Wu et al., 2019). Quite simply, misinformation can be an unintentional affair, where individuals with benign motivations spread inaccurate information within their networks because they genuinely believe it to be true, despite facts to the contrary. In contrast, disinformation is intentionally misleading, which makes this concept the primary focus for the purposes of this thesis’s investigation.

2.2. Disinformation, Definitions and Epistemology Fallis (2015) defines disinformation as “misleading information that has the function of misleading” (Fallis, 2015, p. 413). This definition shares parallels with the Swedish understanding of disinformation, which defers to the Swedish Civil Contingencies Agency (MSB) definition: “disinformation refers to incorrect or manipulated information that is deliberately disseminated for the purpose of misleading” (MSB, 2018, p. 25). However, in the strictest sense, these definitions of disinformation would capture harmless things, like jokes and satire, which function to mislead their targeted audience, albeit temporarily, to achieve humorous ends. But as some scholars point out, satire and humour are typically absent from accusations of disinformation on account of the short-lived nature of the deception and the humorous intentions of the communicator (Lanoszka, 2019, p. 3; Meinert et al., 2018, p. 486). The implication of this suggests the need to probe deeper into the nature of the intentions underpinning the message.

Wardle and Derakhshan (2017) provide a definition for disinformation that accounts for the hostile intentions that can underpin a message: “information that is false and deliberately created to harm a person, social group, organization or country” (Wardle & Derakhshan, 2017, p. 20). This definition would exclude lighthearted things like jokes, satire, and sarcasm—but it would also exclude things like marketing campaigns that seek to deliberately mislead audiences for financial gain. For example, like a guerrilla marketing campaign that seeks to influence the audience’s purchasing behaviour using deliberately misleading information (Rtec Auto Design, 2016). Or a public relations company that is paid to pump out misleading positive narratives for a would-be political candidate during election time (Buzzfeed News, 2020). In both these cases, one would find it extremely difficult to demonstrate harmful intentions, however, misleading schemes of this sort are still arguably counterproductive to preserving societies’ ability to communicate in a peaceful and legitimate manner. In contrast, if we examine how the UK’s Government Communication Service defines disinformation, we find a much more comprehensive definition of disinformation that captures a wider spectrum of interests that may

7 underpin disinformation campaigns: “disinformation is the deliberate creation and dissemination of false and/or manipulated information that is intended to deceive and mislead audiences, either for the purposes of causing harm, or for political, personal or financial gain” (UK Government Communication Service, 2019, p. 6).

No less important than unpacking disinformation’s conceptual variance is also the need to clarify what exactly is required to make an allegation of disinformation in the first place. In other words, how does one know disinformation when they see it? Afterall, it is all too easy to assume that disinformation is merely related to some objective truth—which it is if it is thought of in strictly rational terms. In practise, however, how does one measure the intentions behind the message? Or measure the truthfulness of a claim? Particularly in political matters where knowledge is continually contested and in flux? With respect to the latter point, Vraga and Bode (2020) provide an invaluable assessment of the epistemological considerations that underpin the confidence behind misinformation allegations, particularly in proving that information is inaccurate—a criterion that is equally found within the concept of disinformation. Vraga and Bode’s research demonstrates that a consensus among experts and the levels of evidence available within the information environment are the key determining factors that permit allegations pertaining to an information’s inaccuracy. Which raises some significant questions, such as, who qualifies as an expert? What level of agreement must experts satisfy to reach expert consensus? What qualifies as the best evidence for emerging issues? Circling back to the first epistemological challenge concerning disinformation, that is, how can one practically determine the intentions underpinning information? Wanless and Pamment (2019, p. 4) highlight the inherent immeasurability of such a requirement, particularly at scale, which proves to be extremely problematic for researchers and analysts looking to capture instances of disinformation, in addition to the difficulties in discerning between misinformation and disinformation respectively. This conundrum was even acknowledged within a recent Swedish government bill, which recognised how external information influence campaigns can be extremely difficult to separate from legitimate domestic opinion (Försvarsdepartementet, 2020, p. 61). The epistemological challenges outlined here serve as a warning to conflict scholars and analysts that a cautionary approach to their methods in the study of disinformation is vital.

2.3. Previous Peace Research on Disinformation The work of the present generation of conflict resolvers takes place “in a world that is changing rapidly, most especially under the impact of global information communications technology”

8 (Ramsbotham et al., 2017a, p. 425). Mass communications are a significant area of study for scholars of PACS on account of its influence over how societies organise themselves (Ramsbotham et al., 2017a, p. 420). The impact of mass communications upon successful conflict resolution is typically expressed as a double-edged sword. On the one hand, communications inform, educate and unite people—but on the other hand, they also mislead, divide and turn people against one another (Gallacher et al., 2021). The latter of which so naturally occurs amid the circulation of disinformation (Ward & Beyer, 2018). Not least since information distortion via disinformation can lead to dangerous misperceptions about another actor’s intentions—which may lead to devastating consequences in the event of a miscalculation (Dreze, 2000, p. 1177).

The study of malign communications within PACS generally falls under the purview of ‘Peace Journalism’ (Lynch & McGoldrick, 2007). In the 2007 Handbook of Peace and Conflict Studies, disinformation is briefly mentioned as a distinct military technique that plays a function in the shaping of misleading news representations about a conflict (Lynch & McGoldrick, 2007, pp. 248–249). However, disinformation’s significance within Peace Journalism is typically regarded as a subsidiary element that shapes news representations of war and not of particular significance in and of itself, much less during peace time. Perhaps this is why there appears to be a distinct absence of research about the topic of disinformation within the prevailing PACS scholarship. The Journal of Peace Research (JPR), Journal of Conflict Resolution (JCR) and the journal of Conflict Management and Peace Science (CMPS) hold zero publications that contain the word ‘disinformation’ within the title of their works. The Peace Review journal returned one publication from 1993 titled ‘DisInformation, DatInformation’, which features a five page article that discusses U.S. CIA disinformation campaigns in South America during the Cold War (Sharkey, 1993). Quite simply, there is a distinct lack of research and debate within PACS on the phenomenon of disinformation as a discernible risk to intergroup conflict and violence. In view of this aperture within the field of PACS, this thesis seeks to provide a modest offering that describes functioning resilience strategies against disinformation to encourage further debate and contribute towards future intervention planning.

9 3. Analytical Framework

Here this study presents a bespoke analytical framework for helping us understand the types of strategies that are currently being employed by the United Kingdom and Sweden to reduce the threat of disinformation in consideration of the increased risk of violence and intergroup conflict that disinformation poses (Arayankalam & Krishnan, 2021; Ward & Beyer, 2018). The framework is developed from insights derived from literary works and academic research on the subject of resilience. In particular, this thesis draws on insights that will enable this study to adequately make sense of the forthcoming resilience strategies for the purposes of classification.

3.1. Introduction to Resilience As a form of non-kinetic attack that does not quite meet conventional understandings of aggression, let alone meet the threshold of war, disinformation is proving to be extremely problematic to deter within Western liberal democracies—not least due to the difficulties in sourcing attribution (EU DisinfoLab, 2020). Such challenges are contributing to a growing perception within state policy of a more risky, complex and uncertain world (Joseph, 2016). In light of these developments, state decision-making is seen to be much more fragmented and weakly coordinated to be capable of dealing with the rapid pace of technical and ecological change (Duit et al., 2010). It is for this reason that there is a growing recognition that responding to each national security threat is now realistically beyond the scope of the state’s security apparatus (Braw, 2020). This has led to increased calls for the strengthening of societal resilience—a concept that emphasises personal responsibility, self-awareness and self- regulation in the face of increasing shocks and uncertainty (Joseph, 2016, p. 14). Western states, multilateral organisations and NGOs are increasingly advocating for societal resilience to combat and mitigate the sort of challenges posed by threats such as disinformation (Atlantic Council, 2019; Council of the European Union, 2020; NATO, 2020) “Governing complexity is thereby understood to be a process whereby failures or unintended outcomes can be seen as an inevitable part of that process and the key aspect is how failure is reflected upon to shape future policy-making” (Chandler, 2014, p. 12).

Etymologically, the term ‘resilience’ originates from the Latin verb resilire (to rebound or recoil), which entered the English language sometime in the early 17th century—undergoing significant transformation in its meaning up until the present day (McAslan, 2010). Resilience has been understood by some as a property of material that can withstand pressure and maintain

10 its form, the capacity of an ecosystem to tolerate shock and recover, the ability of children and adults alike to tolerate traumatic and adverse situations, and the ability of a system to withstand and overcome crises in a complex world (McAslan, 2010). Today, however, particularly in the field of international security, resilience generally refers to society’s ability (encompassing all of its individuals) to respond and recover from the impact of shocks—including society’s ability to adapt through learning, changing and re-organising to better cope with future threats (Cutter et al., 2008, p. 599). Fundamentally, resilience relates to a system’s capacity and not some outcome, a point that is conceptually vital for understanding the concept’s underlying essence—which is primarily about improving the individual’s ability to make decisions over their own lives in order to build more resilient societies (Béné et al., 2016, p. 125). In the words of Chandler (2012) “the resilient subject (at both individual and collective levels) is never conceived as passive or as lacking agency, but is conceived only as an active agent, capable of achieving self-transformation” (Chandler, 2012, p. 217). Chandler recognises that resilience is a normative concept, on account of its relative quality. For if one wants to adequately measure it, resilience necessitates being placed in relation to some preconceived outcome, since it is not something that is directly observable in and of itself (Sturgess, 2015, p. 7). So in practical terms, resilience is context-specific and necessitates tailor-made solutions to address specific risks (European Commission, 2017, p. 23).

Since resilience is ultimately a normative capacity enabling endeavour, the question then becomes a matter of determining the appropriate means to build it. In the case of strengthening resilience against disinformation, practitioners generally adopt a systems-oriented approach— focusing on exogenous influences and proposing top-down initiatives to shape them, for instance like creating fact-checking institutions or communicating positive narratives to the targeted audience (European Commission, 2017, p. 16). Such interventions are what Lazer et al., (2018, p. 1095) refer to as ‘structural’ interventions—whereby practitioners strive to shape external influences to prevent individuals from being exposed to disinformation in the first place. But there are clear risks in leaning too much on this approach, as Lezer et al., note: “fact checking might even be counterproductive under certain circumstances” (Lazer et al., 2018, p. 1095). By way of example, a 2014 randomised trial study that set out to increase child vaccination rates by testing the effectiveness of counterattitudinal persuasive interventions turned out to be massively detrimental to the desired outcome (Nyhan et al., 2014). The test sought to reduce misunderstandings surrounding vaccine scepticism, by using factual communications to clear up general misconceptions, an endeavour that ended up having zero impact on increasing parental intent to vaccinate a future child. On top of that, parents who had

11 the least favourable vaccine attitudes going into the study, were reportedly even less likely to vaccinate their children after these deliberate interventions. The implication of these findings suggests that for distrustful people who are likely to resist persuasion, the targeted attitude often becomes significantly stronger after attempted counterattitudinal persuasive interventions, like fact-checking—a conclusion that is supported by the psychological literature (Tormala & Petty, 2004). Likewise, these findings may indicate that concentrating one’s intervention strategy against disinformation through attempts to shape exogenous factors at the expense of focusing on improving the individual’s cognition may be counterproductive towards building effective resilience. In the words of Julia Koller, a lead developer for learning solutions: “information is only as reliable as the people who are receiving it. If readers do not change or improve their ability to seek out and identify reliable information sources, the information environment will not improve” (Pew Research Center, 2017).

3.2. Reflections on Resilience Against this background, it seems clear that certain resilience strategies are going to be unsatisfactory, if not harmful at times to the desired capacity that resilience advocates are seeking if individual cognition is overlooked. In contrast to focusing on exogenous influences, an approach that focuses on the individual one-on one, like an education initiative that seeks to strengthen a person’s capacity to critically evaluate the information they consume, rather than telling them what is true, places the agency of the individual most in need of help front and centre (Chandler, 2012, p. 216). Research suggests that such bottom-up interventions, even on the small-scale, are rather effective at assisting an individual’s ability to perceive the accuracy of false information (Guess et al., 2020). If we recall Chandler’s understanding of resilience mentioned earlier, resilience can be understood as the symbiosis of a group of individual capacities coming together. Reasoning from this fact, societal resilience is simply individual capacity at scale. To suggest that resilience transcends the individual in some way, as some scholars do (Humprecht et al., 2020), is to arguably commit a fallacy of misplaced concreteness—mistaking something quite abstract for the way things are in actuality. Likewise, to prioritise shaping exogenous factors within a resilience strategy in lieu of building individual capacity, is to potentially undermine the very endeavour of resilience building entirely.

3.3. Resilience Framework Since this thesis aims to identify and analyse different types of resilience strategies being employed by the UK and Sweden, the following analytical framework provides a

12 predetermined structure for classifying the different forms of strategies that will be identified in the forthcoming analysis. In view of the preceding considerations concerning individual- focused interventions and structural-focused interventions inspired by Lazer et al.’s (2018) dichotomy of interventions against fake news, the following analytical framework is designed to adequately describe a state’s resilience strategy through two-distinct strategy types. The first type of strategy is focused on strengthening an individual’s ability to make decisions over their own lives, whilst the second type of strategy is focused on exercising top-down initiatives that aim to shape exogenous influences. To be capable of identifying these distinct strategies within a body of text, there ought to be clear indicators for each type of resilience strategy to guide the forthcoming analysis.

The first type of resilience strategy of interest to this study is specifically concerned with efforts to strengthen the individual’s ability to evaluate the information they are exposed to, which will be referred to as a micro strategy for the purposes of this study (see table 1). To locate a micro strategy within a piece of content, the researcher will search for the combination of (1) a learning effect—constituting some desired effect that is believed to enhance an individual’s ability to independently evaluate the accuracy of information and (2) the necessary means— constituting the specified solution to achieve the desired learning outcome in question.

Table 1 Logic of a Micro Strategy

Element Indicator Identifier

Learning Outcome A sentence constituting some desired effect that is believed to L increase an individual’s capacity to independently evaluate the accuracy of information.

Necessary Means A sentence constituting the solution to achieve L N

The second type of resilience strategy that this study is interested in finding is specifically concerned with state efforts to influence exogenous factors for the purposes of reducing societal disinformation exposure, which will be referred to as a macro strategy for the purposes of this study (see table 2). To locate a macro strategy within a piece of content, the researcher will search for (1) a structural effect—constituting some desired effect that is believed to reduce the risk of disinformation in the environment and (2) a policy approach—the presumed solution to achieve the desired structural effect.

13 Table 2 Logic of a Macro Strategy

Element Indicator Identifier

Structural Effect A sentence constituting some desired effect that is believed to S reduce the risk of disinformation in the environment.

Policy Approach A sentence constituting the presumed solution to achieve S P

14 4. Methodology

This chapter presents the methodological strategy that guided the forthcoming analysis to ensure that this study’s research question was reliably answered to the best of the researcher’s ability.

4.1. Design This thesis chose to conduct a qualitative comparative content analysis for the purpose of answering the research question—how does the United Kingdom compare with/differ from Sweden in the type of disinformation resilience strategy it employs? A qualitative approach is ideally suited for the investigation of meanings in context, that is, permitting the researcher to act as the research instrument in order to gather and interpret the underlying meanings behind the gathered data (Merriam & Tisdell, 2015, p. 2). Given that the UK and Sweden’s resilience strategies are scattered throughout copious publications, absent of classification or grouping, a qualitative approach seems best to capture, codify and describe the different dimensions of strategy that are being employed to combat disinformation. The comparative element of this approach constitutes a systematic, rule-based approach for the analysis of “informational contents of textual data” (Forman & Damschroder, 2007, p. 39). Which provide two mutually reinforcing purposes for this study, which are (1) contextual context—to enable the researcher to know what the UK and Sweden’s resilience strategies are like and (2) classification—to reduce the complexity of the UK and Sweden’s resilience strategies through the placement of empirical evidence into relevant ‘data containers’ (Landman, 2008, p. 4).

4.2. Source Discussion This study sought to understand the different types of resilience strategies that are currently being practised by the UK and Swedish government and compare them. Therefore, the official government websites of the UK and Sweden were chosen to source the primary data for this analysis on account of their direct representation of the government’s official policies. The researcher recognises how the chosen sources may diminish the study’s content validity— which refers to whether the researcher is measuring all the things that ought to be measured in relation to the intended construct (Creswell & Creswell, 2018, p. 215). One could argue that the choice to exclusively source this study’s data from the official government websites is inadequate to faithfully represent the full extent of the UK and Sweden’s resilience strategies. Which is to say, different organs of state and their respective publications may very well have provided a richer understanding of the state’s resilience strategy than solely focusing on one

15 source. Acknowledging this consideration, an abridged representation of the UK and Sweden’s resilience strategy is likely the best that can be achieved within the scope of this study and its chosen sources. Likewise, the study’s choice to exclusively focus on state policy clearly omits the role that civil society has towards strengthening societal resilience against disinformation, which may have afforded a richer understanding of resilience should it have been taken into consideration (Aslama, 2019, 'Meso Level: Public Media and Collaborations'). However, it is with regret that there was simply not enough time to incorporate additional sources that may have been relevant to the purposes of this study.

4.3. Data Collection Process The study adopted a multi-stage purposeful sampling approach for selecting the data for this study’s analysis—which is commonly used for the selection of cases related to information- rich phenomena (Palinkas et al., 2015). Purposeful sampling refers to when the researcher has selected their data “based on a specific purpose rather than randomly” (Tashakkori & Teddlie, 2003, p. 713) The study sourced all data pertaining to the UK from gov.uk—the official website of the UK government. Likewise, the researcher sourced all data pertaining to Sweden from regeringen.se—the official website of the government of Sweden. It must be noted that all data sourced from regeringen.se is in Swedish and required translation into English by the researcher for the purposes of this analysis. Hence, all representations presented within this study of Sweden’s resilience strategy are the product of the researcher’s best attempts to faithfully represent the position of the Swedish government in the English language.

4.3.1. Part One: Content Selection In the first stage of the data collection process, the researcher searched for the term ‘disinformation’ (UK) and ‘desinformation’ (Sweden) using the respective search functions of both websites. The search process identified 241 results from gov.uk (UK) and 231 results from regeringen.se (Sweden). It should be noted that there were often additional resources contained within the results that generally represented the main focus of the content (e.g. An attached PDF report). In such instances, all attached publications were analysed in addition to the parent item, so long as they fulfilled the appropriate selection criteria (see table 3).

16 Table 3 Content Selection Rules

Rule Criteria

Search terms: ‘disinformation’ (UK) & ‘desinformation’ (Sweden)

Eligible content types: Web page, .PDF

Eligible date range: January 1st, 2014 to May 1st, 2021.

Languages: English (UK) and Swedish (Sweden)

4.3.2. Part Two: Strategy Identification The next stage of the data collection process involved the analysis of all 241 results from gov.uk (UK) and all 231 results from regeringen.se (Sweden). The analytical framework outlined in the previous chapter provided the basis for the following sub-questions to guide this process:

1. What strategies have the United Kingdom and Sweden employed to strengthen the individual’s ability to evaluate the information they are exposed to? (Micro)

2. What strategies have the United Kingdom and Sweden employed to reduce societal disinformation exposure? (Macro)

The unit of analysis within the chosen dataset is the sentence, which was coded accordingly. Using the analytical framework outlined in the previous chapter, all text from the 241 results from gov.uk (UK) and 231 results from regeringen.se (Sweden) were examined to detect the combination of any two elements that qualify as a micro strategy or macro strategy respectively (see table 4).

Table 4 Strategy Identification Rules

Type Element Indicator Identifier

Learning Effect A sentence constituting some desired effect that is L

believed to increase an individual’s capacity to independently evaluate the accuracy of

Strategy information.

ndividual)

i

( Micro Necessary Means A sentence constituting the solution to achieve L N

17

Structural Effect A sentence constituting some desired effect that is S

believed to reduce the risk of disinformation in the environment.

(exogenous) Policy Approach A sentence constituting the presumed solution to P Macro Strategy achieve S

The particular constellation of words and statements within a sentence that qualify as an element by meeting the aforementioned criteria were referred to as a meaning unit for the purposes of coding (see table 5). A meaning unit refers to a sentence that corresponds to the same central meaning, whether that is a learning effect, the necessary means, a structural effect, or a policy approach (Graneheim & Lundman, 2004, p. 106). It should be noted that the discovery of a single element within a publication (e.g. a learning effect) without its corresponding supporting element (e.g. the necessary means) fails to qualify as a complete strategy and was disregarded accordingly. On the other hand, content that contains more than one corresponding supporting element (e.g. the necessary means or a policy approach) to a desired effect was aggregated to the strategy in question. Lastly, any duplicate strategies identified throughout the course of the analysis were discarded for the purposes of clarity.

Each strategy identified within the analysis was given a unique three-digit identification number—with the letters ‘MI’ placed beforehand to denote a micro strategy (e.g. MI001) and the letters ‘MA’ placed beforehand to denote a macro strategy (e.g. MA001). Likewise, each piece of content that revealed a strategy during the analysis was provided a unique three-digit identification number for the purposes of ‘source’ classification—with the letters ‘UK’ placed beforehand to denote the item’s British origins (e.g. UK001) and the letters ‘SE’ placed beforehand to denote the item’s Swedish origins (e.g. SE001).

Table 5 Example of a Macro Strategy

ID Source Element Meaning Units

MA001 UK001 S “empower independent media”

P “building capacity and raising the professionalism of journalists, as well as providing higher quality products to local audiences, helping counter disinformation”

18 Any supplementary publications (e.g. An attached PDF report) found within a piece of content that revealed a strategy was given the same unique three-digit identification as its parent content holder. Content that bared no strategies or produced duplicate strategies were discarded and omitted from the classification process. Finally, all content that produced at least one strategy can be found in the references chapter under ‘data collection’.

4.4. Analytical Discussion Upon completion of the data collection process the analysis progressed with the development of subcategories—a core feature of any qualitative content analysis (Graneheim & Lundman, 2004, p. 107). Subcategories were determined via pattern discovery between strategies that share a high degree of commonality (Krippendorff, 2004, p. 50). Krippendorff notes how these categories must be mutually exclusive of one another, which is what enables the data of a content analysis to be “informative” (Krippendorff, 2004, p. 155). Likewise, since the comparative aspect of this study sought to understand the differences between the actors under investigation, the facilitation of mutually exclusive categories provides the logic that substantiates the differences between the UK and Sweden’s resilience strategies should they arise.

4.5. Reliability Reliability generally refers to the consistency of a particular research tool, procedure or approach in different circumstances, granting that nothing else has changed (Roberts et al., 2006, p. 41). In respect to the qualitative research approach undertaken in this study, reliability refers to the consistency of the researcher’s analytical procedures (Noble & Smith, 2015, p. 34). Since the researcher is acting as the instrument, would the chosen procedures of this study provide consistent results if different researchers applied the same procedures in different research settings? One way to have mitigated this risk would have been to perform an intercoder reliability test to ensure that one’s coding procedures would provide consistent results with different researchers (Chambliss & Schutt, 2019, p. 98). However, it is with regret that this study was unable to perform the necessary diligence checks on account of the time constraints surrounding this research. Therefore, this study acknowledges the diminished reliability of the forthcoming implemented approach.

19 5. Analysis

From the 472 pieces of content examined in this analysis (241 items from gov.uk and 231 items from regeringen.se), 43 British and 17 Swedish pieces of content unveiled 74 strategies admissible to this study’s analytical framework.

5.1. Micro Strategies From the 74 strategies discovered in this study, 15 micro strategies were identified that sought to strengthen the individual’s ability to evaluate the information they are exposed to—eight via British sources and seven via Swedish sources. From these 15 micro strategies, two distinct sub-strategies were conceived in accordance with the commonalities shared between them.

5.1.1. Media and Information Literacy The first sub-strategy derived from the collection of micro strategies is the media and information literacy sub-strategy. This type of micro strategy seeks to strengthen an individual’s critical thinking skills with a particular emphasis on improving their competence in navigating the digital environment. Educational practices to evaluate, use and create information are advocated to fulfil this aim within this type of strategy. Both the UK and Sweden practised this form of sub-strategy, which was identified on five occasions via UK sources and on five occasions via Swedish sources.

5.1.1.1. United Kingdom Results:

Table 6 British Media and Information Literacy sub-strategies

ID Source Element Meaning Units

MI001 UK006 L “help [individuals] think critically about things they might come across online, like disinformation”

N “online media literacy strategy”

MI002 UK014 L “help increase user awareness of, and resilience to, disinformation and misinformation online”

N “The [online harms] regulatory framework will build on Ofcom’s existing duties to promote media literacy”

20 MI003 UK014 L “enabling people to critically assess, appraise and challenge information online”

N “The forthcoming online media literacy strategy will set out more action to improve and strengthen audience resilience”

MI004 UK016 L “develop people’s knowledge and confidence in navigating the online world and the information promulgated through it”

N “promoting work already in train across library services, and [the DCMS Libraries Team's] important role in education”

MI005 UK032 L “enabling Ukrainian youth to better discern fact from fiction in the media and social media space and to make informed decisions as to which information they consume, share and produce”

N “UK programme assistance to Ukraine”

N “media literacy and critical thinking for secondary schools”

5.1.1.2. Sweden Results:

Table 7 Swedish Media and Information Literacy sub-strategies

ID Source Element Meaning Units

MI006 SE006 L “A strong knowledge base can also contribute to science-based policy and practice and build resilience to disinformation”

N “The Government therefore considers that scientific publications, which are the result of research funded by public funds, shall be immediately available with effect from 2021”

MI007 SE008 L “develop critical thinking”

N “The school is the social institution that has the task of systematically and over time imparting knowledge and values to all children and young people in Sweden”

N “In the case of pre-school, compulsory school and upper secondary school, parts of the education can strengthen children and young people’s information literacy and resilience against disinformation, propaganda and online hate”

21 MI008 SE008 L “increase awareness of disinformation, propaganda and online hate and to spread knowledge about media and information literacy and other resistance-building methods to as many people as possible”

N “an important starting point for the implementation of the external work has been to cooperate with and meet actors who pass on knowledge to others, so-called intermediaries”

N “Intermediaries can be organisations or individuals who, in its business has the potential to reach out to many people”

MI009 SE011 L “promote more digital competence within the general public, including in MIK issues”

N “The Royal Library has received SEK 25 million annually 2018–2020 for one investment called 'Digital first'”

MI010 SE013 L “strengthen digital competence and media and information literacy”

N “The Government instructs the Swedish Media Council to develop frameworks for a strengthened collaboration of media and information literacy (MIK) initiatives as of the 1st of October 2018”

5.1.2. Digital Educational Material The second and final sub-strategy derived from the collection of micro strategies is the digital educational material sub-strategy. This type of micro strategy aims to strengthen an individual’s critical thinking skills with a particular emphasis on raising disinformation awareness. Simple online checklists, governmental guidance publications and digital classroom material are methods generally employed to achieve this aim within this type of strategy. Both the UK and Sweden practised this form of sub-strategy, which was identified on three occasions via UK sources and on two occasions via Swedish sources.

5.1.2.1. United Kingdom Results:

Table 8 British Digital Educational Material sub-strategies

ID Source Element Meaning Units

MI011 UK026 L “help [the individual] understand and spot disinformation”

22 N “use the SHARE checklist to make sure you’re not contributing to the spread of harmful content: Source - make sure information comes from a trusted source Headline - always read beyond the headline Analyse - check the facts Retouched - does the image or video look as though it has been doctored? Error - look out for bad grammar and spelling”

MI012 UK035 L “identify, assess and respond to disinformation”

N “the ‘RESIST’ toolkit, which enables organisations to develop a strategic counter-disinformation capability”

MI013 UK035 L “increase the audience’s ability to spot disinformation”

N “providing them with straightforward advice to help them check whether content is likely to be false or intentionally misleading”

N “behaviour change campaign [S.H.A.R.E]”

5.1.2.2. Sweden Results:

Table 9 Swedish Digital Educational Material sub-strategies

ID Source Element Meaning Units

MI014 SE007 L “stimulates critical thinking and source criticism”

N The Living History Forum has produced the digital classroom material "Propaganda - Risk of Influence" which explains the mechanisms of propaganda”

MI015 SE016 L “to distinguish facts and independent reporting from fake news and disinformation”

N “The government therefore decided earlier this year on strengthened digital competence in both curricula and degree objectives and individual course and subject plans”

N “The State Media Council has, amongst other things, produced and developed the digital education material ‘MIK for me’ and educational material about propaganda and the power of images, for children and young people”

23 5.2. Macro Strategies From the 74 strategies discovered in this study, 59 macro strategies were identified that sought to reduce societal disinformation exposure—46 via British sources and 13 via Swedish sources. From these 59 macro strategies, seven distinct sub-strategies were conceived in accordance with the commonalities shared between them.

5.2.1. Independent Journalism and Media The first sub-strategy derived from the collection of macro strategies is the independent journalism and media sub-strategy. This type of macro strategy aims to create a pluralistic media landscape in and around a disinformation threat actor. In the case of the UK and Sweden, both states sought to cultivate a pluralistic media landscape in and around Russia’s near abroad on account of both states recognising Russia as the biggest threat actor in the dissemination of disinformation (Swedish Government, 2017; UK Government, 2021, p. 75). Policies aiming to create, support and promote independent journalism and media are typically employed to facilitate this end. Both the UK and Sweden practised this form of sub-strategy, which was identified on 14 occasions via UK sources and on two occasions via Swedish sources.

5.2.1.1. United Kingdom Results:

Table 10 British Independent Journalism and Media sub-strategies

ID Source Element Meaning Units

MA001 UK002 S “empower independent media”

P “building capacity and raising the professionalism of journalists, as well as providing higher quality products to local audiences, helping counter disinformation”

MA002 UK003 S “to support independent media, especially in Russia’s near abroad”

P “the Counter Disinformation and Media Development programme will fund initiatives to understand and expose the disinformation threat”

MA003 UK018 S “increasing capacity and professionalism of [Kyrgyzstan’s] journalists”

P “the British Embassy Bishkek is looking to support two projects for activity before March 2021”

MA004 UK019 S “promote independent media working in Belarus”

24 P “The UK has announced it is doubling its support to independent media, human rights organisations and community groups in Belarus, with an extra £1.5m for projects over the next two years”

MA005 UK021 S “enhance knowledge and understanding of emerging foreign and security policy issues among Czech communities by through activities to support quality independent journalism on [hybrid threats, including countering disinformation, promoting media literacy and cyber security]”

P “The Prague Programme Fund is a small and short-term funding mechanism, which allows British Embassy Prague to support local organisations seeking to deliver real and measurable outcomes”

MA006 UK022 S “Egypt: a project training journalists on countering disinformation and fake news”

P “The UK ran a major international campaign in 2019 on Media Freedom”

MA007 UK031 S “empower independent media [in the Eastern Europe and Central Asia (EECA) region]”

P “building capacity and raising the professionalism of journalists, as well as providing higher quality products to local audiences, helping counter disinformation”

MA008 UK033 S “counter disinformation across Eastern Europe and strengthen independent media in the Western Balkans”

P “£18 million over 3 years will support freedom of expression and strengthen independent media”

P “The funding from the Conflict, Stability and Security Fund (CSSF) will support freedom of expression and independent local voices in the Western Balkans to boost the creation of balanced, non-biased content”

P “The funding for Eastern Europe and Central Asia is part of a £100 million, 5-year commitment to counter disinformation and support independent media”

MA009 UK034 S “support independent media in Ukraine”

P “£9 million project which will strengthen societal resilience to disinformation and help increase Government accountability by developing independent sources of information in Ukraine and across the Eastern Partnership countries”

MA010 UK037 S “building the capacity of independent media outlets in Ukraine to hold power accountable and enable more informed and active citizens”

P “Support to National Anti-corruption Bureau of Ukraine with development of an information security management system (ISMS) strategy”

25 MA011 UK037 S “support peace-building efforts by helping establish an independent media space for people in the region to engage with decision makers and civil society”

P “supports the civic radio broadcaster, Hromadske Radio, to provide unbiased, factual reporting and news to the east of Ukraine (including the conflict affected areas), where access to independent media is limited and Russian disinformation is readily available”

MA012 UK038 S “build resilience to Russian disinformation and build plurality and balance across media landscapes”

P “joint actions [UK and Poland] aimed at supporting independent media in Eastern Partnership countries”

MA013 UK039 S “countering Russian disinformation”

P “more investment in public service and independent media operating in the Russian language, both through projects in the Baltic States, Ukraine, Moldova and Georgia”

P “And through reinvigorating the BBC Russia Service as an independent source of news for Russian speakers”

MA014 UK041 S “support trust- and peace-building efforts by offering an alternative media space for the people in the region [Donetsk and Luhansk regions] to engage with decision makers and civil society in the news and views format”

P “Funding: £350,000”

P “Supports a newly established civic radio broadcaster in becoming an independent and trustworthy information provider to the region, where trust in the central government is low, access to independent media limited and impact of Russian disinformation significant”

P “funding supports installation of 16 FM transmitters in Ukraine-controlled areas along the contact line and “grey zone” in the east, reaching an estimated audience of 2 million people”

5.2.1.2. Sweden Results:

Table 11 Swedish Independent Journalism and Media sub-strategies

ID Source Element Meaning Units

MA015 SE010 S “A pluralistic media landscape [in Russia]”

P “support for independent media in a broad sense, both traditional and new media”

26 P “In this area, cooperation with the EU and the European External Action Service 'East StratCom Task Force' is important”

MA016 SE017 S “Support free and independent media in the Baltic countries, Ukraine and in the Eastern Partnership”

P “Through the Swedish Institute and the Nordic Council of Ministers, for example, we have supported the new independent Russian-speaking public service channel in Estonia”

5.2.2. Strategic Communication The second sub-strategy derived from the collection of macro strategies is the strategic communication sub-strategy. This type of macro strategy aims to shape the information environment to command the strategic narrative. Fact checking, targeted messaging, communicative capacity building or information dissemination were all policy approaches that were commonly favoured to achieve this end. Both the UK and Sweden practised this form of sub-strategy, which was identified on 17 occasions via UK sources and on five occasions via Swedish sources.

5.2.2.1. United Kingdom Results:

Table 12 British Strategic Communication sub-strategies

ID Source Element Meaning Units

MA017 UK001 S “tackle harmful disinformation and inaccurate reporting around the world”

P “£8 million of funding for BBC World Service”

MA018 UK003 S “improve our response to disinformation campaigns”

P “Investment in the Government’s behavioural science expertise, horizon scanning and strategic communications”

MA019 UK004 S “tackle mis- and disinformation among ethnic minorities”

P “the government is regularly producing myth-busting content and utilising trusted platforms and messengers within communities and taking specific targeting approaches on social media channels (such as Facebook and Instagram which allows for better targeting)”

P “We are also using native language publisher sites and targeting specific media outlets (Asian Voice, Leader, The Nation, JC and Desi Express) as part of ongoing partnership work”

27 MA020 UK005 S “mythbust false information about COVID-19 and the vaccine”

P “A cross-government Counter Disinformation Unit”

MA021 UK007 S “address vaccine disinformation”

P “engagement at local level via trusted religious and community leaders, sharing examples of what is known to work well in nearby areas, and encouraging community-led efforts to address vaccine disinformation”

MA022 UK008 S “dispel any vaccine myths and disinformation”

P “established a network of 'Community Champions'”

P “the champions' role developed to become 'vaccine champions' to ensure as many residents as possible are vaccinated, whilst at the same time helping dispel any vaccine myths and disinformation”

P “By the end of 2020, there were 600 community champions”

MA023 UK011 S “countering vaccine disinformation”

P “transparency, openness and proactive and positive communications”

MA024 UK020 S “tackle disinformation about this revolutionary mobile technology [5G]”

P “new guidance on the safety and benefits of 5G so councils can give people the facts”

MA025 UK022 S “reduce the impact of Russian disinformation across wider Europe”

P “From 2017/18 until 2019/20, we have spent £62 million, to reduce the impact of Russian disinformation across wider Europe, through our Counter Disinformation and Media Development programme, funded by the Conflict, Stability and Security Fund (CSSF)”

MA026 UK023 S “combatting a range of harmful online narratives”

P “A small team from the Ministry of Defence, including members of 77 Brigade, is supporting the Cabinet Office’s Rapid Response Unit in its efforts to tackle disinformation”

MA027 UK023 S “combat the spread of harmful, false and misleading narratives”

P “deploying two British Army experts to NATO’s new COVID-19 Communications Hub, where they are helping to lead the fight against disinformation”

MA028 UK027 S “combat false and misleading narratives about coronavirus”

28 P “The Rapid Response Unit, operating from within the Cabinet Office and No10, is tackling a range of harmful narratives online”

MA029 UK029 S “counter gendered disinformation [in Conflict and Stabilisation Contexts]”

P “develop alternative and counter-narratives to disinformation campaigns”

MA030 UK032 S “Ukrainian government strategic communication capacity building”

P “expert support to help build the strategic communication capability of the Ukrainian government, including via a UK advisor to the Ministry of Foreign Affairs”

P “The focus of this work is on planning, coordination and evaluation of communication outputs, countering the effects of Russian disinformation and improving Ukraine’s international communication efforts”

MA031 UK037 S “countering Russian disinformation and improving Ukraine’s communication of its reform agenda”

P “embedded experts within the Cabinet of Ministers and a UK advisor to the Ministry of Foreign Affairs”

MA032 UK040 S “counter Russian disinformation in the region”

P “new joint strategic communications projects”

P “deliver valuable support to Belsat, a Polish-funded TV channel providing unbiased, free and frank reporting for Belarussians”

P “The UK will provide £5 million”

MA033 UK043 S “improve communications as a means of countering disinformation and propaganda [in Ukraine]”

P “The UK Conflict Stability and Security Fund (CSSF) seeks applications from organisations with experience of working on conflict prevention and mitigation to undertake work that will contribute to strengthening citizens’ engagement in peacebuilding in Ukraine”

P “developing the capacity of the Ukrainian government to design and implement a coherent communications strategy, reaching out to all parts of the population of Ukraine and to the international community”

P “supporting civil society to develop successful campaigns projecting positive narratives about Ukraine”

P “developing the ability of media outlets to report sensitive and controversial issues with accuracy and balance, with a greater understanding of the principles of conflict-sensitive reporting”

29 5.2.2.2. Sweden Results:

Table 13 Swedish Strategic Communication sub-strategies

ID Source Element Meaning Units

MA034 SE003 S “counteracting disinformation and misleading information regarding the vaccinations against COVID-19”

P “the Swedish Civil Contingencies Agency (MSB)" conducts several training sessions to counteract this”

P “The lectures are mainly aimed at communicators and other relevant staff at municipalities, regions and county administrative boards”

MA035 SE004 S “we must ensure that there are reliable sources as a counterforce [to disinformation]”

P “The Swedish Research Council is therefore commissioned, in collaboration with the Swedish Agency for Civil Protection and Emergency Planning and the Swedish Public Health Agency, to conduct communication efforts and contribute to the communication of scientifically substantiated facts about vaccination against COVID-19”

MA036 SE005 S “coordinate, strengthen and develop communication efforts on the COVID-19 pandemic aimed at the general public”

P “Swedish Civil Contingencies Agency, MSB, receive an increased allocation of SEK 89 million kronor”

MA037 SE008 S “detect and counter misleading information and attempts to influence other actors”

P “The Psyops Association contributes to creating resistance to undue information influence through education and information dissemination”

P “The Psyops Association is a nationwide association within the voluntary defence organisation, The Defence Educators, and is a network for people with knowledge and interest in information dissemination and initiatives that contribute to strengthened resilience and information influence within the total defence”

MA038 SE015 S “increase the ability to detect and act when disinformation risks damaging the image of Sweden”

P “The Ministry of Foreign Affairs’ and the Swedish Insitute's (SI) external monitoring can be better coordinated through more frequent reporting of disinformation efforts directed at Sweden observed within SI's Big Data analysis”

30 P “A readiness is needed to deploy a focused communications team (rapid reaction force) with high competence, well-developed plans and decision- making power to be able to act quickly when the image of Sweden is attacked through lies, factual distortion or other discrediting information”

5.2.3. Situational Awareness The third sub-strategy derived from the collection of macro strategies is the situational awareness sub-strategy. This type of macro strategy seeks to enhance an actor’s situational awareness of disinformation within the information environment. Monitoring, analysis, and risk assessments are generally employed to achieve this effect. Both the UK and Sweden practised this form of sub-strategy, which was identified on two occasions via UK sources and on one occasion via Swedish sources.

5.2.3.1. United Kingdom Results:

Table 14 British Situational Awareness sub-strategies

ID Source Element Meaning Units

MA039 UK014 S “provide a comprehensive picture of the extent, scope and the reach of disinformation and misinformation”

P “Department for Digital, Culture, Media and Sport-led cross-Whitehall Counter Disinformation Unit”

MA040 UK028 S “provide a comprehensive picture on the potential extent, scope and impact of disinformation on coronavirus in the UK”

P “The Department for Digital, Culture, Media and Sport earlier this week announced the creation of a new cross-Government counter disinformation unit bringing together expert teams”

5.2.3.2. Sweden Results:

Table 15 Swedish Situational Awareness sub-strategies

ID Source Element Meaning Units

MA041 SE008 S “identify and meet information influence and other dissemination of misleading information aimed at Sweden”

31 P “As part of MSB's work to identify, analyse and respond to information influence, a continuous external monitoring, analysis of vulnerabilities in Sweden and attempts at information influence, as well as analysis of the consequences of influence activities, are carried out.”

P “The work also includes responding to information influence by alerting to ongoing influence activities, creating situational awareness and reports, educating and supporting other actors, raising public awareness of threats and vulnerabilities and actively communicating accurate information.”

P “MSB has financed knowledge overviews on information influence from certain environments, produced analyses of threats, risks and vulnerabilities in the media industry, as well as the handbook for communicators, Countering information influence activities”

5.2.4. Civil Society The fourth sub-strategy derived from the collection of macro strategies is the civil society sub- strategy. This type of macro strategy aims to establish partnerships with civil society to address disinformation. To realise this goal, policy approaches generally involve the use of government grants and embassy collaborations with NGOs and other civil society actors. Both the UK and Sweden practised this form of sub-strategy, which was identified on three occasions via UK sources and on three occasions via Swedish sources.

5.2.4.1. United Kingdom Results:

Table 16 British Civil Society sub-strategies

ID Source Element Meaning Units

MA042 UK025 S “expose and reduce disinformation around COVID-19 in the Ukrainian media space and social networks, in particular, disinformation originating from temporarily occupied territories of and Crimea and from Russia”

P “The British Embassy administers this support to Ukraine through our International Programme COVID-19 Enabling Fund 2020 to 2021”

P “invite proposals for project work through civil society organisations as well as government bodies, in all regions of Ukraine, in support of Ukraine’s response to the consequences of COVID-19”

MA043 UK037 S “tackle the negative effects of disinformation and propaganda about Ukraine (within Ukraine and around the world)”

P “supports the NGO StopFake.org in its fact-checking, debunking and information verification work”

32 MA044 UK042 S “Promote the economic and societal advantages of EU integration processes in the Republic of Moldova in order to counter disinformation particularly among conflict affected communities”

P “The UK Conflict Stability and Security Fund (CSSF) seeks applications from organisations with experience of working on conflict transformation and confidence building in conflict-affected communities to undertake work that will contribute to strengthening citizens’ and political engagement in the conflict transformation in the Republic of Moldova”

5.2.4.2. Sweden Results:

Table 17 Swedish Civil Society sub-strategies

ID Source Element Meaning Units

MA045 SE009 S “democratic development [overseas]”

P “In 2017, the government allocated funds for a special investment over four years freedom of expression”

P “Sweden contributed in 2019 to strengthened capacity in civil society organizations and increased resilience among actors in civil society, in particular media, against external influences”

P “Support was provided to, among others, civil society organizations in Russia, to the exchange of journalists through Barents Press, to courses in media management at SSE Riga and for projects through the Fojo: Media Institute at Linnaeus University to strengthen the development of a free, independent and professional media in Eastern Europe”

MA046 SE009 S “counteracting disinformation in the Baltic Sea Region”

P “The grant is proposed to be increased by SEK 10 000 000 in 2021 for a continued effort to increase measures for issues in the areas of freedom of expression, cyber security, contacts with civil society and counteracting disinformation”

MA047 SE012 S “We need to invest more in international relations with countries where democracy is declining and we need to provide even better support to civil society and positive forces in these countries”

P “In Poland, school projects have been done on media and disinformation”

P “A large part of the work is done through our embassies”

33 5.2.5. Technological The fifth sub-strategy derived from the collection of macro strategies is the technological sub- strategy. This type of macro strategy backs technological solutions that function to reduce societal disinformation exposure through public-private partnerships. Two distinctive aspects of this approach were distinguished in the analysis. The first aspect entailed top-down software design guidance directed at the private sector, which aimed to steer software development to limit the opportunities a user has to critically engage with online content in their browsing experience. The second aspect involved the funding of software that could identify and flag potential disinformation for the user in real-time. This form of sub-strategy was unique to the UK and was identified on six occasions.

5.2.5.1. United Kingdom Results:

Table 18 British Technological sub-strategies

ID Source Element Meaning Units

MA048 UK009 S “Flagging of content with false, misleading and/or harmful narratives”

P “Apps and plugins aimed at users can help build audience resilience to both misinformation and disinformation by advising them on the reliability of a website or news article as well as steer them towards trustworthy content”

MA049 UK014 S “address the risk of misinformation and disinformation spreading on services”

P “publishing a safety by design framework”

P “Online services and products can be designed in a way that limits the ability of users to engage critically with online content”

MA050 UK017 S “develop new technologies to help human moderators tackle the spread of online disinformation and identify harms linked to online targeting and manipulation”

P “funding from UK Research and Innovation’s (UKRI) Strategic Priorities Fund”

MA051 UK024 S “identify potential sources of disinformation and promote verified facts”

P “boost the Safety Tech sector in the UK”

34 MA052 UK030 S “tackle some of the dangers of the online world from privacy abuses and wrongful use of data like disinformation and online fraud”

P “A further project, backed by £18 million government investment, through the Strategic Priorities Fund (SPF)”

P “The project will help understand what businesses and individuals need to reduce the harm they are exposed to by using online platforms and will aim to develop more trustworthy technology”

MA053 UK036 S “development of cutting-edge new technology to combat disinformation online”

P “$250,000 grant”

P “Semantic Visions secured the funding award to finance their platform which provides real-time detection of adversarial propaganda and disinformation and gives user joint situational awareness of event and emerging trends”

5.2.6. Psychological Defence The sixth sub-strategy derived from the collection of macro strategies is the psychological defence sub-strategy. This type of macro strategy aims to establish a psychological defence against disruptive information activities that target the population’s perceptions with the aim of reducing their willingness to resist attacks directed against the state. This would be realised through an authority that would have the function to identify, analyse, respond to prevent malign information activities from influencing the population. This form of sub-strategy was unique to Sweden and was identified on two occasions.

5.2.6.1. Sweden Results:

Table 19 Swedish Psychological Defence sub-strategies

ID Source Element Meaning Units

MA054 SE001 S “identify, analyse, meet and prevent improper information influence and other misleading information aimed at weakening Sweden's resilience and the will of the people to defend themselves or to improperly influence the perceptions, behaviours and decision-making of different target groups”

P “Psychological defence”

MA055 SE002 S “strengthen Sweden's psychological defences”

35 P “Next year, the government intends to establish a new authority that will have the overall responsibility for developing and coordinating Sweden's psychological defence”

5.2.7. Legislative The seventh and final sub-strategy derived from the collection of macro strategies is the legislative sub-strategy. This type of macro strategy seeks to shape the law to reduce societal disinformation exposure. There were varying legislative functions identified within this analysis. One such proposal required political actors, campaigners, and others to identify who they are when promoting political campaign content online. Another law sought to coerce private social media firms to spell out their terms and conditions relating to disinformation being used on their platforms, with enforcement and sanction powers available to the state should these firms fail to uphold agreed terms. This form of sub-strategy was unique to the UK and was identified on four occasions.

5.2.7.1. United Kingdom Results:

Table 20 British Legislative sub-strategies

ID Source Element Meaning Units

MA056 UK010 S “safeguard UK elections from intimidation, influence and disinformation”

P “implement a digital imprints regime for online campaigning material”

MA057 UK012 S “setting and enforcing clear terms and conditions which explicitly state how [popular social media sites] will handle content”

P “Ofcom is now confirmed as the regulator with the power to fine companies failing in their duty of care up to £18 million or ten per cent of annual global turnover, whichever is higher”

P “It will have the power to block non-compliant services from being accessed in the UK”

P “legislation includes provisions to impose criminal sanctions on senior managers”

MA058 UK013 S “[companies] will need to set out what content, including many types of misinformation and disinformation on social media platforms, such as anti-vaccination content and falsehoods about Covid-19, is and is not acceptable in their terms and conditions”

36 P “If what is appearing on their platforms doesn’t match up with the promises made to users, Ofcom will be able to take enforcement action”

MA059 UK015 S “limit the spread of vaccine misinformation and disinformation”

P “Facebook, Twitter and Google committed to the principle that no company should profit from or promote COVID-19 anti-vaccine disinformation, to respond to flagged content more swiftly, and to work with authorities to promote scientifically accurate messages”

5.3. Discussion From the 74 strategies analysed in this study, nine distinct sub-strategies were unveiled based upon discernible commonalities shared between the various micro and macro strategies under investigation. Of the nine distinct sub-strategies identified within this study, the UK and Sweden jointly practised six of them, with two distinct sub-strategies exclusively practised by the UK and one that was exclusively practised by Sweden.

5.3.1. The Micro Strategy Summarised Two of the nine distinct sub-strategies identified within this study were of the micro type—a strategy that aimed to enhance an individual’s ability to independently evaluate the accuracy of information. The UK and Sweden jointly practiced these two sub-strategies (see table 21).

Table 21 Components of the Micro Strategy

Sub-strategy Learning Effects Necessary Means UK SWE

Media and Information Strengthening critical Educational practices to

Literacy thinking with an evaluate, use and create emphasis on improving information (media competence in literacy) navigating the digital environment

Digital Educational Strengthening critical Simple online checklists,

Material thinking with an classroom material or emphasis on raising governmental guidance disinformation publications awareness

37 5.3.2. The Macro Strategy Summarised Seven of the nine distinct sub-strategies identified within this study were of the macro type— a strategy that aimed to reduce societal disinformation exposure. Four of these sub-strategies were jointly practiced by the UK and Sweden, whilst two of them were exclusively practised by the UK and one was exclusively practised by Sweden (see table 22).

Table 22 Components of the Macro Strategy

Sub-strategy Structural Effects Policy Approach UK SWE

Independent Create a pluralistic Creating, supporting, and Journalism and Media media landscape promoting independent journalism and media

Strategic Shape the information Fact checking, targeted Communication environment to messaging, command the strategic communicative capacity narrative building or information dissemination

Situational Awareness Enhance situational Monitoring, analysis, awareness of and risk assessments disinformation within the information environment

Civil Society Establish partnerships Government grants and with civil society to embassy collaborations address disinformation

Technological Back technological Software design solutions that function to guidance and reduce societal government grants for disinformation exposure real-time disinformation through public-private monitoring partnerships

Psychological Defence Establish a psychological Establishing authority to defence against identify, analyse, disruptive information respond to prevent activities that target the malign information population’s perceptions activities from with the aim of reducing influencing the their willingness to resist population attacks directed against the state

Legislative Shape the law to reduce Digital imprints regime, societal disinformation enforcement and exposure sanction powers against social media firms who fail to adhere to agreed standards

38 6. Conclusion

This thesis sought to understand what resilience strategies were being employed by the UK and Sweden to combat disinformation for the benefit of adding perspective and repository knowledge to future intervention planning. The findings of this thesis have demonstrated that both states share approximately two-thirds of their disinformation resilience strategy with one another—sharing six of the nine distinct sub-strategies identified within this study, whilst two of them were exclusively practised by the UK and one was exclusively practised by Sweden.

Using two criteria—micro and macro—this thesis has shown how resilience can be understood at scale across 74 strategies to discern between bottom-up and top-down focused strategies against disinformation. Of the micro kind, which aimed to improve an individual’s ability to make decisions over their own lives, two distinct sub-strategies were identified from 15 micro strategies that sought to enhance the individual’s ability to independently evaluate the accuracy of information. Of the macro kind, which aimed to shape exogenous influences in the environment, seven distinct sub-strategies were identified from 59 macro strategies that sought to reduce societal disinformation exposure. This study, however, is reluctant to infer any bold conclusions from the imbalance observed in the frequency between micro and macro strategies, given the descriptive nature of this thesis and the abridged representation of resilience that was limited by the scope of this research project. Nevertheless, at face value, this study’s findings hint towards a genuine bias in favour of macro strategies from both the UK and Sweden— which, given the issues identified within this study’s resilience chapter concerning top-down strategies overlooking the role of individual cognition, ought to call for additional research to assess the merits of this hypothesis (Lazer et al., 2018; Nyhan et al., 2014).

The breadth of disinformation resilience strategies uncovered in these findings provides a preliminary descriptive framework for scholars who are examining plausible interventions against the threat of disinformation. This might aid such questions such as, what is being done to combat disinformation? What are the different forms of resilience that can be undertaken against disinformation? How can civil society contribute towards disinformation resilience? And of course, how are the United Kingdom and Sweden combatting disinformation? Furthermore, in accordance with this study’s aim to contribute towards future disinformation intervention planning, this thesis encourages the undertaking of a normative research project to evaluate the impact and effectiveness of the various resilience strategies uncovered in this analysis for the purposes of providing proscriptive guidance to combat disinformation.

39 References

Literature Arayankalam, J., & Krishnan, S. (2021). Relating Foreign Disinformation through Social Media, Domestic Online Media Fractionalization, Government’s Control over Cyberspace, and Social Media-induced Offline Violence: Insights from the Agenda- building Theoretical Perspective. Technological Forecasting and Social Change, 166, 120661. https://doi.org/10.1016/j.techfore.2021.120661

Aslama, M. (2019). Disinformation as Warfare in the Digital Age: Dimensions, Dilemmas, and Solutions. Journal of Vincentian Social Action, 4(2), 6–21.

Atlantic Council. (2019). Democratic Defense Against Disinformation 2.0. https://www.atlanticcouncil.org/in-depth-research-reports/report/democratic-defense- against-disinformation-2-0/ [Accessed 16th May 2021].

Baines, P., & Jones, N. (2018). Influence and Interference in Foreign Elections: The Evolution of its Practice. The RUSI Journal, 163(1), 12–19. https://doi.org/10.1080/03071847.2018.1446723

Béné, C., Headey, D., Haddad, L., & von Grebmer, K. (2016). Is Resilience a Useful Concept in the Context of Food Security and Nutrition Programmes? Some Conceptual and Practical Considerations. Food Security, 8(1), 123–138. https://doi.org/10.1007/s12571-015-0526-x

Bradshaw, S., Bailey, H., & Howard, P. N. (2020). Industrialized Disinformation: 2020 Global Inventory of Organized Social Media Manipulation. University of Oxford. https://demtech.oii.ox.ac.uk/wp-content/uploads/sites/127/2021/02/CyberTroop- Report20-Draft9.pdf [Accessed 16th May 2021].

Braw, E. (2020). The Case for National Resilience Training for Teenagers. Royal United Services Institute for Defence and Security Studies. https://rusi.org/sites/default/files/rusi_pub_174_2019_12_resilience_braw_final.pdf [Accessed 16th May 2021].

Brookings. (2021, January). How Disinformation Evolved in 2020. https://www.brookings.edu/techstream/how-disinformation-evolved-in-2020/ [Accessed 16th May 2021].

40 Buzzfeed News. (2020). Disinformation for Hire: How A New Breed of PR Firms is Selling Lies Online. https://www.buzzfeednews.com/article/craigsilverman/disinformation- for-hire-black-pr-firms [Accessed 16th May 2021].

Carter, E. B., & Carter, B. L. (2021). Questioning More: RT, Outward-Facing Propaganda, and the Post-West World Order. Security Studies, 30(1), 49–78. https://doi.org/10.1080/09636412.2021.1885730

Chambliss, D. F., & Schutt, R. K. (Eds.). (2019). Conceptualization and Measurement. In Making Sense of the Social World: Methods of Investigation (Sixth edition, pp. 91– 117). SAGE.

Chandler, D. (2012). Resilience and Human Security: The Post-Interventionist Paradigm. Security Dialogue, 43(3), 213–229. https://doi.org/10.1177/0967010612444151

Chandler, D. (2014). Resilience: The Governance of Complexity. Routledge.

Cisco. (2020). Cisco Annual Internet Report (2018–2023) [White Paper]. https://www.cisco.com/c/en/us/solutions/collateral/executive-perspectives/annual- internet-report/white-paper-c11-741490.pdf [Accessed 16th May 2021].

Council of the European Union. (2020). Council calls for strengthening resilience and countering hybrid threats, including disinformation in the context of the COVID-19 pandemic. https://www.consilium.europa.eu/en/press/press- releases/2020/12/15/council-calls-for-strengthening-resilience-and-countering-hybrid- threats-including-disinformation-in-the-context-of-the-covid-19-pandemic/ [Accessed 16th May 2021].

Creswell, J. W., & Creswell, J. D. (2018). Research Design: Qualitative, Quantitative, and Mixed Methods Approaches (Fifth edition). SAGE.

Cutter, S. L., Barnes, L., Berry, M., Burton, C., Evans, E., Tate, E., & Webb, J. (2008). A Place-based Model for Understanding Community Resilience to Natural Disasters. Global Environmental Change, 18(4), 598–606. https://doi.org/10.1016/j.gloenvcha.2008.07.013

Deibler, J. (2020). Contesting the Psychological Domain during Great Power Competition. Global Security and Intelligence Studies, 5(1), 125–150. https://doi.org/10.18278/gsis.5.1.9

41 DIIS. (2020). Russian Disinformation: An Example. https://www.diis.dk/en/russian- disinformation-an-example [Accessed 16th May 2021].

Dreze, J. (2000). Militarism, Development and Democracy. Economic and Political Weekly, 35(14), 1171–1183. https://doi.org/10.2307/4409112

Duit, A., Galaz, V., Eckerberg, K., & Ebbesson, J. (2010). Governance, Complexity, and Resilience. Global Environmental Change, 20(3), 363–368. https://doi.org/10.1016/j.gloenvcha.2010.04.006

EU DisinfoLab. (2020). Being Cautious with Attribution: Foreign Interference & COVID-19 Disinformation. https://www.disinfo.eu/wp- content/uploads/2020/04/20200414_foreignintereferencecovid19-1.pdf [Accessed 16th May 2021].

European Commission. (2017). A Strategic Approach to Resilience in the EU’s External Action. https://eeas.europa.eu/sites/default/files/join_2017_21_f1_communication_from_com mission_to_inst_en_v7_p1_916039.pdf [Accessed 16th May 2021].

European Commission. (2018). Action Plan Against Disinformation. https://eeas.europa.eu/sites/default/files/action_plan_against_disinformation.pdf [Accessed 16th May 2021].

Fallis, D. (2015). What is Disinformation? Library Trends, 63(3), 401–426. https://doi.org/10.1353/lib.2015.0014

FMPRC. (2020). Chinese and Russian Foreign Ministry Spokespersons Held Consultations and Agreed to Cooperate in Combating Disinformation. https://www.fmprc.gov.cn/mfa_eng/wjbxw/t1800619.shtml [Accessed 16th May 2021].

Forman, J., & Damschroder, L. (2007). Qualitative Content Analysis. In Advances in Bioethics (Vol. 11, pp. 39–62). Elsevier. https://doi.org/10.1016/S1479- 3709(07)11003-7

Försvarsdepartementet. (2020). Regeringens proposition 2020/21:30—Totalförsvaret 2021– 2025. https://www.regeringen.se/4a965d/globalassets/regeringen/dokument/forsvarsdeparte

42 mentet/forsvarsproposition-2021-2025/totalforsvaret-2021-2025-prop.-20202130.pdf [Accessed 16th May 2021].

Freedom House. (2017). Manipulating Social Media to Undermine Democracy. https://freedomhouse.org/report/freedom-net/2017/manipulating-social-media- undermine-democracy [Accessed 16th May 2021].

Gallacher, J. D., Heerdink, M. W., & Hewstone, M. (2021). Online Engagement Between Opposing Political Protest Groups via Social Media is Linked to Physical Violence of Offline Encounters. Social Media + Society, 7(1), 205630512098444. https://doi.org/10.1177/2056305120984445

Graneheim, U. H., & Lundman, B. (2004). Qualitative Content Analysis in Nursing Research: Concepts, Procedures and Measures to Achieve Trustworthiness. Nurse Education Today, 24(2), 105–112. https://doi.org/10.1016/j.nedt.2003.10.001

Grant, T. D. (2015). Annexation of Crimea. The American Journal of International Law, 109(1), 68. https://doi.org/10.5305/amerjintelaw.109.1.0068

Guess, A. M., Lerner, M., Lyons, B., Montgomery, J. M., Nyhan, B., Reifler, J., & Sircar, N. (2020). A Digital Media Literacy Intervention Increases Discernment Between Mainstream and False News in the United States and India. Proceedings of the National Academy of Sciences, 117(27), 15536–15545. https://doi.org/10.1073/pnas.1920498117

Guess, A. M., & Lyons, B. A. (2020). Misinformation, Disinformation, and Online Propaganda. In N. Persily & J. A. Tucker (Eds.), Social Media and Democracy (1st ed., pp. 10–33). Cambridge University Press. https://doi.org/10.1017/9781108890960.003

Haack, S. (2019). Post “Post‐Truth”: Are We There Yet? Theoria, 85(4), 258–275. https://doi.org/10.1111/theo.12198

Humprecht, E., Esser, F., & Van Aelst, P. (2020). Resilience to Online Disinformation: A Framework for Cross-National Comparative Research. The International Journal of Press/Politics, 25(3), 493–516. https://doi.org/10.1177/1940161219900126

Hussain, M. M. (2012). Journalism’s Digital Disconnect: The Growth of Campaign Content and Entertainment Gatekeepers in Viral Political Information. Journalism: Theory, Practice & Criticism, 13(8), 1024–1040. https://doi.org/10.1177/1464884911433253

43 Internetkunskap. (2021). Behöver vi oroa oss för desinformation? https://internetkunskap.se/desinformation/behover-vi-oroa-oss-for-desinformation/ [Accessed 16th May 2021].

Jack, C. (2017). Lexicon of Lies: Terms for Problematic Information (p. 20). Data & Society. https://datasociety.net/wp- content/uploads/2017/08/DataAndSociety_LexiconofLies.pdf [Accessed 16th May 2021].

Joseph, J. (2016). Governing through Failure and Denial: The New Resilience Agenda. Millennium: Journal of International Studies, 44(3), 370–390. https://doi.org/10.1177/0305829816638166

Kania, E. B. (2020). The Ideological Battlefield. In C. Whyte, A. T. Thrall, & B. M. Mazanec (Eds.), Information Warfare in the Age of Cyber Conflict (pp. 42–53). Routledge/Taylor & Francis Group.

Karlsen, G. H. (2019). Divide and Rule: Ten Lessons about Russian Political Influence Activities in Europe. Palgrave Communications, 5(1), 19. https://doi.org/10.1057/s41599-019-0227-8

Kavanagh, J., & Rich, M. (2018). Truth Decay: An Initial Exploration of the Diminishing Role of Facts and Analysis in American Public Life. RAND Corporation. https://doi.org/10.7249/RR2314

Khaldarova, I., & Pantti, M. (2016). Fake News: The Narrative Battle over the Ukrainian Conflict. Journalism Practice, 10(7), 891–901. https://doi.org/10.1080/17512786.2016.1163237

Klein, H. (2018). Information Warfare and Information Operations: Russian and U.S. Perspectives. Journal of International Affairs, 71(1.5), 135–142.

Kornienko, A. A., Kornienko, A. V., Fofanov, O. B., & Chubik, M. P. (2015). The Nature of Knowledge Power in Communicative Information Society. Procedia - Social and Behavioral Sciences, 166, 595–600. https://doi.org/10.1016/j.sbspro.2014.12.579

Kremlin Russia. (2014). Vladimir Putin Answered Journalists’ Questions on the Situation in Ukraine. http://en.kremlin.ru/events/president/news/20366 [Accessed 16th May 2021].

44 Krippendorff, K. (2004). Content Analysis: An Introduction to its Methodology (2nd ed). Sage.

Landman, T. (2008). Issues and Methods in Comparative Politics: An Introduction (3rd ed). Routledge.

Lanoszka, A. (2019). Disinformation in International Politics. European Journal of International Security, 4(2), 227–248. https://doi.org/10.1017/eis.2019.6

Lazer, D. M. J., Baum, M. A., Benkler, Y., Berinsky, A. J., Greenhill, K. M., Menczer, F., Metzger, M. J., Nyhan, B., Pennycook, G., Rothschild, D., Schudson, M., Sloman, S. A., Sunstein, C. R., Thorson, E. A., Watts, D. J., & Zittrain, J. L. (2018). The Science of Fake News. Science, 359(6380), 1094–1096. https://doi.org/10.1126/science.aao2998

Library of Congress. (2020). Government Responses to Disinformation on Social Media Platforms: Comparative Summary. https://www.loc.gov/law/help/social-media- disinformation/compsum.php [Accessed 16th May 2021].

Lin, C. A. (2019). The Challenge of Information and Communication Divides in the Age of Disruptive Technology. Journal of Broadcasting & Electronic Media, 63(4), 587– 594. https://doi.org/10.1080/08838151.2019.1699677

Lynch, J., & McGoldrick, A. (2007). Peace Journalism. In C. Webel & J. Galtung (Eds.), Handbook of Peace and Conflict Studies (pp. 248–264). Routledge.

McAslan, A. (2010). The Concept of Resilience: Understanding its Origins, Meaning and Utility (p. 13). Torrens Resilience Institute. https://www.flinders.edu.au/content/dam/documents/research/torrens-resilience- institute/resilience-origins-and-utility.pdf [Accessed 16th May 2021].

Meinert, J., Mirbabaie, M., Dungs, S., & Aker, A. (2018). Is It Really Fake? – Towards an Understanding of Fake News in Social Media Communication. In G. Meiselwitz (Ed.), Social Computing and Social Media. User Experience and Behavior (Vol. 10913, pp. 484–497). Springer International Publishing. https://doi.org/10.1007/978- 3-319-91521-0_35

Mejias, U. A., & Vokuev, N. E. (2017). Disinformation and the Media: The Case of Russia and Ukraine. Media, Culture & Society, 39(7), 1027–1042. https://doi.org/10.1177/0163443716686672

45 Merriam, S. B., & Tisdell, E. J. (2015). Part One: The Design of Qualitative Research. In Qualitative Research: A Guide to Design and Implementation (Fourth edition, pp. 1– 104). John Wiley & Sons.

MSB. (2018). Att Möta Informationspåverkan: Handbok för Kommunikatörer. https://rib.msb.se/filer/pdf/28778.pdf [Accessed 16th May 2021].

MSB. (2019). Countering Information Influence Activities: A Handbook for Communicators. (p. 46). https://www.msb.se/RibData/Filer/pdf/28698.pdf [Accessed 16th May 2021].

NATO. (2020). NATO’s Approach to Countering Disinformation: A focus on COVID-19. https://www.nato.int/cps/en/natohq/177273.htm [Accessed 16th May 2021].

NATO StratCom. (2015). Analysis of Russia’s Information Campaign against Ukraine (p. 40). https://stratcomcoe.org/cuploads/pfiles/russian_information_campaign_public_12012 016fin.pdf [Accessed 16th May 2021].

Newman, N., Fletcher, R., Schulz, A., Andi, S., & Nielsen, R. K. (2020). Reuters Institute Digital News Report 2020 (p. 111). Reuters Institute for the Study of Journalism. https://reutersinstitute.politics.ox.ac.uk/sites/default/files/2020- 06/DNR_2020_FINAL.pdf [Accessed 16th May 2021].

Noble, H., & Smith, J. (2015). Issues of Validity and Reliability in Qualitative Research. Evidence Based Nursing, 18(2), 34–35. https://doi.org/10.1136/eb-2015-102054

Nyhan, B., Reifler, J., Richey, S., & Freed, G. L. (2014). Effective Messages in Vaccine Promotion: A Randomized Trial. Pediatrics, 133(4). https://doi.org/10.1542/peds.2013-2365

Official Journal of the European Union. (2018). EU strategic communication to counteract anti-EU propaganda by third parties (C 224/58). European Parliament. https://eur- lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:52016IP0441&from=EN [Accessed 16th May 2021].

Padovani, C., & Pavan, E. (2011). Actors and Interactions in Global Communication Governance: The Heuristic Potential of a Network Approach. In R. Mansell & M. Raboy (Eds.), The Handbook of Global Media and Communication Policy (pp. 543– 563). Wiley-Blackwell. https://doi.org/10.1002/9781444395433.ch33

46 Palinkas, L. A., Horwitz, S. M., Green, C. A., Wisdom, J. P., Duan, N., & Hoagwood, K. (2015). Purposeful Sampling for Qualitative Data Collection and Analysis in Mixed Method Implementation Research. Administration and Policy in Mental Health and Mental Health Services Research, 42(5), 533–544. https://doi.org/10.1007/s10488- 013-0528-y

Petty, R. E., & Cacioppo, J. T. (1986). The Elaboration Likelihood Model of Persuasion. In Advances in Experimental Social Psychology (Vol. 19, pp. 123–205). Elsevier. https://doi.org/10.1016/S0065-2601(08)60214-2

Pew Research Center. (2017). The Future of Truth and Misinformation Online. https://www.pewresearch.org/internet/2017/10/19/the-future-of-truth-and- misinformation-online/ [Accessed 16th May 2021].

Rainsel, D., Gantz, J., & Rydning, J. (2018). The Digitization of the World: From Edge to Core [White Paper]. IDC. https://www.seagate.com/files/www-content/our- story/trends/files/idc-seagate-dataage-whitepaper.pdf [Accessed 16th May 2021].

Ramsbotham, O., Woodhouse, T., & Miall, H. (2017a). Conflict Resolution, the Media and the Communications Revolution. In Contemporary Conflict Resolution (Fourth edition, pp. 420–442). Polity Press.

Ramsbotham, O., Woodhouse, T., & Miall, H. (2017b). Preventing Violent Conflict. In Contemporary Conflict Resolution (Fourth edition, pp. 144–172). Polity Press.

RFE/RL. (2019). From ‘Not Us’ To ‘Why Hide It?’: How Russia Denied Its Crimea Invasion, Then Admitted It. https://www.rferl.org/a/from-not-us-to-why-hide-it-how-russia- denied-its-crimea-invasion-then-admitted-it/29791806.html [Accessed 16th May 2021].

Roberts, P., Priest, H., & Traynor, M. (2006). Reliability and Validity in Research. Nursing Standard, 20(44), 41–45. https://doi.org/10.7748/ns.20.44.41.s56

Rtec Auto Design. (2016). Official Video REVERE NOT REVENGE viral making of the Range Rover #cheater which went viral. Youtube. https://www.youtube.com/watch?time_continue=2&v=rgzsod1sDbc [Accessed 16th May 2021].

47 Ruhmann, I., & Bernhardt, U. (2019). Information Warfare – From Doctrine to Permanent Conflict. In C. Reuter (Ed.), Information Technology for Peace and Security (pp. 63– 82). Springer Fachmedien Wiesbaden. https://doi.org/10.1007/978-3-658-25652-4_4

Sarts, J. (2021). Disinformation as a Threat to National Security. In S. Jayakumar, B. Ang, & N. D. Anwar, Disinformation and Fake News (pp. 23–33).

Select Committee on International Relations. (2018). UK Foreign Policy in a Shifting World Order (HL Paper 250; p. 117). UK House of Lords. https://publications.parliament.uk/pa/ld201719/ldselect/ldintrel/250/25006.htm [Accessed 16th May 2021].

Sharkey, J. (1993). DisInformation, DatInformation. Peace Review, 5(1), 65–69. https://doi.org/10.1080/10402659308425695

Singh, J. P. (2002). Negotiating Regime Change: The Weak, the Strong and the WTO Telecom Accord. In J. N. Rosenau & J. P. Singh (Eds.), Information Technologies and Global Politics: The Changing Scope of Power and Governance (pp. 239–272). State University of New York Press.

Singh, J. P. (2013). Information Technologies, Meta-power, and Transformations in Global Politics. International Studies Review, 15(1), 5–29. https://doi.org/10.1111/misr.12025

Smith, P. A. (1989). On Political War. National Defense University Press Publications.

Sturgess, P. (2015). Measuring Resilience. Evidence on Demand. https://doi.org/10.12774/eod_tg.may2016.sturgess2

Swedish Government. (2017). Speech by Minister for Defence Peter Hultqvist on Northern European Security. https://www.government.se/speeches/2017/05/speech-by- minister-for-defence-peter-hultqvist-on-northern-european-security/ [Accessed 16th May 2021].

Taber, C. S., & Lodge, M. (2006). Motivated Skepticism in the Evaluation of Political Beliefs. American Journal of Political Science, 50(3), 755–769. https://doi.org/10.1111/j.1540-5907.2006.00214.x

Taddicken, M., & Wolff, L. (2020). ‘Fake News’ in Science Communication: Emotions and Strategies of Coping with Dissonance Online. Media and Communication, 8(1), 206– 217. https://doi.org/10.17645/mac.v8i1.2495

48 Tashakkori, A., & Teddlie, C. (Eds.). (2003). Handbook of Mixed Methods in Social & Behavioral Research. SAGE Publications.

Taylor, P. M. (2003). Introduction Looking Through a Glass Onion: Propaganda, Psychological Warfare and Persuasion. In Munitions of the Mind: A History of Propaganda from the Ancient World to the Present Era (3rd ed, pp. 1–16). Manchester University Press.

The Guardian. (2014). Conflict Fears Rise After Pro-Russian Gunmen Seize Crimean Parliament. https://www.theguardian.com/world/2014/feb/24/ukraine-crimea-russia- secession [Accessed 16th May 2021].

Tormala, Z. L., & Petty, R. E. (2004). Source Credibility and Attitude Certainty: A Metacognitive Analysis of Resistance to Persuasion. Journal of Consumer Psychology, 14(4), 427–442. https://doi.org/10.1207/s15327663jcp1404_11

UK Government. (2021). Global Britain in a Competitive Age: The Integrated Review of Security, Defence, Development and Foreign Policy. https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachme nt_data/file/975077/Global_Britain_in_a_Competitive_Age- _the_Integrated_Review_of_Security__Defence__Development_and_Foreign_Policy. pdf [Accessed 16th May 2021].

UK Government Communication Service. (2019). RESIST Counter Disinformation Toolkit (p. 72). https://3x7ip91ron4ju9ehf2unqrm1-wpengine.netdna-ssl.com/wp- content/uploads/2020/03/RESIST-Counter-Disinformation-Toolkit.pdf [Accessed 16th May 2021].

UK Ministry of Defence. (2018). Information Advantage (JCN 2/18). https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachme nt_data/file/919262/20200909-JCN_2_18_Information_Advantage_accessible.pdf [Accessed 16th May 2021].

U.S. Department of Defense. (2016). Department of Defense Strategy for Operations in the Information Environment. https://dod.defense.gov/Portals/1/Documents/pubs/DoD- Strategy-for-Operations-in-the-IE-Signed-20160613.pdf [Accessed 16th May 2021].

Vaccari, C. (2017). Online Mobilization in Comparative Perspective: Digital Appeals and Political Engagement in Germany, Italy, and the United Kingdom. Political Communication, 34(1), 69–88. https://doi.org/10.1080/10584609.2016.1201558

49 Vargo, C. J., Guo, L., & Amazeen, M. A. (2018). The Agenda-setting Power of Fake News: A Big Data Analysis of the Online Media Landscape from 2014 to 2016. New Media & Society, 20(5), 2028–2049. https://doi.org/10.1177/1461444817712086

Vartapetiance, A., & Gillam, L. (2014). Deception Detection: Dependable or Defective? Social Network Analysis and Mining, 4(1), 166. https://doi.org/10.1007/s13278-014- 0166-8

Vraga, E. K., & Bode, L. (2020). Defining Misinformation and Understanding its Bounded Nature: Using Expertise and Evidence for Describing Misinformation. Political Communication, 37(1), 136–144. https://doi.org/10.1080/10584609.2020.1716500

Waisbord, S. (2018). Truth is What Happens to News: On Journalism, Fake news, and Post- truth. Journalism Studies, 19(13), 1866–1878. https://doi.org/10.1080/1461670X.2018.1492881

Waltzman, R. (2017). The Weaponization of Information: The Need for Cognitive Security. RAND Corporation. https://doi.org/10.7249/CT473

Wanless, A., & Pamment, J. (2019). How do you Define a Problem like Influence? Journal of Information Warfare, 18(3), 1–14.

Ward, M., & Beyer, J. (2018). Vulnerable Landscapes: Case Studies of Violence and Disinformation. The Wilson Center. https://www.wilsoncenter.org/sites/default/files/media/documents/publication/vulnera ble_landscapes_case_studies.pdf [Accessed 16th May 2021].

Wardle, C., & Derakhshan, H. (2017). Information Disorder: Toward an Interdisciplinary Framework for Research and Policy Making. Council of Europe. https://rm.coe.int/information-disorder-toward-an-interdisciplinary-framework-for- researc/168076277c [Accessed 16th May 2021].

Weng, L., Flammini, A., Vespignani, A., & Menczer, F. (2012). Competition Among Memes in a World with Limited Attention. Scientific Reports, 2(1), 335. https://doi.org/10.1038/srep00335

Whyte, C., & Etudo, U. (2020). Cyber by a Different Logic. In C. Whyte, A. T. Thrall, & B. M. Mazanec (Eds.), Information Warfare in the Age of Cyber Conflict (pp. 114–131). Routledge/Taylor & Francis Group.

50 Whyte, C., Thrall, A. T., & Mazanec, B. M. (Eds.). (2020). Introduction. In Information Warfare in the Age of Cyber Conflict (pp. 1–11). Routledge/Taylor & Francis Group.

Wohlforth, W. C. (2020). Realism and Great Power Subversion. International Relations, 34(4), 459–481. https://doi.org/10.1177/0047117820968858

Woolley, S. C., & Howard, P. N. (Eds.). (2019a). Computational Propaganda: Political Parties, Politicians, and Political Manipulation on Social Media. Oxford University Press.

Woolley, S. C., & Howard, P. N. (Eds.). (2019b). Introduction. In Computational Propaganda: Political Parties, Politicians, and Political Manipulation on Social Media (pp. 3–18). Oxford University Press.

World Economic Forum. (2021). The Global Risks Report 2021 (16th Edition). http://www3.weforum.org/docs/WEF_The_Global_Risks_Report_2021.pdf [Accessed 16th May 2021].

Wu, L., Morstatter, F., Carley, K. M., & Liu, H. (2019). Misinformation in Social Media: Definition, Manipulation, and Detection. ACM SIGKDD Explorations Newsletter, 21(2), 80–90. https://doi.org/10.1145/3373464.3373475

Data collection SE001, Sveriges regering. (2021). Struktur för ökad motståndskraft, SOU 2021:25. https://www.regeringen.se/rattsliga-dokument/statens-offentliga- utredningar/2021/04/sou-202125/ [Accessed 10th May 2021].

SE002, Sveriges regering. (2021). Anförande av Mikael Damberg på konferensen Mötesplats Samhällssäkerhet. https://www.regeringen.se/tal/2021/03/anforande-av-mikael- damberg-pa-konferensen-motesplats-samhallssakerhet/ [Accessed 10th May 2021].

SE003, Sveriges regering. (2021). Mikael Damberg deltar på föreläsning om desinformation kring vaccin. https://www.regeringen.se/pressmeddelanden/2021/03/mikael-damberg- deltar-pa-forelasning-om-desinformation-kring-vaccin/ [Accessed 10th May 2021].

SE004, Sveriges regering. (2021). Vetenskapsrådet ges i uppdrag att bedriva kommunikationsinsatser om vaccinationer. https://www.regeringen.se/pressmeddelanden/2021/02/vetenskapsradet-ges-i-

51 uppdrag-att-bedriva-kommunikationsinsatser-om-vaccinationer/ [Accessed 10th May 2021].

SE005, Sveriges regering. (2021). Ökat anslag med 89 miljoner kronor till MSB för insatser under coronapandemin. https://www.regeringen.se/pressmeddelanden/2021/01/okat- anslag-med-89-miljoner-kronor-till-msb-for-insatser-under-coronapandemin/ [Accessed 10th May 2021].

SE006, Sveriges regering. (2020). Forskning, frihet, framtid – kunskap och innovation för Sverige, Prop. 2020/21:60. https://www.regeringen.se/rattsliga- dokument/proposition/2020/12/forskning-frihet-framtid--kunskap-och-innovation-for- sverige/ [Accessed 10th May 2021].

SE007, Sveriges regering. (2020). Åtgärder mot terrorism sedan slutet av 2014. https://www.regeringen.se/regeringens-politik/bekampning-av-terrorism/atgarder- mot-terrorism-sedan-slutet-av-2014/ [Accessed 10th May 2021].

SE008, Sveriges regering. (2020). Det demokratiska samtalet i en digital tid—Så stärker vi motståndskraften mot desinformation, propaganda och näthat. https://www.regeringen.se/rattsliga-dokument/statens-offentliga- utredningar/2020/09/sou-202056/ [Accessed 10th May 2021].

SE009, Sveriges regering. (2020). Budgetpropositionen för 2021, Prop. 2020/21:1. https://www.regeringen.se/rattsliga-dokument/proposition/2020/09/prop.-2020211/ [Accessed 10th May 2021].

SE010, Sveriges regering. (2020). Strategi för Sveriges stöd till demokrati, mänskliga rättigheter och miljö i Ryssland 2020–2024. https://www.regeringen.se/land--och- regionsstrategier/2020/02/strategi-for-sveriges-stod-till-demokrati-manskliga- rattigheter-och-miljo-i-ryssland-20202024/ [Accessed 10th May 2021].

SE011, Sveriges regering. (2019). Uppdrag till Statens medieråd att förstärka arbetet för ökad medie- och informationskunnighet. https://www.regeringen.se/regeringsuppdrag/2019/09/uppdrag-till-statens-medierad- att-forstarka-arbetet-for-okad-medie--och-informationskunnighet/ [Accessed 10th May 2021].

SE012, Sveriges regering. (2019). Anförande av utrikesminister Margot Wallström vid lansering av rapporter om mänskliga rättigheter, demokrati och rättsstatens principer. https://www.regeringen.se/tal/20192/02/anforande-vid-lansering-av-

52 rapporter-om-mr-demokrati-och-rattsstatens-principer-uppsala-universitet/ [Accessed 10th May 2021].

SE013, Sveriges regering. (2018). Uppdrag till Statens medieråd att utveckla former för en förstärkt samverkan av insatser för medie- och informationskunnighet (MIK). https://www.regeringen.se/regeringsuppdrag/2018/09/uppdrag-till-statens-medierad- att-utveckla-former-for-en-forstarkt-samverkan-av-insatser-for-medie--och- informationskunnighet-mik/ [Accessed 10th May 2021].

SE014, Sveriges regering. (2018). En ny myndighet för psykologiskt försvar, Dir. 2018:80. https://www.regeringen.se/rattsliga-dokument/kommittedirektiv/2018/08/dir.-201880/

SE015, Sveriges regering. (2018). Bättre kommunikation för fler investeringar, SOU 2018:56. https://www.regeringen.se/rattsliga-dokument/statens-offentliga- utredningar/2018/06/sou-201856/ [Accessed 10th May 2021].

SE016, Sveriges regering. (2017). Handlingsplan: Till det fria ordets försvar—Åtgärder mot utsatthet för hot och hat bland journalister, förtroendevalda och konstnärer. https://www.regeringen.se/informationsmaterial/2017/07/regeringen-antar- handlingsplan-mot-hat-och-hot-mot-journalister-fortroendevalda-och-konstnarer/ [Accessed 10th May 2021].

SE017, Sveriges regering. (2016). Margot Wallströms tal vid Folk och Försvars rikskonferens. https://www.regeringen.se/tal/2016/01/ums-tal-vid-folk-och-forsvar/ [Accessed 10th May 2021].

UK001, UK Government. (2021). Foreign Secretary boosts BBC funding to fight fake news. https://www.gov.uk/government/news/foreign-secretary-boosts-bbc-funding-to-fight- fake-news [Accessed 10th May 2021].

UK002, UK Government. (2021). Official Development Assistance (ODA): FCDO International Programme spend objectives 2020 to 2021. https://www.gov.uk/government/publications/official-development-assistance-oda- fcdo-international-programme-spend-objectives-2020-to-2021 [Accessed 10th May 2021].

UK003, UK Government. (2021). Global Britain in a Competitive Age: The Integrated Review of Security, Defence, Development and Foreign Policy. https://www.gov.uk/government/publications/global-britain-in-a-competitive-age-the-

53 integrated-review-of-security-defence-development-and-foreign-policy [Accessed 10th May 2021].

UK004, UK Government. (2021). Second quarterly report on progress to address COVID-19 health inequalities. https://www.gov.uk/government/publications/second-quarterly- report-on-progress-to-address-covid-19-health-inequalities [Accessed 10th May 2021].

UK005, UK Government. (2021). Government updates on identifying and tackling COVID- 19 disparities. https://www.gov.uk/government/news/government-updates-on- identifying-and-tackling-covid-19-disparities [Accessed 10th May 2021].

UK006, UK Government. (2021). Government response to the House of Lords Select Committee on Artificial Intelligence. https://www.gov.uk/government/publications/government-response-to-the-house-of- lords-select-committee-on-artificial-intelligence [Accessed 10th May 2021].

UK007, UK Government. (2021). COVID-19 Response—Spring 2021 (Roadmap). https://www.gov.uk/government/publications/covid-19-response-spring-2021 [Accessed 10th May 2021].

UK008, UK Government. (2021). UK COVID-19 vaccination uptake plan. https://www.gov.uk/government/publications/covid-19-vaccination-uptake-plan [Accessed 10th May 2021].

UK009, UK Government. (2020). Directory of UK Safety Tech Providers. https://www.gov.uk/government/publications/directory-of-uk-safety-tech-providers [Accessed 10th May 2021].

UK010, UK Government. (2020). Intimidation in Public Life: Progress report on recommendations. https://www.gov.uk/government/publications/intimidation-in- public-life-progress-report-on-recommendations [Accessed 10th May 2021].

UK011, UK Government. (2020). Argentina and the UK Joint Webinar: COVID-19 vaccines in Latin America and the Caribbean. https://www.gov.uk/government/news/argentina- and-the-uk-joint-webinar-covid-19-vaccines-in-latin-america-and-the-caribbean [Accessed 10th May 2021].

54 UK012, UK Government. (2020). UK leads the way in a ‘new age of accountability’ for social media. https://www.gov.uk/government/news/uk-leads-the-way-in-a-new-age- of-accountability-for-social-media [Accessed 10th May 2021].

UK013, UK Government. (2020). Fact sheet—Online Harms Full Government Response. https://www.gov.uk/government/publications/fact-sheet-online-harms-full- government-response [Accessed 10th May 2021].

UK014, UK Government. (2020). Online Harms White Paper. https://www.gov.uk/government/consultations/online-harms-white-paper [Accessed 10th May 2021].

UK015, UK Government. (2020). Social media giants agree package of measures with UK Government to tackle vaccine disinformation. https://www.gov.uk/government/news/social-media-giants-agree-package-of- measures-with-uk-government-to-tackle-vaccine-disinformation [Accessed 10th May 2021].

UK016, UK Government. (2020). Libraries Taskforce: Six month progress report (October 2019 to March 2020). https://www.gov.uk/government/publications/libraries- taskforce-six-month-progress-report-october-2019-to-march-2020 [Accessed 10th May 2021].

UK017, UK Government. (2020). £29 million government funding to boost digital revolution and help keep people safe online. https://www.gov.uk/government/news/29-million- government-funding-to-boost-digital-revolution-and-help-keep-people-safe-online [Accessed 10th May 2021].

UK018, UK Government. (2020). British Embassy Bishkek: Call for 2020-2021 project bids. https://www.gov.uk/government/publications/british-embassy-bishkek-call-for-2020- 2021-project-bids [Accessed 10th May 2021].

UK019, UK Government. (2020). UK to increase support to civil society and independent media in Belarus. https://www.gov.uk/government/news/uk-to-increase-support-to- civil-society-and-independent-media-in-belarus [Accessed 10th May 2021].

UK020, UK Government. (2020). Ministers call on councils to help deliver digital connectivity ambitions. https://www.gov.uk/government/news/ministers-call-on- councils-to-help-deliver-digital-connectivity-ambitions [Accessed 10th May 2021].

55 UK021, UK Government. (2020). Prague Programme Fund: Call for bids 2020/21. https://www.gov.uk/government/news/prague-programme-fund-call-for-bids-202021 [Accessed 10th May 2021].

UK022, UK Government. (2020). Human Rights and Democracy Report 2019. https://www.gov.uk/government/publications/human-rights-and-democracy-report- 2019 [Accessed 10th May 2021].

UK023, UK Government. (2020). COVID Support Force: The MOD’s contribution to the coronavirus response. https://www.gov.uk/guidance/covid-support-force-the-mods- contribution-to-the-coronavirus-response [Accessed 10th May 2021].

UK024, UK Government. (2020). Safer technology, safer users: The UK as a world-leader in Safety Tech. https://www.gov.uk/government/publications/safer-technology-safer- users-the-uk-as-a-world-leader-in-safety-tech [Accessed 10th May 2021].

UK025, UK Government. (2020). The British Embassy Kyiv: Call for project proposals 2020 to 2021. https://www.gov.uk/government/news/the-british-embassy-kyiv-call-for- project-proposals [Accessed 10th May 2021].

UK026, UK Government. (2020). Coronavirus (COVID-19)—Staying safe online. https://www.gov.uk/guidance/covid-19-staying-safe-online [Accessed 10th May 2021].

UK027, UK Government. (2020). Government cracks down on spread of false coronavirus information online. https://www.gov.uk/government/news/government-cracks-down- on-spread-of-false-coronavirus-information-online [Accessed 10th May 2021].

UK028, UK Government. (2020). UK aid to tackle global spread of coronavirus ‘fake news’. https://www.gov.uk/government/news/uk-aid-to-tackle-global-spread-of-coronavirus- fake-news [Accessed 10th May 2021].

UK029, UK Government. (2020). How to Guide on Gender and Strategic Communication in Conflict and Stabilisation Contexts—January 2020. https://www.gov.uk/government/publications/how-to-guide-on-gender-and-strategic- communication-in-conflict-and-stabilisation-contexts-january-2020 [Accessed 10th May 2021].

56 UK030, UK Government. (2019). Confronting cyber threats to businesses and personal data. https://www.gov.uk/government/news/confronting-cyber-threats-to-businesses-and- personal-data [Accessed 10th May 2021].

UK031, UK Government. (2019). Official Development Assistance (ODA): FCO International Programme spend objectives 2019 to 2020. https://www.gov.uk/government/publications/official-development-assistance-oda- fco-international-programme-spend-objectives-2019-to-2020 [Accessed 10th May 2021].

UK032, UK Government. (2019). UK programme assistance to Ukraine in 2019-2020. https://www.gov.uk/government/news/uk-programme-assistance-to-2019-2020 [Accessed 10th May 2021].

UK033, UK Government. (2019). UK steps up fight against fake news. https://www.gov.uk/government/news/uk-steps-up-fight-against-fake-news [Accessed 10th May 2021].

UK034, UK Government. (2019). UK announces £9 million project to support independent media in Ukraine. https://www.gov.uk/government/news/uk-announces-9m-project- to-support-independent-media-in-ukraine [Accessed 10th May 2021].

UK035, UK Government. (2019). UK to introduce world first online safety laws. https://www.gov.uk/government/news/uk-to-introduce-world-first-online-safety-laws [Accessed 10th May 2021].

UK036, UK Government. (2019). Semantic Visions wins $250,000 Tech Challenge to Combat Disinformation. https://www.gov.uk/government/news/semantic-visions- wins-250000-tech-challenge-to-combat-disinformation [Accessed 10th May 2021].

UK037, UK Government. (2018). UK programme assistance to Ukraine 2017-2018. https://www.gov.uk/government/news/uk-programme-assistance-to-ukraine-2017- 2018 [Accessed 10th May 2021].

UK038, UK Government. (2018). United Kingdom—Poland Quadriga 2018: Joint communiqué. https://www.gov.uk/government/news/united-kingdom-poland- quadriga-joint-communique [Accessed 10th May 2021].

57 UK039, UK Government. (2018). PM Commons statement on National Security and Russia: 26 March 2018. https://www.gov.uk/government/speeches/pm-commons-statement- on-national-security-and-russia-26-march-2018 [Accessed 10th May 2021].

UK040, UK Government. (2017). PM announces landmark new package of defence and security cooperation with Poland. https://www.gov.uk/government/news/pm- announces-landmark-new-package-of-defence-and-security-cooperation-with-poland [Accessed 10th May 2021].

UK041, UK Government. (2017). UK programme assistance to Ukraine 2016-2017. https://www.gov.uk/government/news/uk-programme-assistance-to-ukraine-2016- 2017 [Accessed 10th May 2021].

UK042, UK Government. (2015). Conflict Stability and Security Fund for 2015-2016 in Moldova. https://www.gov.uk/government/news/conflict-stability-and-security-fund- for-2015-2016-in-moldova [Accessed 10th May 2021].

UK043, UK Government. (2015). Conflict Stability and Security Fund: Call for project proposals for 2015-2016 in Ukraine. https://www.gov.uk/government/news/conflict- stability-and-security-fund-call-for-project-proposals-for-2015-2016-in-ukraine [Accessed 10th May 2021].

58