CURRENT TRENDS IN DISSEMINATION

CONTENTS

Summary 3 Introduction 4 Political aspects of disinformation campaign 5 Methods of disinformation dissemination 9 Technological trends 13

The strategic components in the fight against disinformation 16

Conclusion 18 Sources 19

SUMMARY

Combating disinformation is now a regular part of state security agendas. Hostile state and non-state actors - using the available technological tools and abusing often insufficient online regulation - see disinformation as a two-pronged tool to promote their own interests.

Through election interference or influencing public opinion and state decision-making processes, the goal of systematically spreading disinformation is often political destabilization, social polarization and breaking trust in public institutions.

The transmission - and transformation - of political communication online opened up vulnerabilities that disseminators are taking advantage of.

Absent regulation and social network business models are important factors in rapid, widespread disinformation dissemination. This situation is being abused by actors who have strategies for disseminating their content: digital astroturfing; flooding the zone with spam bots and media Trojan horses; and so-called source hacking. This includes offline activities such as acquiring and influencing “the influencers”.

In the future, the combination of artificial intelligence, deepfakes and big data tools will be a major challenge in the fight against disinformation spread.

Primary responsibility for combating disinformation lies with states. They must make societal efforts involving the private sector, universities, the media, professionals and the general public, and harmonize these strategies among allies at the transnational level within NATO and the EU.

Creating a central entity to combat disinformation at the supra-ministerial level would facilitate data collection, analysis and communications coordination across public administration.

In the long run, the best solution to combat disinformation is quality state strategic communication, transparency and appropriate public policies.

3 INTRODUCTION

The fight against disinformation is now a regular part of state security agendas. The severity of its impact resonated significantly in the 2016 US presidential election, wherein Russia demonstrably influenced the outcome [1].

However, this was not the first case of domestic political destabilization - of a state by a foreign actor - using targeted disinformation. As early as 2007, Russia attacked Estonia with a combination of cyber-attacks and disinformation campaigns [2]. An extensive disinformation campaign was also visible in the case of the 2014 Ukrainian crisis [3].

A similar scenario was repeated before the British referendum - regarding the latter’s withdrawal from the European Union in 2016 - when Russian communications agencies such as Russia Today and Sputnik published several hundred articles on anti-EU rhetoric in a matter of months and influenced public opinion in other ways [4].

It is clear that disinformation is a frequently used tool for promoting hostile interests. Whether it is one of several elements of hybrid action or used in stand-alone campaigns, it has significant potential to destabilize, polarize and otherwise damage the foundations of open, democratic societies and states.

However, disinformation campaigns are also adaptable and copy the dynamically changing environment. A prerequisite for the strategic fight against disinformation is therefore an understanding of dissemination development trends. The next text aims to present basic disinformation dissemination trends.

[1] Select Commitee on Intelligence, United States Senate (2019) [2] Grassegger & Krogerus (2017) [3] Macfarquhar (2016) 4 [4] UK Parliament (2018) POLITICAL ASPECTS OF DISINFORMATION CAMPAIGNS

The main task of hostile actors is to destabilize particular states through disinformation campaigns that target societal weaknesses. Thus, disinformation campaigns primarily contribute to greater societal polarization overlapping into political life, reflected not only in domestic politics, but also in the broader geopolitical context.

5 DIGITAL OLIGARCHY

It is expected that technology company diversity will be replaced by digital oligarchy. By consolidating the market - out of 70 most influential companies of 2017 - only 10 super corporations will probably remain until 2050 [5].

These economically, technologically and politically influential giants will also be perceived by democracies as objects of national security concerns, leading to efforts to limit their autonomy. In the case of authoritarian states, these technology companies should always be seen as potential tools of the ruling regime.

Current trends suggest an Internet split in the near future - primarily between the US, China and Russia [6]. This will create several parallel forms of social networks and platforms in direct competition. Specific examples are the Western WhatsApp and the Chinese WeChat.

Related to this is the tendency of authoritarian regimes to centralize control over the Internet and subsequently isolate its national market from the global market, such as in Russia [7]. The state monitors and regulates information dissemination online, within its territory, prohibiting a healthy, competitive media and digital market.

INTERFERENCE OF STATE AND NON-STATE ACTORS IN NATIONAL PROCESSES OF OTHER STATES

Through long-term disinformation campaigns, these actors influence democratic processes at global and local levels. It is expected that foreign interference in national elections will continue, which may link disinformation campaigns to cyberattacks, making it a complex two- pronged action.

Russia has been shown to had intervened in the 2016 US presidential election, with the latest intelligence information pointing to the same scenario in 2020. China and Iran recently joined Russia with similar efforts [8].

The current French President Emmanuel Macron fell victim to Russian hackers in 2017 who published several thousand emails concerning the former’s campaign two days before the presidential election [9]. One year after the incident, France passed an anti-fake news law [10].

[5] European Commission (2020) [6] NATO STRATCOM COE (1/2020) [7] Epifanova (2020) [8] Associated Press News (2020) [9] Erickson (2017) 6 [10] Gouvernement Français (2018) China is also copying Russian disinformation techniques and applying them to the EU. A prominent example was the disinformation campaign on COVID-19 origins, which China targeted primarily at the EU - and the surrounding area - denying responsibility for the global pandemic [11].

ONLINE POLITICAL COMMUNICATION TRANSFORMATION

The emergence of new media and the Internet’s expansion have moved political campaigns online. The 2020 Slovak parliamentary elections were proof of that. For example, the SMER political party built its digital campaign on misleading videos depicting cartoons of former President Andrej Kiska [12].

The absence of content regulation and clearly defined rules online lowers transparency in the financing and organization of political campaigns, as well as encourages misinformation spread at home by political opponents along with inciting hatred.

Political actors are generally able to build visibility without the need for contact with traditional media: exclusively through party portals, which can look like independent sites and profiles - or groups - on social networks without acknowledging party affiliations.

An example is the ĽSNS party’s communication channels, effectively mobilizing audiences and disseminating party narratives using the above- mentioned tools, such as the Kulturblog, Hlavné Správy or Magazín1 news outlets.

ECHO CHAMBERS AND SOCIETAL POLARIZATION

Polarization is mainly influenced by two types of factors: algorithmic and political. Preference mining algorithms create echo chambers or echo rooms, reducing information and opinion diversity. This business model, which successfully targets advertising consumers, is also reflected in opinion platform fragmentation..

According to the GLOBSEC study, open groups on served as echo chambers before the 2020 Slovak parliamentary elections, strengthening currents of opinion. One in five posts published in these groups accused opponents of manipulating election results [13].

[11] Rankin (2020) [12] iDNES (2020) [13] Klingová et al. (2020) 7 This favors actors with strategies geared toward extreme supporter mobilization and opposition demobilization.

In addition to algorithmic polarization, political polarization is also visible. Through cognitive disinformation operations, foreign actors target groups prone to radicalization and subvert the political opponent’s society.

Problematic social network algorithms thus create a space full of weaknesses, favoring disinformation content that is fully exploited by malicious actors. An example is the above-mentioned external interference in domestic policy of other states through disinformation campaigns on social networks.

8 METHODS OF DISINFORMATION DISSEMINATION

There are already countless ways to spread misinformation online and offline. These are usually inexpensive and affordable strategies. Ultimately, misinformation dissemination is accessible to almost anyone.

9 BEHAVIORAL COMMERCIALIZATION ON SOCIAL NETWORKS

Today, it is possible to buy tens of thousands of likes or followers for a few hundred euros and thus significantly influence social network discussions. Algorithms readily favor popular and commented content, which in turn gains more traction.

Using the NATO STRATCOM COE experiment as an example, researchers were able to buy 3,530 comments; 25,750 likes; 20,000 views and 5,100 followers for 300 euros, demonstrating that social networks are failing in this regard. Although these accounts showed suspicious behavior, they were still active four weeks after the purchase of four out of five services. 95% of their accounts were still active three weeks after inauthentic behavior being reported [14].

This form of manipulation is therefore available to virtually anyone, even without deeper technical knowledge and skills. All you have to do is pay for it.

DIGITAL ASTROTURFING

Astroturfing refers to a method that influences consumer behavior or political preferences by simulating authenticity and spontaneous grassroots - "from below" - activity. In reality, however, said campaigns are professionally organized [15].

Many social networks pretend to be bottom-up movements willing to change local politics, but, in reality, they purposefully manipulate public opinion and polarize the masses. Along with non-transparent groups, personalized propaganda is also emerging through advertisement micro- targeting [16].

Facebook groups created by the Russian troll farm IRA are an example. One was the Heart of Texas group, supposed to unite the secessionist Texas movement, but in fact promoted conspiracy theories; far-right ideology; xenophobia; and anti-LGBT and anti-Muslim rhetoric.

In real life, the group staged several protests and, at one point, had nearly a quarter of a million followers, more than the official local Democratic and Republican sites in Texas combined [17].

[14] NATO STRATCOM COE (2019) [15] Urban (2015) [16] Kornbluh (2020) 10 [17] Casey (2017) FLOODING THE ZONE AND USING BOTS

Harmful actors can use infected computers to continuously flood social networks with disinformation. This - coupled with general lack of knowledge about algorithmics; non-transparency and the artificial intelligence that regulates content - poses another significant problem [18].

For example, bots - accounts that generate disinformation automatically - are used to "flood the zone". Botnets - i.e., both groups or networks - spread disinformation at great speed through social networks. They are most often seen on , the latest of which have already appeared on Instagram.

Another similar phenomenon is trolls: real people who initiate conflict and prevent informed discussions online, often through sharing disinformation. In turn, trolls further polarize other users. They are especially popular on Facebook and discussion forums [19].

MEDIA TROJAN HORSES

There are fake websites that pretend to be relevant media outlets but spread misinformation and propaganda. These are usually local newspapers presented as independent, but the topics and direction of the articles are dictated abroad [20]. An example is a group of portals called Balt news that spreads Russian propaganda in the Baltics [21].

Russia also pays freelance journalists, whose political convictions correspond to pro-Russian propaganda and add legitimacy to media Trojan horses. In the USA, the Russian Peace Data site appeared, hiring freelance journalists from both America and Europe, who, without knowing their employer’s origin, wrote articles for English-speaking audiences [22].

SOURCE HACKING

Source hacking is a way of manipulating content to achieve influence on social media without raising suspicion. Its primary targets are journalists and other influential figures, who then share this content with an even wider audience.

In general, there are four types of source hacking: viral sloganeering, fake document leaks, evidence collage, and keyword squatting.

[18] Kornbluh (2020) [19] Barojan (2018) [20] Kornbluh (2020) [21] Roonemaa (2018) [22] Frenkel (2020) 11 In the famous case of the Macron Leaks, which happened two days before the French presidential election, a combination of genuine and fake documents was leaked to prove Emmanuel Macron had committed tax fraud. The aim of these leaks was to discredit Macron’s campaign during the election moratorium, when the French media could not comment on the content. However, the documents circulated on social networks, where they received considerable attention.

An example of evidence collage is the “PizzaGate” conspiracy theory. Information was spread on the internet about a fictitious pedophile network whose members allegedly featured prominent Democrats. Throughout this long-term campaign, collages of information and photos of misleading content spread across social media. They culminated in an armed attack on a pizzeria in Washington D.C., the perpetrator of which said he wanted to confirm these suspicions.

This disinformation campaign is only a fragment of the much larger QAnon conspiracy web, which has, for several years, supported and similar politicians in their fight against ‘the establishment’ [23].

THE ACQUISITION AND “REARING” OF INFLUENCERS

Winning over celebrities to various causes has major impacts both in the offline and in the online world. A study conducted by Oxford’s Reuters Institute found that influencers and celebrities are the biggest contributors to the spread of coronavirus disinformation. While celebrities make up only 20% of all disinformation content disseminators, their posts generate as much as 69% of all social media interactions [24].

In the Czech Republic, the singer Ilona Csáková has become a prominent figure among a circle of easily influenced celebrities who spread coronavirus disinformation [25]. Another significant group are people who spread misleading, spurious, and shocking content in the hopes of becoming viral. One example is the Czech youtuber Ondřej Tesárek, also known as Bratříček (Little Brother) [26].

Aspiring or novice influencers are an appealing target for malign actors. In exchange for providing technological or financial assistance, these malign actors broaden the audience reached by their disinformation narratives.

[23] Donovan & Friedberg (2019) [24] Waterson (2020) [25] Valášek & Dragoun (2020) 12 [26] Cemper (2019) TECHNOLOGICAL TRENDS

Disinformation campaigns do not exist in a vacuum; they dynamically adapt to technological progress and innovation. In the future, three factors will be crucial in the spread of disinformation.

13 THE VIRAL POTENTIAL OF AUDIOVISUAL CONTENT AND DEEP FAKES

Psychological studies show that four times as many users remember information better when they consume it in the audiovisual form rather than by reading [27]. Thus, digital marketers predict significant annual increases in video content at the expense of written text. In 2022, online videos will account for 82% of all internet content, which is 15 times more than in 2017 [28].

As early as ten years ago, videos on social media generated 12 times more shares than images and text combined [29]. In the future, therefore, it will be vital to reckon with manipulation through audiovisual disinformation, virtual reality and deep fakes [30]. Popular memes, comics, and Art Noir are also important aspects of audiovisual manipulation.

DISINFORMATION CAMPAIGNS USING ARTIFICIAL INTELLIGENCE

The 2019 Worldwide Threat Assessment predicts that technologically advanced adversaries of the US will use a combination of technologies such as deep-fakes, artificial intelligence, big data, and machine learning to create sophisticated information campaigns [31].

The Global Risks Report emphasizes the role of artificial intelligence as a double-edged sword - it describes artificial intelligence as the innovation with “the greatest impact” as well as the innovation that poses “the greatest existential threat” [32]. In 2020, the global market for artificial intelligence software was expected to grow by 22.6 billion USD, a 54% increase on the previous year [33].

Artificial intelligence is already able to generate text and publish automated messages indistinguishable from text written by humans with the help of bots. The boundary between natural and inauthentic behavior in the online space, which has hitherto been relatively clear, will be almost completely blurred.

[27] Hewlett-Packard Development Company (2004) [28] Biteable (2020) [29] Bullas (2012) [30] NATO STRATCOM COE (5/2020) [31] Coats (2019) [32] World Economic Forum (2020) 14 [33] Liu (2020) DATA-MINING, MICRO-TARGETING AND DARK POSTING

Personalized messages are becoming increasingly widespread, mainly due to micro-targeting. It consists of targeting advertisements and any other posts that might effectively manipulate public political opinion, especially with the aim of polarizing and negatively affecting the target audience.

Artificial intelligence that generates personalized messages through data mining is an increasingly popular tool in marketing. Marketers who used artificial intelligence increased their turnover by 41% [34]. Similarly, advertisements created with the help of micro-targeting garner 670% more clicks than non-targeted advertisements [35].

With a sufficient level of information, it is even possible to perform what is known as nano-targeting - targeting information towards a single individual.

Targeting is also a method used by dark posting. A dark post is a paid advertisement that targets users as a sponsored post, but it does not appear on the advertiser’s profile and usually redirects the user outside the social media platform [36].

The personalization of advertising and content causes the division of society into smaller groups and leads to the isolation of individuals. This was demonstrated, for example, in the case of Cambridge Analytica, which had an impact on the 2016 Brexit referendum [37] as well as the 2017 presidential election in Kenya [38].

Based on the information gathered by big data, actors are able to determine which sections of society are more prone to fringe views, and create content based on their preferences, thereby mobilizing and radicalizing these individuals.

[34] Guttmann (2019) [35] Barbu-Kleitsch (2014) [36] Cakl (2020) [37] Scott (2019) [38] Moore (2018) 15 THE STRATEGIC COMPONENTS IN THE FIGHT AGAINST DISINFORMATION

With the growing presence of information and disinformation operations, individual countries increasingly emphasize the need to develop strategies to combat hybrid threats. Each country is building its resilience and security strategy based on its own geopolitical and historical context. Although there is no widely accepted consensus on the best way to wage this battle, it is possible to identify several approaches that countries share.

16 The following strategies are mainly used by the Baltic countries, where they help in the fight against Russian propaganda and influence operations on a daily basis.

CONFRONTATION is characterized by active capacity-building and the dissemination of counter-narratives that combat disinformation campaigns. A typical example is Estonia’s establishment of the Russian- language national television channel ETV+ in 2015. Its aim is to reach Russian-speaking minorities in Estonia and counter Russian propaganda narratives [39].

BLOCKING is very similar to confrontation; however, the goal is not to disseminate opposing narratives but to stop disinformation campaigns and foreign propaganda from spreading within the country’s territory. Countries that have pursued this strategy include Lithuania, Latvia and Estonia, which have gradually blocked media outlets such as Rossyia RTR and Sputnik [40].

DOCUMENTATION AND ANALYSES OF THREATS helps to increase public awareness and to identify the weak spots of a particular society. It is these weak spots that threaten social cohesion and are therefore often attacked and abused by hostile actors, both domestic and foreign. By identifying them in a correct and timely manner, a country contributes to building a comprehensive defense strategy and increases its resilience [41].

BUILDING RESILIENCE combines research, education of the general public, strategic communication of a country’s values, social identity and trust in institutions, but also cooperation with the second and third sectors in the fight against disinformation. The goal of this strategy is to immunize individuals and society as a whole against harmful narratives.

One example is Lithuania’s strengthening of its strategic capacity through the creation of the Demaskuok fact-checking software and the involvement of volunteers, so-called elves [42]. Another initiative that has been put into action is the European Center of Excellence for Countering Hybrid Threats, which brings together experts from around the world to combat international disinformation.

[39] Hellman & Wagnsson (2017) [40] Hellman & Wagnsson (2017) [41] European Values (2019) [42] The Economist (2019) 17 CONCLUSION

Information and disinformation operations have become a common component of political as well as geopolitical power struggles. We are currently aware of several significant foreign actors who have strategic interests in Slovakia and neighboring countries.

Disinformation campaigns by foreign actors in Slovakia last year aimed mainly to subvert the cohesion of NATO and the EU. Slovakia, as a member state of both groups, was subject to disinformation campaigns that sought to weaken public confidence in the allies’ common values, solidarity, commitment to a common defense mechanism and professional competence. Russian sources, such as official media, experts, think tanks and the like, played a leading role in this offensive [43].

The second prominent player was China. It primarily spread pro-Chinese propaganda, its One China Policy narrative, and a positive image of China abroad. It used the Chinese diaspora and local Chinese media to achieve this goal [44].

Several foreign actors actively use modern and readily available technologies and social media platforms to spread disinformation, propaganda, and polarizing narratives in order to disrupt and destabilize the socio-political cohesiveness of other countries. Malign actors abuse the weak spots in the environment created by the successful business models of social media sites for their own benefit.

The outlined trends indicate the ever-increasing influence of technological and social media giants. Without sufficient regulation and their own willingness, the coming wave of technological progress will largely destroy our ability to distinguish between authentic and inauthentic content.

The ongoing deformation of the information space will thus normalize the spread of disinformation narratives and expand their scope, which may ultimately have fatal consequences for stability in society and international politics more broadly.

[43] Slovenská informačná služba (2020) 18 [44] Slovenská informačná služba (2020) SOURCES

Associated Press News. Russia, China, Iran work to undermine US election: US intel chief. AlJazeera [online]. 7.8.2020 [cit. 6.12.2020]. Available at: https://www.aljazeera.com/news/2020/08/07/russia-china-iran-work-to- undermine-us-election-us-intel-chief/.

Barbu-Kleitsch, Oana. Advertising, Microtargeting and Social Media. Procedia - Social and Behavioral Sciences [online]. Vol. 163, December 2014 [cit. 6.12.2020]. Available at: https://www.researchgate.net/publication/270107608_Advertising_Microtargetin g_and_Social_Media.

Barojan, Donara. Understanding bots, botnets and trolls [online]. International Journalists’ Network: 13.11.2018 [cit. 6.12.2020]. Available at: https://ijnet.org/en/story/understanding-bots-botnets-and-trolls.

Biteable. 55 video marketing statistics for 2020 [online]. 13.5. 2020 [cit. 6.12.2020]. Dostupné z: https://biteable.com/blog/video-marketing-statistics/.

Bullas, Jeff. The Facts and Figures about the Power of Visual Content - Infographic [online]. 27.8.2012 [cit. 6.12.2020]. Available at: https://www.jeffbullas.com/the-facts-and-figures-about-the-power-of-visual- content-infographic/.

Cakl, Ondřej. Dark posting: systematizované pokrytectví. Transparency International Česká republika [online]. Naposledy aktualizované: 6.1.2020 [cit. 6.12.2020]. Available at: https://www.transparency.cz/dark-posting- systematizovane-pokrytectvi/.

Casey, Michel. How Russia Created the Most Popular Texas Secession Page on Facebook. Extra News Feed. 8.9.2017 [cit. 6.12.2020] Available at: https://extranewsfeed.com/how-russia-created-the-most-popular-texas- secession-page-on-facebook-fd4dfd05ee5c.

Cemper, Jan. Kdo trollí platformu #Jsmetu? Koordinátoři Trikolóry, odštěpenci od Svobodných i syn vysoce postavené osoby v Rosatomu. Manipulatori.cz [online]. 25.11.2019 [cit. 6.12.2020]. Available at: https://manipulatori.cz/kdo-trolli- platformu-jsmetu-koordinatori-trikolory-odstepenci-od-svobodnych-i-syn- vysoce-postavene-osoby-v-rusatomu/.

19 Coats, Daniel. Worldwide Threat Assessment of the US Intelligence Community [online]. Senate Select Committee on Intelligence: 29.1.2019 [cit. 6.12.2020]. Available at: https://www.dni.gov/files/ODNI/documents/2019-ATA-SFR--- SSCI.pdf.

Donovan, Joan & Brian Friedberg. Source Hacking: Media Manipulation in Practice [online]. Data & Society: 4.9.2019 [cit. 6.12.2020]. Dostupné z: https://datasociety.net/library/source-hacking-media-manipulation-in-practice/.

Epifanova, Alena. Deciphering Russia’s “Sovereign Internet Law” [online]. DGAP: 16.11.2020 [cit. 6.12.2020]. Available at: https://dgap.org/en/research/publications/deciphering-russias-sovereign- internet-law.

Erickson, Amanda. Macron’s emails got hacked. Here’s why French voters won’t hear much about them before Sunday’s election. The Washington Post [online]. 6.5.2017 [cit. 6.12.2020]. Available at: https://www.washingtonpost.com/news/worldviews/wp/2017/05/06/macrons- emails-got-hacked-heres-why-french-voters-wont-hear-much-about-them- before-sundays-election/.

European Commission. Digital Oligarchy. In: Knowledge for Policy: Foresight [online]. 2020 [cit. 6.12.2020]. Available at: https://knowledge4policy.ec.europa.eu/foresight/topic/diversifying- inequalities/new-digital-oligarchy_en.

European Values. Kremlin Watch Strategy for Countering Hostile Russian Interference [online]. 3.12.2019 [cit. 6.12.2020]. Available at: https://www.europeanvalues.net/wp-content/uploads/2019/12/Kremlin-Watch- Strategy.pdf.

Frenkel, Sheera. A Freelance Writer Learns He Was Working for the Russians. The New York Times [online]. 2.9.2020 [cit. 6.12.2020]. Available at: https://www.nytimes.com/2020/09/02/technology/peacedata-writer-russian- misinformation.html.

Gouvernement Français. Against information manipulation [online] 2018 [cit. 6.12.2020]. Available at: https://www.gouvernement.fr/en/against-information- manipulation.

20 Grassegger, Hannes & Mikael Krogerus. Fake news and botnets: how Russia weaponised the web. Guardian [online]. 2.12.2017 [cit. 6.12.2020]. Available at: https://www.theguardian.com/technology/2017/dec/02/fake-news-botnets-how- russia-weaponised-the-web-cyber-attack-estonia.

Guttmann, A. Positive impact of AI use in e-mail marketing in the U.S. 2018. Statista.com [online]. 23.1.2019 [cit. 6.12.2020]. Available at: https://www.statista.com/statistics/959437/impact-ai-use-in-email-marketing/.

Hellman, Maria & Charlotte Wagnsson. How can European states respond to Russian information warfare? An analytical framework. European Security [online]. Vol. 26, no. 2, 1.3.2017 [cit. 6.12.2020]. Available at: https://doi.org/10.1080/09662839.2017.1294162.

Hewlett-Packard Development Company. The Power of Visual Communication [online]. 2004. Available at: https://policyviz.com/wp- content/uploads/2015/10/power-of-visual-communication.pdf. iDdnes.cz. Kiska přivede tisíce migrantů, děsí Směr-SD Slováky v předvolební kampani. iDnes.cz: Zpravodajství [online]. 16.1.2020 [cit. 6.12.2020]. Available at: https://www.idnes.cz/zpravy/zahranicni/slovensko-volby-kampan-smer- kiska.A200116_120021_zahranicni_zaz.

Klingová, Katarína et al. Slovak parliamentary election 2020: Liberalism as a threat, Facebook as a battlefield [online]. GLOBSEC: 2020 [cit. 6.12.2020]. Available at: https://www.globsec.org/wp-content/uploads/2020/04/Slovak- parliamentary-election-2020.pdf.

Kornbluh, Karen et al. Safeguarding Democracy Against Disinformation [online]. The German Marshall Fund of the United States: 24.3.2020 [cit. 6.12.2020]. Available at: https://www.gmfus.org/publications/safeguarding-democracy- against-disinformation.

Liu, Shanhong. Artificial intelligence software market growth forecast worldwide 2019-2025. Statista.com [online]. 17.8.2020 [cit. 6.12.2020]. Available at: https://www.statista.com/statistics/607960/worldwide-artificial-intelligence- market-growth/.

21 Macfarquhar, Neil. A Powerful Russian Weapon: The Spread of False Stories. The New York Times [online]. 28.8.2016 [cit. 6.12.2020]. Available at: https://www.nytimes.com/2016/08/29/world/europe/russia-sweden- disinformation.html.

Moore, Jina. Cambridge Analytica Had a Role in Kenya Election, Too. The New York Times [online]. 20.3.2018 [cit. 6.12.2020]. Available at: https://www.nytimes.com/2018/03/20/world/africa/kenya-cambridge-analytica- election.html.

NATO STRATCOM COE. Deepfakes - Primer and Forecast [online]. Máj 2020 [cit. 6.12.2020]. Available at: https://www.stratcomcoe.org/deepfakes-primer-and- forecast.

NATO STRATCOM COE. Disinformation as a Global Problem – Regional Perspectives [online]. Január 2020 [cit. 6.12.2020]. Available at: https://www.stratcomcoe.org/disinformation-global-problem-regional- perspectives.

NATO STRATCOM COE. How Social Media Companies are Failing to Combat Inauthentic Behaviour Online [online]. November 2019 [cit. 6.12.2020]. Available at: https://www.stratcomcoe.org/how-social-media-companies-are-failing- combat-inauthentic-behaviour-online.

Rankin, Jennifer. EU says China behind 'huge wave' of Covid-19 disinformation. In: Guardian [online]. 10.6.2020 [cit. 6.12.2020]. Available at: https://www.theguardian.com/world/2020/jun/10/eu-says-china-behind-huge- wave-covid-19-disinformation-campaign.

Roonemaa, Holger & Inga SPRINGE. This Is How Russian Propaganda Actually Works In The 21st Century. BuzzFeed News [online]. Naposledy upravené: 31.8.2018 [cit. 6.12.2020]. Available at: https://www.buzzfeednews.com/article/holgerroonemaa/russia-propaganda- baltics-baltnews.

Scott, Mark. Cambridge Analytica did work for Brexit groups, says ex-staffer. In: Politico [online]. 30.7.2019 [cit. 6.12.2020]. Available at: https://www.politico.eu/article/cambridge-analytica-leave-eu-ukip-brexit- facebook/.

22 Select Commitee on Intelligence United States Senate. Russian Active Measures Campaigns and Interference in the 2016 U.S. Election [online]. 116th Congress, Senate, 1st Session. Vol.2, Report 116-XX [cit. 6.12.2020]. Available at: https://www.intelligence.senate.gov/sites/default/files/documents/Report_Volum e2.pdf.

Slovenská informačná služba. Správa o činnosti SIS za rok 2019 [online]. Bratislava, september 2020 [cit. 6.12.2020]. Available at: https://www.sis.gov.sk/pre-vas/sprava-o-cinnosti.html#zahranicnopoliticka-oblast.

The Economist. Lithuanians are using software to fight back against fake news. Science & Technology: Disinformation [online]. 24.10.2019 [cit. 6.12.2020]. Available at: https://www.economist.com/science-and- technology/2019/10/24/lithuanians-are-using-software-to-fight-back-against-fake- news.

UK Parliament. Russian influence in political campaigns. Disinformation and ‘fake news’: Interim Report Contents [online]. 29.7.2018 [cit. 6.12.2020]. Available at: https://publications.parliament.uk/pa/cm201719/cmselect/cmcumeds/363/36308. htm.

Urban, František. Pochybné marketingové praktiky online – astoturfing [online]. TouchIt: 2.8.2015 [cit. 6.12.2020]. Available at: https://touchit.sk/pochybne- marketingove-praktiky-on-line-astroturfing/19026.

Valášek, Lukáš & Radek Dragoun. Kvůli lžím o koronaviru mohou umírat lidé, vláda zcela selhává, upozorňují expert. Aktualne.cz [online]. 21.10.2020 [cit. 6.12.2020]. Available at: https://zpravy.aktualne.cz/domaci/kvuli-lzim-o- koronaviru-mohou-umirat-lide/r~6d4687c412d811eb95caac1f6b220ee8/.

Waterson, Jim. Influencers among 'key distributors' of coronavirus misinformation. In: Guardian [online]. 8.4.2020 [cit. 6.12.2020]. Available at: https://www.theguardian.com/media/2020/apr/08/influencers-being-key- distributors-of-coronavirus-fake-news.

World Economic Forum. The Global Risks Report 2020 [online]. 15th edition, 15.1.2020 [cit. 6.12.2020]. Available at: https://www.weforum.org/reports/the- global-risks-report-2020.

23

Author: Eva Húsková, SSPI - Slovak Security Policy Institute

Editors: Matej Kandrík, Stratpol - Strategic Policy Institute Peter Köles, SSPI - Slovak Security Policy Institute Kristína Urbanová, Stratpol - Strategic Policy Institute

Thanks: Sincere thanks to all those who were involved in the research, willing respondents, specialists and, last but not least, to the implementation team.

© STRATPOL, SSPI, Evropské hodnoty 2020

This research was carried out within project funded by the subsidy schemes of the Ministry of Defence of The Slovak Republic, the Embassy of the United States of America in Bratislava, NATO Public Diplomacy Division and Friedrich Naumann Stiftung.

All views expressed in this document are those of the author only; they do not reflect the views of the donors. © STRATPOL, SSPI 2020. All rights reserved.