<<

VYTAUTAS MAGNUS UNIVERSITY FACULTY OF POLITICAL SCIENCE AND DIPLOMACY

PUBLIC COMMUNICATIONS DEPARTMENT

Maryna Kupriienko

INFLUENCING IN . TROLLING AND STRATEGIES

Final Master Thesis

Journalism and Media Industries Study Program, state code 621P50002 Degree in Journalism

Supervisor prof. Robert van Voren (acad. title, name, surname)

Defended prof. Šarūnas Liekis (Dean of the Faculty)

Kaunas, 2019

VYTAUTO DIDŽIOJO UNIVERSITETAS POLITIKOS MOKSLŲ IR DIPLOMATIJOS FAKULTETAS

VIEŠOSIOS KOMUNIKACIJOS KATEDRA

Maryna Kupriienko

VIEŠOS NUOMONĖS ĮTAKA SOCIALINĖSE MEDIJOSE. TROLINIMO IR DEZINFORMACIJOS STRATEGIJOS

Magistro baigiamasis darbas

Žurnalistikos ir medijų industrijų studijų programa, valstybinis kodas 621P50002 Žurnalistikos studijų kryptis

Vadovas (-ė) prof. Robert van Voren (Moksl. laipsnis, vardas, pavardė)

Apginta prof. Šarūnas Liekis (Fakulteto/studijų instituto dekanas/direktorius)

Kaunas, 2019 CONTENTS SUMMARY ...... 4 SANTRAUKA ...... 5 INTRODUCTION ...... 6 1. THEORETICAL FRAMEWORK ...... 9 1.1. Theoretical aspects of the interaction of state authorities and the media. The phenomenon of social media in modern political science ...... 9 1.2. Social media as a tool to influence public opinion ...... 13 1.3. The artificial activity of Internet users as a way to simulate the involvement of citizens in socio- political campaigns ...... 15 2. THE ROLE OF THE IRA IN ...... 20 2.1. IRA experience in creating artificial activity: structure of the organisation and implementation practice ...... 20 2.2. Strategies and technologies of the IRA on social media ...... 22 2.3. The geopolitical impact of the IRA: global trends ...... 23 3. REFLECTION ON ACCOMPLISHED CREATIVE APPLIED PROJECT ...... 27 3.1 Methodological framework ...... 27 3.2 and political persuasion ...... 27 3.3 Russian interference in foreign policies ...... 30 3.4 Evaluation of the effectiveness of mechanisms for creating artificial activity during the socio- political campaigns in ...... 32 CONCLUSIONS ...... 35 REFERENCES ...... 37 APPENDICES ...... 41 SUMMARY

Mass media has become the primary tool for the dissemination of messages that affect the public consciousness. What has not fallen into the channels of mass communication, in our time, almost does not affect the development of society. Thus, a modern person cannot escape the influence of the media. In these conditions, it becomes necessary to determine what kind of influence of mass media on public opinion formation. The thesis conceptualised around the activity of paid trolls and influencing public opinion in social media, particularly the case of IRA and focus on the problem of mapping pro-Kremlin trolls through media channels. The relevance of this research is due to the critical situation in Ukraine for the last five years as well as the case with the interference of Russia in US elections in 2016. Many are inclined to believe that all the events that occurred in early 2014 are the result of brazen manipulation of information. This research examines the trend of IRA activities, explores the opinion held by journalists and experts on trolling and disinformation strategies in social media. The purpose of the investigation is to comprehensively study the activities of the IRA and their impact on the formation of public opinion in social networks. As well it tends to determine factors of the effectiveness of the influence of Russian trolls, the synthesis of disinformation strategies and methods of forming their public opinion regarding political forces, identifying the best ways to accelerate the development of democratic, independent journalism in the state and increasing the effectiveness of its operation. Is there any difference in how Russians and Americans look at the same events, and if they do, what explains it? It revealed to analyse an example of the known state-sponsored internet propaganda used by the Russian government, the most stable ideas about the IRA distributed by the American, Russian and European media. Keywords: Russian trolls, disinformation, troll factory, Kremlin bots, information warfare.

4

SANTRAUKA Žiniasklaida tapo pagrindiniu įrankiu žinių plėtimuisi, kurios įtakoja viešą sąmonę. Tai kas nepateko į kanalus masinės komunikacijos, mūsų laikais, beveik neįtakoja visuomenės vystymosi. Nors, modernus žmogus negali išvengti medijų įtakos. Tokiomis sąlygomis, tampa būtina užtikrinti kokio tipo įtaka žiniasklaidoje formuoja viešą nuomonę. Disertacija konceptualizuoja apmokamų trolių veiklą ir įtaką viešos nuomonės socialinėse medijose, ypač IRA atveju, bei sutelkia dėmesį į kartografavimo Kremliaus trolių problemą medijų kanaluose. Aktualija šio mokslinio tyrimo yra dėl kritinės situacijos Ukrainoje per paskutinius penkis metus, taipogi dėl Rusijos trukdžių Jungtinių Amerikos Valstijų rinkimuose 2016 metais atvejo. Dauguma yra linkusi tikėti, kad visi įvykiai nutikę ankstyvuosiuose 2014 metais yra akivaizdžios informacijos manipuliacijos rezultatas. Šis mokslinis tyrimas nagrinėja IRA veiklos tendenciją, tyrinėja žurnalistų, bei ekspertų nuomonę apie „trolinimą“ bei dezinformacijos strategijas socialinėse medijose. Tyrimo tikslas išsamiai ištirti IRA veiklą ir jų poveikį formuojant viešą nuomonę socialiniuose tinkluose. Taipogi tyrimas linksta užtikrinti Rusijos „trolių“ įtakos efektyvumo faktorius, dezinformacijos strategijų sintezes ir metodus formuojant viešą nuomonę ryšium su politinėmis jėgomis, nustatyti geriausius būdus paspartinti demokratiškos nepriklausomos žurnalistikos vystymąsi šalyje ir didėjimą įtakos jos operacijose. Ar yra skirtumas tarp Rusų ir Amerikiečių žiūrinčių į vieną įvykį ir jei yra, kas tai apibūdina? Tyrimas atskleidžia tiri žinomus šalies remiamus interneto propagandos pavyzdžius naudotus Rusų vyriausybės, stabiliausias idėjas apie IRA paskirstytas Amerikiečių, Rusų, bei Europos medijų. Keywords: Rusijos troliai, dezinformacija, trolių fabrikas, kremliaus parankiniai, informacinis karas.

5

INTRODUCTION In the modern world, the struggle for markets, energy sources and the expansion of political influence continue. In this arena, the interests of states, corporations, and, ultimately, individuals, clash. In this confrontation, propaganda is an essential tool and often an effective weapon for achieving goals. Therefore, the study of the experience of conducting propaganda during , that is, at the time when it was most in demand and became, in fact, a matter of vital necessity. The importance of wars without blood, which do not cause enormous destruction, is increasing, but at the same time capable of solving the most critical political and economic problems facing several world powers. Thanks to the achievements of technological progress in the field of mass communications, radio, television, the development of social networks, the primary weapons of modern war are no longer information, but meanings and values. If information warfare as a type of warfare is widely known to people, then cognitive warfare is a new phenomenon for man, which he cannot yet resist. In the meantime, imposed meanings seep deep into society, destroying and subjugating it. Knowingly, depending on uncontrolled circumstances, the human factor significantly impairs the overall impression of the processes of informatisation. Inaccuracy, inappropriate use of data, censorship, and conscious manipulation of facts are all questions of the whole mass, all existing data. After all, for example, how much poison should be poured into a glass to be considered poisonous? Of course, two drops can and do not die, and who knows if there is not the full half? Moreover, this, of course, is rooted in mind and dramatically influences the subsequent processes of thinking of the audience. Even though the person himself learned to convey information, later invented the language, is now subjected to its influence, and it is no longer possible to identify who it is managing. Analysis of media frames is of particular interest to sociological science, as it allows us to understand the hidden mechanisms of social influence of the media. News is not an objective view of reality, but rather a reconstruction of a small part of reality from different angles. Media framing is a subtle but powerful way of influencing an audience. Studying media frames can help identify and explore essential points in exploring public opinion. Mass media has become the primary tool for the dissemination of messages that affect the public consciousness. What has not fallen into the channels of mass communication, in our time, almost does not affect the development of society. Thus, a modern person cannot escape the influence of the media. In these conditions, it becomes necessary to determine what kind of influence of mass media on public opinion formation. The research conceptualised around the activity of paid trolls and influencing public opinion in social media, particularly the case of IRA and focus on the problem of mapping pro- Kremlin trolls through media channels. 6

The relevance of my research is due to the critical situation in Ukraine for the last five years as well as case with US elections in 2016. Possessing an understanding of the notions of information warfare, propaganda, and methods of its influence, it is much easier to distinguish it among ordinary text and, accordingly, to withstand it. Many are inclined to believe that all the events that occurred in early 2014 are the result of brazen manipulation of information. If earlier wars were won with weapons, now information is a primary strategic resource. The novelty of the chosen topic is that with the globalisation and development of technologies, society changes and new types of manipulations arise, new goals are pursued and for this purpose, a lot of new techniques and tools are being devised. The methods of propaganda in the digital era have been studied insufficiently. In the publicly available literature, there are no works that fully systematise the whole variety of forms of influence characteristic of the propaganda, and mainly there is lack of studies of the case with Russian implementation in domestic politics of many democratic countries nowadays. Creating a fictitious public opinion is not difficult and is widely practised by such agencies as the IRA, which has gained in recent years universal visibility. However, what are the effectiveness and practical significance of the influence of public opinion on social networks? Up to now, quite a lot of researchers and journalists have been engaged in this topic, and in particular the assessment of IRA activities, the activities of the “troll factory” have been repeatedly covered by both Russian and foreign media. In an empirical study, this work will be based on the practical research currently being carried out in Lithuania by the Sakharov Center “Mapping Russian Trolls in Lithuania”. The relevance and novelty of the same work are explained by the fact that the activities of the IRA in European countries, as well as methods of dealing with its real influence on the masses, have not been studied enough. The paper studies the manipulative technologies of Russian political propaganda, as well as using specific examples to determine the effectiveness of their influence on target audiences. The subject of research is the illumination of trolling and disinformation strategies in social networks on the example of IRA. The object of the thesis is the materials in Russian media and mass culture. The thesis aims to comprehensively study the activities of the (IRA) and their impact on the formation of public opinion in social networks. As well to determine the factors of the effectiveness of the influence of Russian trolls, the synthesis of disinformation strategies and methods of forming their public opinion regarding political forces, in identifying the best ways to accelerate the development of democratic, independent journalism in the state and increasing the effectiveness of its operation. As well as to reveal the essence and features of Russian 7 propaganda, and to prove that it is necessary to relate carefully to information, the persons who possess it and the possible consequences of misuse of the information in their interests in the XXI century. Tasks and objectives: To explore and identify the importance of social media during informational warfare; To analyse the impact of pro-Russian trolls on the formation of public opinion; To evaluate the effectiveness of IRA activities during the sociopolitical campaigns This study intends to analyse Russian attempts of artificial activity through online media in democratic societies. Hypothesis: The implementation of Russia in the main geopolitical conflicts of the XXI century has its impact through informational warfare in social media influencing public opinion. Research methods: To achieve the goals, a materials were selected on the research topic, and interviews with experts in the field were done. The following main methods were used in the study: a qualitative and quantitative approach, oral history, and an interview. In the first chapter theoretical background of the research, the area presented based on literature overview in the field of information warfare and political manipulation. Described the phenomenon of media and politics during conflict and information warfare as well as particularly the use of social media and paid artificial activity. In the second chapter sources on the artificial activity of Russia analysed: the implementation of IRA in domestic politics of democratic countries discussed. Identifying and evaluating the impact of the IRA during three main geopolitical conflicts of the XXI century: ISIS, the US election of 2016, and conflict in Ukraine. In the third chapter presented reflection on creative applied project: personal investigation on IRA activities and analysis of interviews with experts and journalists, and overview on the collective arguments and controversial opinions; evaluation of the effectiveness of the research on trolling activity in European countries, the analyses of the Sakharov center ongoing research “Mapping Russian trolls in Lithuania”; an overview of the conversations with former workers of troll factory. In conclusion, archived the results of the investigation and future field of work identified.

8

1. THEORETICAL FRAMEWORK The theoretical framework consists of a detailed overview of the problematic issues: the practice and theory of political manipulation have received quite a high scientific development and practical application. The basis of the success of a social movement, a party or a particular candidate in an election is their rating and popularity. Therefore, the formation of candidates for the power of the public opinion they need is of particular importance. Often these methods of influence go beyond the legal field and represent a severe danger. Therefore, it becomes necessary to analyse and understand the mechanisms of political manipulation. General technology of nation-wide manipulation is usually based on the introduction of cultural and political myths into the mass consciousness, which asserts specific values and norms and is perceived mostly on faith, without rational, critical understanding.

1.1. Theoretical aspects of the interaction of state authorities and the media. The phenomenon of social media in modern political science Significant structural shifts characterise the current state of the world media market. Freedom of the media is a necessary component of the general principle of “freedom of information”, a necessary condition for the regime of democracy, ensuring political pluralism and cultural diversity. The political stability in society, the socio-psychological state of the population, and their attitude to the events depend on the mass media and the level of public confidence in them. They promptly, publicly and regularly disseminate opinions and assessments, keep up to date with the essential socio-political events. The following conclusions are directly related to the problems of improving the areas of interaction between state authorities and the media. Each state in its foreign policy activities seeks to achieve three main goals - the representation in the necessary light, the establishment of its influence over some processes or subjects of international relations and ensuring the guarantees of a stable and secure own existence and development. Among the various ways to achieve this, the most optimal is the information itself: information is a nonmaterial resource, but thanks to an extensive and increasingly globalised media network, it is also comprehensive, which allows for multilevel influence on a variety of forms and channels. At the same time, the actualisation of such an immaterial concept as the image of the country is its representation and a virtual image that shapes the people’s perceptions of this country. The image of the country is its strategic asset, because if we follow the rhetoric of modern leaders of public opinion - politicians, heads of state and government, heads of international organisations - it is at the moment an indicator of an international community’s assessment and

9 determines the level of cooperation with other countries. However, often the decisive role in constructing the image of the country is played not by its own media resources, but by information provided by mass media of large states and border neighbours. In politics, the image defines a lot and, primarily, victory in the elections, which is one of the most severe acts of acquiring power. Pretty much evident that the government has used social media as one of the main tools of manipulation. State-sponsored internet propaganda, in other words, the use of paid internet propagandists by the government in order to change the mass view and opinion online, well practised in many democratic societies. In the early 2000s, ministries of truth in different countries (USA, Israel, ) began to use new tactics to promote the critical point of view on the Internet. Instead of single eloquent speakers, tens of thousands of “simple” people began to be used, who forced the fundamental idea, presenting it as their own opinion (Riley, Etter, Pradhan, 2018) — for instance known examples of state-sponsored trolling and agencies: the troll army of North Korea (Firn, 2013), “keyboard trolls” in Philippines (Placido, 2017), Islamic Republic’s troll army in Iran (Hajizade, 2018), “AK Trolls” in (Hunter, 2015), “Trolls from Olgino” known as Internet Research Agency (IRA) in Russia (DiResta, Ruppel, 2018). Unfortunately, during the last decade, the level of false information spreading through social networks has radically increased in democratic societies. This activity negatively affects the choice of citizens, turning it from a free, informed decision into a formal act, pre-programmed by experts in the formation of mass consciousness. Modern manipulators, understanding the enormous role of the masses in the political process, skillfully use the laws of mass psychology (Bennett, Livingston, 2018). Any information, affecting a person, can create a socio-psychological attitude in him, i.e. internal readiness for any action. Manipulators use this feature of the human psyche for manipulating public opinion. Politics is widely used manipulation through the environment, as it allows to act more imperceptibly, while communication takes place through other people, circumstances, has a long-lasting impact, and has a profound effect on the mind, subconscious and human behaviour. Manipulation also has a secure link with state-owned awareness-raising campaigns that promote or activate certain forms of behaviour. Manipulation can be in the form of postcards, posters, TV and radio broadcasts; it can spread to other mass media. Political manipulation is a concept that unites advertising, PR, in general, all the means that generate sustainable sociopolitical ideas in a person and encourage him to one or another political activity. Political scientists - Machiavellists believe that politics is the art of manipulating people. For each political manipulation, there is a certain closeness, an illusion, an invisible mechanism and, of course, a psychological effect on the consciousness of the individual to form certain beliefs or preferences. 10

Success is guaranteed when a person subjected to manipulation believes that what is happening with him is correct. In a political context, the manipulation of consciousness is a distortion of materials of sociological researches, ratings of candidates in order to provide the necessary changes in the mass consciousness; specific composition of election commissions; interference by officials in the course of the campaign, the unreliability of storing ballots, illegal campaigning, bribing the media, and bribing voters. The spread disinformation leads to poor understanding by society of basic facts related for instance public policy, economy, public health and safety, climate change and environment. It is known that in order to achieve the greatest success, the manipulation of public consciousness must remain imperceptible; success is guaranteed if the addressee perceives everything that is happening to be natural and inevitable. Thus, the elite stage the events that, by definition, the audience will perceive positively, and the television media turn them into reality. In politics, there are two most common ways of manipulation, so-called “social” and “ideological”. The first way is the impact on a person who is part of society. Considering that any individual belongs to society and is located in its cultural field, “social” manipulation is designed for all citizens. “Ideological” manipulation is designed to increase the level of patriotism in the state. (Endsley, 2018). The goal of political manipulation is to obtain, implement and preserve power, i.e. the goals of manipulators are reduced to the formation of people of a particular opinion, control of a person, attracting attention and keeping it. This is a kind of realisation of hidden intentions, the achievement of which directly depends on their support. Political manipulation is the hidden control of the political consciousness and behaviour of people to force them to act or do nothing in the interests of the manipulators, imposing in the form of hidden influence. Political manipulation is primarily concerned with the technology of covering political processes in the media. Currently, we are witnessing the substitution of what was previously understood as politics. Now the place of discussions, political decisions are increasingly occupied by specific symbolic actions. This symbolic policy appears where the power cannot change anything or does not want to change, where the expectations that they encourage the population with their election promises cannot be met. We often see, hear, read fake news that occurs only insofar as they are told. These fake news close the road to events and critical thoughts that are truly important for society. The competitive struggle for the audience and circulation is increasingly forcing journalists to exaggerate the importance of the news, to notice the unusual where there is none, to seek out imaginary sensations or even create them. In the political reality of the last decades, a symbolic

11 policy comes to the first place, and the main merit in this belongs precisely to the rapid development of mass media. It is evident that the media are the primary tool of political manipulation because they have precious and virtually unlimited resource capabilities to influence the mass consciousness. Thus, the basis of political manipulation is the artistic reality they create, which can fundamentally change the proportions of the exact model of the world. An essential prerequisite for manipulation can also be considered the fact that having a monopoly on information, the media set the priorities of events. Millions of events occur in the world, but only that part of them that the media bring to the attention of the respondent is discussed. “The spread of Internet access, however, has radically extended the range of news source options available to the public even in many non-democratic states” (Szostek, 2016) The manipulative capabilities of the media are also in need to interpret the facts, comment on them. Here everything determines the level of social responsibility of a journalist: in what context he embeds the facts, in what words he describes them, what he emphasises, what he will keep silent about, how he will evaluate. The election is especially revealing to study political manipulation, because first of all during political campaigns all the manipulating sources are mobilising to provoke mass for active support of one or another political party, on the other hand during elections we also can observe the results of manipulative operations. Success in elections is impossible without significant public support. Consequently, the goals of manipulators are reduced to the formation of a particular opinion among voters and encouraging them to support this social group in the elections. The manipulator must guess the social expectation and offer the optimal image of the candidate or election program. Manipulators act, on the one hand, openly, often, hiding behind slogans about the common good, but in most cases, there is a hidden mechanism, goal, idea that is not visible to others. The development of political manipulation is also influenced by the type of political system of a society, as well as by the way conflicts are resolved, and political protest is prevented under the conditions of this system. The limits of manipulating public opinion are determined primarily by the already established mass consciousness, stereotypes, and views of people. In order to be effective, manipulation must be based on the mentality and current perceptions of the population. Although under the influence of propaganda, these ideas may gradually change. Creating a specific public opinion through the media, manipulating public consciousness and its impact on it is increasingly becoming an integral part of the life of a democratic society. The growth of its influence on social relations is connected with the democratisation of life, the increase of the cultural and educational level of the population, the processes of globalisation. The central premise of this research is 12 information attacks directed against political parties and target audience and primarily focused on the understanding of such mechanisms. Most openly, the media show their manipulative capabilities during election campaigns. Especially effectively they manipulate the results of opinion polls. After all, the same figure in one context can be presented as a success, and in the other - as a disaster. The manipulative arsenal of the media is full enough: deliberate distortion of the real state of affairs by hushing up some facts and sticking out others, publishing false messages, awakening negative emotions through audiences or verbal images. All these techniques differ in strength and content, but one thing unites them: they are all aimed at creating a particular emotional attitude and psychological attitudes among the audience.

1.2. Social media as a tool to influence public opinion

Journalism theory generally believes that news reports should be objective, giving the reader an exact background and analysis of the subject. On the other hand, the advertising has grown from traditional commercial advertising, having attracted new types in the form of paid articles or programs presented as news. They mostly cover issues in a very subjective and often inappropriate way to convince rather than inform. Usually, they use hidden methods of propaganda that are not used in traditional commercial advertising. If the reader believes that paid advertisements are in fact news reports, then the message the advertiser is trying to convey is more comfortable to “perceive faith” or “perceived as his own opinion.” Such advertising is considered an obvious example of “hidden” propaganda because it looks like objective information, not false propaganda. Laws from different countries often assume that advertising that is served as a news message must be labelled as such in the form of paid advertisements. Thus, journalism educates the consumer, which supports the desire of the ruling elite of the whole world to rule the “herd”. Moreover, that is all, if possible, to grow a generation of educated and adequate people, which subsequently developed other ways to fund broadcasting channels. With such an immeasurable power in the present - journalism is devouring it and sold to the political forces for a penny (in comparison with the possible consequences of non-fulfillment of the requirements of the customer), and spends so much resources on wasting nobody the necessary goods of production and food, while sharing the effort the right way - it would be possible to achieve a brilliant level of development. One of the reasons for the widespread use of such manipulation technologies is the stereotypical thinking of modern man, which allows the manipulator to construct a false picture of the world based on stereotypes that exist in mind by creating new or modifying old stereotype

13 systems in the mass consciousness. The basis of the existence and spread of stereotypes is considered to be an apparent lack of reliable, proven knowledge from relevant areas of life. The introduction of the message into consciousness occurs either with the support of the already existing stereotypes with their simultaneous strengthening or correction by shifting the emphasis in the message or by replacing them with other, more emotionally coloured ones. Stereotypes deeply affect the whole process of perception, participate in the creation of sustainable attitudes, while often they distort reality, and create an illusory picture of the world within which a person begins to act. Schematic thinking sets specific reactions to a standardised message. It is worth noting that stereotypes are a system of beliefs and attitudes; they do not depend on a person’s social experience; therefore, by introducing certain beliefs and attitudes into people's consciousness, the manipulator gets the opportunity to change an individual’s attitude to reality regardless of his social experience. Globalization in the information sphere, the development of social networks creates a virtual opportunity for targeted manipulation of the consciousness of particular groups of people, regions or countries in order to impose the necessary valuation characteristics, views, norms of behavior, if necessary - to destabilize power, public structures, unauthorized interference in the internal affairs. Extremely threatening is the possibility of the target destruction of society by undermining the goals, views, an outlook of the population, carried out through disinformation, manipulation of public opinion. The increasing use of social media by politicians and authorities as a tool to influence the target audience turned social networks to a source of new data on conflict. (Zeitzoff, 2017) With the advent of the possibility of free commenting directly on the media site, editors of publications had the problem of regulating a conversation under this or that material without violating the fundamental human right to freedom of speech and expression, which acts equally on the Internet and in life. It was necessary to find such a solution so that people of polar opinions did not turn into rough confrontations, and in the disputes themselves, no offensive comments appeared. At the stage of introducing this opportunity, the situation on sites and forums was controlled by moderators manually, according to the pre-established rules for using the service. In the course of its development, the media became more interactive. The authors could get an instant reaction to the note. However, restraint and accuracy in open areas quickly disappeared. The media advanced to social networks where broad masses joined regular users. Information began to spread faster, due to reposts, ratings, message. Thus, there were more opinions. A considerable number of statements were subsequently made faceless by their authors, and the regulatory process of commentary became physically impracticable. In addition to various types of cyber-attacks committed against the media, there is a so-called artificial activity of users. It is a comprehensive, most creative and 14 hidden method of attack or promotion. Covers an extensive toolkit and includes several stages of development: articles, reports, websites, statements that could mislead recipients (Allcott, Gentzkow, 2017). The purpose of creating an artificial activity of users: disinformation, slanderous campaigns, promotion, dark PR, dark SMM. The method of creating artificial activity can be used both for offensive and defensive purposes, often in order to promote a service page. The work on creating artificial activity and conducting this kind of information campaign requires a lot of creative, human resources, unlike other technologies that can be done with software code and require only monitoring during the implementation process. Based on creating slanderous campaigns, artificial activity, today numerous services offer services in this direction (promotion, cheat, dark SMM). The implementation of the campaign on the artificial activity of users involves the work of specialists in the field of media management, graphic design, IT.

1.3. The artificial activity of Internet users as a way to simulate the involvement of citizens in socio-political campaigns The first units in the creation of artificial activity are user accounts. These are personal virtual profiles of people who, depending on the Internet resource, store certain information about the user and thanks to which real people become participants on specific portals. Creating personal profiles in social networks, blogger sites, and video portals does not require binding the page of a real person, more precisely there are no means to ensure that the person who is listed on the page leads the page. The account can manage almost any person. Anonymity, in this case, allows broadcasting or leave any messages or opinions without real consequences for the author. Thus, if one of the social networks or a portal becomes one of the platforms for an information campaign or support, the creation of the required number of controlled characters (accounts) through which it will be possible to become a full participant in the social network will be the primary step for its implementation. The task of the author of an unrealistic profile is to give all the other network members a feeling of trust in their page and make them believe that the account is from the first real person. For these purposes, the profile is filled in identically to any present with indication of plausible data (full name, date of birth, education, interests), photos for the visual component and various other materials are loaded depending on the direction of the service, but anyway it’s possible to recognize the fake page after all. Participation in a social network can enable the anonymous party to do the following: publish and distribute own materials (text, photos, videos, etc.) on personal or other pages, as well as in social network communities; commenting on other people’s and own publications, social

15 communities, groups, events, meetings; evaluate opinions under various materials (“like”, “dislike”, “share”, etc.); form statistics (views, visits, participants); purchase targeted advertisement; view personal statistics. Any of these actions, in particular, the comment or publication of the anonymous party can provoke the reaction of other members of the network, including approval or disregard. The decisive factor in the response is the subjective assessment of the participants’ confidence in the page of the author. If implausible, illogical, unrelated, aggressive, offensive comments come from the user, and his page regarding others is strangely filled with false information, then most likely other members of the network will immediately find the comment and all subsequent actions for trolling. In some cases, several fake characters can be used to create the effect of a dispute: two accounts will actively express extreme views in order to create or maintain a conversation artificially. Depending on the goal set, the behaviour and the number of fake accounts are determined. In the case of an aggressive raid on open sites for disinformation, clogging of the service, and disabling it, quantitative suppression is essential, so it is less important how much the profile is filled. Conventionally, it can be said that the quality of a fake account is directly related to the sense of trust it creates. Creating a perfect fake requires creativity, attention and time. It is necessary to “grow” a high-quality account: regularly publish news, update photos, personal information, add new real friends. In search engines for requests “buying accounts”, “cheating statistics, likes, polls”, an extensive list of exchanges is issued, where you can buy fake pages for specific amounts in the market, and at the same time create artificial activity: make artificial pages add estimates, passed polls, left comments and influenced the statistics of views. Due to the high demand for these services, natural competition between exchanges is growing. Within the industry, some significant leaders have managed to scale up their business. Creating fake accounts is only a tool in campaigns of artificial user activity. When developing a project of this kind, it is essential to understand what function the profiles will perform; hence, a strategy and a plan for campaign implementation are formed. The strategy of an information campaign is determined by the tasks that confront it. They can be divided into those that are solved by the number and quality of accounts in the long or short term. A task from a series of to the top is a short-term task that requires a large number of accounts that will actively publish posts with a specific . The quantitative factor here plays a vital role, since the top hashtags that have the highest number of mentions. As soon as the hashtag is placed in the top for a while, the goal will be achieved and the creation of new accounts, publications will cease. Keeping

16 the hashtag in the top is a task with a longer perspective and requires maintaining the number of mentions at a specific level. Thanks to profile purchase services, it is possible to develop more complex mechanisms of artificial activity and push opinions in a virtual environment. You can make mass raids in the community to disable, take polls, cheat statistics. Hypothetically, one person can engage in moderation of a conversation from different accounts in order to attract an audience, who will write opposing opinions from different people, arrange a verbal skirmish, dispute. One of the strategic techniques is trolling. There are import and export of trolling. Exporting trolling means writing comments on third-party resources and platforms. By importing trolling, it is meant to use it on previously created private sites controlled by the same SMM campaign. Political trolls aim to convince the target audience that some opinion is generally accepted (Phillips, 2016). “Trolling behaviours are extremely diverse, varying by context, tactics, motivations, and impact. Definitions, perceptions of, and reactions to online trolling behaviours vary.” (Sanfilippo, M. R., Yang, S., & Fichman, P., 2017, p. 1802) “Tactics in the political trolling range from partisan baiting of ideological opponents into arguments, as on news forums and comments sections. Scholars studied political trolls targeting President Obama, associated with the Occupy movement or the 2011 London riots, and generating discussions about possible social issues, such as race, age, gender, and sexuality. Responses to trolling are as diverse as the behaviours themselves, with both preventative and remedial interventions.” (Sanfilippo, M. R., Yang, S., & Fichman, P., 2017, pp. 1802-1803) Objectives of trolling are to attack Internet mercenaries on any service in order to massively lobby for positive opinions, attitudes, page promotion, spam, and main regulators are comments and ratings. Political trolls exploit a fundamental passion of an individual - being with people, either with the majority or with a minority if it is elite (Phillips, 2016). Prospects for implementation: short-term, medium-term. Any action within the network that comes from a fictitious account is pre-arranged and paid for, so each publication, comment or rating has its price. The active period of action depends only on the budget that is allocated for trolling. Hence it is short or medium term use. Along with the creation of a single profile on the network, the creation of a community or a whole network of communities that will be fictitious administrators does not seem to be a problem either. The task of the fictitious community is to cover as many network users as possible and broadcast content of a specific nature, in a hidden or non-hidden way — prospects for

17 implementation: long-term. The implementation of the network of communities requires a more extended preparation period, and the period of activity can be calculated for years. The exchanges that were listed on the leaderboard for the market are entirely open and are only auxiliary tools in the framework of PR campaigns and promotion. Complex formations, organisations that operate for political purposes and perform strategic tasks, exist in a hidden manner under cover of a legal organisation. Employees of a dark SMM company like the Olginsky Trolls at the job placement stage can sign a non-disclosure agreement and undergo a questionnaire using a polygraph. How to distinguish a fake profile? Despite the understanding and theoretical knowledge about the mechanisms of creating an artificial activity for specialists and ordinary users, the process of recognising fake accounts seems ambiguous and not convincing, however, we will try to collect all known criteria for evaluating the page and suggest our approach to recognise the bot. The fact is that in modern conditions no universal method or tool gives one hundred per cent guarantee to ensure that the account is conducted from a third party. Developed applications for the detection of bots (Fakers App, VkFake) and even compliance with all the criteria that will be listed below, does not give sufficient reason to say that before your account is not the person who is on the avatar. During the analysis, it will be possible to talk only about the increased probability of a fake. Signs of third-party account management in social networks (Vkontakte, ):  As an avatar, an account uses a photo of celebrities or other people, candid images, an abstract picture.  Suspicious names, nicknames, suspicious personal information about the user or lack thereof.  The number of friends and subscribers of a potential friend is in the thousands or, conversely, they are minimal (units).  In friends and subscribers of the account the same bots, artificially created profiles.  On the page, there are no personal photos of the person. Living people love to share photos from the rest, family photos, photo reports from events.  The account does not publish its content or is limited to reposts, publishes a lot of advertising content.  One of the most important criteria is the frequency of added materials. If on the page the recordings and photos are dated to one date, it is likely that the profile was filled once and immediately, which is not natural for real people, who gradually spread information about themselves, add albums, publications.

18

In conclusions, it should be said that the phenomenon of trolling can be considered in most cases precisely as an aggressive form of social-communicative interaction. The purpose of trolling is to give specific incentives to cause adverse reactions from forum members or other virtual communities. The goal of the troll itself is also apparent - obtaining some form of satisfaction from the fact that it was he who at that moment became the epicentre of the unfolding discussions and confrontations. The result of this kind of manipulative action is always a conflict with social overtones and consequences. The most successful trolls can excite several communities, skillfully pushing them together and using public projections in the media to attract the attention of the general public. “Trolling is increasingly pervasive, indicating that efforts to stop trolls are relatively unsuccessful. Thus, there is a gap between scholarly understanding, public practice, and desired outcomes that support the development of appropriate responses to trolling. Specifically, there is a need to consider non-deviant, social and political trolling.” (Sanfilippo, M. R., Yang, S., & Fichman, P., 2017, p. 1803)

19

2. THE ROLE OF THE IRA IN RUSSIA Russia’s image is determined not so much by its real strengths or weaknesses, as by the perception of the western recipient, which is influenced by historically established stereotypes and political prejudices. Mass media are the property of the political elite, which sets the vector information, realising their goals. In this chapter, I will try to find and analyse the methods used by the Russian media to create a modern image of Russia, as well as how representatives of another understand the image of one culture in this connection. Is there any difference in how Russians and Americans look at the same events, and if they do, what explains it? It revealed to analyse an example of the known state- sponsored internet propaganda used by the Russian government, the most stable ideas about the IRA distributed by the American and European media.

2.1. IRA experience in creating artificial activity: structure of the organisation and implementation practice The recently hosted an international conference on cybersecurity and counterterrorism (Westminster Town Hall Forum, 2018). One of the speakers was , a former FBI agent, and army officer. Now one of the country’s leading experts in network hygiene served on the Senate intelligence committee on Trump and Russia’s influence on elections. So, he conducted a master class for all the dominant power structures of the United States, where he presented the idea was that the Russians had created a unique system, which is being studied and adopted by other countries. Working in three levels: professionals (paid trolls, also known as “Trolls from Olgino” or “Olginsky trolls”), like-minded people, and “useful masses”, which the majority. Olginsky trolls gather like-minded people into groups where they communicate, unaware that the Russians organise the group, and then information is poured into the group, like-minded people scribble across the network, and then the main work is done by useful masses that pick it up and distribute it to the rest of society. It creates a virus of information needed by the Kremlin. Since the beginning of 2010, after mass protests against dishonest elections, Russia has relied on systemic manipulation of public opinion through the tools of (Klishin, 2014). This work was recognised to be so effective that it was decided to target these weapons outside Russia - towards the American and European audiences. The current surge in interest in “bots” and “trolls” promoting “Kremlin interests” on the Internet associated mostly with the orientation of a part of them towards Western consumers. The “troll factory”, “Kremlin bots” or “Olginsky trolls” is called the St. Petersburg Internet Research Agency (IRA), which manipulates public opinion through publications in the network. For the first time, it was reported about IRA in the Russian newspaper “” in 2013. The 20 office of IRA was located in St. Petersburg in the district called Olgino (Garmazhapova, 2013), since then they changed the addresses multiple times, but it remains common to call them “Olginsky trolls” by the placement of their first registered office. From the Russian Unified State Register of Legal Entities (2019) information, it follows that the organisation was registered on July 26, 2013. At that time Russian hackers from the group Anonymous International in 2014 posted on their web the correspondence employees of the IRA. Documents published by Anonymous International testified to a large-scale campaign to change attitudes towards Russia in the global information space. The identified agency’s owner is , known as “Putin’s chef” in the (Uainfo, 2014). According to the Russian investigative journalist Timofeyeva (2018) in the beginning, it was “200 people initially worked at the “factory”. However, they did it inefficiently. The main burden was laid on a team of professionals, who had about ten people in the company. Trolls were assigned to dump information into blogs and search for negative information on the network by keywords.” In 2015 researcher and social media analyst Alexander Lawrence revealed more than 20.5 thousand Kremlin trolls on . Studying the social connections of all such accounts, the researcher found out that 2900 of them are a closely related group, the topology of which differs sharply from the natural (random selection of accounts). Alexander applied the same technique to four phrases that the users defined as emanating from the Kremlin bosses - “about Novaya Gazeta”, battles near , the beginning of the “big war” - and the RSS error message that got into the stream. Of the four groups, it was possible to single out 17,590 “socially isolated” accounts, in 93% of the profiles of which there was no location information, in 96% there was no time zone and in 97% of the selected tweets. Also, although the accounts of all four groups were allocated on independent grounds, they all turned out to be closely related to each other. “This picture is very different from the random control sample: the final sample did not contain separate groups at all. It corresponds to the hypothesis that these are bots created by a common agent - and the evidence complex indicates Moscow.” (Lawrence, 2015). Later on, Lawrence (2015) conducted the second study with the help of Google analytics and found a connection between the network of Kremlin-related websites IRA involved in spreading the news of mocking Ukrainian and Western leaders and praising Putin. Pro-Kremlin trolls are not limited to comments on the Internet. They troll through SMS, inventing human-made disasters and causing information panic in entire USA cities. Journalist discovered an interconnection between the St. Petersburg Internet Research Company and sensational political throw-ins that were distributed in social networks. According to Chen, the Russian “troll factory” is

21 not only writing political comments but also plans more complex information campaigns aimed at changing Russia’s perception and pressure on the USA president (Chen, 2015).

2.2. Strategies and technologies of the IRA on social media Internet users are daily confronted with fictitious activity on their pages, with bots, stitched views and pushing propaganda through various tools. Moreover, if users have already mastered the vocabulary to designate these phenomena, then the leading work on systematisation, synthesis, and analysis of these processes from media experts and researchers has not yet followed. “The primary target group for the St Petersburg troll factory seems to be ordinary citizens, but politicians and other public figures are targeted as well.” (Aro, 2016) Here is just one of the ways: the creation of an account on Facebook with a picture of a person stolen from the network and Ukrainian flag attached to the photo. The account remains active with permanent reposts of any patriotic Ukrainian husk, correspondence to the Ukrainian community (from “Interesting Kiev” to “Lovers of Ukrainian embroidery”). Analysis of visitors and selection of “like-minded”, communication in private messages, discussion of urgent matters, creating of virtual friendships, creating own group, says: “There is no corruption in Ukraine”, inviting their like-minded people. At the same time, there are several such “avatars” which are led by one person, and they actively “communicate” with each other. Any topic picked up by trolls and looks like a real conversation, and it involves “like-minded people” and unsuspecting guests. The main goal is so-called “useful masses” because they pick up “stuffing” and spread over the network. In addition to polarised, electoral news - fakes. They also scatter over the network (Watts, 2018). “Disinformation is designed to manipulate the receiver's feelings. Younger and more visually oriented people are lured in with memes, caricatures, and videos. The messages conveyed by trolls’ memes are simple: Western political leaders are often depicted as ‘Nazis’ or ‘fascists’. Images of corpses and alleged war crimes committed by Ukrainian soldiers are distributed, as well as photos of Ukrainian teenage girls wearing t-shirts with Nazi symbols on them–-in reality, these have been edited in Photoshop.” (Aro, 2016) Therefore the main tactics of Kremlin trolls on Facebook are mass complaints about the accounts of Russian oppositionists, Ukrainian and Polish politicians. Complaints are considered automatically, and therefore there is a guaranteed blocking. According to The worldwide community of Ukrainian and Russian FB users (2015): “An army of shills on state payroll has been daily submitting thousands of policy violation reports, targeting popular bloggers who dare to criticise the Russian government. Facebook indiscriminately reacts to these reports by blocking the accounts of prominent Ukrainian public

22 figures and Russian dissenters. Lately, the bans have become so frequent that we can now claim that Facebook has become an efficient tool of the Kremlin.” On April 4, 2018, Facebook blocked Kremlin trolls. It was reported by Mark Zuckerberg (2018) that 270 accounts and pages related to the IRA were blocked. It is reported that a million people have subscribed to one of these Facebook pages. It was also reported on the activity of Kremlin trolls not only during the American elections but also during the elections in France and . The number of Kremlin trolls in Twitter is estimated at an approximate figure of 18.000 accounts (Lawrence, 2015). “Social media attacks can be seemingly small, for example, a 140- character tweet. The influence of a small message can grow when it is repeated, and some trolls have called tweeters the same nasty names hundreds of times.” (Aro, 2016) On YouTube sometimes the number of dislikes exceeds the number of views of the video, which causes cognitive dissonance. For example video bloggers, whose number of subscribers is minimal, but they are hit in a very sore point. For example, the Observer YouTube channel, which released a video on how to use the program for defining Kremlin trolls in comments caught many dislikes. In the comments, Kremlin trolls posts can gain hundreds of likes in a few minutes, and in the video where there are barely a few thousand views, thus giving themselves away, always these comments slander the opposition and express love for power, or try to push the left point of view away from the essence of the problem. (Lawrence, 2015)

2.3. The geopolitical impact of the IRA: global trends Using various languages, Russian trolls distribute false information and have had a significant impact on freedom of speech in many democratic societies (Aro, 2016).

Ukraine This situation was especially vividly manifested in connection with the protracted Russian- Ukrainian conflict. Gradually in the information sphere, Russian mass media have become the mechanism of interference in the internal affairs of Ukraine, as evidenced by the media’s position, for example, in coverage of the political crisis in Ukraine. Permanent attempts by the Russian Federation to strike at the image of Ukraine are aimed at limiting Ukraine’s implementation of its steps in reorienting its interests into economic-political and cultural cooperation with the western community and holding in its sphere of influence. The Russian media cover events in Ukraine selectively, imposing the desired picture of the appropriately selected real facts. The main aggressors are commercial media, in comparison with

23 which the position of Russian state media can be called the most streamlined. Therefore, the position of many Russian media in covering the events in Ukraine is determined primarily by the interests of the Russian authorities regarding the preservation or establishment of their influence, as well as Russian capital in the Ukrainian market. Russian media comment on Ukraine’s aspirations to join the European Union and NATO as images of Nazi soldiers and the NATO emblem against the background of the Ukrainian state flag. The Russian information policy, apart from the domestic Russian level (influence on its citizens), also has an internal Ukrainian level designed to influence Ukrainian citizens. The goal is to support the situation of fear, to counterbalance the local, regional self-identification of people of self-identification in all-Ukrainian, to preserve and deepen the inter-regional split, and ultimately to delegitimise Ukrainian power. That part of the Ukrainian society, which is under the influence of the Russian information environment and consumes products of Russian mass culture, or watches Russian news and programs, actually receives the same information as the Russian population. This tendency can be defined as the modern phenomenon of “interpreting audiences” since the world today no longer distinguishes the existence of local and foreign, Russian or post-Soviet audiences. Applying to the myths of everyday consciousness determines and concrete forms of realisation of the analysed scenario, the defining ones for which are underscored non-propaganda, non-ideological means of influence. These can be, say, retro films of the Soviet period, informal interviews with scientists, pop stars, cultural figures, famous theatre actors, cinematographers. The idea of “Soviet” solidarity arose as a spontaneous reaction to mass consciousness from the standpoint of radical nationalism, as a protest against the artificial separation of cultures and people in favour of political games and ambitions, as well as political fanaticism. However, there are no grounds to argue that the “Soviet” card is not used and cannot in principle be used as a planned scenario of targeted influence on the population of Ukraine, as a very effective means of educating civil indifference, that is, as an instrument by which it is possible to prevent the ever more expressive and massive process of civic self-identification in Ukraine. Today, Ukraine actively and in no small extent uses the common information space with Russia. It is also no secret that such a state of affairs not only leads to significant benefits but at the same time generates numerous, moreover, very significant problems. Moreover, among the latter, there are grounds for highlighting the problem of various types of influences that are deliberately or inaccurately implemented (can be carried out) through this information space. “One of the goals of info-war is to create chaos not only in the information sphere but also within society itself.” (Aro, 2016) It is well known, for example, that any independent state, if it continues to intend to remain, cannot but aim to control its information space. Such an aspiration is 24 caused by many diverse factors, which at the same time have one common feature - all of them, ultimately, directly or indirectly determine the security of the state, and therefore, subject to at least fixation and comprehension. It is quite clear that the active use of information space shared with another state necessitates awareness of both real and potential threats to such use.

USA Chen (2015) studied fake accounts on Twitter among the tweets; the journalist found references to a high-profile event - a fake technological disaster in that allegedly occurred on September 11, 2014. “The Columbian Chemicals was not some simple prank by a bored sadist. It was a highly coordinated disinformation campaign, involving dozens of fake accounts that posted hundreds of tweets for hours, targeting a list of figures precisely chosen to generate maximum attention. And the hoax was just one in a wave of similar attacks during the second half of last year.” (Chen, 2015) The news about that catastrophe in Louisiana was not just a fake satirical site but planned disinformation. Residents were sent an SMS about the explosion allegedly happening at the local chemical plant, and reporters were tweeted to reporters with a request to tell about the details of the incident. As indirect evidence, the organisers of the information attack used specially created sites of local TV channels and companies, as well as the story allegedly released on the Arab TV channel, which reported that ISIS took responsibility for what happened (Lawrence, 2015). In their notes, they indirectly blamed Obama for what happened, for example, asking him to bomb Iraq in return. Later, the same network spread false reports about the outbreak in , as well as information about an unarmed black woman allegedly shot by a police officer. According to Chen (2015), the group played on the real fears of American society to ignite a wave of discontent but used YouTube videos for this dubious quality.

The election in USA 2016 “There is no denying the role of social media in shaping what we think about politically. A recent investigation into Russian meddling in the 2016 election revealed that Russian agents intending to sow discord among American citizens disseminated inflammatory posts that reached 126 million users on Facebook, published more than 131,000 messages on Twitter and uploaded over 1,000 videos to Google’s YouTube service.” (Isaac, Wakabayashi, 2017) The Troll Factory was mentioned in the public part of the report of the US special services on Russia’s interference in the election of the American president, published in early January 2017 (Sakharov, Rysayeva, 2017). The other important do investigation is “The ” a 448- 25 page document has not brought charges against Trump of a criminal offence, which is obstruction of justice. However, at the same time, it says that the facts do not allow us to make a definite conclusion about the innocence of the president. The report lists several situations that, in the opinion of the Mueller team, could potentially contain signs of obstruction of justice. The document contains already known data on Russian interference in elections. We are talking about two main episodes - the actions of Russian hackers and the so-called “trolls”: about the actions of the IRA and about the entrepreneur Yevgeny Prigogine, who financed the “troll factory” and, as well known connected with Russian President ; on the actions of the Russian hackers against the election headquarters of and the US Democratic Party; about the role of Wikileaks; about the contacts of Russian representatives with the election headquarters of before and after his election as president of the United States. (Mueller, Helderman, Zapotosky, 2019)

26

3. REFLECTION ON ACCOMPLISHED CREATIVE APPLIED PROJECT 3.1 Methodological framework The empirical part of the thesis consists of a qualitative analysis of interview data, research strategy: case study, method: content analysis, interview, and secondary data analysis. The interviews were analysed from a variety of perspectives. For instance, content analysis: to examine the interview and identify central themes. Analysing interviews two strategies were adopted: the combination of inductive and deductive approaches. The reflection on an accomplished documentary film based on interviews conducted during March and April 2019 in Kaunas, Lithuania. In a certain way, the methodology that was applied in my thesis differs from the methodological choices of the majority of scholars. In order to conduct this research the primary and secondary data have been used: interviews with experts, news and broadcast materials. The applied project is aimed to analyse opinion held by experts on trolling and disinformation strategies and particularly the case of IRA as an example of influencing operations in social media. Yet, Baltic states as post-Soviet states have a significant population of people who speak Russian and particularly in Lithuania, where the second largest ethnic minority after Poles is Russians, allocated at risk of Russia strategically influencing public opinion (Ethnicity, mother tongue and religion, 2019). The Andrei Sakharov Research Center for democratic development with Robert van Voren and Dainius Genys currently working on the research called “Mapping Russian Trolls in Lithuania: how social networks are used for propaganda purposes?” conceptualized around the activity of paid trolls and influencing public opinion in social media, particularly the case of IRA and focus on the problem of mapping pro-Kremlin trolls through media channels and their impact in Lithuania. As an outcome of the qualitative analysis of the interview data will be evaluated the effectiveness of Russian attempts to affect the domestic politics in democratic societies, and it is role during main geopolitical events of the XXI century as well as to identify the future directions of the research field. This investigation illustrates organised trolling efforts and the outcomes that took place in numerous countries. Semi-structured interviews for the documentary were accomplished with five correspondents: experts in the field of propaganda and cybersecurity and former worker of IRA.

3.2 Propaganda and political persuasion Professor Gintautas Mazeikis, the author of numerous studies on propaganda and political theories, considers that the term propaganda means some strategic persuasion or persuasion based on central ideology and central vision. He separates from propaganda: simple persuasion,

27 advertisement, and agitation. It means if this is a random persuasion or random influencing communication happens, we will not call it as propaganda, just expression of people’s opinion: “If we would like to find is it propaganda or not propaganda, to evaluate the situation, first of all, we need to explain motives, the motivation of this action and to show to what kind of politics or what kind of ideology it corresponds.” (Mazeikis, 2019) Also, propaganda is divided into long term, mid-term and short term propaganda, but by the purposes and due to Mazeikis the contemporary propaganda ministries and agencies are mostly oriented on the short term effect. Short term effect presupposes that you could use fake news, the second is trolls, it means comments, some hot comments on social networks, and the third: to imitate some events, imitation of events, which is not just fake news, but imitation of events, and to involve this imitation of events into more or less influential public discussion, to show that event happened, nobody will negate this artificial event happened, because it was initialization, some staging. Moreover, then they immediately start to discuss this provocation as some reality, because it is quite challenging to show that they speak news. This is very important because you know who works on a level of fake news: only very yellow web pages or trolls companies. If it is more or less severe television TV Company or Radio Company, or if it is more or less trust newspaper, this newspaper even in Russia, they need not the fake news but organized events, or staging events, for instance the case with Columbian Chemicals hoax (Chen, 2015) described in the second chapter. For this reason, they use different tactics than trolls. “So you see these trolls means comments on fake news and serious newspaper they need not a troll and fake news, but they need more or less trusted information, and for this reason they need to organize the events, it means that propaganda organize the event, they pay money, they invest in chaos, they invest into action, and then after it they take some interview on the field, and then trying to comment on more or less trusted newspapers, so to translate to the world, to the whole world, that this situation happened.” (Mazeikis, 2019) There is a multiplicity of approaches, and every approach is used for different purposes. According to Ignas Kalpokas, the principle of hacking and trolling is quite similar because in conventional hacking – the aim is to hack computer systems, with trolling and fake news the aim is essential to hack the human mind. Particularly today when it is quite easy to target specific audiences when through data analysis it is not particularly challenging to identify what sort of preferences, expectations, hopes. Your target audience has, then you can tale the message that bound to address what the audiences think and most importantly how the audiences feel. With a thought, you can apply perhaps some critical thinking and consider whether it sounds legitimate or not, but when emotions targeted even the most critically-minded can be affected provided that the right triggers put in the right time. 28

“Whether trolling or fake comments and things like that I think you get an even further inroad into human mind because it does not seem like a message coming from some either official source or self-interested politician or whatever. It seems to be coming from somebody just like you, sort of concerned individuals who were sharing their experiences, their information, their thoughts. However, of course, all of those have been strategically placed to be discovered by the target audiences in question. So surely there is part of the danger in that.” (Kalpokas, 2019) There is a branch of trolling activity that could be seen as primarily focused on spreading chaos among particular audiences. It could be some moral panic or real panic, reporting about fake events, catastrophes. Within a target audience or sort of distract public attention from something that happens in the background. However, that sometimes is used by domestic politicians as well artificially creating some sort of little scandal in order to distract attention from actual policies that are being passed that. Certainly there sort of smog screen function of trolling is there, but at the same time, much of the trolling activity should be seen as part of a long term plan to sway and move public opinion in predetermined ways. Mainly doing that on the level of emotions where it is more difficult to resist and with sentiment analysis, with data analysis more broadly the use basic emotions are pretty easy to gather, analyse, and therefore communication trolling models can be run and simulated. Many things can be adjusted for trolling activity, but in either way, they are typically getting towards eroding trust within society and more broadly trust between the government and the society. Provoking those emotions is one of the main tasks of the trolls. Because what is the general idea of any trolling activity: they want to divide not only opinion but society at large. The general message “Do not trust the system”, whether it is a system in Lithuania or the United States: “They want to they ask a different kind of stories usually, they want to provoke different emotions, among different social circles within the country. So that is why I think we need a more scientific approach to demonstrate and in other words to rationalise their activity, to rationalise the impact that they make, cause now it is not very difficult to become hysteric speaking about the trolls. However, I think that is one of the emotion that they want to provoke for you or me. They want to eliminate the rational debate.” (Genys, 2019) On the contrary, Robert van Voren emphasises hacking has a remarkable difference from trolling: hacking is very often focused on messing up a system, getting into a system and then either posting false information or just making sure that the system does not function anymore which is quite evident. As far is if we talk about comparing hacking with trolling there is a remarkable difference in a sense that hacking is very often focused on messing up a system, getting into a system and then either posting false information or just making sure that the system does not function anymore which is quite evident. 29

Moreover, sooner or later, usually very soon, you see that the system has been hacked, and it does not function. In a case of trolling when trolling is done professionally, it takes quite a while for people to find out that there is an issue of trolling. Moreover, a considered portion of the audience does not even understand that they have been trolled, whether the information is coming from a trolling source and that it is not real information, truthful information. Trolling is or has to be the most effective during the first period when it has been implemented because people did not recognise it as trolling first of all and secondly there were no anti methods to limit the effect of trolling in order to ban the sources of trolling from social media. All the countries that are being targeted start to realise that it has taken place, how it has taken place and develop mechanisms on how to counter this. (Van Voren, 2019)

3.3 Russian interference in foreign policies We know that Russians are meddling with domestic politics in various countries. There is no doubt that the goal is to divert the attention of debaters from critics of Russia and at the same time praise Russian politics and Putin. At the same time, there is a criticism of the US and the European Union; the statements are structured to cause the audience to mistrust the policies of Western countries. Earlier, the St. Petersburg troll factory has already accused of interfering with government processes. We witnessed a bit of that during the last election in both Germany and France. In Germany, there was a hack into a Christian democratic parties email accounts and some documents. Early it was thought that it would be used against Angele Merkel and her party. It did not announce to be damaging perhaps not too much sort of compromising information was found. There was a similar hack during the French presidential election campaign in targeting Emanuel Macron’s team but there because the French press was forbidden to publish anything election-related immediately before the elections. Again the damage was contained so there were attempts, but this was not particularly successful. Particularly in comparison with let us say the democratic national convention hack in the USA, which was quite damaging to Hilary Clinton’s campaign there. So there are attempts to influence political processes and any election in EU. (Kalpokas, 2019) If we look at the American elections 2016 we think that it’s the first time a foreign power trying to influence elections in the United States, it’s the first time that they use this type of measures, it’s the first time that they use it to this scale, but influencing foreign elections is actually something which has being standard, it’s part of life. “I am convinced that Moscow is trying to influence, but not as massively as in the past, not as successfully. The second part is that with the trolling that they are now doing is not even trying

30 to oppose a truth with their own one fake truth. The whole campaign is bringing so many varieties of the story that people believe in things altogether. It is destroying truth as a factor.” (Van Voren, 2019) It seems that just after the intervention in the US elections was practically proved, there is no more doubt that it was the influence that there were trolls, there were attacks in Facebook: in February 2018, a grand jury of the federal court in the US officially accused the IRA of interfering in the US presidential election in 2016: “Perhaps its scale is somewhat exaggerated, perhaps the “troll factory” has become only one link in the chain of interference in elections. However, the fact that no one doubted this, and if we look at the last parliamentary elections in the US, then the track of the trolls was no longer visible, I do not think that it was, at least we did not notice them, the journalists did not see them.” (Bespalov, 2019) It is clear that Lithuania is one of the target countries. On top of that Lithuania has a pretty outspoken president, who let us put it that way, not the most loved person in the Kremlin. At the same time understanding what ways this trolling is taking place, the extent of the trolling, the way this trolling is working in the country, you can develop your countermeasures. So you need to have this research done in order to say that we need this and these measures, in order to counteract. These debates are very constant in Lithuania, but there is no data: how deep they are, how deep they are trolling our public opinion. There are certain commentators, a certain article from the specialists, who are into the problem but still no more solid account on this issue. (Van Voren, 2019) We still need to talk about practices and methodologies which would help to demonstrate the Russian impact. Because there are different rumours, and there are some initiatives that have been dealing with Russian involvement and with a trolling of our public perception, public view, public attitude. (Genys, 2019) The research “Mapping Russian Trolls in Lithuania: how social networks are used for propaganda purposes?” is currently a small project initiative that belongs to Sakharov centre, Kaunas, done by Robert van Voren and Dainius Genys in cooperation with the Dutch journalists Robert van der Noordaa and Coen van de Ven. The first attempt to map Russian trolls in social media was made by them in the during general elections in March 2017. In the previous research, for instance, in the Netherlands, they analysed 3 millions of tweets. The research was focused on Twitter because it is the only social network that opens up the archive, allows to make various analyses of the archive. They use open sources and such analytical instruments like Facebook graphics and all the other that is available for the public. This research will be accomplished by analysing the activity of Russian trolls on Twitter. The finding of the research can help scientists to better understand the ongoing situation of IRA 31 activities in Lithuanian social media. The main task of the research is to find out to what extent Russian trolls are involved in Lithuanian public sphere. Moreover, moreover, to identify questions, they are interested in other words the trolling habits and the trolling behaviours in Lithuanian public sphere. The scientific idea of the project is to do this kind of research and to see to what extent they are if they want to impact the various perceptions of our society, and what kind of source they are interested in and what kind of subject they are elaborating and whether we can grasp the impact being made by them. There was no systematic research done before, so the goal of the project is to jump into the field and make it more reliable, more scientific. They want to divide not only opinion but society at large. The general message “Do not trust the system”, whether it is a system in Lithuania or the United States. They want to provoke different emotions, among different social circles within the country: “I think that what the colleagues did in the United States, they stick to the same approach, they wanted to see what happened in reality: what kind of impact did Russian trolls in the United States in the process of election because now there are different stories, different rumours. On the one hand, we have stories that Russians have made the biggest impact in favour of Donald Trump.” (Genys, 2019)

3.4 Evaluation of the effectiveness of mechanisms for creating artificial activity during the socio-political campaigns in Ukraine Political events in Ukraine, where the acute political crisis continues day after day, continue to remain in the centre of attention of the Russian and international media on the front pages of newspapers, in numerous comments. The events in Ukraine revealed that a new type of war is unfolding, the primary means of suppressing the enemy in which are disinformation and the use of various electronic means of attack. According to Gintautas Mazeikis when we are talking about the Ukrainian case, especially in the sphere of social media, we could meet with two types of people’s action: one of them is very spontaneous reaction on some events - just expression of opinion and emotions, but from the other side we could see that specially organized persuasion of some people on some social media addresses, for example, Vkontakte.ru, it means that Russia organized a lot of groups, which is coherent or which is similar to each other and they just speak the same, and this is the reason why we could call it as the propaganda. “For example, we could take some information from Kremlin, official speakers from Kremlin, then we could take from so-called Donbass or from Lugansk informational agencies or

32 their propaganda ministries or whatever they call themselves if we compare their text they consist 99% Kremlin persuasion.” (Mazeikis, 2019) If you will pay attention to whom they apply, first of all, it is not all Ukraine; first of all, they apply just to the people who read the texts and messages from so-called . It means that they would like to persuade first of all these people who are attracted by this one problem. They use the other completely different approaches to the Kiev or Western Ukraine, or even all the Ukraine except this Donbass region. It means they use a completely different approach as well to the Baltic States or Poland. “First of all they work with these countries in order to discredit and criticize local power to show how it’s stupid: contemporary Ukrainian president and to influence in some way presidential election or criticize so-called Banderovci, or to criticize green brothers, or forest brothers from Lithuania after second world war, or to criticize pro-Nazi position of Estonian and Latvian volunteers, to show Estonian and Latvian exceptional nationalism and even fascism, to try to discredit Baltic economy, to show the impossibility of Ukraine to build normal European economy.” (Mazeikis, 2019) It means they use the only one strategy to criticise and to discredit and to delegitimised the local power and to create as possible more chaos, as possible more distrust, this is two different approaches. If we will take so-called Novorossiya texts on social networks, you will read a lot of endless of heroic texts, how this sacrifice themselves in the battles against Maidanian Ukraine, how they fight for the Russian truth and the Russian spirit, how they are so heroic, and to show how Ukrainian soldiers and Ukrainian volunteers are villainy. Looking at Ukrainian elections after Maidan, for instance, Russia has been trying to influence it by not only by spreading false information, but also influencing elections by creating crises, for instance by creating war, or creating internal opposition which turns out to be founded by outside sources. (Mazeikis, 2019) Despite some negative factors, in particular, influence from Russia and the spread of fakes, the elections in Ukraine of 2019 was an overwhelming victory of Zelensky with 73.3% votes. Dissatisfaction and disappointment over the results of the last five years leading to choice against the former elite: “I think the main message and the outcome of the elections are some people call it “electoral Maidan”, you know it revolves against Poroshenko period, people completely fed up with Poroshenko and his policies and anything is better than a continuation of the Poroshenko period while he only difference or fundamental difference between them is that Zelensky is talking about a normal country and Poroshenko with his fake patriotism is talking about a Great country.” (Van Voren, 2019)

33

This is a much more sophisticated situation than in 2014. Right after Maidan, we had black and white; now there is no black and white, now it is a much more complicated political setting. The context in which these elections took place, especially about the situation in the Crimea and some eastern parts of the territory of the country. Ukraine is in a state of war, as Russia sent troops for the annexation of the Crimea, the occupied areas of the and Lugansk regions, and constantly provides all kinds of military and political support and support to the separatists. The direct consequences are 13 thousand victims, tens of thousands of wounded, millions of refugees and internally displaced persons, massive violations of human rights, a humanitarian catastrophe, destruction, and destruction: this is the context of these elections. Zelensky won the election based on the virtual program, nobody knows actually what he wants, but to a large degree, the success of Zelensky is the result of social media and media as such. The TV series “Sluga Naroda” which made him well known and on the base of that wave he very nicely managed to get himself elected. “I think it is the social media campaign around Poroshenko played a major role in Zelensky’s victory, so not was the Moscow was doing but what was Poroshenkoists were doing. Part of them are formal Maidanovtsi, but I know that the large part of them are very disappointed with Poroshenko because there were promises after Maidan and most of them were never fulfilled.” (Van Voren, 2019) This whole attempt to create Poroshenko as a prominent patriot through social media watch over things that he did not do had a very contrary effect and hush Zelensky much higher than he should have been. Moscow rejected the fact that Russia interferes in the elections in Ukraine through social networks. For Moscow, Zelensky is much worse than Poroshenko because Putin and FSB leadership need to have clarity of what the policy is. With Poroshenko everything is clear about his strategy, with Zelensky nothing is clear, it is one big black hole. “For Moscow, the outcome that in Ukraine 5 years after Maidan you have democratic elections which run smoothly, really democratic, and a Jew becomes next president of Ukraine. You know, if you for years have been telling that this is a fascist country run by Junta, and suddenly this fascist country run by Jew, you know, it is complicated. You need a little bit more than a troll factory to deal with it.” (Van Voren, 2019)

34

CONCLUSIONS The purpose of the research was to consider the theory and practice of influencing public opinion on social media. The main goal was to clarify what technologies political manipulators use to achieve their goals and objectives. In the modern world, the struggle for markets, energy sources and the expansion of political influence continue. In this arena, the interests of states, corporations, and, ultimately, individuals, clash. In this confrontation, propaganda is an essential tool and often an effective weapon for achieving goals. Therefore, the study of the experience of conducting propaganda during political warfare, that is, at the time when it was most in demand and became, in fact, a matter of vital necessity. The dynamics of the processes taking place in a modern society require not only an adequate presentation of the facts of the public, but with an increasing number of events, information flows and means of their expression, the need for a competent, timely and impartial selection and interpretation of these events and facts, as well as an analysis of their causes and possible consequences, i.e. the role and responsibility of the journalist is also increasing. The influence of the media on everything that happens in society has become universal. It became the primary information weapon in the modern world. Estimates of the growing influence of the media on politics and society are far from unequivocal, if not opposite. Some authors see in it the sprouts of a new, higher and more humane civilisation - the information society, they see a real movement towards an anti-bureaucratic, creative state that can successfully resolve the most acute conflicts of the modern world. Other experts, stating the devastating and destructive impact of the media on the individual, society, and culture in general, assess the growing role of information power more pessimistic. Experience shows that potentially media can serve various purposes. On the one hand, they can educate people, help their competent participation in public life, contribute to personal development. However, on the other hand, what often happens today is to enslave spiritually, misinform, sometimes unwilling to, incite mass hatred, sow mistrust and fear. Nowadays Russia is experiencing a quantitative growth of means and forms of political manipulation, due to the development of new information technologies. How Kremlin’s trolling and disinformation strategies should be countered? “As trolling, hacking and other oppression techniques will only get worse in the future, governments need to find ways to defend individuals from information attacks.” (Aro, 2016) In general, it can be stated that the development of democracy and liberalism in modern Russia is still in the development stage. The low level of legal and political education of citizens will be a pushing factor in the tendencies of manipulating the mass political consciousness of voters and an inhibiting factor in the development of liberal and democratic principles in modern Russian society. Imperfection of the regulatory framework along with its constant change will, on the one 35 hand, facilitate manipulation by the subjects of manipulation, and on the other hand, insecurity from manipulation by the mass political consciousness of the objects of manipulation, since the latter will not be able to follow the sequence of changes in the regulatory framework. Legal acts and the fundamentals of the electoral process in the Russian Federation. Political advertising in an election campaign is based on the principle of a marketing approach to information. There are advertising technologies that manipulate human consciousness. The purpose of the research was to consider the theory and practice of political manipulation. The main goal was to clarify what technologies political manipulators use to achieve their goals and objectives. Based on the above, I can conclude that there is a rich arsenal of manipulative technologies that do not always act lawfully. The possibilities of manipulation are great, but not limitless. The limits of the manifestation of manipulation depend on the consciousness of a person, his stereotypes and views. Obstacles to the path of the manipulators can be their own experience, and such systems not controlled by the power: family, friends, and the like. The topic that I tried to examine in this thesis will continue to be researched to more significant extents in the future. The main task is to find out to what extent Russian trolls are involved in our public sphere, and after that we can see in what questions they are interested in other words we want and we hope to grasp the trolling habits and the trolling behaviours in Lithuanian public sphere so our main goal would be to demonstrate this kind of research results and what would we do with the results afterwards: the main goal will be to advertise it and to show these results to the public as well for specific institutions including universities and some governmental institutions. The government begins a focused effort to combat disinformation. The thesis has provided a current perspective on influencing operations of IRA in social media and the mapping Russian trolls research. The contribution might be of assistance to future researchers in examining similar topics as well as informing masses of possible evidence.

36

REFERENCES 1. Allcott, H., & Gentzkow, M. (2017). Social Media and Fake News in the 2016 Election. Journal of Economic Perspectives, 31(2), 211-236. DOI: https://doi.org/10.1257/jep.31.2.211. 2. Aro, J. (2016). The Cyberspace War: Propaganda and Trolling as Warfare Tools. European View, 15(1), 121-132. DOI: https://doi.org/10.1007/s12290-016-0395-5. 3. Asenas, J. J., & Hubble, B. (2018). Trolling Free Speech Rallies: Social Media Practices and the (Un)Democratic Spectacle of Dissent. Taboo: The Journal of Culture and Education, 17(2). DOI: 10.31390/taboo.17.2.06. 4. Bennett, W. L., & Livingston, S. (2018). The disinformation order: Disruptive communication and the decline of democratic institutions. European Journal of Communication, 33(2), 122- 139. DOI: https://doi.org/10.1177/0267323118760317. 5. Chadwick, A., Vaccari, C., & O’Loughlin, B. (2018). Do tabloids poison the well of social media? Explaining democratically dysfunctional news sharing. New Media & Society, 20(11), 4255-4274. DOI: https://doi.org/10.1177/1461444818769689. 6. Chan, R. (2012, May 21). Retrieved April 08, 2019, from https://www.youtube.com/watch?v=qQelMU9S_ZM&list=LLdE_LxGaZ6d9PaNPGEwx94A& index=39. 7. Chen, A. (2015, June 02). The Agency. . Retrieved March 13, 2019, from https://www.nytimes.com/2015/06/07/magazine/the-agency.html?_r=1. 8. CNN. (2013, April 05). Retrieved May 02, 2019, from https://www.youtube.com/watch?v=sZdxnQhnXH8&list=LLdE_LxGaZ6d9PaNPGEwx94A&i ndex=38. 9. DiResta, R., & Ruppel, B. (2018). The Tactics & Tropes of the Internet Research Agency (pp. 1-101, Rep.). United States: New Knowledge. 10. Endsley, M. R. (2018). Combating Information Attacks in the Age of the Internet: New Challenges for Cognitive Engineering. Human Factors: The Journal of the Human Factors and Ergonomics Society, 60(8), 1081-1094. DOI: https://doi.org/10.1177/0018720818807357. 11. Ethnicity, mother tongue and religion. (2013, March 15). Official Statistics Portal. Statistics Lithuania. Retrieved April 5, 2019, from https://osp.stat.gov.lt/informaciniai- pranesimai?articleId=223122 12. Firn, M. (2013, August 13). North Korea builds online troll army of 3,000. Retrieved April 10, 2019, from https://www.telegraph.co.uk/news/worldnews/asia/northkorea/10239283/North- Korea-builds-online-troll-army-of-3000.html. 13. Garmazhapova, A. (2013, September 09). Где живут тролли. Как работают интернет- провокаторы в Санкт-Петербурге и кто ими заправляет (Where are the trolls: The internet 37

provocateurs in St. Petersburg and who funds them). Novaya Gazeta. Retrieved February/March, 2019, from https://www.novayagazeta.ru/articles/2013/09/09/56265-gde- zhivut-trolli-kak-rabotayut-internet-provokatory-v-sankt-peterburge-i-kto-imi-zapravlyaet. 14. Gaufman, E. (2018). The Trump carnival: Popular appeal in the age of . International Relations, 32(4), 410-429. DOI: https://doi.org/10.1177/0047117818773130. 15. Hajizade, A. (2018, January 17). ANALYSIS: Unveiling Iranian pro-government trolls and cyber-warriors. Retrieved April 10, 2019, from http://english.alarabiya.net/en/perspective/features/2018/01/17/ANALYSIS-Unveiling-Iranian- pro-government-trolls-and-cyber-warriors.html. 16. Hunter, I. (2015, June 06). Turkish ruling party's social media campaigners deny being a troll. Retrieved April 10, 2019, from https://www.independent.co.uk/news/world/europe/turkish- president-s-social-media-campaigners-deny-being-a-troll-army-10301599.html. 17. Isaac, M., & Wakabayashi, D. (2017, October 30). Russian Influence Reached 126 Million Through Facebook Alone. The New York Times. Retrieved March 03, 2019, from https://www.nytimes.com/2017/10/30/technology/facebook-google-russia.html. 18. Klishin, I. (2014, May 21). Максимальный ретвит: Лайки на Запад (Maximum retweet: Likes for the West). Vedomosti. Retrieved March 24, 2019, from https://www.vedomosti.ru/newspaper/articles/2014/05/21/lajki-na-zapad. 19. Kostyuk, N., & Zhukov, Y. M. (2017). Invisible Digital Front: Can Cyber Attacks Shape Battlefield Events? Journal of Conflict Resolution, 63(2), 317-347. DOI: https://doi.org/10.1177/0022002717737138. 20. Kurowska, X., & Reshetnikov, A. (2018). Neutrollization: Industrialized trolling as a pro- Kremlin strategy of desecuritization. Security Dialogue, 49(5), 345-363. DOI: https://doi.org/10.1177/0967010618785102. 21. Lawrence, A. (2015, April 03). Social Network Analysis Reveals Full Scale of Kremlin's Campaign · Global Voices. Retrieved March 21, 2019, from https://globalvoices.org/2015/04/02/analyzing-kremlin-twitter-bots. 22. Lawrence, A. (2015, July 13). Open-Source Information Reveals Pro-Kremlin Web Campaign · Global Voices. Retrieved March 21, 2019, from https://globalvoices.org/2015/07/13/open- source-information-reveals-pro-kremlin-web-campaign. 23. Mejias, U. A., & Vokuev, N. E. (2017). Disinformation and the media: The case of Russia and Ukraine. Media, Culture & Society, 39(7), 1027-1042. DOI: https://doi.org/10.1177/0163443716686672.

38

24. Mueller, R. S., Helderman, R. S., & Zapotosky, M. (2019). The Mueller report. New York, NY: Scribner, an imprint of Simon & Schuster. 25. Myr, I. (2015, June 22). Retrieved April 19, 2019, from https://www.youtube.com/watch?v=QNXMQZ0- _wA&list=LLdE_LxGaZ6d9PaNPGEwx94A&index=56. 26. National, C. N. (2018, March 05). Retrieved March 14, 2019, from https://www.youtube.com/watch?v=jEPNdjs6b74&list=LLdE_LxGaZ6d9PaNPGEwx94A&ind ex=59&t=0s. 27. News, V. (2017, April 03). Retrieved April 04, 2019, from https://www.youtube.com/watch?v=jlstDYFcJZ0&list=LLdE_LxGaZ6d9PaNPGEwx94A&ind ex=17 28. NewsHour, P. (2017, July 11). Retrieved March 15, 2019, from https://www.youtube.com/watch?v=xSIkkza9TVI&list=LLdE_LxGaZ6d9PaNPGEwx94A&in dex=55. 29. Newsnight, B. (2017, May 26). Retrieved March 02, 2019, from https://www.youtube.com/watch?v=Yk8YWyihgno&list=LLdE_LxGaZ6d9PaNPGEwx94A&i ndex=16. 30. Phillips, W. (2016). This is why we cant have nice things: Mapping the relationship between online trolling and mainstream culture. Cambridge, MA: The MIT Press. 31. Placido, D. (2017, July 24). Duterte: No money for trolls, rigged surveys. Retrieved April 7, 2019, from https://news.abs-cbn.com/news/07/24/17/duterte-no-money-for-trolls-rigged- surveys. 32. Riley, M., Etter, L., & Pradhan, B. (2018, July 19). A Global Guide to State-Sponsored Trolling. Retrieved March 19, 2019, from https://www.bloomberg.com/features/2018- government-sponsored-cyber-militia-cookbook. 33. Russell, A. (2018). ‘This time it’s different’: Covering threats to journalism and the eroding public sphere. Journalism,20(1), 32-35. DOI: https://doi.org/10.1177/1464884918809245. 34. Russian Unified State Register of Legal Entities (USRLE/USRIE). Retrieved March 25, 2019, from https://egrul.nalog.ru/index.html. 35. Sakharov, A., & Rysayeva, P. (2017, March 24). Расследование РБК: как из «фабрики троллей» выросла «фабрика медиа» (Investigation of RBC: How the “media factory” has grown from the “troll factory”). RBC. Retrieved April 14, 2019, from https://www.rbc.ru/magazine/2017/04/58d106b09a794710fa8934ac.

39

36. Sanfilippo, M. R., Yang, S., & Fichman, P. (2017). Managing Online Trolling: From Deviant to Social and Political Trolls. Proceedings of the 50th Hawaii International Conference on System Sciences (2017), 1802-1811. DOI: 10.24251/hicss.2017.219. 37. Sherr, J. (2015). The New East-West Discord Russian Objectives, Western Interests (pp. 1-76, Rep.). Clingendael, Netherlands: Netherlands Institute of International Relations Clingendael. 38. Stewart, L. G., Arif, A., & Starbird, K. (2018). Examining Trolls and Polarization with a Retweet Network. In Proceedings of WSDM workshop on Misinformation and Misbehavior Mining on the Web (MIS2). (pp. 1-6). New York, USA: ACM. DOI: https://doi.org/10.475/123_4. 39. Szostek, J. (2016). News media repertoires and strategic narrative reception: A paradox of dis/belief in authoritarian Russia. New Media & Society, 20(1), 68-87. DOI: https://doi.org/10.1177/1461444816656638. 40. The worldwide community of Ukrainian and Russian FB users. (2015). Stop political blocking on Facebook. Retrieved April 04, 2019, from https://www.change.org/p/facebook-stop- political-blocking-on-facebook. 41. Timofeyeva, E. (2018, November 9). Один из создателей «фабрики троллей» рассказал о работе на Пригожина (One of the creators of the “factory trolls” talked about work on Prigozhin). Snob. Retrieved March 19, 2019, from https://snob.ru/news/167926. 42. Uainfo. (2014, May 31). В Сети действительно работает банда кремлевских троллей. Имена, адреса, документы (In the web works a group of Kremlin trolls. Names, addresses, documents). Retrieved March/April, 2019, from https://uainfo.org/blognews/332255-v-seti- deystvitelno-rabotaet-banda-kremlevskih-trolley-imena-adresa-dokumenty.html. 43. Vox. (2018, April 24). Retrieved March 03, 2019, from https://www.youtube.com/watch?v=q1Hl9bRzwEs&list=LLdE_LxGaZ6d9PaNPGEwx94A&in dex=57. 44. Watts, C. (2018). Messing with the enemy: Surviving in a social media world of hackers, terrorists, Russians, and fake news. New York, NY: Harper. 45. Westminster Town Hall Forum (2018, September 27). Clint Watts - Hackers, Terrorists, Russians, and Fake News. Retrieved March 16, 2019, from http://www.westminsterforum.org/forum/hackers-terrorists-russians-and-fake-news 46. Zeitzoff, T. (2017). How Social Media Is Changing Conflict. Journal of Conflict Resolution, 61(9), 1970-1991. DOI: https://doi.org/10.1177/0022002717721392. 47. Zuckerberg, M. (2018, April 3). Today we're taking an important step to protect the integrity of elections. Facebook Retrieved April 05, 2019, from https://www.facebook.com/zuck/posts/10104771321644971. 40

APPENDICES Vitaly Bespalov 07/03/2019 “Я попал туда совершенно случайно, я перебрался в Петербург из города Тюмени, где я учился в университете. Перебрался в Петербург, искал работу, и просто совершенно случайно через вакансию на сайте Headhunter (https://hh.ru), вакансия называлась “контент менеджер” и ничем не отличалась от остальных вакансий аналогичных кроме заработка. Заработок был на порядок выше, чем обычно на эту должность. Тогда это было 45 тысяч рублей, причём эти деньги получали все рядовые сотрудники этого помещения. Те, кто занимали более высокие должности, получали 65, 75 ну и дальше. Сорок пять тысяч рублей по тому курсу это примерно ну 900 долларов, я думаю около 1000 долларов. Для этой должности для Санкт-Петербурга это достаточно высокий оклад. Попал совершенно случайно, вообще ничего не знал про неё. Более того, это случилось незадолго после того как агентство “Интернет исследования” переехало из Ольгино на Савушкина. Поменяло адрес, и информации в интернете нигде не было, что они теперь располагаются на Савушкина, 55. О том, что они располагаются по этому адресу, ещё не было информации. Я пришёл на собеседование, оно показалось мне очень странным, потому что чтобы просто попасть в здание нужно было заполнять анкеты, все свои данные, чтобы просто проникнуть внутрь, просто попасть: паспортные данные, место регистрации, фактическое проживание, просто чтобы войти внутрь. Это было очень странным, собеседование было странное, мне сказали, что моя работа будет связана с темой Украины, я буду работать в отделе “Украина 2”, заниматься новостями темой Украины. Когда я спросил, какого формата будут новости, это же было как раз в разгар, наверное, в самый разгар конфликта на востоке Украины, они мне сказали: “Нет, нет, вы что, это будут объективные новости про Украину”. И уже начав работать там, я уже через два дня понял, где я оказался, ну и стал просто перед выбором увольняться, потому что эта деятельность она вообще противоречит моим политическим взглядам, моим пониманиям о том, чем должен заниматься журналист. И собственно говоря, либо сразу уволиться, либо поработать, чтобы узнать об этом больше и вот рассказать, в том числе и вам. Ну, вот как-то так провести такое спонтанное журналистское расследование. На тот момент, это я всё говорю о том, что происходило в конце 2014го года, потому что сейчас всё поменялось. Сейчас, насколько мне известно, тролли располагаются в разных зданиях, и на улице Савушкина там осталось буквально несколько отделов, которые занимаются там другой административной работой. То есть 41

сами тролли располагаются по другим адресам, некоторые из них известны, некоторые не известны. Так вот в 2014 году, когда я работал там, это 4-этажное здание, оно было полностью набито людьми, я думаю, это было больше 150 человек. Ну, 150-200, опять же я по головам не считал, это моя приблизительная цифра. Я работал в отделе “Украина 2”, работа моя не была работа тролля, конкретно моя, ну то есть это троллинг конечно, но это не тот троллинг который пишем в комментариях, это были фейковые сайты, фейковые украинские сайты, несколько сайтов. Только я, в общей сложности занимался тремя сайтами. Мелкие совершенно сайты, где на русском языке были новости, то есть мы находили новости связанные с Украиной, особенно с конфликтом в Украине военном, и заменяли, делали их, таким образом, заменяли там “террористы” на “ополченцы”, ну вот такое, то есть делали их такими, какими преподносят, допустим, на российском телевидении. На тот момент, и в принципе я думаю и сейчас, собственно говоря, я занимался этими вопросами какое-то время, потом так получилось, мне просто ещё интересно было узнать, что происходит в других отделах, и так получилось, что потом я работал в отделе “social media marketing”. Это как раз вот тот отдел, который особенно VKontakte, на Facebook в меньшей степени занимались спамом ссылок на эти фейковые украинские новости. То есть мы заходим в какую-нибудь группу VKontakte, и видим там спам что там “на Украине вот происходит такое безобразие”. Это отдел “social media marketing”, и были отделы с троллями, о них... я не работал там, поэтому я так вот то, что краем уха слышал, о чём я могу говорить. То есть там работа была и в Facebook, и на YouTube, и LiveJournal, тогда он ещё был более актуальный, чем сейчас, особенно VKontakte. И в принципе работа была выстроена в основном на внутрироссийскую аудиторию, основная работа, и как раз иностранный отдел англоязычный только формировался. То есть когда я работал, это примерно ноябрь 2014го года, он только формировался, появились вакансии, такие на которые устраивался я, только со знанием языка и выше зарплата. И насколько я знаю, там был жёсткий отбор. То есть, такие случайные люди как я туда не могли попасть совершенно. Именно поэтому мы видим что очень мало источников, особенно, насколько мне известно, нету источников с открытым лицом, которые рассказывают о работе в отделе англоязычном, потому что там всё было гораздо жестче. Так вот, да, были отделы, которые занимались мемами, то есть делали мемы, как они ещё назывались картинки с подписями, демотиваторы, тоже политические, они назывались художниками карикатуристами, они тоже получали за это деньги. Тогда очень, опять же это 2014й год под конец, очень большая активность велась на теме Украины. Все эти определения, там “Бандеровская хунта” и всё это вот 42

примерно то, что рождалось в стенах этого здания в том числе, то, что рождалось в комментария о том, что популяризировалось. Сами сотрудники в основном были молодые ребята и девушки примерно моего возраста, мне тогда было 23 года, примерно 22 - 27, было мало людей постарше. Кто платит деньги? Естественно напрямую никаких речей о том, что мы про- кремлёвские не было. То есть все догадывались, но напрямую мы работали в компании “Интернет исследования”. Вот фамилия Евгения Пригожина, того человека которого СМИ называет владельцем фабрики, той информации которой я доверяю, и которая в последствии подтверждалась, сам Евгений Пригожин, его фамилия не звучала там ни разу, ни на кого. Хотя все пути ведут именно к этому человеку – повар Путина, миллиардер, с очень криминальным прошлым, с не менее криминальным настоящим. Вот его фамилия никак не звучала. И в принципе сама фраза, что мы кремлёвские я услышал один раз от своей начальницы, когда я сказал, что я увольняюсь. А увольнялся я, потому что ну всё что я мог я узнал, миссия выполнена, я понял, что больше информации мне там уже не найти. Тогда я конечно же не понимал масштабы того что будет производить это здание и то насколько это будет интересным. Я не думал, что будет всемирный интерес к этой теме, потому что про выборы в США, тогда ещё было два года до выборов в США. И вообще эта тема никак не звучала, фамилия Трампа нигде не фигурировала. Так вот, и когда я сказал ей, что я увольняюсь, сказал честно, что мне всё надоело, что это противоречит моим взглядам, как-то меня на эмоции пробило, и моя начальница, Анна Ботнева, сказала такую фразу: “Ну мы же тут сидим кремлёвские, а там вот через речку работают люди которые пишут американские деньги, тоже самое что мы делаем”. Вот это вот прозвучало один раз и всё, больше я такое не слышал. Ну, понятно, откуда ноги растут, это в принципе я думаю всем понятно. Какое будущее у фабрики? Да мне кажется вот как раз после того как вмешательство в выборы США было практически доказано, ну мало у кого возникает сомнение что влияние это было, что были тролли, были атаки в Facebook. Возможно, его масштабы несколько преувеличены, возможно, “фабрика троллей” стала лишь одним звеном цепи вмешательства в выборы. Там были ещё хакеры, которые нанесли гораздо большие вклады, остальные и остальные. Но то, что это было, не возникает ни у кого сомнения, и если мы посмотрим выборы парламентские последние парламентские в США, то здесь след троллей уже не был виден, я не думаю, что он был. Хотя когда у меня спрашивали до выборов, я был в этом уверен, что будут подобные атаки, нет, по крайней мере, мы их не заметили, журналисты их не увидели. И какое будущее, мне кажется, что учитывая политическую ситуацию в России, экономическую ситуацию, и 43

социальную, то где мы находимся, то, в какой реальности мы сейчас живём, то вполне возможно в ближайшем будущем новая волна оппозиционных настроений, которые появились после повышения пенсионного возраста в России, и дальше, дальше и дальше. И вполне возможно что фабрика троллей вновь сконструирует своё внимание на России, на российской теме, на российских проблемах, на российскую аудиторию, для того чтобы пресекать какие-то протесты настроения. Ну, мне так кажется, естественно это одна из гипотез, я не могу знать точно. Но то, как сейчас во многих странах опасаются вмешательства троллей в их дела, мне кажется, что можно уже немножко успокоиться. Ну как успокоиться, обратить своё внимание на хакеров, потому что это гораздо более перспективное направление и гораздо более опасное.”

Dainius Genys 26/03/2019 “My name is Dainius Genys, I’m a sociologist, and I received my PhD in sociology in 2011, and for a while I was working in a civil society research field and then eventually after a couple of years our administration decided to establish Andrei Sakharov research centre for democratic development and they invited me to become a small partner and they offered me a role as a researcher and as an office manager let’s say, so I’m helping out with both administrative work as well as scientific. So we are quite a new centre, it was established in 2017, at the end of December. And now we are working for the second year, and we accomplished, you can see on the wall, we had a few events mostly public lectures, and we had one bigger conference. Our tasks are threefold: the first one is to keep organising public lectures, seminars, training and scientific conferences dedicated to the important questions like democratisation processes in Russia and the broader region. As well and our main mission is to talk about the legacy of Andrei Dmitrievich Sakharov and talk about human rights in the region, in Russia and in a broader context. The second task is to keep maintaining the archival holdings, which keep fulfilling eventually: now we have few loads from different organisations, archive, that we have is dedicated to human rights of course, through the various angles and we are still waiting for one piece of material from Maiden oral history. And the last task is to do a scientific study, research also regarding the democratisation process in the region. About half a year ago my colleague Robert van Voren, who happens to be the head of the centre, he is Dutch himself, and he went to his home country, and he grabbed some newspaper and read an interesting story about the “Mapping Russian trolls in the Netherlands”. So when he came back, he introduced me to the story. He said that there are a couple of interesting journalists in the Netherlands who are investigating this kind of Russian activities, 44 and so we started talking and we came up with the idea that maybe it will be interesting to do or to repeat some similar research in Lithuania or to make some kind of training for our students and for general audience. So we started to elaborate on this idea, and now we decided to do both options. So we submitted a proposal for a Lithuanian scientific council regarding a scientific side of this project, and now we are working on the organisation of training seminar for students and for training specialists who would be interested in the subject, so we are hoping to do that in 23rd and 24th of May. We want to see whether it is a big problem here. We know that Russians are meddling with domestic politics in various countries. As I said the Netherlands, we know what happened in the United States, and there are some researchers has been done in Germany. These debates are very constant in Lithuania, but we do not actually have this kind of data: how deep they are, how deep they are trolling our public opinion. We know we have certain commentators, a certain article from the specialists, who are into the problem, but we still do not have a more solid account on this issue. So our scientific idea was to do this kind of research and to see to what extent they are if they want to impact the various perceptions of our society, and what kind of source they are interested in and what kind of subject they are elaborating and whether we can grasp the impact being done by them. So it is quite challenging, I do not know how we succeed it, but we will see. I think it is a big issue, because as Lithuania and other Baltic countries we have this common history with Russia, with the former , which is not a happy history. Well, Europe use to blame us that we were too superstitious, that we were maybe too scared, that we were too sensitive to Russian question let’s say regarding the involvement in our public sphere. But now, after this proof, that Russian trolls are involved in various countries domestic politics, after it was brought to the daylight, so everybody understands that our critics were not too sensible, was not too scary, it wasn’t related to some kind of emotional aspect, but it was based on our ability to resist Russian propaganda. And we still need to talk about practices and methodologies which would help us to demonstrate the Russian impact. Because there are different rumours, and there are actually some initiatives that have been dealing with Russian involvement and with a trolling of our public perception let’s say, public view, public attitude. But as I said it is no systematic researches, so we want to jump into the field and make it more solid, more scientific. We have submitted a proposal; it has been accepted. We already got results from the first-period evaluation, and it was quite positive, so our hope is that we will get the needed support, but we still need to wait for the final decision. So it is difficult to say at this point. But we have also been in contact with different organisations like universities and institutions, so everybody very supportive but we 45 will need to raise some money in order to make the more solid project. But everybody that we have been contacted with so they agreed to take part in this kind of project, but we decided to define their roles a little bit later. So now it is our small project initiatives that belong to just our centre, and we are in contact with partners from the Dutch, the journalists that I mentioned Robert van der Noordaa and Coen van de Ven. So they agreed to come to Lithuania and to run these training sessions, so we hope that it will happen and the support that we have is from our university and from the embassy of the Netherlands. As I was talking with these guys from the Netherlands, so what they did in the previous research, for instance in the Netherlands they analysed 3 millions of tweets. So their research was focused on Twitter because Twitter is I think the only social network that opens up the archive allows to make various analyses of the archive. I do not think that Facebook or other programs allow doing that. So what they do, they use basically open sources. And they use such analytical instruments like Facebook graphics and all the other that is available for the public. So our main goal was not to focus on the research that would be pricy, and that could be afforded only by certain people. So we wanted to organise this kind of training in order that everybody who would be willing to take part in it could maintain this analytical desire in the future so that it would be available for everybody. Our main task is to find out to what extent Russian trolls are involved in our public sphere. And after that, we can see in what questions they are interested in other words we want, and we hope to grasp the trolling habits and the trolling behaviours in Lithuanian public sphere so our main goal would be to demonstrate this kind of research results. And what would we do with the results afterwards: I think the main goal will be to advertise it and to show these results to the public as well for specific institutions including universities and some governmental institutions. It definitely will be an open source so everybody who will be interested in the results could download that from our web and could use them as they want to. Well, it is also difficult to say. When I mentioned that one of the reasons why I think we need this kind of research is to make it more scientific, because now as I said we have plenty of emotions and I think that the provoking of those emotions is one of the main tasks of the trolls. So that is why I think we need a more scientific approach to demonstrate and in other words to rationalise their activity, to rationalise the impact that they make, cause now it is not very difficult to become hysteric speaking about the trolls. But I think that is one of the emotions; they want to provoke for you or for me: they want to eliminate the rational debate every area that they are interested in and they are working in. So I think that what the colleagues did in the United States, they stick to the same approach, they wanted to see what happened in reality: what kind of impact did really Russian trolls in the United States in the 46 process of election because now there are different stories, different rumours. On the one hand, we have stories that Russians have made the biggest impact in favour of Donald Trump, the others on the counter say that their impact was actually irrelevant and they did nothing to the outcome of the results. You know, it is standing from aside not basing your argument on the somehow comprehensive research is difficult to validate your argument, who speak the truth and what was the real impact of the trolling initiative.”

Gintautas Mazeikis 2/04/2019 “First of all, I need a little bit to describe what I mean by the propaganda. And propaganda in my theories and in my consideration means, first of all, some strategic persuasion or persuasion based on main ideology and main vision. And I separate from propaganda simple persuasion, as well advertisement, and as well agitation. It means if this some kind of random persuasion or random influencing communication happens, I will not call it as propaganda, just expression of people’s opinion. If we would like to find is it propaganda or not propaganda, to evaluate the situation, first of all, we need to explain motives, the motivation of this action and to show to what kind of politics or what kind of ideology it corresponds. I mean that when we are talking about Ukrainian case, especially in the sphere of social media, we could meet with some two types of people’s action: one of them is very spontaneous reaction on some events, and I will try not to call it as propaganda, not as a political persuasion, just expression of opinion and emotions. But from the other side we could see that specially organized persuasion of some people on some social media addresses, for example, Vkontakte.ru, it means that Russia organized a lot of groups, which is coherent or which is similar to each other and they just speak the same, and this is the reason why we could call it as the propaganda. For example, we could take some information from Kremlin, official speakers from Kremlin, then we could take from so-called Donbass or from Lugansk informational agencies or their propaganda ministries or whatever they call themselves, if we compare their text they consist 99% Kremlin persuasion, then, for example, Vladimir Solovyov some entertainments on TV screen, then you could take some pictures or some pages from social networks, from for example Donetsk or Lugansk, you will get the same text approximately 60-70% even 90%. Actually, if you will pay attention to whom, they apply, first of all, it is not all Ukraine; first of all, they apply just to the people who read the texts and messages from so- called Novorossiya. It means that they would like to persuade first of all these people who are attracted by this one problem and they invest money in this area. Why do they not do 47 differently? Because Ukraine, or Baltic countries, or Poland, they cut possibility, the technical possibility for persuasion and propaganda in their countries about Donbass and Novorossiya issues. And they do not like to spend their money on heavy propaganda or everyday propaganda. It means that they use the other completely different approaches to the Kiev or to Western Ukraine, or even all the Ukraine except this Donbass region. It means they use a completely different approach as well to the Baltic States or to Poland. First of all they work with these countries in order to discredit and criticize local power to show how it’s stupid: contemporary Ukrainian president and to influence in some way presidential election or criticize so-called Banderovci, or to criticize green brothers, or forest brothers from Lithuania after second world war, or to criticize pro-Nazi position of Estonian and Latvian volunteers, to show Estonian and Latvian exceptional nationalism and even fascism, to try to discredit Baltic economy, to show the impossibility of Ukraine to build normal European economy, it means they use the only one strategy to criticize and to discredit and to delegitimized the local power and to create as possible more chaos, as possible more distrust, this is two different approaches. If we take so-called Novorossiya texts on social networks, you will read a lot of endless of heroic texts, how this sacrifice themselves in the battles against you know Maidanian Ukraine, how they fight for the Russian truth and for the Russian spirit, how they are so heroic, and to show how Ukrainian soldiers and Ukrainian volunteers are villainy. This is you see different elements. The third probably moment which I would like to emphasise is the so-called moment of reaction. Propaganda is divided into long term, mid-term and short term propaganda. And for this reason, they use different tactics then trolls. So you see these trolls means comments on fake news and serious newspaper they need not a troll and fake news, but they need more or less trusted information, and for this reason they need to organize the events, it means that propaganda organize the event, they pay money, they invest in chaos, they invest into action, and then after it they take some interview on the field, and then trying to comment on more or less trusted newspapers, so to translate to the world, to the whole world, that this situation happened. And only this one sort of information so-called Sputnik, or the propaganda channels, they translate it to the western countries. So you see they have a multiplicity of approaches, and every approach is used for different purposes.”

Ignas Kalpokas 26/04/2019 “So my name is Ignas Kalpokas, and I am a lecturer here at a department of public communication. My research mostly centres on cybersecurity and information security. 48

Nowadays I’m mostly interested in disinformation, fake news and so-called phenomenon of post-truth which is the subject of my latest book where I’m trying to understand post-truth from both psychological components of it, but also more structural factors: the online environment, social media and also motivation of political actors, who are willing to engage in post-truth as a form of political discourse. Of course, we witnessed a bit of that during the last election in both Germany and France. In Germany, there was a hack into a Christian democratic parties email accounts and some documents where early it was thought that it would be used against Angele Merkel and her party. It did not announce to be terribly damaging perhaps not too much sort of compromising information was found. There was a similar hack during the French presidential election campaign in targeting Emanuel Macron’s team but there because the French press was forbidden to publish anything election-related immediately prior to the elections. Again the damage was contained so there were attempts, but this was not particularly successful. Particularly in comparison with let’s say the democratic national convention hack in the USA, which was quite damaging to Hilary Clinton’s campaign there. So there are definitely attempts to influence political processes and any election in EU I think is something where Russia potentially see some ground for affecting politics in a way that’s useful o them. The principle is quite similar and whether trolling or fake comments and things like that I think you get an even further inroad into human mind because it doesn’t seem like a message coming from some either official source or self-interested politician or whatever. It seems to be coming from somebody just like you, sort of concerned individuals who are apparently sharing their experiences, their information, their thoughts. But of course, all of those have been strategically placed to be discovered by the target audiences in question. So surely there is part of the danger in that. I mean to some extent who needs trolling and informational warfare when we have the current government but on the more serious note I mean this issue of creating an atmosphere where the government understate that for more broadly scene somehow scamming, oppressing sort of common person, and sort of selling everybody out to either big bangs or either human rights organizations or whatever and therefore sort of necessity to tend back to you know something more predictable traditional values and soon which is of course a big theme of Russia self-representation. Well, all of those different strengths of public discourse can be then connected to Russian friendly maybe attitude, not necessarily directly because the positive attitude towards Russia does not really sell that well in the public sphere but subterraneously bringing public discourse closer to that propagated by Russia. But I mean the topic themselves

49 is absolutely conditional on what is happening in society. Anything can be adjusted and twisted to that.” Robert van Voren 29/04/2019 “What I am very often thinking about this whole trolling is whether we are watching something completely new or it is something which has been around all the time but just has been packaged into a new package. As far is if we talk about comparing hacking with trolling I think there is a phenomenal difference in a sense that hacking is very often focused on messing up a system, getting into a system and then either posting false information or just making sure that the system does not function anymore which is quite evident. So I think trolling is a much more sophisticated method, but at the same time, my feeling is, you know, spreading false information has always been on the stream. It has been in the ancient world, it has been in the times of Greeks, and of Roman Empire, it has been in the time of middle ages, and later all the time there have been these waves of spreading false information. Look only about one of the essences or one of the sources of antisemitism is the false story that Jews kidnap poor Christian kids and slot them in order to drink blood. Completely false information, but sometimes has been very effective in creating programs and mobilising people against Jewish communities. So this has always been around and I think something like trolling is or has been the most effective during the first period, when it has been implemented, because people didn’t recognize it as trolling first of all and secondly there were no anti methods to limit the effect of trolling in order to ban the sources of trolling from social media. And I think slowly but surely all the countries that are being targeted start to realise that it has taken place, how it has taken place, and develop mechanisms on how to counter this. So I think in a way it is something which is part of natural development, progress. I always use as an example in I think 1873 the first train runs between Amsterdam and Haarlem in the Netherlands, top speed 15 km/hour, and there is a huge protest of people living near the railroad tracks because they say this is an incredible speed and the cows were so scared that they will never give milk again. Well, 15 km/hour any step, nowadays electric step, has a higher speed than that. Soit is also getting adjusted to something and knowing that it is there. And actually, you know, I think if you keep your eyes open and you really want to figure out, it is quite easy to examine and to see whether it’s false or it’s not false. If we look at the American elections, we think that it is the first time, but you know a foreign power trying to influence elections in the United States, trying to this extent. It is the first time that they use this type of measures, it is the first time that they use it to this scale, but you know influencing foreign elections is something which has being standard, it is part of life. 50

Look at Ukrainian elections after Maidan for instance, of course, Russia has been trying to influence it. It is not only by spreading false information, but you can also influence elections by creating crisis situations, for instance by creating war, or creating internal opposition which turns out to be founded by outside sources. And so I think that is something that we just need to live with and we need to learn how to discern it and to make use of it. For me, very interesting is the recent elections in Ukraine. Because it was an overwhelming victory of Zelensky, 73.3% is really big. I think the main message and the outcome of the elections are some people call it “electoral Maidan”, you know it revolves against Poroshenko period, people completely fed up with Poroshenko and his policies and anything is better than a continuation of the Poroshenko period. But Zelensky actually won the election on the basis of the totally virtual program; nobody knows actually what he wants. Yes, if you go to the election bulls on the 21st of April, yes there is a program hanging on the wall, I mean Poroshenko also has a program hanging on the wall, differences not that big. The only difference I see or fundamental difference is that Zelensky is talking about a normal country and Poroshenko with his fake patriotism is talking about a Great country. But to a large degree, the success of Zelensky is the result of social media and media as such. You know it is this series “Sluga Naroda” which made him well known and on the base of that wave he very nicely managed to get himself elected. And it is very interesting how politic is going to develop now, is it going to be a politics in which social media is playing a much more prominent role that it has being in the past or is it slowly going to slight back into the type of politics that we used to, time will tell, there will be elections in and we will see what the outcome is. But it is really interesting; things are really moving into the next gear you could say. And I do not think it is something which is very unusual, in the sense of that it is a normal process of progress of men kind. In 1969 there was an American futurologist Alvin Toffler who published his book “Future Shock”, in which he said, you know, that the world is changing fundamentally and people are getting so much information that they cannot cope with it anymore, and everything is going quicker and quicker. And one day there is a building standing on the street corner, a year later the building is gone, and it is a kindergarten, two years later there is another building standing there, so there are so many changes you cannot cope with it. In 1970 Orson Wells made a television program about that we will almost develop mental illness, will be in future shock cause so many things changes. Well if you will look at a change and you will see how the world is changing now, it was very nice and slow at that time, so we adjust ourselves, we are very good in adjusting. People, humankind has strong survival kind of the core, and so this

51 is something that we will just get into ourselves, and we will not know a different situation without trolling. I think things worked out differently and what played a major role in Zelensky’s victory is the social media campaign around Poroshenko, so not was the Moscow was doing but what was Poroshenkoists were doing. Part of them are formal Maidanovtsi, but I know that the large part of them are actually are very disappointed with Poroshenko because there were promises after Maidan and most of them were never fulfilled, and instead of fulfilling Maidan promises he filled the country with Roshen shops. I think that the whole social media campaign was trying to promote Poroshenko as this wonderful big patriot. What is his main slogan? “Armiya, vera, mova” so the “Army, belief, and the language”, I think for a lot of people it had an opposite effect. Because if we talk about the “army”, the first thing that people say “army, yes” but then if the army is so important why didn’t your son go to the army but instead he goes to Verkhovna Rada. If you are talking about “vera”, this whole issue with the Donbass, Poroshenko in a very soviet old fashioned way abused religious issue for his political campaign. “Language” – he promotes himself as anti- Russian while a large portion of a population has Russian as their mother tongue, they got insulted. This whole attempt to create Poroshenko as a big patriot through social media watch over things that he did not do had a very contrary effect and actually hushed Zelensky much higher than he should have been. And I think actually for Moscow Zelensky is much worse than Poroshenko, because Putin and FSB leadership need to have clarity of what the policy is. With Poroshenko everything is clear about his strategy, with Zelensky nothing is clear, it is one big black hole. So I think in Moscow they are as worried as some of the people in Ukraine are, in which direction is this going and how to respond to this. In the same way, I believe that for Moscow would have been better if Hilary Clinton would be a president than Trump. Because with Clinton everything is clear and you can develop the strategy. With Trump, you know this guy is crazy enough to say “a” today and “b” tomorrow, and then turn around again. And so every day there is something new and something unexpected. I’m sure; I’m convinced that Moscow is trying to influence, but not as massively as in the past, not as successfully. Because this is a much more sophisticated situation than in 2014. Right after Maidan, we had black and white; now there is no black and white, now it is a much more complicated political setting. And I think for Moscow the outcome that in Ukraine 5 years after Maidan you have democratic elections which run smoothly, really democratic, and a Jew becomes next president of Ukraine. You know, if you for years have been telling that this is a fascist country run by Junta, and suddenly this fascist country run by Jew, you know, it is kind of complicated. You need a little bit more than a troll factory to deal with it. 52

I think first of all they misjudged Trump. FSB has a psychological department; they must have pictures, the psychological composition of Trump. I do not think they understand the whole level to what it is a psychopath. Trump for them not so much a guy they can do business with or rely on. He is destructive for the United States, so the United States has turned into a country which only busy with itself, trying to deal with an idiot as a president. With a Congress which is fundamentally divided with a Republican party, that does not want to take defence for Trump, because of they are afraid that they will lose votes, not understanding that in the long run, they will win much more if they keep this guy in place. And this is already good for Russia because with America being busy with itself it kind of disappears from the international scene. The second part is that with the trolling that they are now doing is not even trying to oppose a truth with their own one fake truth. The whole campaign is bringing so many varieties of the story that people believe in things altogether. It is destroying truth as a factor. So if we look at the March 2017 (Dutch general election), they have been spreading multiple varieties, and that is fine, nobody believes in these varieties anymore because there are so many proofs that what they have been telling is absolute nonsense. Because the general feeling is there is a smog there is a fire, the truth is always somewhere in the middle, so Russia is lying, but probably the Dutch are lying as well. And that is enough because people stop believing in the fact that there might be the truthful information and that is a success for the trolling. And I think that this is what we need to figure out, how to deal with this. And that’s a matter of time, 4 or 5 years ago trolling was something new and now it is a standard. I think what we are looking at is two main outcomes: one is to understand to one extend if trolling takes place in Lithuania also with upcoming elections and political developments in the country. Because it is clear that Lithuania is one of the target countries, we do not have this size of Russian community as in Latvia and Estonia, so you cannot do it in order to create trouble in the country, so you have to use other means. And I think that most interesting thing about the training that we are going to organise is that we are going to pull of the people within the younger generation that are interested in this direction and have first basic skills, and further can become experts in this field. Especially in March 2017, because the investigation is taking place there. Holland was seen as a prime country where you can investigate. Because the research is taking place in the Netherlands, and this is a kind of centre where the investigation is taking place and has been seen as a prime county trying to influence the outcome of the investigation. And of course initially, the attempt was to spread the alternative varieties of the story, so that the Dutch would lose track and that the Dutch population would stop believing the work of investigating

53 community. So now I think the main focus is to create the disbelief in the outcome, in order to limit the damage that the outcome will do. So this is the key. All the things in regards to emigration, emigration from the , , from North Africa, trying to help stimulate anti-emigrant attitude within the country, support the views of the right wings populist parties. And again, in a way we do not see anything new. The mechanism that they were using was completely different, but the essence is completely the same. Try to make a voice within the country stronger than it actually is. And if we look at the outcome of the Dutch elections several weeks ago. I think it was quite successful. Because we now have this populist party in the Netherlands, which is I think even more dangerous than the previous one. Because they mix so-called an intellectual message with very nasty messages from the 1930th, that is neo-fascist, and I think that it is much more dangerous.”

54