<<

978-9934-564-92-5

SOCIAL MANIPULATION 2020 HOW COMPANIES ARE FAILING TO COMBAT INAUTHENTIC BEHAVIOUR ONLINE

PREPARED BY THE NATO STRATEGIC COMMUNICATIONS CENTRE OF EXCELLENCE

���������������������������������������������������������������������������� 1 ISBN: 978-9934-564-92-5 Project Manager: Rolf Fredheim Authors: Sebastian Bay, Anton Dek, Iryna Dek, Rolf Fredheim Research: Singularex Copy Editing: Anna Reynolds Design: Kārlis Ulmanis

This report was completed in November 2020, based on an experiment that was conducted September to October 2020.

Singularex is a Social Media Intelligence and Analytics company based in Kharkiv, Ukraine. Website: www.singularex.com Email: [email protected]

NATO STRATCOM COE 11b Kalnciema Iela Riga LV1048, Latvia www.stratcomcoe.org /stratcomcoe @stratcomcoe

This publication does not represent the opinions or policies of NATO or NATO StratCom COE. © All rights reserved by the NATO StratCom COE. Reports may not be copied, reproduced, distributed or publicly displayed without reference to the NATO StratCom COE. The views expressed here do not represent the views of NATO.

2 ����������������������������������������������������������������������������� EXECUTIVE SUMMARY

Introduction The experiment

Antagonists, from foreign to To test the ability of social media compa- terror groups, anti-democratic groups, and nies to identify and remove manipulation, commercial companies, continually seek to we bought engagement on thirty-nine Face- manipulate public debate through the use of book, , Twitter, YouTube, and TikTok coordinated social media manipulation cam- posts, using three high-quality Russian social paigns. These groups rely on fake accounts media manipulation service providers. For and inauthentic behaviour to undermine on- 300 € we received inauthentic engagement in line conversations, causing online and offline the form of 1 150 comments, 9 690 likes, 323 harm to both society and individuals. 202 views, and 3 726 shares on Facebook, Ins- tagram, YouTube, Twitter, and Tiktok, As a testament to the continued interest of us to identify 8 036 accounts being used for antagonists and opportunists alike to manip- social media manipulation. ulate social media, a string of social media companies, researchers, intelligence services, While measuring the ability of social media and interest groups have detailed attempts to platforms to block fake account creation, manipulate social media conversations during to identify and remove fake activity, and the past year. Therefore, it continues to be es- to respond to user reports of inauthentic sential to evaluate whether the social media accounts, we noted that some of the plat- companies are living up to their commitments forms studied had made important improve- to counter misuse of their platforms. ments. However, other platforms exhibited a continued inability to combat manipula- In an attempt to contribute to the evalua- tion. Of the 337 768 fake engagements pur- tion of social media platforms, we re-ran our chased, more than 98 per cent remained ground-breaking experiment to assess their online and active after four weeks, and even ability to counter the malicious use of their discounting fake views, more than 80 per services. This year we spent significant effort cent of the 14 566 other fake engagements to improve our further, and we delivered remained active after a month. also added a fifth social media platform, Tik- Tok, to our experiment.

���������������������������������������������������������������������������� 3 While these numbers aren’t much to celebrate, ed last year, and we have not seen any mean- we did observe significant pushback, primari- ingful improvements since then. In fact, since ly from Facebook and Twitter. In some cases, our first report, YouTube has been overtaken Facebook managed to remove as much as 90 by both Facebook and Instagram. per cent of the fake views before the manip- ulation service providers restored their work. TikTok, the latest platform added to our exper- This is an important step in the right direction. iment, seems to be nearly defenceless against platform manipulation. None of the manipu- Yet another win this year was a significant im- lation was prevented or removed by the plat- provement by Facebook in blocking the cre- form, making it by a distance the easiest to ation of inauthentic accounts. But despite be- manipulate. In hindsight, we now realise that ing the most effective at this, Facebook was the other platforms could be doing worse. the only platform studied to show an increas- ing half-life for active inauthentic accounts. Only YouTube can counteract fake views in While none of the measures introduced by any significant way. On Facebook, Instagram, any of the platforms are robust enough to Twitter, and TikTok, fake views continue to stop persistent users or organisations from be quickly delivered, and are seemingly never manipulation, both Facebook and Twitter removed. On Instagram, we managed to get have made it significantly more challenging. 250 000 fake views delivered within an hour. Antagonists use fake views to manipulate Twitter stands out in its ability to remove in- platform algorithms to influence what social authentic accounts from the platform; fake media users see. This remains a significant accounts disappear 40 per cent faster than challenge for all platforms. in 2019, indicating that Twitter is three times faster than Facebook at removing accounts Our investigation has also shown that re- engaged in inauthentic activity. porting an account for confirmed social me- dia manipulation does not induce a platform Twitter and Facebook are now relatively good to block that account. Of the 499 accounts at blocking simpler automatic manipulation (100 to Facebook, 100 to Instagram, 100 to services, most notably automated comments, YouTube, 100 to Twitter and 99 to TikTok) we pushing manipulation service providers to rely reported as verified to have engaged in inau- on human labour to a greater extent. While there thentic behaviour, 482 accounts remained ac- isn’t any clear indication that this is driving up tive five days after they were reported. Indeed, prices yet, any human-based manipulation will in all cases where we received feedback on be more expensive than an automatic solution. the accounts we reported, it was to state that the user in question did not violate the plat- YouTube maintains its position as the least form’s terms of service. We conclude that re- effective of the four major platforms we test- porting and moderation mechanisms must be

4 ����������������������������������������������������������������������������� improved so that a larger share of inauthen- haviour, and to developing the frameworks tic accounts reported to the platforms are necessary to impose economic, diplomatic, removed—even if only a single user reports or legal penalties potent enough to deter gov- them. Reported inauthentic accounts must ernments, organisations, and companies from not regularly escape sanction. breaking the norms of online behaviour. Conclusions

The most important insight from this exper- iment is that platforms continue to vary in their ability to counter manipulation of their services. Facebook-owned Instagram shows how this variation exists even within compa- Based on our experiment, we nies. Instagram remains much easier to ma- recommend that governments nipulate than Facebook and appears to lack introduce measures to: serious safeguards. Tellingly, the cost of ma- nipulating Instagram is roughly one tenth of the cost of targeting Facebook. 1. Increase transparency and In 2020, Twitter is still the industry leader in develop new safety standards combating manipulation, but Facebook is rap- for social media platforms idly closing the gap with impressive improve- ments. Instagram and YouTube are still strug- 2. Establish independent and gling behind, but while Instagram is slowly well-resourced oversight of moving in the right direction, YouTube seems social media platforms to have given up. TikTok is the defenceless newcomer with much to learn. 3. Increase efforts to deter social media manipulation Despite significant improvements by some, none of the five platforms is doing enough 4. Continue to pressure social to prevent the manipulation of their services. media platforms to do more Manipulation service providers are still win- to counter the abuse of their ning. services

This ongoing and evolving threat has un- derscored the need for a whole-of-society approach to defining acceptable online be-

���������������������������������������������������������������������������� 5 The evidence is clear: schemers around the world take every opportunity to manipulate social media platforms for a variety of commercial, criminal, and political reasons.

INTRODUCTION

One year ago, the NATO StratCom Centre of In September 2020 the European commission Excellence carried out a ground-breaking ex- presented an assessment of the effectiveness periment to assess the ability of social me- of the Code of Practice on one dia companies to counter the malicious use year after implementation. The commission of their services. We showed that an entire concludes that the social media platforms industry had developed around the manipu- have put policies in place to counter the ma- lation of social media1, and concluded that nipulation of their services, but lack of trans- social media companies were experiencing parency is still too great to enable a thorough significant challenges in countering the ma- evaluation of the impact of social media ma- nipulation of their platforms. nipulation.5 As a response to this enduring challenge, the European commission’s recent- Since then the companies committed to ly launched Action Plan for Democracy out- improving their defences, especially ahead lines a commitment by the Commission ‘to of the US 2020 presidential elections. Face- overhaul the Code of Practice on Disinforma- book,2 Google,3 and Twitter4 have all updated tion into a co-regulatory framework of obliga- their policies and increased transparency re- tions and accountability of online platforms’.6 garding the manipulation of their platforms during the past year, but it continues to be As antagonists and opportunists continue difficult, often impossible, to independently to ramp up their efforts to manipulate social assess the effectiveness of their efforts. media, researchers, intelligence services, in-

6 ����������������������������������������������������������������������������� dependent interest groups, and the social Building on our previous work we decided to media companies themselves have published re-run the research17 we conducted in 2019 numerous reports over the past year detail- using experimental methods to assess the ing attempts to manipulate conversations ability of social media companies to counter on these platforms. The Ukrainian Security manipulation on their platforms. In order to Services (SSU) documented their recent dis- further refine our methodology for this itera- covery of a number of advanced bot farms,7,8,9 tion of the experiment, we decided to focus and many other agencies have reported on on a smaller number of posts manipulated the continued activity of the now-notorious by a smaller number of manipulation service Research Agency in ,10 and providers, which allowed us to track differ- on a multitude of state actors and other or- ences among the responses of the platforms ganisations ranging from Iran11 and China12 studied, rather than the relative performance to lobby groups13 and terror organisations14 of the manipulators. Another change was the involved in social media manipulation. There addition of the Chinese-owned social media are also many academic articles and studies platform TikTok to our experiment. TikTok has detailing the systematic manipulation of con- become one of the fastest growing social me- versations, including our own quarterly Ro- dia platforms, currently ranked as the seventh botrolling15 report and several recent studies largest with almost 700 million active users.18 on the US presidential election.16 In the context of the US presidential election The evidence is clear: schemers around the we partnered with US Senators Chuck Grass- world take every opportunity to manipulate ley (Republican Party) and Chris Murphy social media platforms for a variety of com- (Democratic Party) to assess to what extent mercial, criminal, and political reasons. It will their social media accounts in particular, and remain important to evaluate how well social verified social media accounts in general, are media companies are living up to their com- protected against manipulation. mitments, and to independently verify their Our experiment provides original, and much ability to counter the misuse of their plat- needed, insight into the ability of social me- forms. dia companies to counter the abuse of their platforms.

���������������������������������������������������������������������������� 7 The Social Media Manipulation Industry

Many of the conclusions from our initial re- others.21 Bot-controlled accounts cost only a port, The Black Market for Social Media Ma- few cents each and are expected to be blocked nipulation,19 and from last year’s20 iteration of relatively quickly. More elaborate inauthentic this report still hold true—the manipulation accounts require some direct human control. market remains functional and most orders They can cost several hundred dollars to pur- are delivered in a timely and accurate manner. chase and often remain online for years. Social media manipulation remains widely available, cheap, and efficient, and continues to be used by antagonists and spoilers seek- Developments in 2020 ing to influence elections, polarise public opin- ion, sidetrack legitimate political discussions, and manipulate commercial interests online. During the past year, most indicators have sig- nalled that manipulation service providers are The industry feeds the market for inauthentic prospering. They are upgrading their techni- comments, clicks, likes, and follows. Buyers cal capabilities and improving the range from individuals seeking to boost their of their services. The larger providers are up- to influencers gaming the online dating their frontend and backend systems as system to state-level actors with well as adding new services. There is evidence political motivations. Social media manipu- to suggest that the volume of engagement for lation relies on inauthentic accounts that en- sale is increasing. We have found manipula- gage with other accounts online to influence tion service providers offering up to 10 million public perception of trends and popularity. fake views on Twitter and Facebook, 50 mil- Some inauthentic accounts are simple, [ro] lion fake views on Instagram, and as many as bot-controlled accounts without profile pic- 100 million fake views on IGTV. There is also tures or content, used only to view videos or a greater plurality of services on offer with an retweet content as instructed by a computer increase in manually controlled audience-spe- program. Others are elaborate ‘aged’ accounts cific services. We have identified what we as- with long histories meant to be indistinguish- sess as being possible efforts to introduce AI able from genuine users. text generation for fake comment delivery.

Bots are a very cost-efficient way of generating Despite increasing pushback from a number artificial reach and creating a wave of ‘social of social media platforms, all major manipula- proof’ as typical users are more likely to trust tion providers identified by us have remained and share content that has been liked by many in business and several new actors have

8 ����������������������������������������������������������������������������� emerged. Together, the three manipulation providers we used for this exper- iment claim to have completed more than 17 million orders; one of them with 27 employees serving more than 190 000 regular customers.

Social media manipulation continues to be cheap and readily available through a multitude of professional social manipulation service providers. In certain cases, simple automated manipulation is no longer available and only well-re- sourced manipulation service providers are able to manipulate all platforms. While we are seeing evidence that platform pushback is starting to have an effect, in 2020 manipulation service providers remain in the lead in the digital arms race.

Three insights

1. The scale of the industry is immense. The infrastructure for develop- ing and maintaining social media manipulation software, generating fictitious accounts, and providing mobile proxies is vast. We have identified hundreds of providers. Several have many employees and generate significant revenues. It is clear that the problem of inauthen- tic activity is extensive and growing.

2. During the past year the manipulation industry had become increas- ingly global and interconnected. A European service provider will likely depend on Russian manipulation software and infrastructure providers who, in turn, will use contractors from Asia for much of the manual labour required. Social media manipulation is now a global industry with global implications.

3. The openness of this industry is striking. Rather than lurking a shad- owy underworld, it is an easily accessible marketplace that most web users can reach with little effort through any search engine. In fact, manipulation service providers still advertise openly on major social media platforms and search engines.

��������������������������������������������������������������������������� 9 We spent 300 € and received 1 150 comments, 9 690 likes, 323 202 views, and 3 726 shares on Facebook, Instagram, YouTube, Twitter and TikTok enabling us to identify 8 036 accounts being used for social media manipulation.

THE EXPERIMENT

The aim of our experiment was twofold; our purpose-made accounts were of an apo- first, we aimed to further develop and test a litical nature to avoid any risk of actual impact methodology for assessing the ability of so- beyond the framework of the experiment. cial media platforms to identify and counter In contrast to the 2019 experiment, we did not manipulation using commercial manipulation seek to assess the performance of manipu- service providers; second, we used our updat- lation service providers; instead we selected ed methodology to evaluate the performance reliable providers and monitored their service of the social media companies in 2020 in delivery to encourage them to do their utmost comparison to their performance in 2019 as to deliver the services we purchased. The assessed by our initial experiment. 2020 experiment was designed primarily to For this year’s iteration of the experiment test the ability of social media companies to we used three reliable Russian social media withstand manipulation from well-resourced manipulation service providers to buy en- commercial manipulation service providers. gagement on Facebook, Instagram, Twitter, Setting up the experiment this way allowed us YouTube and, for the first time, on TikTok. We to better compare the relative performance of purchased engagement with accounts up the social media platforms tested. specifically for this experiment. All posts in

10 ���������������������������������������������������������������������������� How is this relevant?

How is buying a thousand fake likes on a fake By using commercial manipulation service account relevant for the assessment of the providers and identifying and tracking the overall ability of social media companies to accounts they were using to deliver manipula- counter manipulation? It could be argued that tion, we could judge the ability of social media bought manipulation is more likely to be re- companies to identify bot networks and inau- ported, and therefore detected, if it engages thentic coordinated behaviour. with current content that influences actual To lay to rest any potential difference be- conversations online. tween buying engagement on real and on This time we were not testing the ability of fake accounts, we conducted a separate ex- social media managers or the public to detect periment to test if there is any difference in and report inauthentic activity or the ability of level of protection between verified accounts the companies to remove posts based on the and ordinary accounts. For this experiment content and context alone. Instead we wanted we partnered with US Senators Grassley (R) to test the ability of the platforms to detect and and Murphy (D) and bought manipulation on remove networks of inauthentic users actively one apolitical neutral post on the Facebook, manipulating public conversations online. Twitter, and Instagram accounts of both men.

��������������������������������������������������������������������������� 11 The experiment was conducted the week be- fore the US Presidential election, during a time when social media companies were expected to be on special alert and to have implement- ed heightened detection mechanisms. This case study was intended to assess the max- imum protection capabilities of the social me- dia platforms studied.

One objection to our research design is that in selecting apolitical content, we may not have been testing platforms’ ability to counter ma- nipulation that might mislead platform users and impact the online conversation. Howev- er, conducting such an experiment would be hard to justify from an ethical perspective [see page 14]. In any case, the platforms claim to implement algorithms to detect platform mis- use broadly, not just within the context of in- formation operations. Facebook’s community standards, for instance, prohibit specific be- haviours (inauthentic behaviour, and especial- ly coordinated inauthentic behaviour), rather than just specific content.22

12 ���������������������������������������������������������������������������� Buying social media Tracking ability of manipulation, tracking social media companies delivery and ability of to remove reported social media platform to confirmed inauthentic accounts identify and remove the manipulation Oct-Nov Nov-Dec

Sep- Oct Nov

Data analysis Reporting a random and verification sample of the identified inauthentic accounts to the social media companies 2020

Tube, Twitter and TikTok enabling us to iden- The scale and timeline tify 8 036 accounts being used for social of the experiment media manipulation. Below, we compare the cost of our basket of manipulation services to For the 2019 version of our experiment, we assess whether prices are rising year by year. bought engagement on 105 different posts On average, services were a bit more expen- on Facebook, Instagram, Twitter, and You- sive than in 2019, but still roughly the same Tube using 16 different manipulation ser- as in 2018. vice providers. In 2020 we focused on three Our work was carried out during six weeks in reliable providers and increased the quan- September and October 2020. To assess the tities of engagement purchased. Last year ability of the platforms to remove inauthentic we spent 300 € to buy 3 530 comments, 25 engagement, we monitored our bought en- 750 likes, 20 000 views, and 5 100 follow- gagement from the moment of purchase to ers enabling us to identify 18 739 accounts one month after it appeared online. We report- being used for social media manipulation. ed a sample of the inauthentic engagement to the social media companies and continued This year we spent 300 € and received 1 150 monitoring to measure the time it took for the comments, 9 690 likes, 323 202 views, and platforms to react. 3 726 shares on Facebook, Instagram, You-

��������������������������������������������������������������������������� 13 Five steps of the experiment

Buying likes, comments, views, and followers for neutral posts on our own inauthentic accounts

Tracking how long Buying likes, it takes to remove comments, views, accounts after and followers for reporting a random neutral apolitical sample posts

Tracking how long Tracking performance inauthentic accounts and response time of stay on the social media platforms in removing platform and with what inauthentic activity they engage

During the experiment, we recorded how quickly the manipulation service providers The ethics of the were able to deliver their services. We then experiment collected data on how the five social media platforms responded to the manipulated con- An important part of this experiment was tent by periodically measuring whether it had minimizing the risks involved, to ensure that been removed. The experiment was organised private individuals were insulated from the into the five steps visualised here: experiment to the highest degree possible. While it would have been possible to design

14 ���������������������������������������������������������������������������� an experiment to assess whether commer- Throughout the experiment we did not ob- cially purchased manipulation can influence serve any indication that our false engage- public conversations, such research would be ment had been noticed by authentic online unethical in that it would interfere with gen- users. For this reason, we concluded that we uine discussions and undermine important successfully managed to conduct the exper- values such as . iment without causing any harm to genuine online conversations. Our experiment was set up so that we could minimize risk and carefully monitor any inad- Furthermore, we acted in the spirit of the vertent effects. In order to achieve this, we white hat programs of the social media com- chose to buy the fake engagement—views, panies themselves; these programs recognise likes, comments, and follows—using our own the importance of external security research- fake accounts. We continuously monitored ers while emphasizing the importance of pro- our accounts to ensure there would be no au- tecting the , integrity, and security of thentic human engagement on them. We also users. We interacted with the two real, verified chose apolitical and trivial content to engage accounts only after having obtained explic- with, and all bought engagements were strict- it from the senators. We spared no ly designed and monitored to minimize risk to effort to avoid privacy violations and disrup- real online conversations. tions to real users; we did not access any real user data nor did we in any way attempt to For the case study in which we engaged with exploit identified weaknesses for any reason content on the social media profiles of the two other than testing purposes.23 senators, we agreed with them in advance which posts would be targeted and how. We Finally, we made every effort to minimize the made sure to engage only with apolitical, dat- amount of bought engagement, to avoid un- ed content without any ongoing authentic necessarily supporting manipulation service conversations. We had also prepared inter- providers. This year we reduced the number of ventions to alert any real users of our experi- posts engaged with and capped the amount ment had we noted any authentic interactions spent at 300 €. with the targeted posts.

��������������������������������������������������������������������������� 15 Surprisingly 10 € will buy more than a thousand comments on Facebook-owned Instagram, but the same amount will only fetch 130 comments on Facebook.

ASSESSMENT OF THE PLATFORMS

Our assessment criteria

We assessed the performance of the five year we determined that false likes bought on social media companies according to seven Instagram remained even after the account criteria measuring their ability to counter the used to deliver them had been blocked. malicious use of their services: 1) Blocking the creation of inauthentic accounts, 2) Re- The seventh criterion is new for this iteration. moving inauthentic accounts, 3) Removing It was added to acknowledge the transparen- inauthentic activity, 4) Cost of services, 5) cy of platform policy enforcement. We argue Speed and availability of manipulation, 6) Re- that increased transparency will reduce public sponsiveness, and 7) Transparency of efforts. harm by holding antagonists responsible and We have further developed and refined our cri- by improving users’, researchers’, and politi- teria since last year’s experiment. cians’ awareness of the scale and effects of social media manipulation. For this year’s iteration we removed criteria assessing the ability of social media plat- These criteria can serve as general bench- forms to undo the activity of inauthentic ac- marks for assessing the ability of platforms to counts. While it is important that the activity counter social media manipulation. of inauthentic accounts is removed together with the accounts themselves, this has be- come increasingly difficult to assess. Last

16 ���������������������������������������������������������������������������� 1 Inauthentic accounts are critical uccess in for the functioning of blocking the manipulation services, and creation of platforms aim to prevent their inauthentic accounts creation. Blocking accounts raises the barrier for manipulation, making it more difficult and costly. 2 Ability to This ability is important to combat detect and the spread, impact, and remove ‘time-on-platform’ of inauthentic inauthentic 3 activity. accounts Ability to detect and Given the speed of social media, remove timely detection is important for inauthentic limiting the effects of social media activity manipulation. 4

The more costly it is to buy Cost of manipulation, the less likely it is purchasing that large scale campaigns will be manipulation 5 carried out. Rapid and successful delivery of manipulation indicate that a peed of platform has insufficient protection. delivery Slow delivery indicates providers need to drip-feed interventions to avoid anti-manipulation efforts. 6 As a last resort, platforms turn to user moderation to detect esponsiveness fraudulent activity. The ability of to reports of the platforms to quickly assess and inauthentic respond to reports is an important activity 7 part of combating platform abuse. The transparency of social media Transparency companies to communicate of actions takedown efforts and results increases accountability and contributes to reduced public harm.

��������������������������������������������������������������������������� 17 creating inauthentic accounts on any of the 1. Blocking the creation platforms we studied. The continued low cost of inauthentic accounts and effectiveness of manipulation services is proof of this, as we will see in the coming Blocking the creation of fake or inauthentic chapters. accounts is perhaps the most important step YouTube, Instagram, and TikTok are in dire social media platforms can take to prevent need of improved protections. This is espe- abuse. The easier and faster it is to create cially surprising since Instagram and Face- fake accounts, the easier it will be to manipu- book are owned by the same company, yet late social media activity. their protection protocols are worlds apart. In order to assess the ability of the social media companies to protect their users from 2. Removing manipulation, we attempted to create fake ac- counts using a variety of methods. We already inauthentic accounts knew that resellers of fake accounts provide The longer bots and inauthentic accounts detailed instructions on how to prevent them used to deliver manipulation remain on a plat- from being blocked; these typically include a form, the lower the cost for the manipulation set of preconfigured cookies to use with the service providers, as they don’t have to spend account, dedicated proxies, specific IP loca- time and money to replace blocked accounts. tions that should be avoided, and other con- figuration descriptions required in order to run Last year only 17 per cent of the inauthen- the account without getting banned. It was tic accounts identified on Instagram and clear from these detailed instructions that YouTube had been removed after six weeks significant improvements had been made in making them the worst-performing platforms detecting suspicious account activity. for blocking inauthentic accounts. Facebook ranked third, having removed 21 per cent. The The biggest change this year was that Face- least-poorly-performing platform was Twitter, book had added a number of new features for which succeeded in removing 35 per cent. blocking inauthentic accounts. It now requires significant effort to bypass Facebook’s protec- This year, the figures remained roughly the tion protocols; the account-generation industry same at 0.4 per cent account removal per must overcome these obstacles and some of day. However, in calculating the half-life of all the larger suppliers now provide specific tools accounts identified as being used to deliver and guidelines to help their customers. fake engagement, we can see that it has de- creased from 128 days to 118 days, so there Ultimately, however, none of the protection is actually a slight increase in the platforms’ measures currently in place are robust enough ability to remove them. The mean lifespan for to stop persistent users or organisations from inauthentic accounts has decreased from 224

18 ���������������������������������������������������������������������������� days to 203 days before removal; although sufficient to demonstrate there has been no marginal, this is . measurable improvement.

While social media companies continue to We conclude that there are significant differ- report that they block millions of inauthentic ences among platforms, and that likely more accounts annually, we are unable to verify the can be done to reduce the half-life of active effectiveness of these efforts. On the contrary, inauthentic accounts, especially with regard inauthentic accounts seem to be active for to the cheap and simple kind of inauthentic quite a long time before they are detected and accounts that we tracked for this experiment. deleted. While it has become more difficult to create a fake account, we see only marginal 3. Removing improvements in decreasing the lifespan and half-life of recognised inauthentic accounts. inauthentic activity

While their average lifespan of an inauthen- Removing inauthentic activity is the process tic account on Facebook, Twitter, and Ins- of identifying fake engagement and removing tagram24 has decreased over the past year, it after it has been delivered. The faster the ac- Facebook stands out as the only platform tivity is removed, the smaller the effect it will with an increased half-life for inauthentic have on social media conversations. accounts. Twitter was the only platform that Last year we showed that social media com- managed to significantly reduce half-life. This panies struggled to identify and remove fake does not necessarily mean that the situation activity as the vast majority of all the fake en- is worse on Facebook—these figures are cal- gagement was still online four weeks after de- culated from small sample sizes, but they are BOTBOT HALF HALF LIFE LIFE BY BY PLATFORM PLATFORM

219219 2222

Facebook Facebook 124124 147147 Facebook Facebook daysdays daysdays

Twitter Twitter 6868 4141 Twitter Twitter daysdays daysdays

Instagram Instagram 193193 165165 Instagram Instagram daysdays daysdays

128128 118118 AVERAGEAVERAGE daysdays daysdays AVERAGEAVERAGE

��������������������������������������������������������������������������� 19 livery. The platforms performed relatively well be practically defenceless. Fake views contin- only with regard to fake followers—roughly ue to be delivered quickly to Facebook, Insta- half had been removed after four weeks. gram, Twitter, and TikTok and seem never to be removed. As this form of manipulation is This year’s results were roughly the same for the cheapest, we are led to speculate that the Instagram and YouTube, with few engage- manipulation service providers might be us- ments removed. Our new subject, TikTok, did ing a different method to manipulate the num- equally poorly. Of a total 337 768 delivered ber of views than they use for delivering likes fake engagements across all platforms, more and comments—one that manages to exploit than 98 per cent remained online and active flaws in how the platforms count views in- after four weeks, and even discounting fake stead of relying on fake accounts to engage. views, more than 80 per cent of the 14 566 fake engagements delivered remained active after a month. Twitter and Facebook showed 4. Cost of services progress this year, demonstrating active ef- The cost of manipulation is a good indicator forts to counter manipulation during the four- of how effectively social media platforms are week test cycle. combating manipulation. If accounts used to Facebook’s removal of inauthentic accounts perform manipulation are removed, manipula- resulted in a U-shaped pattern of the false tion service providers have to spend time and engagement being added, removed, and then money to replace them. When social media replaced by the manipulation service provid- platforms redesign their service to make the ers. Fake engagement gradually began to be scripts used to seed manipulation obsolete, removed after five days and reached- max developers have to update their scripts. These imum removal after about two weeks. On costs are passed on to consumers. Twitter, some manipulation service providers We compared the price of a basket of manipu- had their engagement removed after roughly lation consisting of 100 likes, 100 comments, 12 hours. To bypass the protection protocols 100 followers and 1000 views from five Rus- on Twitter and Facebook, the providers now sian manipulation service providers to arrive drip-feed engagement; this seems to work at a mean price for 2020. The result is quite rather well. It does, however, reduce the speed revealing. YouTube is the most expensive of manipulation service delivery, making it platform to manipulate at a cost of 11,12 € for harder to effectively manipulate ‘live’ conver- the basket. Twitter joins YouTube at the high sations. end with a price of 10,44 €. A Facebook ma- All platforms still face significant challenges nipulation basket is moderately priced at 8,41 in countering fake video views. Only YouTube €, while TikTok and Instagram are the cheap- has a somewhat effective system for counter- est at 2,73 € and 1,24 € respectively. ing fake views; all other platforms continue to

20 ���������������������������������������������������������������������������� We found an increase of roughly 20 per cent hundred times cheaper than fake comments. in the price of a basket of manipulation ser- While this ratio remains roughly the same, vices across all social media platforms from there is a significant difference between how 2018 to 2020. However, a closer look at specif- much manipulation can be bought for 10 € on ic services reveals a tendency for comments the various social media platforms. While 10 to become more expensive, while the cost of euro will buy 90 000 views on Instagram, it will likes remains the same or has got cheaper. only buy you 7000 views on YouTube. Surpris- Average price-levels are roughly the same ingly 10 € will buy more than a thousand com- as in 2018, meaning that there has been no ments on Facebook-owned Instagram, but the significant change in prices over the past same amount will only fetch 130 comments two years. There has been a slight decrease on Facebook. in the cost of manipulation services for You- Over the past year, fake comments have be- Tube, and a slight increase for Twitter. Prices come more expensive, which is a positive remain roughly the same for Instagram and development, but social media manipulation Facebook manipulation services as in 2018, in general continues to be cheap and readily however, prices for Facebook manipulation available. There have been no significant price had dropped significantly in 2019. increases since last year and the differences Fake views continue to be the cheapest form in price between services and platforms re- of manipulation—about ten times cheaper main relatively unchanged. than fake likes and follows, and roughly one

€ 16.00 CHANG IN THE COST OF A BASKET OF MANIPULATION SERVIC € 14.00 Twitter Facebook Twitter € 12.00 YouTube 100 likes 100 likes 100 comments 100 comments Facebook € 10.00 1000 views 1000 views 100 friends 100 followers Average € 8.00

€ 6.00 Instagram YouTube

100 likes 100 likes € 4.00 100 comments 1000 views TikTok 1000 views 100 comments € 2.00 100 followers 100 subscribers Instagram € 0.00 2018 2019 2020

��������������������������������������������������������������������������� 21 3135 likes 331 likes HOW MUCH MANIPULATION 1914 likes 1 824 likes CAN YOU BUY FOR 10 EURO? 7577 likes

13 comments 14 comments 114 comments Buying comments 189 comments is the most expensive Buying views 455 comments form of manipulation on TikTok and Instagram is cheaper 6994 views than on ‚her 2431 views platforms 3 37 views 992 views 999 views

598 followers 2273 followers 1272 views 8265 followers 3431 followers

0 2000 4000 6000 8000 10000 12000 14000

YouTube Facebook Twitter Instagram TikTok

PRICE COMPARISON FOR MANIPULATION SERVIC

2 E YouTube Facebook Twitter Instagram TikTok 1,8 E 1,6 E

1,4 E 1,2 E

1 E

8 CET 6 CET 4 CET 2 CET

1 IE 1 CET 1 IE 1 FE

22 ���������������������������������������������������������������������������� Comparing the speed of delivery of all manip- 5. Speed and availability ulation services, excluding failed deliveries, of manipulation across all social media companies studied, we found that roughly 60 percent of all manip- ulation services were delivered after 24 hours. Manipulations on Twitter arrive most quickly Last year we found that manipulation services but are also removed most quickly (on aver- for Instagram in general were the most reli- age). TikTok performs worst; manipulations able, while comment services for Facebook, are delivered almost instantaneously and re- Twitter, and YouTube were the least reliable. main over time. This has changed over the past year and there is enough evidence to conclude that platform Views and likes are still delivered relatively efforts to improve security have had some quickly. Views are usually delivered within an effect. This is most notable on Facebook, hour, with the exception of YouTube, where it where automated comment services are no took as long as a month for all the fake views longer widely available. Instead, manipulation we had purchased to be delivered. On Insta- service providers now must pay real humans gram, we managed to get 250 000 fake views to deliver fake comments. While this service delivered within an hour. Likes were usually has existed for quite some time, the fact that also delivered within an hour; when it took lon- cheaper automated services are no longer ger we surmise that this was likely because available on Facebook is a testament to the the manipulation service providers were slow effectiveness of Facebook’s efforts. to launch their services, rather than a result of platform countermeasures. Although it is important to recognise and en- courage any improvements made by the plat- There has been an overall reduction in the forms, social media manipulation continues speed of delivery of manipulation services. to be widely accessible and professional. The In 2019, most services were delivered in full providers we used are highly professional and within an hour. The main difference in 2020 customer oriented, offering discounts, mon- is a reduction in the availability and speed of ey-back guarantees, and 24/7 support. comment services on Facebook and Twitter. YouTube manipulation delivery is roughly Platform pushback, primarily from Facebook equal to that of 2019; manipulation services and to some extent from Twitter, is challeng- are rather slow, and large orders of more than ing primarily automatic manipulation ser- ten thousand views require days or weeks to vices, most notably comments. Another sig- be delivered in full. For the other platforms nificant change is that comment services are views are delivered as quickly and effectively now more often drip-fed, meaning that it now this year as they were last year. takes slightly longer for manipulation services to deliver purchased comments safely.

��������������������������������������������������������������������������� 23 COMPARISON OF SPEED OF DELIVERY OF FAKE ENGAGEMENTS

100% TikTok

YouTube 80% Instagram 60% Facebook

40% Twitter

20%

0% 12h after 24h after 48h after 72h after

6. Responsiveness After the end of phase one of the experiment, the speed of online conversations, this year we we reported 100 random accounts used for shortened the assessment time to five days. social media manipulation to each of the Our reporting produced better results in 2020, platforms in our study, and then monitored although the platforms’ response is still woe- how many accounts the platforms were able fully insufficient to combat manipulation to remove within five days. It is worth reiter- effectively. Facebook was most successful, ating that the accounts we reported were removing nine of the 100 accounts reported. the accounts that delivered the manipula- Twitter followed close behind, removing sev- tion we bought, meaning that we were 100 en accounts and temporarily suspending two per cent certain the accounts we reported others. Instagram removed just one of the were engaging in social media manipulation. reported accounts, and YouTube and TikTok We reported these accounts anonymously. didn’t remove any.

To better compare this year’s figures to last In 2019, removal rates left a lot to be desired. year’s, we re-checked removals three weeks Facebook removed 12 per cent, Twitter and In- after the time of reporting. At that time Twit- stagram 3 per cent, and YouTube 0 per cent of ter was in first place, having removed twelve the reported accounts after three weeks. Given

24 ���������������������������������������������������������������������������� accounts and suspended eight; Facebook that, given the frequency and diversity of the had removed thirteen accounts, Instagram— engagement, the reported accounts were de- three, TikTok—two, and YouTube still hadn’t livering paid services. removed a single account. We conclude that reporting and moderation This marginal increase suggests that time is mechanisms must be improved so that a larg- not the factor that determines whether a re- er share of inauthentic accounts reported are ported account is removed. It was possibly removed, even if they are reported by a single the natural rate at which fake accounts get user. It is not satisfactory that reported inau- removed, rather than our reporting, that led to thentic accounts regularly escape sanction. the small increases in removals over time.

It is clear that reporting an account as engag- 7. Transparency of efforts ing in social media manipulation does not sig- All the major social media platforms studied nificantly contribute to any of the platforms for this report have increased their commu- removing or suspending that account, even nication efforts regarding measures under- if the validity of the report is confirmed. In- taken to counter abuse. Their efforts are in deed, in all cases where we received feedback alignment with the European Code of Practice about an account we had reported, it was the on Disinformation and the political pressure platform letting us know that the user in ques- the platforms were facing ahead of the US tion had not violated the platform’s terms of presidential election of 2020. Today TikTok,25 service. We suspect that platforms are using Twitter,26 Facebook,27 and Google28 all publish quantitative indicators rather than a manual transparency reports of various kinds, some assessment of reported accounts to deter- of which relate to fake accounts and other mine abuse. Manual vetting of the reported forms of inauthentic coordinated behaviour. accounts would have clearly demonstrated Facebook reported that in Q2 2020, 1,5 billion fake accounts were removed for violating the 9 out of 1 REMOVED ACCOUNTS community standards, a slight reduction from 3 WEEKS AFTER 2019, which was attributed to the platform’s 7 out of 1 REPORTING increased ability to block the creation of fake 29 1 out of 1 accounts. Twitter’s most recent report on fake account removal is from Q3/Q4 2019 out of 1 when the platform reported that it had sanc- out of 1 tioned 2,3 million accounts and suspended close to 800 thousand; Twitter also reported a 30 per cent increase in accounts sanctioned Facebook Instagram YouTube for platform manipulation.30 We have been Twitter TikTok unable to find detailed on fake

��������������������������������������������������������������������������� 25 account removal or on platform manipula- tinuous improvements by the Google Threat tion in general for Google and TikTok. Google Analysis Group is also a positive step, even does provide information on advanced threats if the content of its reports has yet to reach through its Threat Analysis Group.31 the level of detail and transparency that Face- book is delivering. Twitter’s safety blog,32 Facebook’s news ar- chives,33 and TikTok’s Newsroom reporting on While all of these efforts represent steps in safety-related matters34 provide further infor- the right direction, much more can be done mation pertaining to specific threats and ma- in terms of contextualizing information, con- nipulation activities, often focusing on specif- ducting more thorough audits and publishing ic content being taken down. the results, developing disclosure frameworks and collaborative transparency best practic- Facebook should be commended for the de- es jointly with other platforms, etc. From the tailed takedown reports it is now publishing information currently being provided by the on a monthly and case-by-case basis. Twitter platforms, it continues to be difficult to as- should be commended for publishing data sess how the threat landscape is developing that outside researchers can study. The con- and how well counter efforts are working.

26 ���������������������������������������������������������������������������� Twitter remains the industry leader in 2020 but Facebook is rapidly closing the gap with its impressive improvements. Instagram and YouTube are still struggling behind, but while Instagram is slowly moving in the right direction, YouTube has made very few improvements. TikTok is the defenceless newcomer with much to learn.

ASSESSMENT OF EACH PLATFORM’S RELATIVE STRENGTHS AND WEAKNESSES

The performance of each social media platform Facebook has become significantly more dif- is assessed based on the criteria introduced ficult in the last year. above. To assess the relative performance for Despite being the most effective at blocking each of the social media platforms we have the creation of fake accounts, Facebook was given a rating to each company for each cat- the only platform studied that showed an in- egory. These qualitative ratings are based on a crease in the half-life of active inauthentic ac- joint assessment by the research team. counts—currently an average 147 days. In this Facebook regard Facebook is significantly behind Twitter. However, Facebook has made important strides Facebook is the most effective social media in removing inauthentic activity. A portion of platform at blocking the creation of inau- the likes we had purchased were gradually thentic accounts. It uses advanced analytics removed over a period of two weeks. We also and techniques such as requesting the users observed a reduction in the speed of manipu- to upload videos to verify their accounts to lation delivery, especially for fake comments, identify them and keep fake accounts from indicating that manipulation service providers making it onto the platform. In order to cre- can no longer automate such delivery. ate a fake account on Facebook, advanced IP-address and cookie control is now neces- The price of Facebook manipulation decreased sary; this has prompted manipulation service between 2018 and 2019, only to return to 2018 providers to offer lengthy tutorials about how levels again in 2020. This suggests that an to prevent a fake account from being blocked. increase in protection protocols is starting to We conclude that creating fake accounts on translate into higher manipulation costs.

��������������������������������������������������������������������������� 27 Facebook has not improved its ability to re- We observed no significant improvements in move reported inauthentic accounts—at 9 per blocking or removing fake accounts. However, cent after five days, the ratio is still far too low manipulation delivery is slower now, and we to make an impact. Facebook did, however, in- saw that manipulation providers encountered crease its transparency and is currently pub- challenges resulting in only partial deliveries lishing regular updates on takedown activi- for specific kinds of manipulation, such as ties. On 19 November 2020, Facebook further mentions. It remains to be seen whether these refined and updated its community standards ‘wins’ for the platform are temporary. page,35 and provided additional information Instagram’s failure to counter manipulation is regarding fake accounts.36 We are still miss- perhaps most evident in the cost of manipula- ing important details about how Facebook tion. Instagram continues to be the cheapest compiles its statistics and would welcome platform to manipulate, even cheaper than independent assessment and auditing of the TikTok. Moreover, prices are currently falling, data and reports published by the company. suggesting that manipulation services are Facebook’s current research initiatives are a highly available. crucial step in the right direction, and we look forward to studying the results.37 In assessing Instagram’s performance in comparison to 2019, we observed only mar- Facebook has had a good year. We have noted ginal improvements—primarily that delivery improvements in every category of measure- of fake likes and comments are now delivered ment, although some are only marginal. We significantly slower than last year. Fake views commend the platform’s improvements in are still delivered instantaneously. blocking fake account creation and remov- ing fake activity, and its success in stopping We conclude that Instagram remains very some forms of automatic manipulation. easy to manipulate and whatever slight im- However, the fake accounts that are created provements it has introduced in the last still survive too long, and Facebook’s respon- year have made little difference. Instagram’s siveness to reporting leaves much to be de- parent company, Facebook, should share its sired. growing expertise. Instagram Twitter Even though Instagram is owned by Facebook, Twitter continues to be the most effective it is far less effective at countering platform platform overall at countering manipulation, abuse. In fact, Instagram is outperformed by outperforming Facebook in several areas but its parent company in every category we as- falling behind in others. Facebook is slowly sess. Manipulating Instagram remains cheap gaining on Twitter’s lead. and effective.

28 ���������������������������������������������������������������������������� We observed less pushback against the cre- from 15 percent between 2018 and 2019 to ation of fake accounts by Twitter this year only 4,5 per cent between 2019 and 2020. than in 2019. None of our own accounts were In terms of transparency Twitter lags behind blocked this year, and we observed fewer Facebook in updates and regular takedown counter measures in 2020 than in 2019. Face- disclosures but excels in making datasets book’s accomplishments show that more can available for external review by the scientific be done by Twitter in this field. community. Independent audits of method- We observed continued improvements by ology would be useful for Twitter and for the Twitter in all other categories. Most signifi- other platforms. cantly Twitter stands out for its ability to re- Twitter improved in all categories except for move inauthentic accounts. Fake accounts blocking account creation and should be are disappearing 40 per cent faster than in commended for the strides it has made in its 2019, making Twitter three times faster than ability to remove inauthentic accounts and Facebook. activity. Twitter has also improved its ability to remove We conclude that Twitter maintains its posi- fake activity and to slow down the speed of tion at the most effective platform at counter- delivery of inauthentic engagement. Although ing manipulation, but it can do more to block half of the bought likes were delivered within the creation of fake accounts and to increase an hour, Twitter had effectively removed many its ability to remove reported inauthentic ac- within 12 hours. After 24 hours 80 per cent had counts. been removed and after two days virtually all of the fake likes had vanished. However, the drip- fed engagement proved more difficult. Twitter YouTube blocked a significant proportion (as much as YouTube remains mired in its position as the 80 per cent), but the remaining 20 per cent re- least effective of the four major platforms. mained four weeks after delivery. When we added TikTok this year, YouTube lost Of the 100 random accounts we reported to the distinction of being the worst platform, Twitter, 7 per cent were removed and 2 per but only because TikTok is worse. cent were suspended after five days. This is We are unable to accurately assess the half- an improvement over last year, when only 3 life of identified but active YouTube accounts per cent were blocked three weeks after re- used to deliver manipulation because of the porting, but the overall performance is still far platform’s lack of transparency; it simply isn’t too low to make an actual impact. possible for outside researchers to identify Twitter was the only platform to cause a year- the accounts delivering the most inauthentic by-year increase in the price of manipulation content. There was virtually no reduction in services, however, the rate of increase slowed fake comments or the accounts used to de-

��������������������������������������������������������������������������� 29 liver them; only three of 380 fake comments we tested drip feed fake views onto YouTube delivered were eventually removed. On the at a rate of 500 to 4000 views per day, with basis of this performance, we conclude that significant fluctuations over time. It took the YouTube removed few or no inauthentic ac- manipulation providers 6–12 hours to deliver counts on its platform. Furthermore, YouTube the first 1000 views, and then another four removed none of the accounts we reported weeks to reach 10 000 views. YouTube out- for inauthentic activity, a result worse than performs every other platform tested in this last year. category but isn’t able to stop fake views from appearing on its platform. YouTube does a better job than the other plat- forms in removing inauthentic views, as none YouTube maintains its position as the most of the other platforms managed to remove any expensive platform to manipulate, however it of the inauthentic views we bought. We ob- was the only platform for which the price of served a ten per cent reduction after three to manipulation services had decreased since four weeks, and a 50 per cent reduction after 2019. Although the 3,5 per cent reduction is eight weeks. However, we observed that fake smaller than the 25 per cent reduction from views were removed for some videos and not the year before, the trend is still negative; for others, indicating an uneven distribution YouTube is getting cheaper to manipulate of YouTube counter-measures. Interestingly, over time and fewer inauthentic accounts and we experienced significant over-deliveries this engagements were removed in 2020 than in year; manipulation service providers delivered 2019. The primary positive development was 70–90 per cent more engagement for some a continued, and somewhat enhanced, ability YouTube services than we had bought. to slow down the delivery of fake views.

The delivery of inauthentic likes is nearly in- Although YouTube has made some advance stantaneous on YouTube, while inauthentic in transparency this year, it still demonstrates views and comments are either drip-fed or a significant lack of transparency, providing added in increments. The average speed of much less in the way of facts, statistics, and delivery, across all services, for YouTube was takedown reports than Facebook and Twitter. equal to that for Instagram and Facebook af- It is clear that YouTube needs to prioritize ter 12 hours, but after 72 hours we observed combatting inauthentic behaviour. It is embar- 85 per cent delivery on YouTube compared to rassing that several large manipulation ser- only 45–60 per cent for the other platforms. vice providers maintain accounts on YouTube This means that manipulation services, on that publish promotional material and how-to average, can be delivered to YouTube much videos for manipulating the platform. Further- more quickly than to Twitter, Instagram, or more, manipulation service providers continue Facebook. But, as mentioned above, You- to advertise their services successfully through Tube is the only platform able to counter fake Google Ads—YouTube’s parent company. views. Currently all the manipulation providers

30 ���������������������������������������������������������������������������� We conclude that TikTok is far too easy to TikTok manipulate—so easy that perhaps the best The story about TikTok’s ability to counter way to describe the platform is simply as un- manipulation is a sad one, as manipulating defended. The only positive aspect of TikTok’s TikTok is cheap, fast, and reliable. TikTok ability to counter manipulation is that it casts seems unable to identify and remove even the the efforts of the other social media platforms cheapest forms of manipulation from its plat- in a much better light form and provides its users with only the most basic protections. Relative performance. For the duration of our experiment, TikTok The most important insight from this iteration did not manage to remove any of our fake of our study is that there continue to be sig- accounts or any of the fake engagements we nificant differences among platforms in their had purchased. Engagements were delivered ability to counter the manipulation of their almost instantaneously; only comments re- services. From the near defenceless TikTok to quired between 6 and 12 hours to be delivered, industry leader Twitter, all platforms perform indicating that these may have been manually differently in different categories. delivered onto the platform. Twitter remains the industry leader in 2020 Given the ease and availability of manipula- but Facebook is rapidly closing the gap with tion, we were surprised to discover that ma- its impressive improvements. Instagram and nipulation services for TikTok are about 45 YouTube are still struggling behind, but while per cent more expensive than for Instagram Instagram is slowly moving in the right direc- on average (manipulating TikTok in turn is tion, YouTube has made very few improve- one tenth the price of manipulating Twitter on ments. TikTok is the defenceless newcomer average). While TikTok comments are about with much to learn. twice the price of Instagram comments— Despite notable improvements by some, none roughly 2 EUR for 100 fake comments—fake of the five platforms we studied are doing TikTok views are as cheap as fake Instagram enough to prevent the manipulation of their views, with a median price of 0,11 EUR for services. The manipulation service providers 1000 inauthentic views. are still winning the digital arms-race. TikTok’s countermeasures were largely non-existent. For the categories manipulation costs and speed we gave the platform the ben- efit of the doubt, noting some small possible responses. And on a positive note, TikTok has started a program to disclose information about its efforts to counter abuse on the platform.

��������������������������������������������������������������������������� 31 *In red - relative change from 2019 Poor Improving Good

3 1. Ability to block fake account creation no change -1 1 new

no change 2. Ability to identify and remove no change inauthentic accounts 3 -1 new

3 3. Ability to remove fake likes, views, etc. 1 delivered by anauthentic accounts 3 -1 new

1 4. Manipulation costs no change (more expensive-harder to manipulate) 1 no change new

4 5. Speed of delivery 3 (slower=harder to manipulate) 2 1 new

1 6. Responsiveness. Fake account removal 1 a†er being reported to the platform 1 no change new

new category 7. Transparency of efforts new category new category MSP new category new category

Illustration of relative performance of Twitter (1st), Facebook (2nd), Instagram (3rd), Youtube (4th) and TikTok (5th). Manipulation service providers are still winning. x Errata. A previous version of this report showed the wrong relative figures by mistake xx For this year we have included a new category “Transparency of effort”. We have also removed the category assessing the ability to “undo historic activity”

32 ���������������������������������������������������������������������������� What other content were the bots targeting?

Because of changes in the way in which social be bipartisan; we identified accounts promot- media platforms share data and allow for ex- ing both pro-Biden and pro-Trump content. ternal data access, it is challenging to identify, Other accounts pushed conspiracy material monitor, and analyse which other types of ac- surrounding the Catalan independence strug- counts and content were targeted by the com- gle and the poisoning of Aleksej Navalny. We mercial bot networks we studied. On most also identified accounts discussing the - poli of the platforms, we were unable to identify tics of , Brazil, and Ukraine, calling out which accounts were being used to deliver alleged anti-Semitism in Lithuania, and ex- views and likes; even when we could identify pressing xenophobic content. an account as being responsible for specific As noted in the 2019 report, influencers in- inauthentic efforts, we are often unable to see terested in inflating their follows continue to the other activities of that account. be an important source of business for the This section provides some insight into the commercial manipulation bot industry. This activity of manipulation service accounts by year we observed that sports and entertain- looking at the activity of accounts identified ment influencers had joined the usual on Twitter, as Twitter alone provides the nec- influencers using these services. Perhaps the essary transparency to allow us to discover most surprising observation was that bots which other coordinated activities the ac- were being used to inflate engagement on a counts engaged in. belonging to a major American artist.

We identified a network of inauthentic ac- Sadly, we also observed attempts to spread counts that promoted 71 different topics over disinformation surrounding the COVID-19 the course of a week, including financial ser- pandemic, although with little apparent suc- vices, politics, health, pornography, gambling, cess as the posts targeted contained fringe sports, and online influencers. content that attracted few or no organic en- gagements. The financial services being promoted con- sisted mostly of spam-like cryptocurrency Our conclusion from 2019 stands: manipula- ads, while the goods and services promoted tion services are still being used primarily for included chili sauce, makeup artists, cyber commercial purposes, but political actors are security specialists, and a monument instal- making minor forays into manipulating public lation service in St. Petersburg. discourse. We identified bots promoting material relating Social media platforms have made numerous to the US election. These efforts appeared to commitments to protect elections against

��������������������������������������������������������������������������� 33 CASE STUDY: US PRESIDENTIAL ELECTIONS 2020 Are the platforms protecting verified accounts during elections?

manipulation such as the artificial inflation nipulation services available: 1000 likes, 50 of likes, views, and comments.38, 39, 40 In light comments, and 50 shares on Facebook; 1000 of the numerous recent attempts by hostile likes and 50 comments on Instagram; and actors to interfere in democratic elections 1000 likes, 50 retweets, and 50 replies on by weaponizing social media platforms, this Twitter. is important and commendable work. Fur- thermore, it is vital that the work of the social For Facebook the comments were deployed media companies be independently assessed as tasks, meaning that we gave instructions and audited to ensure that their countermea- to the manipulation service provider, who in sures are robust and effective. turn passed them on to the people perform- ing the tasks. A higher degree of automation This experiment was conducted on the cusp of is possible on Twitter and Instagram, so here the US presidential election to assess the gen- we provided pre-written comments that were eral ability of social media platforms to counter posted by bots. Pre-written comments have manipulation. In order to discover whether ver- the added benefit of being easier to trace after ified accounts on social media platforms are they are delivered. better protected than ordinary accounts, par- ticularly during a US presidential election, we After having purchased the fake engage- partnered with US Senators Grassley (R) and ments we monitored the posts closely for Murphy (D) to conduct a unique experiment. eight days before reporting a random sample of the accounts observed delivering the fake Using commercial social media manipulation engagement. We then continued to monitor service providers, we bought fake shares, likes, the posts for another five days to observe the and comments for two social media posts— proportion of reported accounts removed by one from each senator—on Twitter, Facebook, the platforms and assess the enforcement ca- and Instagram. These platforms were chosen pabilities of the social media companies. because the two senators are active on these platforms and their accounts contain content Instagram that met the stringent criteria we developed to We paid 7,3 USD for 1803 likes and 103 fake reduce the risk of inadvertently influencing any comments delivered successfully on Instagram. current political discussions. The comments we bought on Instagram were We chose six neutral, apolitical posts for our delivered to the senators’ posts within one case study—one from each senator on each hour and remained in place throughout the ex- of the three platforms. The posts were at least periment; the bought likes started appearing one month old and didn’t have any active en- after an hour. Ninety per cent of the inauthen- gagement, such as comments or views. We tic engagement we purchased was delivered then purchased the smallest baskets of ma-

34 ���������������������������������������������������������������������������� within the eight-day observation period. The Twitter likes and comments were all delivered by rela- tively simple fake accounts that had few or no We paid 28.4 USD for 220 likes, 75 replies, and friends and no original content. 95 retweets delivered successfully on Twitter. Facebook For one of the accounts on Twitter, the likes were never delivered. For the other post, only We paid 26,3 USD for 2205 likes, 40 com- about 20 percent of the likes had been deliv- ments, and 110 shares delivered successfully ered after the eight day observation period. on Facebook. However, at the end of the experiment rough- ly half of the likes had been delivered, which All of the fake likes were delivered within one suggests that the likes were being drip-fed hour, but over the subsequent hours and days slowly to avoid being blocked by Twitter. the number of likes steadily declined. The ma- nipulation service provider added more likes The retweets we purchased were delivered after 4 days to compensate for likes that had successfully to both posts. Numbers peaked been removed, by which time roughly half of within about six hours, after which a few of the original fake likes had been removed. At the retweets began to disappear. After eight the end of the eight-day observation period days about ten percent of the retweets had roughly half the number of fake likes we had disappeared. Seventy-five per cent of the in- purchased remained. authentic replies were delivered within 24 hours and none had been removed by the end The fake shares were delivered over a 24- of the experiment. hour period and at the end of the observation period all the fake shares remained without Analysis any reductions. The manipulation provider encountered some challenges delivering the Instagram was the easiest of the three plat- fake comments as only 40 of 100 comments forms to manipulate. Manipulation services were delivered by the end of the observation were delivered rapidly and nearly in full. De- period. All 40 comments were delivered be- livery of Facebook services was staggered; tween 6 and 24 hours after purchase. those that were delivered all in one go tend- ed to disappear quickly as the platform’s an- We determined that most of the accounts ti-manipulation algorithm rooted out the fake delivering the inauthentic engagement were activity. Apparently, the manipulation service genuine accounts/users getting paid for deliv- provider delivered the inauthentic likes slowly ering the engagement. However, some of the to circumvent prevention mechanisms. accounts looked suspiciously like simple bot accounts with little content and few friends. The manipulation service provider we used Almost all of the accounts seemed to origi- did not offer automatically generated com- nate in Asia. We determined this by looking at ments or comments delivered through auto- the names and languages used in other posts mated accounts for Facebook; instead the made by the accounts. current model is to pay real people to per- form specific tasks. Manipulation of this type is harder to detect, but also harder to scale.

��������������������������������������������������������������������������� 35 This suggests that Facebook is winning delivered by the manipulation service provid- against the manipulation providers in this er. With some important exceptions, a large area by denying them the ability to deliver proportion of the delivered engagements fake comments automatically. remained undetected for the duration of the experiment. The accounts delivering the ma- We observed no difference between the protec- nipulation also remained active throughout tion offered to senators and that provided to the experiment, and we observed that these regular users on any of the three platforms in accounts continued to deliver purchased respect of inauthentic engagements delivered engagements for other commercial clients and removed. The speed and completeness of throughout the monitoring period. manipulation services was broadly the same for verified and regular accounts. However, we The results of this small study on verified ac- observed a significant difference in the way re- counts are comparable to our general findings ported accounts were handled. At the end of for 2020. There is no clear evidence to sug- the eight-day observation period we reported gest that the platforms added any safeguards a sample of the accounts identified as deliv- for verified political accounts to counter this ering inauthentic engagement and then mea- form of manipulation during the current US sured how many had been removed five days election cycle. after reporting. Instagram removed 16 of 40 Despite the improved removal of reported ac- reported accounts (40%), Facebook removed counts, the results of this case study are sim- 10 of 60 reported accounts (17%), and Twitter ilar to those of last year’s case study when we removed 5 of 50 reported accounts (10%). Re- bought engagement on posts by EU commis- checking removal rates after three weeks, we sioners Dombrovskis, Jourová, Katainen, and found that Facebook hadn’t removed any more Vestager. This indicates that the platforms accounts, but both Instagram and Twitter had have not made any major changes in how roughly doubled removals, by 70% and 24% they protect verified accounts. respectively. In removing reported accounts for the two senators, Facebook and Instagram This leads us to conclude that verified in- outperformed their results for regular users stitutional accounts are still no better pro- significantly, while Twitter’s performance re- tected against manipulation than any other mained constant. This suggests that reported accounts on the social media platforms. Our accounts engaging with US political accounts experiment also shows that at the height of may undergo additional scrutiny. Even so, given the US presidential election it remained pos- the simplicity of the reported accounts, manu- sible to buy fake engagements on the social al assessment should have resulted in a much media accounts of two US senators from a larger percentage of removals. Consequently, Russian manipulation service provider with we are unable to rule out the possibility that relatively little pushback from the platforms in reasons other than platform vigilance account question. Efforts to protect official accounts for the difference. and to counter bot networks and commercial manipulation must be significantly improved Case study conclusions to protect online conversations. A significant proportion of the inauthentic en- gagements we purchased was successfully

36 ���������������������������������������������������������������������������� 37 It is still far too easy to manipulate social media platforms, even at the height of a US presidential election.

CONCLUSIONS

Last year an important finding of our experi- work together has never been more evident— ment was that the different platforms weren’t not only for social media companies but for equally bad—in fact, some were significantly our society as a whole. Telecom companies, better at identifying and removing manipula- online payment providers, web hosting ser- tive accounts and activities than others. This vices, search engines, and online advertising year’s iteration has made the differences even companies all need to come together to com- clearer as we have seen great improvements bat the digital manipulation industry. by some platforms, while others have reached new lows. Investment, resources, and de- It is still far too easy to manipulate social termination make a significant difference media platforms, even at the height of a US in the ability of social media companies to presidential election. Although we studied counter manipulation. only commercial manipulation service pro- viders, there is no doubt that other more ma- It has also become evident to us that social liciously inclined actors could use the same media companies have different strengths techniques for political or security-related ma- and weaknesses. This seems to indicate that nipulations. Given the number of takedowns they don’t share information about lessons and ample evidence of state-driven social me- learned and best practices amongst them- dia manipulation over the course of the past selves. Many of the challenges they face could year, published and acknowledged by the plat- be addressed more effectively if the compa- forms themselves, it is clear that antagonists nies themselves improved communications, continue to exploit social media loopholes to established forums, and chose to work jointly manipulate public discussions. to combat the problem at hand. The need to

38 ���������������������������������������������������������������������������� As antagonists find that the big social media advertising, and ultimately our democrat- platforms, primarily Twitter and Facebook, are ic discourse, more transparency is needed. slowly closing the door on them, it is reason- Furthermore, we need a common safety stan- able to expect that they will look for other av- dard that allows watchdog agencies to com- enues of access, probably seeking to exploit pare reports from the different social media and manipulate less defended platforms and companies. Tech companies also need to be online services. This risk underscores the encouraged or forced to share technical data continuing need to regulate digital services to that would enable joint development of best ensure equal protections for online conversa- practices and optimize capabilities for tracking tions regardless of where they occur. and removing antagonists across platforms. Policy recommendations Finally, a system of independent auditing should be considered in order to build and The policy recommendations we presented maintain trust in the reports from the social in our initial report remain in force. The devel- media companies. opments we have observed over the last year 2. Establish independent and strengthen our conviction that our original well-resourced oversight recommendations are important and remain much needed. Independent oversight would help provide the 1. Increase transparency and insight needed to better assess the progress develop new safety standards of the social media companies in countering inauthentic activity on their platforms. Giv- More transparency is needed to understand en the wide variation in the ability of social the scope and effect of available manipulation media platforms to counter manipulation, it services. In particular, more detailed informa- is becoming ever clearer that impartial and tion is needed regarding the “actors, vectors, independent assessment of the effective- targets, content, delivery mechanisms and ness of social media companies’ attempts to propagation patterns of messages intended counter platform manipulation is a necessity. to manipulate ”.41 In essence, it is important to know who is trying to ma- 3. Deter social media manipulation nipulate social media platforms and to what effect. The current transparency reports tell While we have focused on the ability of social us that billions of fake accounts are removed media companies to protect their platforms, from the platforms every year, but we don’t it is also important that we turn our atten- know what these accounts sought to achieve, tion to the industry that profits from devel- or indeed what effects they delivered before oping the tools and methods that enable this being identified. To assess their impact on interference. Lawmakers need to regulate social media conversations, business, online the market for social media manipulation.

��������������������������������������������������������������������������� 39 Political manipulation by domestic groups Implications for NATO and foreign governments needs to be exposed and perpetrators must be held accountable. Developments in Ukraine, with its advanced Violators can be deterred by economic, dip- antagonistic botfarms, have demonstrated lomatic, and criminal penalties. The ongoing the skill and determination of antagonists practice of widespread and relatively risk-free seeking to undermine and exploit social me- social media manipulation needs to stop. dia conversations. Disclosures by social me- 4. Social media platforms need to do dia companies underscore the intensity and more to counter abuse of their services determination of foreign states and other antagonists to undermine the interests of the Even though we have observed important im- Alliance. provements by social media companies over Social media manipulation continues to be the past year, it is important that we contin- a challenge for NATO; it is a potent tool for ue to pressure them to do more to counter malicious actors seeking to undermine the in- platform manipulation. Manipulation service terests of the Alliance. As the defences of the providers continue to advertise and promote social media companies are still inadequate, their services on the very platforms that they we must continue to expect that antagonists seek to undermine. It is embarrassing for the will be able exploit social media for malign social media companies that they are unable purposes during times of peace, of crisis, and to prevent the manipulation service providers of war. Therefore, the Alliance must continue from using their own platforms to market ser- to develop and refine its strategies and its vices designed to undermine platform integri- ability to communicate in a highly contested ty. It may well be that the incentives for plat- information environment. forms to tackle the problem are insufficiently strong—after all, fake clicks also generate ad- The recent assessment published by the Euro- vertising revenue. pean Union underscores the fact that the abil- ity of the social media companies to protect 4. A whole-of-industry solution users in all countries and languages of the is needed Union isn’t evenly distributed. In fact, there is good reason to assume that some languages Social media companies will not be able to and regions of the Alliance are virtually un- combat social media manipulation as long protected by human moderators. NATO must as there isn’t a whole-of-industry solution to demand full disclosure from social media the problem. Financial service providers such companies about their intention and ability as PayPal, Visa, and Mastercard need to stop to ensure that all of the Alliance is protected payments to the manipulation industry. Ad- against hostile foreign manipulation.42 vertisers need to put sanctions on influencers who use social media manipulation to defraud advertisers, and on social media companies that allow it.

40 ���������������������������������������������������������������������������� ENDNOTES

1. cf. European Commission, ‘Communication from the Commission to the European Parliament, the Council, the European Economic and Social Committee and the Committee of the Regions’, 3 December 2020. 2. Facebook, ‘Coordinated Inauthentic Behavior Archives’. [Accessed 30 October 2020]’ 3. Google, ‘Threat Analysis Group (TAG)’. [Accessed 30 October 2020]. 4. Twitter, ‘Twitter Safety’. [Accessed 30 October 2020]. 5. European Commission, ‘Assessment of the Code of Practice on Disinformation’, 10 September 2020. 6. European Commission, ‘European Democracy Action Plan: Making EU Democracies Stronger’, [Accessed 3 De- cember 2020]. 7. SSU, ‘СБУ блокувала діяльність потужної «ботоферми», яку координували куратори з Росії’. [Accessed 30 October 2020]. 8. SSU, ‘У Бердянську СБУ блокувала роботу ботоферми, керованої з РФ’. [Accessed 30 October 2020]. 9. SSU, ‘СБУ викрила ботоферму, через яку поширювали фейки про COVID-19 та заклики до повалення конституційного ладу (відео)’. [Accessed 30 October 2020]. 10. Graphika, ‘IRA Again: Unlucky Thirteen’. [Accessed 30 October 2020]. 11. Stanford Internet Observatory, ‘Hacked and Hoaxed: Tactics of an Iran-Linked Operation to Influence Black Lives Matter Narratives on Twitter’. [Accessed 30 October 2020]. 12. Facebook, ‘Removing Coordinated Inauthentic Behavior’, 8 October 2020. 13. Stanford Internet Observatory, ‘Analysis of an October 2020 Facebook Takedown Linked to U.S. Political - tancy Rally Forge’. [Accessed 30 October 2020]. 14. Stanford Internet Observatory, ‘Analysis of an October 2020 Facebook Takedown Linked to the Islamic Movement in Nigeria’. [Accessed 30 October 2020]. 15. NATO StratCom CoE, ‘Robotrolling 2020/3’ 16. See, for example: Ferrara, Emilio, Herbert Chang, Emily Chen, Goran Muric, and Jaimin Patel. ‘Characterizing So- cial Media Manipulation in the 2020 U.S. Presidential Election’, 19 October 2020. 17. Sebastian Bay and Rolf Fredheim, How Social Media Companies Are Failing to Combat Inauthentic Behaviour Online, 2019. 18. Statista, ‘Most Used Social Media 2020’. [Accessed 21 November 2020]. 19. NATO StratCom CoE, The Black Market for Social Media Manipulation, December 2018. 20. Sebastian Bay and Rolf Fredheim, ‘How Social Media Companies are Failing to Combat Inauthentic Behaviour Online’, 2019. 21. Jens Mattke, Christian Maier, Lea Reis, and Tim Weitzel, ‘Herd Behavior in Social Media: The Role of Facebook Likes, Strength of Ties, and Expertise’, Information & Management 57, № 8 (December 2020). 22. Facebook, Community Standards: ‘Inauthentic Behaviour’; Facebook, ‘An Update On Information Operations On Facebook’, 6 September 2017. 23. See, for example: Facebook, ‘White Hat Programme’. [Accessed 21 November 2020]. 24. At present we are unable to track bot survival rates on YouTube and TikTok. 25. TikTok, ‘Transparency Center’. [Accessed 1 November 2020].

��������������������������������������������������������������������������� 41 26. Twitter, ‘Twitter Transparency Center’. [Accessed 1 November 2020]. 27. Facebook, ‘Facebook Transparency’. [Accessed 1 November 2020]. 28. Google, ‘Google Transparency Report’. [Accessed 1 November 2020]. 29. Facebook, ‘Facebook Transparency Report | Community Standards’, [Accessed 1 November 2020]. 30. Twitter, ‘Platform Manipulation—Twitter Transparency Center’, [Accessed 1 November 2020]. 31. Google, ‘Threat Analysis Group (TAG)’, blog.google. [Accessed 1 November 2020]. 32. Twitter, ‘Twitter Safety’. [Accessed 30 October 2020]. 33. Facebook, ‘Coordinated Inauthentic Behavior Archives’. [Accessed 1 November 2020]. 34. Tiktok, ‘Newsroom | Safety’. [Accessed 1 November 2020]. 35. Facebook, ‘Community Standards Enforcement Report, November 2020’. [Accessed 19 November 2020]. 36. Facebook, ‘Facebook Transparency Report | Community Standards’. [Accessed 22 November 2020] 37. Facebook, ‘New Facebook and Instagram Research Initiative to Look at US 2020 Presidential Election’. [Accessed August 2020]. 38. YouTube, ‘YouTube Security & Election Policies—How YouTube Works’. [Accessed 10 November 2020]. 39. Facebook, ‘Helping to Protect the 2020 US Elections’, 21 October 2019. [Accessed 10 November 2020]. 40. Twitter, ‘Additional Steps We’re Taking Ahead of the 2020 US Election’. [Accessed 10 November 2020]. 41. European Commission, ‘Assessment of the Code of Practice on Disinformation’, 10 September 2020. 42. Ibid.

42 ���������������������������������������������������������������������������� Prepared and published by the NATO STRATEGIC COMMUNICATIONS CENTRE OF EXCELLENCE

The NATO Strategic Communications Centre of Excellence (NATO StratCom COE) is a NATO accredited multi-national organisation that conducts research, publishes studies, and provides strategic commu- nications training for and military personnel. Our mission is to make a positive contribution to Alliance’s understanding of strategic communi- cations and to facilitate accurate, appropriate, and timely communication among its members as objectives and roles emerge and evolve in the rapidly changing information environment. Operating since 2014, we have carried out significant research enhancing NATO nations’ situational awareness of the information environment and have contributed to exercises and trainings with subject matter expertise.

www.stratcomcoe.org | @stratcomcoe | [email protected]

���������������������������������������������������������������������������43 43