978-9934-564-92-5
SOCIAL MEDIA MANIPULATION 2020 HOW SOCIAL MEDIA COMPANIES ARE FAILING TO COMBAT INAUTHENTIC BEHAVIOUR ONLINE
PREPARED BY THE NATO STRATEGIC COMMUNICATIONS CENTRE OF EXCELLENCE
���������������������������������������������������������������������������� 1 ISBN: 978-9934-564-92-5 Project Manager: Rolf Fredheim Authors: Sebastian Bay, Anton Dek, Iryna Dek, Rolf Fredheim Research: Singularex Copy Editing: Anna Reynolds Design: Kārlis Ulmanis
This report was completed in November 2020, based on an experiment that was conducted September to October 2020.
Singularex is a Social Media Intelligence and Analytics company based in Kharkiv, Ukraine. Website: www.singularex.com Email: [email protected]
NATO STRATCOM COE 11b Kalnciema Iela Riga LV1048, Latvia www.stratcomcoe.org Facebook/stratcomcoe Twitter @stratcomcoe
This publication does not represent the opinions or policies of NATO or NATO StratCom COE. © All rights reserved by the NATO StratCom COE. Reports may not be copied, reproduced, distributed or publicly displayed without reference to the NATO StratCom COE. The views expressed here do not represent the views of NATO.
2 ����������������������������������������������������������������������������� EXECUTIVE SUMMARY
Introduction The experiment
Antagonists, from foreign governments to To test the ability of social media compa- terror groups, anti-democratic groups, and nies to identify and remove manipulation, commercial companies, continually seek to we bought engagement on thirty-nine Face- manipulate public debate through the use of book, Instagram, Twitter, YouTube, and TikTok coordinated social media manipulation cam- posts, using three high-quality Russian social paigns. These groups rely on fake accounts media manipulation service providers. For and inauthentic behaviour to undermine on- 300 € we received inauthentic engagement in line conversations, causing online and offline the form of 1 150 comments, 9 690 likes, 323 harm to both society and individuals. 202 views, and 3 726 shares on Facebook, Ins- tagram, YouTube, Twitter, and Tiktok, enabling As a testament to the continued interest of us to identify 8 036 accounts being used for antagonists and opportunists alike to manip- social media manipulation. ulate social media, a string of social media companies, researchers, intelligence services, While measuring the ability of social media and interest groups have detailed attempts to platforms to block fake account creation, manipulate social media conversations during to identify and remove fake activity, and the past year. Therefore, it continues to be es- to respond to user reports of inauthentic sential to evaluate whether the social media accounts, we noted that some of the plat- companies are living up to their commitments forms studied had made important improve- to counter misuse of their platforms. ments. However, other platforms exhibited a continued inability to combat manipula- In an attempt to contribute to the evalua- tion. Of the 337 768 fake engagements pur- tion of social media platforms, we re-ran our chased, more than 98 per cent remained ground-breaking experiment to assess their online and active after four weeks, and even ability to counter the malicious use of their discounting fake views, more than 80 per services. This year we spent significant effort cent of the 14 566 other fake engagements to improve our methodology further, and we delivered remained active after a month. also added a fifth social media platform, Tik- Tok, to our experiment.
���������������������������������������������������������������������������� 3 While these numbers aren’t much to celebrate, ed last year, and we have not seen any mean- we did observe significant pushback, primari- ingful improvements since then. In fact, since ly from Facebook and Twitter. In some cases, our first report, YouTube has been overtaken Facebook managed to remove as much as 90 by both Facebook and Instagram. per cent of the fake views before the manip- ulation service providers restored their work. TikTok, the latest platform added to our exper- This is an important step in the right direction. iment, seems to be nearly defenceless against platform manipulation. None of the manipu- Yet another win this year was a significant im- lation was prevented or removed by the plat- provement by Facebook in blocking the cre- form, making it by a distance the easiest to ation of inauthentic accounts. But despite be- manipulate. In hindsight, we now realise that ing the most effective at this, Facebook was the other platforms could be doing worse. the only platform studied to show an increas- ing half-life for active inauthentic accounts. Only YouTube can counteract fake views in While none of the measures introduced by any significant way. On Facebook, Instagram, any of the platforms are robust enough to Twitter, and TikTok, fake views continue to stop persistent users or organisations from be quickly delivered, and are seemingly never manipulation, both Facebook and Twitter removed. On Instagram, we managed to get have made it significantly more challenging. 250 000 fake views delivered within an hour. Antagonists use fake views to manipulate Twitter stands out in its ability to remove in- platform algorithms to influence what social authentic accounts from the platform; fake media users see. This remains a significant accounts disappear 40 per cent faster than challenge for all platforms. in 2019, indicating that Twitter is three times faster than Facebook at removing accounts Our investigation has also shown that re- engaged in inauthentic activity. porting an account for confirmed social me- dia manipulation does not induce a platform Twitter and Facebook are now relatively good to block that account. Of the 499 accounts at blocking simpler automatic manipulation (100 to Facebook, 100 to Instagram, 100 to services, most notably automated comments, YouTube, 100 to Twitter and 99 to TikTok) we pushing manipulation service providers to rely reported as verified to have engaged in inau- on human labour to a greater extent. While there thentic behaviour, 482 accounts remained ac- isn’t any clear indication that this is driving up tive five days after they were reported. Indeed, prices yet, any human-based manipulation will in all cases where we received feedback on be more expensive than an automatic solution. the accounts we reported, it was to state that the user in question did not violate the plat- YouTube maintains its position as the least form’s terms of service. We conclude that re- effective of the four major platforms we test- porting and moderation mechanisms must be
4 ����������������������������������������������������������������������������� improved so that a larger share of inauthen- haviour, and to developing the frameworks tic accounts reported to the platforms are necessary to impose economic, diplomatic, removed—even if only a single user reports or legal penalties potent enough to deter gov- them. Reported inauthentic accounts must ernments, organisations, and companies from not regularly escape sanction. breaking the norms of online behaviour. Conclusions
The most important insight from this exper- iment is that platforms continue to vary in their ability to counter manipulation of their services. Facebook-owned Instagram shows how this variation exists even within compa- Based on our experiment, we nies. Instagram remains much easier to ma- recommend that governments nipulate than Facebook and appears to lack introduce measures to: serious safeguards. Tellingly, the cost of ma- nipulating Instagram is roughly one tenth of the cost of targeting Facebook. 1. Increase transparency and In 2020, Twitter is still the industry leader in develop new safety standards combating manipulation, but Facebook is rap- for social media platforms idly closing the gap with impressive improve- ments. Instagram and YouTube are still strug- 2. Establish independent and gling behind, but while Instagram is slowly well-resourced oversight of moving in the right direction, YouTube seems social media platforms to have given up. TikTok is the defenceless newcomer with much to learn. 3. Increase efforts to deter social media manipulation Despite significant improvements by some, none of the five platforms is doing enough 4. Continue to pressure social to prevent the manipulation of their services. media platforms to do more Manipulation service providers are still win- to counter the abuse of their ning. services
This ongoing and evolving threat has un- derscored the need for a whole-of-society approach to defining acceptable online be-
���������������������������������������������������������������������������� 5 The evidence is clear: schemers around the world take every opportunity to manipulate social media platforms for a variety of commercial, criminal, and political reasons.
INTRODUCTION
One year ago, the NATO StratCom Centre of In September 2020 the European commission Excellence carried out a ground-breaking ex- presented an assessment of the effectiveness periment to assess the ability of social me- of the Code of Practice on Disinformation one dia companies to counter the malicious use year after implementation. The commission of their services. We showed that an entire concludes that the social media platforms industry had developed around the manipu- have put policies in place to counter the ma- lation of social media1, and concluded that nipulation of their services, but lack of trans- social media companies were experiencing parency is still too great to enable a thorough significant challenges in countering the ma- evaluation of the impact of social media ma- nipulation of their platforms. nipulation.5 As a response to this enduring challenge, the European commission’s recent- Since then the companies committed to ly launched Action Plan for Democracy out- improving their defences, especially ahead lines a commitment by the Commission ‘to of the US 2020 presidential elections. Face- overhaul the Code of Practice on Disinforma- book,2 Google,3 and Twitter4 have all updated tion into a co-regulatory framework of obliga- their policies and increased transparency re- tions and accountability of online platforms’.6 garding the manipulation of their platforms during the past year, but it continues to be As antagonists and opportunists continue difficult, often impossible, to independently to ramp up their efforts to manipulate social assess the effectiveness of their efforts. media, researchers, intelligence services, in-
6 ����������������������������������������������������������������������������� dependent interest groups, and the social Building on our previous work we decided to media companies themselves have published re-run the research17 we conducted in 2019 numerous reports over the past year detail- using experimental methods to assess the ing attempts to manipulate conversations ability of social media companies to counter on these platforms. The Ukrainian Security manipulation on their platforms. In order to Services (SSU) documented their recent dis- further refine our methodology for this itera- covery of a number of advanced bot farms,7,8,9 tion of the experiment, we decided to focus and many other agencies have reported on on a smaller number of posts manipulated the continued activity of the now-notorious by a smaller number of manipulation service Internet Research Agency in Russia,10 and providers, which allowed us to track differ- on a multitude of state actors and other or- ences among the responses of the platforms ganisations ranging from Iran11 and China12 studied, rather than the relative performance to lobby groups13 and terror organisations14 of the manipulators. Another change was the involved in social media manipulation. There addition of the Chinese-owned social media are also many academic articles and studies platform TikTok to our experiment. TikTok has detailing the systematic manipulation of con- become one of the fastest growing social me- versations, including our own quarterly Ro- dia platforms, currently ranked as the seventh botrolling15 report and several recent studies largest with almost 700 million active users.18 on the US presidential election.16 In the context of the US presidential election The evidence is clear: schemers around the we partnered with US Senators Chuck Grass- world take every opportunity to manipulate ley (Republican Party) and Chris Murphy social media platforms for a variety of com- (Democratic Party) to assess to what extent mercial, criminal, and political reasons. It will their social media accounts in particular, and remain important to evaluate how well social verified social media accounts in general, are media companies are living up to their com- protected against manipulation. mitments, and to independently verify their Our experiment provides original, and much ability to counter the misuse of their plat- needed, insight into the ability of social me- forms. dia companies to counter the abuse of their platforms.
���������������������������������������������������������������������������� 7 The Social Media Manipulation Industry
Many of the conclusions from our initial re- others.21 Bot-controlled accounts cost only a port, The Black Market for Social Media Ma- few cents each and are expected to be blocked nipulation,19 and from last year’s20 iteration of relatively quickly. More elaborate inauthentic this report still hold true—the manipulation accounts require some direct human control. market remains functional and most orders They can cost several hundred dollars to pur- are delivered in a timely and accurate manner. chase and often remain online for years. Social media manipulation remains widely available, cheap, and efficient, and continues to be used by antagonists and spoilers seek- Developments in 2020 ing to influence elections, polarise public opin- ion, sidetrack legitimate political discussions, and manipulate commercial interests online. During the past year, most indicators have sig- nalled that manipulation service providers are The industry feeds the market for inauthentic prospering. They are upgrading their techni- comments, clicks, likes, and follows. Buyers cal capabilities and improving the marketing range from individuals seeking to boost their of their services. The larger providers are up- popularity to influencers gaming the online dating their frontend and backend systems as advertising system to state-level actors with well as adding new services. There is evidence political motivations. Social media manipu- to suggest that the volume of engagement for lation relies on inauthentic accounts that en- sale is increasing. We have found manipula- gage with other accounts online to influence tion service providers offering up to 10 million public perception of trends and popularity. fake views on Twitter and Facebook, 50 mil- Some inauthentic accounts are simple, [ro] lion fake views on Instagram, and as many as bot-controlled accounts without profile pic- 100 million fake views on IGTV. There is also tures or content, used only to view videos or a greater plurality of services on offer with an retweet content as instructed by a computer increase in manually controlled audience-spe- program. Others are elaborate ‘aged’ accounts cific services. We have identified what we as- with long histories meant to be indistinguish- sess as being possible efforts to introduce AI able from genuine users. text generation for fake comment delivery.
Bots are a very cost-efficient way of generating Despite increasing pushback from a number artificial reach and creating a wave of ‘social of social media platforms, all major manipula- proof’ as typical users are more likely to trust tion providers identified by us have remained and share content that has been liked by many in business and several new actors have
8 ����������������������������������������������������������������������������� emerged. Together, the three manipulation providers we used for this exper- iment claim to have completed more than 17 million orders; one of them with 27 employees serving more than 190 000 regular customers.
Social media manipulation continues to be cheap and readily available through a multitude of professional social manipulation service providers. In certain cases, simple automated manipulation is no longer available and only well-re- sourced manipulation service providers are able to manipulate all platforms. While we are seeing evidence that platform pushback is starting to have an effect, in 2020 manipulation service providers remain in the lead in the digital arms race.
Three insights
1. The scale of the industry is immense. The infrastructure for develop- ing and maintaining social media manipulation software, generating fictitious accounts, and providing mobile proxies is vast. We have identified hundreds of providers. Several have many employees and generate significant revenues. It is clear that the problem of inauthen- tic activity is extensive and growing.
2. During the past year the manipulation industry had become increas- ingly global and interconnected. A European service provider will likely depend on Russian manipulation software and infrastructure providers who, in turn, will use contractors from Asia for much of the manual labour required. Social media manipulation is now a global industry with global implications.
3. The openness of this industry is striking. Rather than lurking a shad- owy underworld, it is an easily accessible marketplace that most web users can reach with little effort through any search engine. In fact, manipulation service providers still advertise openly on major social media platforms and search engines.
��������������������������������������������������������������������������� 9 We spent 300 € and received 1 150 comments, 9 690 likes, 323 202 views, and 3 726 shares on Facebook, Instagram, YouTube, Twitter and TikTok enabling us to identify 8 036 accounts being used for social media manipulation.
THE EXPERIMENT
The aim of our experiment was twofold; our purpose-made accounts were of an apo- first, we aimed to further develop and test a litical nature to avoid any risk of actual impact methodology for assessing the ability of so- beyond the framework of the experiment. cial media platforms to identify and counter In contrast to the 2019 experiment, we did not manipulation using commercial manipulation seek to assess the performance of manipu- service providers; second, we used our updat- lation service providers; instead we selected ed methodology to evaluate the performance reliable providers and monitored their service of the social media companies in 2020 in delivery to encourage them to do their utmost comparison to their performance in 2019 as to deliver the services we purchased. The assessed by our initial experiment. 2020 experiment was designed primarily to For this year’s iteration of the experiment test the ability of social media companies to we used three reliable Russian social media withstand manipulation from well-resourced manipulation service providers to buy en- commercial manipulation service providers. gagement on Facebook, Instagram, Twitter, Setting up the experiment this way allowed us YouTube and, for the first time, on TikTok. We to better compare the relative performance of purchased engagement with accounts set up the social media platforms tested. specifically for this experiment. All posts in
10 ���������������������������������������������������������������������������� How is this relevant?
How is buying a thousand fake likes on a fake By using commercial manipulation service account relevant for the assessment of the providers and identifying and tracking the overall ability of social media companies to accounts they were using to deliver manipula- counter manipulation? It could be argued that tion, we could judge the ability of social media bought manipulation is more likely to be re- companies to identify bot networks and inau- ported, and therefore detected, if it engages thentic coordinated behaviour. with current content that influences actual To lay to rest any potential difference be- conversations online. tween buying engagement on real and on This time we were not testing the ability of fake accounts, we conducted a separate ex- social media managers or the public to detect periment to test if there is any difference in and report inauthentic activity or the ability of level of protection between verified accounts the companies to remove posts based on the and ordinary accounts. For this experiment content and context alone. Instead we wanted we partnered with US Senators Grassley (R) to test the ability of the platforms to detect and and Murphy (D) and bought manipulation on remove networks of inauthentic users actively one apolitical neutral post on the Facebook, manipulating public conversations online. Twitter, and Instagram accounts of both men.
��������������������������������������������������������������������������� 11 The experiment was conducted the week be- fore the US Presidential election, during a time when social media companies were expected to be on special alert and to have implement- ed heightened detection mechanisms. This case study was intended to assess the max- imum protection capabilities of the social me- dia platforms studied.
One objection to our research design is that in selecting apolitical content, we may not have been testing platforms’ ability to counter ma- nipulation that might mislead platform users and impact the online conversation. Howev- er, conducting such an experiment would be hard to justify from an ethical perspective [see page 14]. In any case, the platforms claim to implement algorithms to detect platform mis- use broadly, not just within the context of in- formation operations. Facebook’s community standards, for instance, prohibit specific be- haviours (inauthentic behaviour, and especial- ly coordinated inauthentic behaviour), rather than just specific content.22
12 ���������������������������������������������������������������������������� Buying social media Tracking ability of manipulation, tracking social media companies delivery and ability of to remove reported social media platform to confirmed inauthentic accounts identify and remove the manipulation Oct-Nov Nov-Dec
Sep- Oct Nov
Data analysis Reporting a random and verification sample of the identified inauthentic accounts to the social media companies 2020
Tube, Twitter and TikTok enabling us to iden- The scale and timeline tify 8 036 accounts being used for social of the experiment media manipulation. Below, we compare the cost of our basket of manipulation services to For the 2019 version of our experiment, we assess whether prices are rising year by year. bought engagement on 105 different posts On average, services were a bit more expen- on Facebook, Instagram, Twitter, and You- sive than in 2019, but still roughly the same Tube using 16 different manipulation ser- as in 2018. vice providers. In 2020 we focused on three Our work was carried out during six weeks in reliable providers and increased the quan- September and October 2020. To assess the tities of engagement purchased. Last year ability of the platforms to remove inauthentic we spent 300 € to buy 3 530 comments, 25 engagement, we monitored our bought en- 750 likes, 20 000 views, and 5 100 follow- gagement from the moment of purchase to ers enabling us to identify 18 739 accounts one month after it appeared online. We report- being used for social media manipulation. ed a sample of the inauthentic engagement to the social media companies and continued This year we spent 300 € and received 1 150 monitoring to measure the time it took for the comments, 9 690 likes, 323 202 views, and platforms to react. 3 726 shares on Facebook, Instagram, You-
��������������������������������������������������������������������������� 13 Five steps of the experiment
Buying likes, comments, views, and followers for neutral posts on our own inauthentic accounts
Tracking how long Buying likes, it takes to remove comments, views, accounts after and followers for reporting a random neutral apolitical sample posts
Tracking how long Tracking performance inauthentic accounts and response time of stay on the social media platforms in removing platform and with what inauthentic activity they engage
During the experiment, we recorded how quickly the manipulation service providers The ethics of the were able to deliver their services. We then experiment collected data on how the five social media platforms responded to the manipulated con- An important part of this experiment was tent by periodically measuring whether it had minimizing the risks involved, to ensure that been removed. The experiment was organised private individuals were insulated from the into the five steps visualised here: experiment to the highest degree possible. While it would have been possible to design
14 ���������������������������������������������������������������������������� an experiment to assess whether commer- Throughout the experiment we did not ob- cially purchased manipulation can influence serve any indication that our false engage- public conversations, such research would be ment had been noticed by authentic online unethical in that it would interfere with gen- users. For this reason, we concluded that we uine discussions and undermine important successfully managed to conduct the exper- values such as freedom of speech. iment without causing any harm to genuine online conversations. Our experiment was set up so that we could minimize risk and carefully monitor any inad- Furthermore, we acted in the spirit of the vertent effects. In order to achieve this, we white hat programs of the social media com- chose to buy the fake engagement—views, panies themselves; these programs recognise likes, comments, and follows—using our own the importance of external security research- fake accounts. We continuously monitored ers while emphasizing the importance of pro- our accounts to ensure there would be no au- tecting the privacy, integrity, and security of thentic human engagement on them. We also users. We interacted with the two real, verified chose apolitical and trivial content to engage accounts only after having obtained explic- with, and all bought engagements were strict- it consent from the senators. We spared no ly designed and monitored to minimize risk to effort to avoid privacy violations and disrup- real online conversations. tions to real users; we did not access any real user data nor did we in any way attempt to For the case study in which we engaged with exploit identified weaknesses for any reason content on the social media profiles of the two other than testing purposes.23 senators, we agreed with them in advance which posts would be targeted and how. We Finally, we made every effort to minimize the made sure to engage only with apolitical, dat- amount of bought engagement, to avoid un- ed content without any ongoing authentic necessarily supporting manipulation service conversations. We had also prepared inter- providers. This year we reduced the number of ventions to alert any real users of our experi- posts engaged with and capped the amount ment had we noted any authentic interactions spent at 300 €. with the targeted posts.
��������������������������������������������������������������������������� 15 Surprisingly 10 € will buy more than a thousand comments on Facebook-owned Instagram, but the same amount will only fetch 130 comments on Facebook.
ASSESSMENT OF THE PLATFORMS
Our assessment criteria
We assessed the performance of the five year we determined that false likes bought on social media companies according to seven Instagram remained even after the account criteria measuring their ability to counter the used to deliver them had been blocked. malicious use of their services: 1) Blocking the creation of inauthentic accounts, 2) Re- The seventh criterion is new for this iteration. moving inauthentic accounts, 3) Removing It was added to acknowledge the transparen- inauthentic activity, 4) Cost of services, 5) cy of platform policy enforcement. We argue Speed and availability of manipulation, 6) Re- that increased transparency will reduce public sponsiveness, and 7) Transparency of efforts. harm by holding antagonists responsible and We have further developed and refined our cri- by improving users’, researchers’, and politi- teria since last year’s experiment. cians’ awareness of the scale and effects of social media manipulation. For this year’s iteration we removed criteria assessing the ability of social media plat- These criteria can serve as general bench- forms to undo the activity of inauthentic ac- marks for assessing the ability of platforms to counts. While it is important that the activity counter social media manipulation. of inauthentic accounts is removed together with the accounts themselves, this has be- come increasingly difficult to assess. Last
16 ���������������������������������������������������������������������������� 1 Inauthentic accounts are critical uccess in for the functioning of blocking the manipulation services, and creation of platforms aim to prevent their inauthentic accounts creation. Blocking accounts raises the barrier for manipulation, making it more difficult and costly. 2 Ability to This ability is important to combat detect and the spread, impact, and remove ‘time-on-platform’ of inauthentic inauthentic 3 activity. accounts Ability to detect and Given the speed of social media, remove timely detection is important for inauthentic limiting the effects of social media activity manipulation. 4
The more costly it is to buy Cost of manipulation, the less likely it is purchasing that large scale campaigns will be manipulation 5 carried out. Rapid and successful delivery of manipulation indicate that a peed of platform has insufficient protection. delivery Slow delivery indicates providers need to drip-feed interventions to avoid anti-manipulation efforts. 6 As a last resort, platforms turn to user moderation to detect esponsiveness fraudulent activity. The ability of to reports of the platforms to quickly assess and inauthentic respond to reports is an important activity 7 part of combating platform abuse. The transparency of social media Transparency companies to communicate of actions takedown efforts and results increases accountability and contributes to reduced public harm.
��������������������������������������������������������������������������� 17 creating inauthentic accounts on any of the 1. Blocking the creation platforms we studied. The continued low cost of inauthentic accounts and effectiveness of manipulation services is proof of this, as we will see in the coming Blocking the creation of fake or inauthentic chapters. accounts is perhaps the most important step YouTube, Instagram, and TikTok are in dire social media platforms can take to prevent need of improved protections. This is espe- abuse. The easier and faster it is to create cially surprising since Instagram and Face- fake accounts, the easier it will be to manipu- book are owned by the same company, yet late social media activity. their protection protocols are worlds apart. In order to assess the ability of the social media companies to protect their users from 2. Removing manipulation, we attempted to create fake ac- counts using a variety of methods. We already inauthentic accounts knew that resellers of fake accounts provide The longer bots and inauthentic accounts detailed instructions on how to prevent them used to deliver manipulation remain on a plat- from being blocked; these typically include a form, the lower the cost for the manipulation set of preconfigured cookies to use with the service providers, as they don’t have to spend account, dedicated proxies, specific IP loca- time and money to replace blocked accounts. tions that should be avoided, and other con- figuration descriptions required in order to run Last year only 17 per cent of the inauthen- the account without getting banned. It was tic accounts identified on Instagram and clear from these detailed instructions that YouTube had been removed after six weeks significant improvements had been made in making them the worst-performing platforms detecting suspicious account activity. for blocking inauthentic accounts. Facebook ranked third, having removed 21 per cent. The The biggest change this year was that Face- least-poorly-performing platform was Twitter, book had added a number of new features for which succeeded in removing 35 per cent. blocking inauthentic accounts. It now requires significant effort to bypass Facebook’s protec- This year, the figures remained roughly the tion protocols; the account-generation industry same at 0.4 per cent account removal per must overcome these obstacles and some of day. However, in calculating the half-life of all the larger suppliers now provide specific tools accounts identified as being used to deliver and guidelines to help their customers. fake engagement, we can see that it has de- creased from 128 days to 118 days, so there Ultimately, however, none of the protection is actually a slight increase in the platforms’ measures currently in place are robust enough ability to remove them. The mean lifespan for to stop persistent users or organisations from inauthentic accounts has decreased from 224
18 ���������������������������������������������������������������������������� days to 203 days before removal; although sufficient to demonstrate there has been no marginal, this is progress. measurable improvement.
While social media companies continue to We conclude that there are significant differ- report that they block millions of inauthentic ences among platforms, and that likely more accounts annually, we are unable to verify the can be done to reduce the half-life of active effectiveness of these efforts. On the contrary, inauthentic accounts, especially with regard inauthentic accounts seem to be active for to the cheap and simple kind of inauthentic quite a long time before they are detected and accounts that we tracked for this experiment. deleted. While it has become more difficult to create a fake account, we see only marginal 3. Removing improvements in decreasing the lifespan and half-life of recognised inauthentic accounts. inauthentic activity
While their average lifespan of an inauthen- Removing inauthentic activity is the process tic account on Facebook, Twitter, and Ins- of identifying fake engagement and removing tagram24 has decreased over the past year, it after it has been delivered. The faster the ac- Facebook stands out as the only platform tivity is removed, the smaller the effect it will with an increased half-life for inauthentic have on social media conversations. accounts. Twitter was the only platform that Last year we showed that social media com- managed to significantly reduce half-life. This panies struggled to identify and remove fake does not necessarily mean that the situation activity as the vast majority of all the fake en- is worse on Facebook—these figures are cal- gagement was still online four weeks after de- culated from small sample sizes, but they are BOTBOT HALF HALF LIFE LIFE BY BY PLATFORM PLATFORM
2 192 19 2 2 2 2
Facebook Facebook 124124 147147 Facebook Facebook daysdays daysdays
Twitter Twitter 6868 4141 Twitter Twitter daysdays daysdays
Instagram Instagram 193193 165165 Instagram Instagram daysdays daysdays
128128 118118 AVERAGEAVERAGE daysdays daysdays AVERAGEAVERAGE
��������������������������������������������������������������������������� 19 livery. The platforms performed relatively well be practically defenceless. Fake views contin- only with regard to fake followers—roughly ue to be delivered quickly to Facebook, Insta- half had been removed after four weeks. gram, Twitter, and TikTok and seem never to be removed. As this form of manipulation is This year’s results were roughly the same for the cheapest, we are led to speculate that the Instagram and YouTube, with few engage- manipulation service providers might be us- ments removed. Our new subject, TikTok, did ing a different method to manipulate the num- equally poorly. Of a total 337 768 delivered ber of views than they use for delivering likes fake engagements across all platforms, more and comments—one that manages to exploit than 98 per cent remained online and active flaws in how the platforms count views in- after four weeks, and even discounting fake stead of relying on fake accounts to engage. views, more than 80 per cent of the 14 566 fake engagements delivered remained active after a month. Twitter and Facebook showed 4. Cost of services progress this year, demonstrating active ef- The cost of manipulation is a good indicator forts to counter manipulation during the four- of how effectively social media platforms are week test cycle. combating manipulation. If accounts used to Facebook’s removal of inauthentic accounts perform manipulation are removed, manipula- resulted in a U-shaped pattern of the false tion service providers have to spend time and engagement being added, removed, and then money to replace them. When social media replaced by the manipulation service provid- platforms redesign their service to make the ers. Fake engagement gradually began to be scripts used to seed manipulation obsolete, removed after five days and reached- max developers have to update their scripts. These imum removal after about two weeks. On costs are passed on to consumers. Twitter, some manipulation service providers We compared the price of a basket of manipu- had their engagement removed after roughly lation consisting of 100 likes, 100 comments, 12 hours. To bypass the protection protocols 100 followers and 1000 views from five Rus- on Twitter and Facebook, the providers now sian manipulation service providers to arrive drip-feed engagement; this seems to work at a mean price for 2020. The result is quite rather well. It does, however, reduce the speed revealing. YouTube is the most expensive of manipulation service delivery, making it platform to manipulate at a cost of 11,12 € for harder to effectively manipulate ‘live’ conver- the basket. Twitter joins YouTube at the high sations. end with a price of 10,44 €. A Facebook ma- All platforms still face significant challenges nipulation basket is moderately priced at 8,41 in countering fake video views. Only YouTube €, while TikTok and Instagram are the cheap- has a somewhat effective system for counter- est at 2,73 € and 1,24 € respectively. ing fake views; all other platforms continue to
20 ���������������������������������������������������������������������������� We found an increase of roughly 20 per cent hundred times cheaper than fake comments. in the price of a basket of manipulation ser- While this ratio remains roughly the same, vices across all social media platforms from there is a significant difference between how 2018 to 2020. However, a closer look at specif- much manipulation can be bought for 10 € on ic services reveals a tendency for comments the various social media platforms. While 10 to become more expensive, while the cost of euro will buy 90 000 views on Instagram, it will likes remains the same or has got cheaper. only buy you 7000 views on YouTube. Surpris- Average price-levels are roughly the same ingly 10 € will buy more than a thousand com- as in 2018, meaning that there has been no ments on Facebook-owned Instagram, but the significant change in prices over the past same amount will only fetch 130 comments two years. There has been a slight decrease on Facebook. in the cost of manipulation services for You- Over the past year, fake comments have be- Tube, and a slight increase for Twitter. Prices come more expensive, which is a positive remain roughly the same for Instagram and development, but social media manipulation Facebook manipulation services as in 2018, in general continues to be cheap and readily however, prices for Facebook manipulation available. There have been no significant price had dropped significantly in 2019. increases since last year and the differences Fake views continue to be the cheapest form in price between services and platforms re- of manipulation—about ten times cheaper main relatively unchanged. than fake likes and follows, and roughly one
€ 16.00 CHANG IN THE COST OF A BASKET OF MANIPULATION SERVIC € 14.00 Twitter Facebook Twitter € 12.00 YouTube 100 likes 100 likes 100 comments 100 comments Facebook € 10.00 1000 views 1000 views 100 friends 100 followers Average € 8.00
€ 6.00 Instagram YouTube
100 likes 100 likes € 4.00 100 comments 1000 views TikTok 1000 views 100 comments € 2.00 100 followers 100 subscribers Instagram € 0.00 2018 2019 2020
��������������������������������������������������������������������������� 21 3135 likes 3 31 likes HOW MUCH MANIPULATION 1914 likes 1 824 likes CAN YOU BUY FOR 10 EURO? 7577 likes
13 comments 14 comments 114 comments Buying comments 1 89 comments is the most expensive Buying views 455 comments form of manipulation on TikTok and Instagram is cheaper 6994 views than on her 2 431 views platforms 3 3 7 views 9 92 views 9 9 9 views
598 followers 2273 followers 1272 views 8265 followers 3431 followers
0 2000 4000 6000 8000 10000 12000 14000
YouTube Facebook Twitter Instagram TikTok
PRICE COMPARISON FOR MANIPULATION SERVIC
2 E YouTube Facebook Twitter Instagram TikTok 1,8 E 1,6 E
1,4 E 1,2 E
1 E
8 CE T 6 CE T 4 CE T 2 CE T
1 I E 1 C E T 1 IE 1 F E
22 ���������������������������������������������������������������������������� Comparing the speed of delivery of all manip- 5. Speed and availability ulation services, excluding failed deliveries, of manipulation across all social media companies studied, we found that roughly 60 percent of all manip- ulation services were delivered after 24 hours. Manipulations on Twitter arrive most quickly Last year we found that manipulation services but are also removed most quickly (on aver- for Instagram in general were the most reli- age). TikTok performs worst; manipulations able, while comment services for Facebook, are delivered almost instantaneously and re- Twitter, and YouTube were the least reliable. main over time. This has changed over the past year and there is enough evidence to conclude that platform Views and likes are still delivered relatively efforts to improve security have had some quickly. Views are usually delivered within an effect. This is most notable on Facebook, hour, with the exception of YouTube, where it where automated comment services are no took as long as a month for all the fake views longer widely available. Instead, manipulation we had purchased to be delivered. On Insta- service providers now must pay real humans gram, we managed to get 250 000 fake views to deliver fake comments. While this service delivered within an hour. Likes were usually has existed for quite some time, the fact that also delivered within an hour; when it took lon- cheaper automated services are no longer ger we surmise that this was likely because available on Facebook is a testament to the the manipulation service providers were slow effectiveness of Facebook’s efforts. to launch their services, rather than a result of platform countermeasures. Although it is important to recognise and en- courage any improvements made by the plat- There has been an overall reduction in the forms, social media manipulation continues speed of delivery of manipulation services. to be widely accessible and professional. The In 2019, most services were delivered in full providers we used are highly professional and within an hour. The main difference in 2020 customer oriented, offering discounts, mon- is a reduction in the availability and speed of ey-back guarantees, and 24/7 support. comment services on Facebook and Twitter. YouTube manipulation delivery is roughly Platform pushback, primarily from Facebook equal to that of 2019; manipulation services and to some extent from Twitter, is challeng- are rather slow, and large orders of more than ing primarily automatic manipulation ser- ten thousand views require days or weeks to vices, most notably comments. Another sig- be delivered in full. For the other platforms nificant change is that comment services are views are delivered as quickly and effectively now more often drip-fed, meaning that it now this year as they were last year. takes slightly longer for manipulation services to deliver purchased comments safely.
��������������������������������������������������������������������������� 23 COMPARISON OF SPEED OF DELIVERY OF FAKE ENGAGEMENTS
100% TikTok
YouTube 80% Instagram 60% Facebook
40% Twitter
20%
0% 12h after 24h after 48h after 72h after
6. Responsiveness After the end of phase one of the experiment, the speed of online conversations, this year we we reported 100 random accounts used for shortened the assessment time to five days. social media manipulation to each of the Our reporting produced better results in 2020, platforms in our study, and then monitored although the platforms’ response is still woe- how many accounts the platforms were able fully insufficient to combat manipulation to remove within five days. It is worth reiter- effectively. Facebook was most successful, ating that the accounts we reported were removing nine of the 100 accounts reported. the accounts that delivered the manipula- Twitter followed close behind, removing sev- tion we bought, meaning that we were 100 en accounts and temporarily suspending two per cent certain the accounts we reported others. Instagram removed just one of the were engaging in social media manipulation. reported accounts, and YouTube and TikTok We reported these accounts anonymously. didn’t remove any.
To better compare this year’s figures to last In 2019, removal rates left a lot to be desired. year’s, we re-checked removals three weeks Facebook removed 12 per cent, Twitter and In- after the time of reporting. At that time Twit- stagram 3 per cent, and YouTube 0 per cent of ter was in first place, having removed twelve the reported accounts after three weeks. Given
24 ���������������������������������������������������������������������������� accounts and suspended eight; Facebook that, given the frequency and diversity of the had removed thirteen accounts, Instagram— engagement, the reported accounts were de- three, TikTok—two, and YouTube still hadn’t livering paid services. removed a single account. We conclude that reporting and moderation This marginal increase suggests that time is mechanisms must be improved so that a larg- not the factor that determines whether a re- er share of inauthentic accounts reported are ported account is removed. It was possibly removed, even if they are reported by a single the natural rate at which fake accounts get user. It is not satisfactory that reported inau- removed, rather than our reporting, that led to thentic accounts regularly escape sanction. the small increases in removals over time.
It is clear that reporting an account as engag- 7. Transparency of efforts ing in social media manipulation does not sig- All the major social media platforms studied nificantly contribute to any of the platforms for this report have increased their commu- removing or suspending that account, even nication efforts regarding measures under- if the validity of the report is confirmed. In- taken to counter abuse. Their efforts are in deed, in all cases where we received feedback alignment with the European Code of Practice about an account we had reported, it was the on Disinformation and the political pressure platform letting us know that the user in ques- the platforms were facing ahead of the US tion had not violated the platform’s terms of presidential election of 2020. Today TikTok,25 service. We suspect that platforms are using Twitter,26 Facebook,27 and Google28 all publish quantitative indicators rather than a manual transparency reports of various kinds, some assessment of reported accounts to deter- of which relate to fake accounts and other mine abuse. Manual vetting of the reported forms of inauthentic coordinated behaviour. accounts would have clearly demonstrated Facebook reported that in Q2 2020, 1,5 billion fake accounts were removed for violating the 9 out of 1 REMOVED ACCOUNTS community standards, a slight reduction from 3 WEEKS AFTER 2019, which was attributed to the platform’s 7 out of 1 REPORTING increased ability to block the creation of fake 29 1 out of 1 accounts. Twitter’s most recent report on fake account removal is from Q3/Q4 2019 out of 1 when the platform reported that it had sanc- out of 1 tioned 2,3 million accounts and suspended close to 800 thousand; Twitter also reported a 30 per cent increase in accounts sanctioned Facebook Instagram YouTube for platform manipulation.30 We have been Twitter TikTok unable to find detailed information on fake
��������������������������������������������������������������������������� 25 account removal or on platform manipula- tinuous improvements by the Google Threat tion in general for Google and TikTok. Google Analysis Group is also a positive step, even does provide information on advanced threats if the content of its reports has yet to reach through its Threat Analysis Group.31 the level of detail and transparency that Face- book is delivering. Twitter’s safety blog,32 Facebook’s news ar- chives,33 and TikTok’s Newsroom reporting on While all of these efforts represent steps in safety-related matters34 provide further infor- the right direction, much more can be done mation pertaining to specific threats and ma- in terms of contextualizing information, con- nipulation activities, often focusing on specif- ducting more thorough audits and publishing ic content being taken down. the results, developing disclosure frameworks and collaborative transparency best practic- Facebook should be commended for the de- es jointly with other platforms, etc. From the tailed takedown reports it is now publishing information currently being provided by the on a monthly and case-by-case basis. Twitter platforms, it continues to be difficult to as- should be commended for publishing data sess how the threat landscape is developing that outside researchers can study. The con- and how well counter efforts are working.
26 ���������������������������������������������������������������������������� Twitter remains the industry leader in 2020 but Facebook is rapidly closing the gap with its impressive improvements. Instagram and YouTube are still struggling behind, but while Instagram is slowly moving in the right direction, YouTube has made very few improvements. TikTok is the defenceless newcomer with much to learn.
ASSESSMENT OF EACH PLATFORM’S RELATIVE STRENGTHS AND WEAKNESSES
The performance of each social media platform Facebook has become significantly more dif- is assessed based on the criteria introduced ficult in the last year. above. To assess the relative performance for Despite being the most effective at blocking each of the social media platforms we have the creation of fake accounts, Facebook was given a rating to each company for each cat- the only platform studied that showed an in- egory. These qualitative ratings are based on a crease in the half-life of active inauthentic ac- joint assessment by the research team. counts—currently an average 147 days. In this Facebook regard Facebook is significantly behind Twitter. However, Facebook has made important strides Facebook is the most effective social media in removing inauthentic activity. A portion of platform at blocking the creation of inau- the likes we had purchased were gradually thentic accounts. It uses advanced analytics removed over a period of two weeks. We also and techniques such as requesting the users observed a reduction in the speed of manipu- to upload videos to verify their accounts to lation delivery, especially for fake comments, identify them and keep fake accounts from indicating that manipulation service providers making it onto the platform. In order to cre- can no longer automate such delivery. ate a fake account on Facebook, advanced IP-address and cookie control is now neces- The price of Facebook manipulation decreased sary; this has prompted manipulation service between 2018 and 2019, only to return to 2018 providers to offer lengthy tutorials about how levels again in 2020. This suggests that an to prevent a fake account from being blocked. increase in protection protocols is starting to We conclude that creating fake accounts on translate into higher manipulation costs.
��������������������������������������������������������������������������� 27 Facebook has not improved its ability to re- We observed no significant improvements in move reported inauthentic accounts—at 9 per blocking or removing fake accounts. However, cent after five days, the ratio is still far too low manipulation delivery is slower now, and we to make an impact. Facebook did, however, in- saw that manipulation providers encountered crease its transparency and is currently pub- challenges resulting in only partial deliveries lishing regular updates on takedown activi- for specific kinds of manipulation, such as ties. On 19 November 2020, Facebook further mentions. It remains to be seen whether these refined and updated its community standards ‘wins’ for the platform are temporary. page,35 and provided additional information Instagram’s failure to counter manipulation is regarding fake accounts.36 We are still miss- perhaps most evident in the cost of manipula- ing important details about how Facebook tion. Instagram continues to be the cheapest compiles its statistics and would welcome platform to manipulate, even cheaper than independent assessment and auditing of the TikTok. Moreover, prices are currently falling, data and reports published by the company. suggesting that manipulation services are Facebook’s current research initiatives are a highly available. crucial step in the right direction, and we look forward to studying the results.37 In assessing Instagram’s performance in comparison to 2019, we observed only mar- Facebook has had a good year. We have noted ginal improvements—primarily that delivery improvements in every category of measure- of fake likes and comments are now delivered ment, although some are only marginal. We significantly slower than last year. Fake views commend the platform’s improvements in are still delivered instantaneously. blocking fake account creation and remov- ing fake activity, and its success in stopping We conclude that Instagram remains very some forms of automatic manipulation. easy to manipulate and whatever slight im- However, the fake accounts that are created provements it has introduced in the last still survive too long, and Facebook’s respon- year have made little difference. Instagram’s siveness to reporting leaves much to be de- parent company, Facebook, should share its sired. growing expertise. Instagram Twitter Even though Instagram is owned by Facebook, Twitter continues to be the most effective it is far less effective at countering platform platform overall at countering manipulation, abuse. In fact, Instagram is outperformed by outperforming Facebook in several areas but its parent company in every category we as- falling behind in others. Facebook is slowly sess. Manipulating Instagram remains cheap gaining on Twitter’s lead. and effective.
28 ���������������������������������������������������������������������������� We observed less pushback against the cre- from 15 percent between 2018 and 2019 to ation of fake accounts by Twitter this year only 4,5 per cent between 2019 and 2020. than in 2019. None of our own accounts were In terms of transparency Twitter lags behind blocked this year, and we observed fewer Facebook in updates and regular takedown counter measures in 2020 than in 2019. Face- disclosures but excels in making datasets book’s accomplishments show that more can available for external review by the scientific be done by Twitter in this field. community. Independent audits of method- We observed continued improvements by ology would be useful for Twitter and for the Twitter in all other categories. Most signifi- other platforms. cantly Twitter stands out for its ability to re- Twitter improved in all categories except for move inauthentic accounts. Fake accounts blocking account creation and should be are disappearing 40 per cent faster than in commended for the strides it has made in its 2019, making Twitter three times faster than ability to remove inauthentic accounts and Facebook. activity. Twitter has also improved its ability to remove We conclude that Twitter maintains its posi- fake activity and to slow down the speed of tion at the most effective platform at counter- delivery of inauthentic engagement. Although ing manipulation, but it can do more to block half of the bought likes were delivered within the creation of fake accounts and to increase an hour, Twitter had effectively removed many its ability to remove reported inauthentic ac- within 12 hours. After 24 hours 80 per cent had counts. been removed and after two days virtually all of the fake likes had vanished. However, the drip- fed engagement proved more difficult. Twitter YouTube blocked a significant proportion (as much as YouTube remains mired in its position as the 80 per cent), but the remaining 20 per cent re- least effective of the four major platforms. mained four weeks after delivery. When we added TikTok this year, YouTube lost Of the 100 random accounts we reported to the distinction of being the worst platform, Twitter, 7 per cent were removed and 2 per but only because TikTok is worse. cent were suspended after five days. This is We are unable to accurately assess the half- an improvement over last year, when only 3 life of identified but active YouTube accounts per cent were blocked three weeks after re- used to deliver manipulation because of the porting, but the overall performance is still far platform’s lack of transparency; it simply isn’t too low to make an actual impact. possible for outside researchers to identify Twitter was the only platform to cause a year- the accounts delivering the most inauthentic by-year increase in the price of manipulation content. There was virtually no reduction in services, however, the rate of increase slowed fake comments or the accounts used to de-
��������������������������������������������������������������������������� 29 liver them; only three of 380 fake comments we tested drip feed fake views onto YouTube delivered were eventually removed. On the at a rate of 500 to 4000 views per day, with basis of this performance, we conclude that significant fluctuations over time. It took the YouTube removed few or no inauthentic ac- manipulation providers 6–12 hours to deliver counts on its platform. Furthermore, YouTube the first 1000 views, and then another four removed none of the accounts we reported weeks to reach 10 000 views. YouTube out- for inauthentic activity, a result worse than performs every other platform tested in this last year. category but isn’t able to stop fake views from appearing on its platform. YouTube does a better job than the other plat- forms in removing inauthentic views, as none YouTube maintains its position as the most of the other platforms managed to remove any expensive platform to manipulate, however it of the inauthentic views we bought. We ob- was the only platform for which the price of served a ten per cent reduction after three to manipulation services had decreased since four weeks, and a 50 per cent reduction after 2019. Although the 3,5 per cent reduction is eight weeks. However, we observed that fake smaller than the 25 per cent reduction from views were removed for some videos and not the year before, the trend is still negative; for others, indicating an uneven distribution YouTube is getting cheaper to manipulate of YouTube counter-measures. Interestingly, over time and fewer inauthentic accounts and we experienced significant over-deliveries this engagements were removed in 2020 than in year; manipulation service providers delivered 2019. The primary positive development was 70–90 per cent more engagement for some a continued, and somewhat enhanced, ability YouTube services than we had bought. to slow down the delivery of fake views.
The delivery of inauthentic likes is nearly in- Although YouTube has made some advance stantaneous on YouTube, while inauthentic in transparency this year, it still demonstrates views and comments are either drip-fed or a significant lack of transparency, providing added in increments. The average speed of much less in the way of facts, statistics, and delivery, across all services, for YouTube was takedown reports than Facebook and Twitter. equal to that for Instagram and Facebook af- It is clear that YouTube needs to prioritize ter 12 hours, but after 72 hours we observed combatting inauthentic behaviour. It is embar- 85 per cent delivery on YouTube compared to rassing that several large manipulation ser- only 45–60 per cent for the other platforms. vice providers maintain accounts on YouTube This means that manipulation services, on that publish promotional material and how-to average, can be delivered to YouTube much videos for manipulating the platform. Further- more quickly than to Twitter, Instagram, or more, manipulation service providers continue Facebook. But, as mentioned above, You- to advertise their services successfully through Tube is the only platform able to counter fake Google Ads—YouTube’s parent company. views. Currently all the manipulation providers
30 ���������������������������������������������������������������������������� We conclude that TikTok is far too easy to TikTok manipulate—so easy that perhaps the best The story about TikTok’s ability to counter way to describe the platform is simply as un- manipulation is a sad one, as manipulating defended. The only positive aspect of TikTok’s TikTok is cheap, fast, and reliable. TikTok ability to counter manipulation is that it casts seems unable to identify and remove even the the efforts of the other social media platforms cheapest forms of manipulation from its plat- in a much better light form and provides its users with only the most basic protections. Relative performance. For the duration of our experiment, TikTok The most important insight from this iteration did not manage to remove any of our fake of our study is that there continue to be sig- accounts or any of the fake engagements we nificant differences among platforms in their had purchased. Engagements were delivered ability to counter the manipulation of their almost instantaneously; only comments re- services. From the near defenceless TikTok to quired between 6 and 12 hours to be delivered, industry leader Twitter, all platforms perform indicating that these may have been manually differently in different categories. delivered onto the platform. Twitter remains the industry leader in 2020 Given the ease and availability of manipula- but Facebook is rapidly closing the gap with tion, we were surprised to discover that ma- its impressive improvements. Instagram and nipulation services for TikTok are about 45 YouTube are still struggling behind, but while per cent more expensive than for Instagram Instagram is slowly moving in the right direc- on average (manipulating TikTok in turn is tion, YouTube has made very few improve- one tenth the price of manipulating Twitter on ments. TikTok is the defenceless newcomer average). While TikTok comments are about with much to learn. twice the price of Instagram comments— Despite notable improvements by some, none roughly 2 EUR for 100 fake comments—fake of the five platforms we studied are doing TikTok views are as cheap as fake Instagram enough to prevent the manipulation of their views, with a median price of 0,11 EUR for services. The manipulation service providers 1000 inauthentic views. are still winning the digital arms-race. TikTok’s countermeasures were largely non-existent. For the categories manipulation costs and speed we gave the platform the ben- efit of the doubt, noting some small possible responses. And on a positive note, TikTok has started a program to disclose information about its efforts to counter abuse on the platform.
��������������������������������������������������������������������������� 31 *In red - relative change from 2019 Poor Improving Good