Social Media Manipulation 2020 How Social Media Companies Are Failing to Combat Inauthentic Behaviour Online
Total Page:16
File Type:pdf, Size:1020Kb
978-9934-564-92-5 SOCIAL MEDIA MANIPULATION 2020 HOW SOCIAL MEDIA COMPANIES ARE FAILING TO COMBAT INAUTHENTIC BEHAVIOUR ONLINE PREPARED BY THE NATO STRATEGIC COMMUNICATIONS CENTRE OF EXCELLENCE ____________________________________________________________________________ 1 ISBN: 978-9934-564-92-5 Project Manager: Rolf Fredheim Authors: Sebastian Bay, Anton Dek, Iryna Dek, Rolf Fredheim Research: Singularex Copy Editing: Anna Reynolds Design: Kārlis Ulmanis This report was completed in November 2020, based on an experiment that was conducted September to October 2020. Singularex is a Social Media Intelligence and Analytics company based in Kharkiv, Ukraine. Website: www.singularex.com Email: [email protected] NATO STRATCOM COE 11b Kalnciema Iela Riga LV1048, Latvia www.stratcomcoe.org Facebook/stratcomcoe Twitter @stratcomcoe This publication does not represent the opinions or policies of NATO or NATO StratCom COE. © All rights reserved by the NATO StratCom COE. Reports may not be copied, reproduced, distributed or publicly displayed without reference to the NATO StratCom COE. The views expressed here do not represent the views of NATO. 2 _____________________________________________________________________________ EXECUTIVE SUMMARY Introduction The experiment Antagonists, from foreign governments to To test the ability of social media compa- terror groups, anti-democratic groups, and nies to identify and remove manipulation, commercial companies, continually seek to we bought engagement on thirty-nine Face- manipulate public debate through the use of book, Instagram, Twitter, YouTube, and TikTok coordinated social media manipulation cam- posts, using three high-quality Russian social paigns. These groups rely on fake accounts media manipulation service providers. For and inauthentic behaviour to undermine on- 300 € we received inauthentic engagement in line conversations, causing online and offline the form of 1 150 comments, 9 690 likes, 323 harm to both society and individuals. 202 views, and 3 726 shares on Facebook, Ins- tagram, YouTube, Twitter, and Tiktok, enabling As a testament to the continued interest of us to identify 8 036 accounts being used for antagonists and opportunists alike to manip- social media manipulation. ulate social media, a string of social media companies, researchers, intelligence services, While measuring the ability of social media and interest groups have detailed attempts to platforms to block fake account creation, manipulate social media conversations during to identify and remove fake activity, and the past year. Therefore, it continues to be es- to respond to user reports of inauthentic sential to evaluate whether the social media accounts, we noted that some of the plat- companies are living up to their commitments forms studied had made important improve- to counter misuse of their platforms. ments. However, other platforms exhibited a continued inability to combat manipula- In an attempt to contribute to the evalua- tion. Of the 337 768 fake engagements pur- tion of social media platforms, we re-ran our chased, more than 98 per cent remained ground-breaking experiment to assess their online and active after four weeks, and even ability to counter the malicious use of their discounting fake views, more than 80 per services. This year we spent significant effort cent of the 14 566 other fake engagements to improve our methodology further, and we delivered remained active after a month. also added a fifth social media platform, Tik- Tok, to our experiment. ____________________________________________________________________________ 3 While these numbers aren’t much to celebrate, ed last year, and we have not seen any mean- we did observe significant pushback, primari- ingful improvements since then. In fact, since ly from Facebook and Twitter. In some cases, our first report, YouTube has been overtaken Facebook managed to remove as much as 90 by both Facebook and Instagram. per cent of the fake views before the manip- ulation service providers restored their work. TikTok, the latest platform added to our exper- This is an important step in the right direction. iment, seems to be nearly defenceless against platform manipulation. None of the manipu- Yet another win this year was a significant im- lation was prevented or removed by the plat- provement by Facebook in blocking the cre- form, making it by a distance the easiest to ation of inauthentic accounts. But despite be- manipulate. In hindsight, we now realise that ing the most effective at this, Facebook was the other platforms could be doing worse. the only platform studied to show an increas- ing half-life for active inauthentic accounts. Only YouTube can counteract fake views in While none of the measures introduced by any significant way. On Facebook, Instagram, any of the platforms are robust enough to Twitter, and TikTok, fake views continue to stop persistent users or organisations from be quickly delivered, and are seemingly never manipulation, both Facebook and Twitter removed. On Instagram, we managed to get have made it significantly more challenging. 250 000 fake views delivered within an hour. Antagonists use fake views to manipulate Twitter stands out in its ability to remove in- platform algorithms to influence what social authentic accounts from the platform; fake media users see. This remains a significant accounts disappear 40 per cent faster than challenge for all platforms. in 2019, indicating that Twitter is three times faster than Facebook at removing accounts Our investigation has also shown that re- engaged in inauthentic activity. porting an account for confirmed social me- dia manipulation does not induce a platform Twitter and Facebook are now relatively good to block that account. Of the 499 accounts at blocking simpler automatic manipulation (100 to Facebook, 100 to Instagram, 100 to services, most notably automated comments, YouTube, 100 to Twitter and 99 to TikTok) we pushing manipulation service providers to rely reported as verified to have engaged in inau- on human labour to a greater extent. While there thentic behaviour, 482 accounts remained ac- isn’t any clear indication that this is driving up tive five days after they were reported. Indeed, prices yet, any human-based manipulation will in all cases where we received feedback on be more expensive than an automatic solution. the accounts we reported, it was to state that the user in question did not violate the plat- YouTube maintains its position as the least form’s terms of service. We conclude that re- effective of the four major platforms we test- porting and moderation mechanisms must be 4 _____________________________________________________________________________ improved so that a larger share of inauthen- haviour, and to developing the frameworks tic accounts reported to the platforms are necessary to impose economic, diplomatic, removed—even if only a single user reports or legal penalties potent enough to deter gov- them. Reported inauthentic accounts must ernments, organisations, and companies from not regularly escape sanction. breaking the norms of online behaviour. Conclusions The most important insight from this exper- iment is that platforms continue to vary in their ability to counter manipulation of their services. Facebook-owned Instagram shows how this variation exists even within compa- Based on our experiment, we nies. Instagram remains much easier to ma- recommend that governments nipulate than Facebook and appears to lack introduce measures to: serious safeguards. Tellingly, the cost of ma- nipulating Instagram is roughly one tenth of the cost of targeting Facebook. 1. Increase transparency and In 2020, Twitter is still the industry leader in develop new safety standards combating manipulation, but Facebook is rap- for social media platforms idly closing the gap with impressive improve- ments. Instagram and YouTube are still strug- 2. Establish independent and gling behind, but while Instagram is slowly well-resourced oversight of moving in the right direction, YouTube seems social media platforms to have given up. TikTok is the defenceless newcomer with much to learn. 3. Increase efforts to deter social media manipulation Despite significant improvements by some, none of the five platforms is doing enough 4. Continue to pressure social to prevent the manipulation of their services. media platforms to do more Manipulation service providers are still win- to counter the abuse of their ning. services This ongoing and evolving threat has un- derscored the need for a whole-of-society approach to defining acceptable online be- ____________________________________________________________________________ 5 The evidence is clear: schemers around the world take every opportunity to manipulate social media platforms for a variety of commercial, criminal, and political reasons. INTRODUCTION One year ago, the NATO StratCom Centre of In September 2020 the European commission Excellence carried out a ground-breaking ex- presented an assessment of the effectiveness periment to assess the ability of social me- of the Code of Practice on Disinformation one dia companies to counter the malicious use year after implementation. The commission of their services. We showed that an entire concludes that the social media platforms industry had developed around the manipu- have put policies in place to counter the ma- lation of social media1, and concluded that nipulation of their services, but lack of trans- social media companies were experiencing