Towards a Deception Detection Framework for Social Media
Total Page:16
File Type:pdf, Size:1020Kb
CAN UNCLASSIFIED Towards a Deception Detection Framework for Social Media Bruce Forrester DRDC – Valcartier Research Centre Friederike Von Franqué Von Franqué consulting 25th ICCRTS Virtual event, 2–6 and 9–13 November 2020 Topic 1: C2 in the Information Age Paper number: 091 Date of Publication from Ext Publisher: December 2020 The body of this CAN UNCLASSIFIED document does not contain the required security banners according to DND security standards. However, it must be treated as CAN UNCLASSIFIED and protected appropriately based on the terms and conditions specified on the covering page. Defence Research and Development Canada External Literature (P) DRDC-RDDC-2020-P228 December 2020 CAN UNCLASSIFIED CAN UNCLASSIFIED IMPORTANT INFORMATIVE STATEMENTS This document was reviewed for Controlled Goods by Defence Research and Development Canada using the Schedule to the Defence Production Act. Disclaimer: This document is not published by the Editorial Office of Defence Research and Development Canada, an agency of the Department of National Defence of Canada but is to be catalogued in the Canadian Defence Information System (CANDIS), the national repository for Defence S&T documents. Her Majesty the Queen in Right of Canada (Department of National Defence) makes no representations or warranties, expressed or implied, of any kind whatsoever, and assumes no liability for the accuracy, reliability, completeness, currency or usefulness of any information, product, process or material included in this document. Nothing in this document should be interpreted as an endorsement for the specific use of any tool, technique or process examined in it. Any reliance on, or use of, any information, product, process or material included in this document is at the sole risk of the person so using it or relying on it. Canada does not assume any liability in respect of any damages or losses arising out of or in connection with the use of, or reliance on, any information, product, process or material included in this document. Template in use: EO Publishing App for CR-EL Eng 2019-01-03-v1.dotm © Her Majesty the Queen in Right of Canada (Department of National Defence), 2020 © Sa Majesté la Reine en droit du Canada (Ministère de la Défense nationale), 2020 CAN UNCLASSIFIED 25th ICCRTS – 3-5 November, 2020 “The Future of Command and Control” Towards a Deception Detection Framework for Social Media Paper number: 091 Topic 1: C2 in the Information Age Bruce Forrester Defence R&D Canada – Valcartier 2459 Pie-XI North Quebec, QC, G3J 1X5 Tel.: (418) 844-4000 #4943 [email protected] Friederike Von Franqué Von Franqué Consluting 0176-83076104 [email protected] Abstract The democratization of communication media has had significant consequences for military command and control. Never before have adversaries had such free and direct access to our local populations allowing for influence through propaganda, disinformation and deception. In fact, social media platforms can help target messages to the exact demographic desired, while keeping attribution hidden. Commanders have been reluctant to embrace the new communication technologies which has resulted in playing catch up. Meanwhile, our opponents have infiltrated our communication spaces and ‘recruited’ thousands of followers who spread their messages unwittingly. This paper will present a new research framework for deception that will help overcome the issues of attribution of message originators. Concentrating on uncovering narratives, methods and intent rather than individuals alleviates many ethical problems of social media analytics within western societies. The framework will help to guide research on deception detection and increase assessment confidence for intelligence analysts and public affairs involved in the ongoing information environment clashes of power and politics. 1. Introduction In warfare, the consequences of deception can be severe and decisive. Not surprisingly, deceiving and influencing the enemy, or one’s own population for that matter, is not new. It has existed for thousands of years and as Sun Tzu stated in his book The Art of War, “the supreme art of war is to subdue the enemy without fighting”. Operational and tactical commanders’ use forms of deception on a regular basis to keep the actual or future adversary misinformed or just guessing about military resources, processes or future plans. The use of influence campaigns designed to sow division, garner support, or to just create chaos in adversary countries have similarly been used before. However, this aspect of the art of war is more than ever easier due to affordances of cyberspace and the democratization of information and communications via social media. A recent example is the suspected involvement of Russia during the 2016 US presidential election. It seems clear that outside forces were at play and trying to influence voters. For example, Timberg [1] states: “There is no way to know whether the Russian campaign proved decisive in electing Trump, but researchers portray it as part of a broadly effective strategy of sowing distrust in U.S. democracy and its leaders. The tactics included penetrating the computers of election officials in several states and releasing troves of hacked emails that embarrassed Clinton in the final months of her campaign.” Russian cyber and influence activities have been well documented [2-4] in the Ukraine during the annexation of the Crimea which was accomplished with almost no fighting. In fact, Berzin [3] states that “the Russian view of modern warfare is based on the idea that the main battle-space is the mind, and as a result, new-generation wars are to be dominated by information and psychological warfare…” (p.5). Such influence activities indeed point to a new focus of warfare; one conducted in cyber space, that exploits the use of deception and influence, and that broadcasts messages using social media. Within the cyber domain, Russia’s tool kit includes the “weaponization of information, culture and money” [5] in an effort to inject disinformation, confusion and proliferate falsehoods. This is being accomplished through using all available information channels such as TV news channels (i.e. RT, Sputnik), newspapers, YouTube channels (i.e. RT), Blogs and websites [6], as well as state sponsored Trolls [7, 8] who are ever-present on many social media outlets. Most of these means are combined and intermingled to create repetition of alternative narratives in many places, effectively strengthening perceived authenticity of the message. One of the most important channels to disseminate influential information operations are social media. Their reach has become expansive. Social media provides access to literally billions of people at a very granular level where even regionally targeted news can create viral explosions of a scale that have real effect in the physical world. For example, Pizzagate is the case of Edgar Welch who took his AR-15 rifle to Comet Ping Pong pizzeria in Washington to save children from ritualistic child abuse [9]. A convoluted conspiracy theory that originated and spread via social media but ended up with Welch actually going to the pizzeria with a rifle. Different deception techniques use social media as a platform or have been newly developed for this environment. Established propaganda techniques such as lies, misinformation, exaggeration, omission, bluffs, white lies, etc., meet with the newly available digital deception and propaganda techniques such as deep fakes, phishing and trolling and can each lead to unique manifestations within social media. It is not easy to spot these deception techniques, on the contrary, it is easy to become overwhelmed and muddled in one’s approach to detection of deception. Even fact- checking websites (i.e. Snopes.com, Google Fact Check) that are designed to help people differentiate fact from fiction are being faked [10]. To help researchers and operators deal with this complexity, a comprehensive detection framework is required. This paper presents an empirical-based framework for deception detection in social media. The framework described will allow for directed research, the production of indicators, and the development of algorithms. Single pieces of information are not usually considered sufficient to make solid recommendations and decisions. Once the framework is validated it will allow the ability for triangulation between its parts in order to increase confidence in any declaration of deception detection, thus improving a commander’s situational awareness and decision making capability. 1.1 What is Deception? Many areas of expertise developed definitions of their understanding of deception and their close relatives: lies, fraud, trickery, delusion, misguidance, or misdirection. In military deception operations the objectives are often designed to make the opponent believe some falsehood about military strength or obfuscate future actions. A famous example was the “Operation Fortitude” embedded into the planning of the invasion of Normandy in 1944: For a non-existent army, stage designers built dummy tanks made of rubber and dummy airports with wooden airplanes. Officers engaged in radio communication, ordered food supply and discussed fictitious attack plans [11]. While there is no precise overlap in the definitions of deception from domain to domain, the definitions are close to each other and there are at least four characteristics that are common to most [12, 13]: a. The intent is deliberate; b. The intent is to mislead; c. Targets are unaware; and d. The aim is to transfer a false belief to another person. Deception campaigns have as a goal to get the target or population to do something that the deceiver wants them to do or not do and thus give the deceiver, in a way, control over the targets’ actions or general behaviour. It can be to confuse, delay, or waste resources of an opponent, to put the receiver at disadvantage, or to hide one’s own purpose. Deception can occur to discredit and divide. Deception is an art form and as such will never remain static but continue to evolve and change.