<<

This report was published on the Behavioural Sciences

https://bscresearch.com.sg

Title: How to protect yourself from ? Five tips from professional fact- checkers

Author(s): Chen Xingyu, Senior Research Analyst, HTBSC Neo Loo Seng, Principal Research Analyst, HTBSC

Citation: Chen, X., & Neo, L. S. (2020). How to protect yourself from fake news? Five tips from professional fact-checkers (HTBSC Research Report 36/2020). Home Team Behavioural Sciences Centre.

Copyright © 2020. All rights are reserved. Views expressed in this publication are the authors’ only and do not represent or imply any official position or view. This publication is intended to stimulate further discussion about the topic.

Uploaded on July 2020. Please direct any correspondence to [email protected] PUBLIC

How to protect yourself from fake news? Five tips from professional fact-checkers

Resilience, Safety and Security Branch Theme: Resilience to fake news, Fact-checking

Research Report 36/2020 Accurate as of 7 September 2020

AUTHORS Chen Xingyu, Senior Behavioural Sciences Research Analyst Neo Loo Seng, Principal Behavioural Sciences Research Analyst

SYNOPSIS Cultivating good online behaviours and media literacy skills in the public is an essential component in fighting the spread of fake news. This report identified five tips by examining the behaviours of fact-checkers and their recommendations for countering fake news: (i) checking one’s emotions, (ii) checking one’s , (iii) checking for corroborating sources, (iv) checking the source for red flags, and (v) countering fake news by taking action against them.

THIS PROJECT IS SUPERVISED BY Dr Gabriel Ong, Senior Assistant Director Dr Majeed Khader, Director

Citation: Chen, X., & Neo, L. S. (2020). How to protect yourself from fake news? Five tips from professional fact- checkers (HTBSC Research Report 36/2020). Home Team Behavioural Sciences Centre.

PUBLIC

CONTENTS

FOREWORD ...... Error! Bookmark not defined.

CONTENTS ...... 1

1. INTRODUCTION ...... 2 1.1 Importance of behaviours and skills for individuals to spot fake news ...... 3

2. TIP 1: CHECK IF YOU ARE EXPERIENCING STRONG EMOTIONS ...... 3 2.1 Wait for your emotions to subside ...... 4 2.2 Regulate your emotional response to the ...... 4 2.2.1 Cognitive reappraisal ...... 4 2.2.2 Emotional labelling ...... 4

3. TIP 2: DE- YOURSELF...... 5 3.1 Read the content first before sharing ...... 5 3.2 Use a ‘consider the opposite’ strategy ...... 6 3.3 Seek out information from diverse sources ...... 7 3.4 Have a sceptical when online — Be S.U.R.E first ...... 7

4. TIP 3: CORROBORATE INFORMATION FROM OTHER SOURCES ...... 8 4.1 Check the credentials of the news source through lateral reading ...... 9 4.2 Check if fact-checkers and reputable outlets have reported on the information ...... 10

5. TIP 4: CHECK THE CONTENT FOR RED FLAGS ...... 11 5.1 Examine the quality of evidence for the claims ...... 11 5.2 Check if the source is motivated to be accurate or is motivated by money...... 12 5.3 Look out for recycled/doctored images ...... 13

6. TIP 5: COUNTER FAKE NEWS BY TAKING ACTION AGAINST IT ...... 14 6.1 Sending fake news directly to fact-checkers or news agencies ...... 14 6.2 Flag fake news found on social media ...... 14 6.3 Affirm their good intentions when countering fake news from friends and family ...... 15

7. CONCLUSION ...... 15

REFERENCES...... 16

APPENDICES ...... 22 Appendix A: List of fact-checking and other relevant organisations used in this research 22 Appendix B: Commonly identified biases that lead individuals to fall for fake news ...... 23 Appendix C: Additional Resources ...... 26

ABOUT THE HOME TEAM BEHAVIOURAL SCIENCES CENTREError! Bookmark not defined.

HOME TEAM BEHAVIOURAL SCIENCES CENTRE 1

PUBLIC

1. INTRODUCTION

The Internet has dramatically eased the public’s access to information. However, this has also exposed them to all manner of online information disorder such as fake news. The spread of fake news can have negative consequences on society such as wasting of public resources to address the false information, inciting public panic, increasing distrust towards institutions, and/or worsening social tensions (Chen & Goh, 2019; Chen & Neo, 2019; Chen et al., 2019).

In response to the proliferation of fake news1 online, there has been a boom in the number of fact-checking organisations worldwide – 188 organisations in more than 60 countries (Stencel, 2019). These organisations are staffed by professional fact-checkers, whose job is to spot and debunk fake news. On top of fact-checking stories, these fact-checking organisations have also shared tips with the public on how to spot and fact-check fake news.

Fact-checkers have been found to be more adept at spotting untrustworthy information online. In a study by Stanford Education Group, they compared historians, professional fact-checkers, and Stanford University undergraduates to understand which group would be better at evaluating the trustworthiness of the source of information (Wineburg & McGrew, 2017). All groups were given two sources of information, one trustworthy and the other was not, on the topic of and asked to evaluate them within ten minutes. The authors found drastic differences between the three groups (see Figure 1). Compared to historians and students, the fact-checkers performed the best and were able to judge the correct website as trustworthy. The ability for fact-checkers to avoid falling victim to untrustworthy online information has been attributed to the fact that they engage in actions like investigating the organisation that produced the information first before they evaluated the content (Wineburg & McGrew, 2017).

Correct selection Wrong Selection Judged both sources as accurate

FACT- CHECKERS 100%

HISTORIANS 50% 10% 40%

STUDENTS 20% 64% 16%

Figure 1. Percentage of participants in each group selecting the correct source of information as more reliable (Wineburg & McGrew, (2017).

1 For this report, fake news is defined as “content that contains inaccurate, misleading or fabricated information, and is being distributed through different channels of communication such as print, broadcast, text messaging or social media” (Chen & Goh, 2019, p. 121). HOME TEAM BEHAVIOURAL SCIENCES CENTRE 2

PUBLIC

1.1 Importance of behaviours and skills for individuals to spot fake news

The Select Committee’s report on Deliberate Online Falsehoods highlighted the importance of public education “to build up the immunity of our citizenry against deliberate online falsehoods by equipping them with the knowledge and skills to discern truth from falsehood” (Report Of The Select Committee On Deliberate Online Falsehoods – Causes, Consequences And Countermeasures, 2018, para. 258). This is a view that is supported by evidence that one of the most effective means for a society to fight fake news is for its citizens to be equipped with good online behaviours and media literacy skills. For example, it is found that countries who teach media literacy skills to their students (e.g., Finland) were well equipped to resist fake news and their negative ramifications (Charlton, 2019; Lessenski, 2018).

Thus, this report identified five tips for individuals to protect themselves from fake news by examining the behaviours of fact-checkers and their recommendations for countering fake news (see Appendix A for the fact-checkers and digital literacy organisations reviewed in this report). The tips are: (i) checking one’s emotions, (ii) checking one’s biases, (iii) checking for corroborating sources, (iv) checking the content for red flags, and (v) countering fake news by taking action against them.

2. TIP 1: CHECK IF YOU ARE EXPERIENCING STRONG EMOTIONS

“The habit is simple. When you feel strong emotion–happiness, anger, pride, vindication– and that emotion pushes you to share a “fact” with others, STOP. Above all, these are the claims that you must fact-check.”

Michael A. Caulfield, head of the Digital Polarization Initiative of the American Democracy Project, in Web Literacy for Student Fact-Checkers

The virality of fake news is fuelled in part by its ability to elicit strong emotions, such as anger and fear, in the target audience (Berger & Milkman, 2012; Kramer et al., 2014; Stieglitz & Dang-Xuan, 2013).

One biological explanation underlying the virality of highly emotional content (whether fake news or real news) is that the content evokes heightened physiological and emotional arousal (Berger & Milkman, 2012), which in turn incites people to take action. Individuals in a high- arousal emotional state are more likely to act out certain behaviours (such as sharing or commenting) in response to the news (see Lerner et al., 2015), which can accelerate the spread of information online. High-arousal emotional states such as anger, awe, anxiety have been linked to an increased likelihood for the affected individual to share the information on social media (Fan et al., 2016; Guerini & Staiano, 2015).

Given that these kinds of content are likely to go viral, fake news creators are incentivised to create content aimed at eliciting such emotions (Bakir & McStay, 2018). This results in online spaces becoming increasingly emotionalised and fertile grounds for the spread of emotionally- driven fake news.

Letting emotions influence one’s behaviour could have adverse consequences in the context of fake news. Hence, it is important to cultivate emotional scepticism amongst the populace as a form of defence against fake news (Chen, 2019; The Rise and Fall of Fake News, 2017).

HOME TEAM BEHAVIOURAL SCIENCES CENTRE 3

PUBLIC

Emotional scepticism stresses the importance for people to exercise caution when they experience strong emotions such as anger, fear, or hope. Hence, individuals should stop and check the information, if they are experiencing strong emotions after reading the content online.

2.1 Wait for your emotions to subside

One simple way to reduce the effect of emotions on one’s actions is for the individual to wait until they feel calm. Research has revealed that the intensity of emotional arousal in humans reverts to baseline states over time (Lerner et al., 2015). While there are individual differences in terms of the time required, a ‘down-time’ of between 10 – 60 minutes (Gneezy & Imas, 2014; Verduyn et al., 2009) can help to reduce the effect and intensity of emotions experienced by an individual.

2.2 Regulate your emotional response to the information

Another approach to reducing the effects of emotions on one’s behaviour is to regulate one’s emotional response to the information.

2.2.1 Cognitive reappraisal

Cognitive reappraisal 2 involves re-thinking or re-interpreting an emotional situation in a neutral or more positive manner (Ray et al., 2010; Troy et al., 2018). One way of applying cognitive reappraisal would be to respond to the information like a scientist or a journalist – by approaching it with an objective and analytical frame of mind (Halperin et al., 2013), in order to reduce the effect of emotions on one’s decision-making. Learning to see things from a different point of view can help reduce the intensity of emotions experienced.

2.2.2 Emotional labelling

Emotional contagion refers to how emotions can be easily passed from person to person and often without them realising it. There is abundant research showing that emotions such as awe, sadness, fear and anger, are easily passed from person to person (Goldenberg et al., 2019; Hatfield et al., 1993; Kramer et al., 2014). One reason that can drive the spread of fake news is that it has some survival value for the group (Kelly et al., 2016). For example, being able to catch someone else’s fear or sense of alarm about contaminated food could alert people to the threat and keep themselves safe.

The spread of emotional contagion can also occur online. For example, Ferrara and Yang (2015) found that emotions can be spread online, where Twitter users have a higher likelihood to post negative tweets following exposure to negative tweets, and vice versa.

One way to avoid being affected by emotional contagion is by labelling one’s emotions. In psychology research, labelling one’s current feelings has been linked to reduced effect of

2 Applying cognitive reappraisal can be helpful in reducing the effects of emotion’s on one’s decision-making. Halperin and colleagues (2013) studied the responses of Israelis to a Palestinian bid for United Nations recognition and found that using reappraisal to regulate anger can alter people’s support for policies to escalate a political conflict. In fact, those who were trained in cognitive reappraisal techniques were more likely to support conciliatory policies and show less support for aggressive policies toward Palestinians compared with other respondents. HOME TEAM BEHAVIOURAL SCIENCES CENTRE 4

PUBLIC emotions in the brain, the body, and behavior (see Lieberman et al., 2007; Torre & Lieberman, 2018).

There are two methods for labelling such emotions. The first method is to trace the emotion to its original source (Colino, 2016). It involves the individual asking themselves whether the emotions they experience comes from others or themselves. Recognising whom the emotion comes from is a way to short-circuit the transmission of the emotional contagion (Colino, 2016).

The second method is to specifically verbally label one’s current emotional state (e.g., sad, irritable, anxious, etc). One important point to note is to avoid overusing emotional labelling for negative emotion. A study showed a curvilinear relationship between negative emotions felt and emotional labelling (Niles et al., 2016), suggesting that beyond a certain point, emotional labelling would cease to regulate emotions and instead make these emotions more intense.

3. TIP 2: DE-BIAS YOURSELF

“You just have to stop and think ... All of the data we have collected suggests that’s the real problem. It’s not that people are being super-biased and using their reasoning ability to trick themselves into believing crazy stuff. It’s just that people aren’t stopping. They’re rolling on.”

David Rand, MIT cognitive scientist, in an interview with TIME on How Your Brain Tricks You Into Believing Fake News

Biases are a form of mental shortcut that people develop and adapt over time to interpret the environment that they live in so that they can make a quicker decision (Tversky & Kahneman, 1985). There is a wealth of research which examines why people tend to believe in certai things and process information in a biased fashion (Fazio et al., 2015; Fessler et al., 2014; Lord et al., 1979; Pennycook et al., 2017).

Research indicates that anyone can fall prey to the effect of biases; they can affect people regardless of political leanings (Swire et al., 2017) or education levels (Wineburg & McGrew, 2017). Even the most critical thinkers are susceptible to their own cognitive biases if they did not check for their biases when they evaluate information (Wineburg & McGrew, 2017).

Some commonly identified biases are relevant for understanding how individuals fall prey to fake news when they rely on these mental shortcuts to make a decision (see Appendix B for an in-depth explanation of these biases). They are as follows: • Favouring what is recent (availability ) • Conforming with what the group thinks (bandwagon effect, echo-chamber effect) • Accepting information that aligns with what you believe in () • Displaying overconfidence in one’s knowledge (Dunning-Kruger effect)

Hence, it is important to identify ways that one can do to de-bias themselves – i.e., eliminate bias or diminish its intensity or frequency (Colombo, 2018).

3.1 Read the content first before sharing

HOME TEAM BEHAVIOURAL SCIENCES CENTRE 5

PUBLIC

There is ample evidence showing that people are “cognitive misers” (Böckenholt, 2012; Stanovich, 2009), such as relying more on information that is immediate and easily accessible to them when making judgements (see Appendix B, availability heuristic for detailed elaboration). This kind of bias can lead people to fall prey to false information because when people rely on their intuition to form quick and plausible judgments, they are likely to make mistakes. Hence, these judgements are not likely to be as accurate as compared to taking the time to understand the information (Tversky & Kahneman, 1985).

This form of cognitive misery can be seen at work in the reading habits of the general public. Although people are spending more time reading online content, the depth and concentration associated with reading have declined due to habits such as quick browsing or skimming (Liu, 2012).

In some cases, people may opt to read headlines instead of reading the whole article. The spread of fake news can be worsened when people only read the headlines of the articles before sharing them (Ecker et al., 2014), as the headlines may be misleading or do not match the content of the story (Lee, 2017; Lockwood, 2016; Wardle, 2017). As such, it is important to read the contents and comprehend it first before sharing.

3.2 Use a ‘consider the opposite’ strategy

People are likely to fall prey to fake news due to confirmation bias (see Appendix B.2 for further elaboration). Confirmation bias refers to a type of that leads people to favour information that confirms their beliefs rather than those that do not (Confirmation Bias; Casad, 2007; Kiely & Robertson, 2016).

Confirmation bias can make it difficult for people to reject fake news, especially if the fake news validates what they believe in. For example, in the context of the 2016 U.S. elections, researchers found that people were more likely to believe in false news that supported their candidate (Allcott & Gentzkow, 2017) and people who supported a particular political candidate were pre-disposed to assume that information attributed to their candidate was accurate regardless of its accuracy (Swire et al., 2017).

This strategy of de-biasing oneself is known as a “consider the opposite” strategy. It involves the individual asking themselves, “what are some reasons that my judgment is wrong?”. There is evidence for the effectiveness of such a strategy. For example, Lord and colleagues (1984) asked participants with strong pro- or anti-death penalty views to read evidence that support or abolish the death penalty. Those who employed the ‘consider the opposite’ strategy were found to be less biased against studies that disagreed with their views (Lord et al., 1984). This is also supported by a meta-analysis by Hart and colleagues (2009), which found that people are able to overcome their bias when they are motivated to form accurate judgments (Colombo, 2018; Hart et al., 2009; Passe et al., 2018).

Thus, one way for an individual to motivate themselves to form accurate judgements would be to ask themselves to “imagine if you would come to the same conclusion if the presented evidence was different”. This de-biasing process enables the individual to check and be aware if they are experiencing confirmation bias or they are being fair (i.e. willing to change their mind in the face of new evidence or information).

HOME TEAM BEHAVIOURAL SCIENCES CENTRE 6

PUBLIC

3.3 Seek out information from diverse sources

One reason people can fall for online fake news could be due to their tendencies to conform to the group’s consensus. People can fall for false information when they rely on the group consensus as a substitute for making their judgment. The bandwagon effect and the echo- chamber effect are some common effects seen when people exhibit biases of conforming to the group’s consensus.

Bandwagon effect describes people’s bias towards news that is popular or shared by a lot of people. It explains why fake news content can quickly rise to prominence like a fad. In the context of judging credibility of information, the bandwagon effect can be seen when individuals use popularity cues to decide if a piece of information is credible or not (see Appendix B.3 for further elaboration). Using popularity cues such as number of likes and retweets is an unreliable method to determine if something is credible. This is because these cues can be distorted by malicious actors seeking to distort through the use of bots to create the illusion that a piece of information is popular (Ferrara et al., 2016).

The echo chamber effect (see Appendix B.4 for further elaboration) refers to a phenomenon where increasingly polarised attitudes in society are the result of individuals who seek out like- minded communities that validate their beliefs (Barberá, 2014; Del Vicario et al., 2016; Schmidt et al., 2018). In the context of politics, in fake news that occurs within a group can be particularly resistant to corrections due to the echo-chamber effects (Carden, 2017; Swire et al., 2017). Individuals in echo chambers are likely to see other viewpoints as less legitimate, have increased political intolerance, and they are often more insulated from information that challenges the faulty information that they have (Boutyline & Willer, 2017).

As such, one method to de-bias individuals from the tendency to conform to the group consensus would be through exposure to information from diverse sources (Barberá, 2014; Mutz, 2002). In the context of highly debated issues, exposure to arguments from opponents can enable individuals to understand their opponents better (i.e., understand the legitimacy of their concerns, existing common ground). They can also be more aware of relevant information that could be discovered from thinking through multiple perspectives (Benhabib, 1996; Munson et al., 2013).

Thus, seeking information from diverse sources can help individuals avoiding being biased by group consensus. Understanding diverse points of view enables them to see how certain criticisms might be valid while also being aware of new information which can arise from exploring different viewpoints.

3.4 Have a sceptical mindset when online — Be S.U.R.E first

Individuals may fall prey to fake news when they display excessive confidence in their abilities to distinguish accurate information from inaccurate information. This effect, known as the Dunning-Kruger effect, can be frequently observed in cases where people report that they are confident in their ability to spot fake news – however, when tested on their ability to spot fake news, they often do not perform as well (See Appendix B.5 for further elaboration).

In the Stanford study about fact-checkers, the researchers observed that fact-checkers had a healthy dose of scepticism when they were navigating information online (Wineburg & McGrew, 2017). According to the authors, “[These fact-checkers] understood the web as a

HOME TEAM BEHAVIOURAL SCIENCES CENTRE 7

PUBLIC maze filled with trap doors and blind alleys, where things are not always what they seem. Their stance toward the unfamiliar was cautious: while things may be as they seem, in the words of Checker D, “I always want to make sure.” ”(Wineburg & McGrew, 2017, p. 15)

Being in a state of healthy scepticism can help to counter the effects of misinformation in individuals (Chan et al., 2017; Lewandowsky et al., 2012), as they are then primed to expect false information and hence will be on the lookout for it. Accordingly, one way to be in a state of healthy scepticism online is to cultivate a habit of checking the information for its accuracy first. A simple way to do that is to use the National Library Board’s S.U.R.E (See Figure 2) as a mnemonic device for evaluating information online (Tan, Wan, & Teo, 2014).

Figure 2. S.U.R.E ways to do research by the National Library Board

4. TIP 3: CORROBORATE INFORMATION FROM OTHER SOURCES

“Be skeptical—verify before you share. Journalists assume they are wrong, and seek corroborating evidence”

Barbara Gray, Associate Professor, Chief Librarian, CUNY Graduate School of Journalism, in 10 Tips for Fighting Fake News: How to Fact Check Like a Pro

Fake news creators have been known to disguise their content as authentic information through numerous techniques. Some of the techniques that have been employed include: • Using authentic-sounding organisation names (e.g., Associated Media Coverage). • Faking statements made by the authorities (e.g., claiming that the information was from an inside source within the FBI (see Spencer, 2017). • URLs that look similar to actual organisations (e.g., NewYorkTimesPolitics.com). There is also the need to be mindful of news websites that add a domain name after “.com” (e.g. “.com.to”, “.com.co”, “.com.la”), as they are often fake versions of real news sources. • Having a polished web layout that resembles real news sites (Chen, 2019; Kiely & Robertson, 2016; Subramanian, 2017b; Wineburg & McGrew, 2017).

HOME TEAM BEHAVIOURAL SCIENCES CENTRE 8

PUBLIC

Even though fake news creators can employ these tactics to create an impression of trustworthiness, checking the claims made by suspicious new sources against other sources is a reliable approach to identifying fake news (Pennycook & Rand, 2017; Wineburg & McGrew, 2017). Hence an individual’s immediate reaction when encountering an unfamiliar piece of information should be to check with other sources.

4.1 Check the credentials of the news source through lateral reading

According to research by Stanford, fact-checkers judge the credibility of unfamiliar sites effectively by performing lateral reading (Wineburg & McGrew, 2017). Lateral reading describes a process of reading other sources of information to learn more about the credibility of the information source that an individual is investigating (See Figure 3).

For the fact-checkers, after a cursory scan of an unknown website, they open up new browser tabs to search for information about the unknown website from other sources. For example, Wineberg and McGrew (2017) described how one fact-checker did it – by checking the ‘About Us’ section and opening up tabs in Google to read about what other sources had to say about the website.

Process of lateral reading STEPS EXAMPLE

Step 1: Look up the “About Us” RealEarthResearchInstitute.com section to understand the agenda that the site may have (e.g., leadership team, , political ABOUT US affiliation of donors) The Real Earth Research Institute is an institute that aims to spread pathfinding ideas about the Earth

Step 2: Open up another browser Reliable Source A Reliable Source B tab. Look at what other reliable sources of information say about this website RealEarthResearchInstitute. RealEarthResearchInstitute. com is a front organisation com is funded by flat earth for a flat earth society societies

Maybe the RealEarthResearchInstitute.com Step 3: Evaluate if the site is is an low-quality source reliable or a low-quality source

Figure 3. The process of lateral reading

HOME TEAM BEHAVIOURAL SCIENCES CENTRE 9

PUBLIC

This process of lateral reading allows the fact-checkers to check the websites for potential biases (e.g., ideological stances, the funding received by the organisation). This, in turn, enables them to arrive at an accurate conclusion about whether the site is a reliable source or a low-quality source. When encountering an unfamiliar source of information, readers can engage in lateral reading to assess the reliability of the new source of information.

4.2 Check if fact-checkers and reputable outlets have reported on the information

Alternatively, readers can check if fact-checking sites have debunked the claim, or if the story is being reported by multiple reputable outlets (Caulfield, 2017; Kiely & Robertson, 2016). This can help them check if the information from a site is debunked or not.

4.2.1 Fact-checkers in Singapore

There are some fact-checkers in Singapore who can provide credible fact-checking for fake news in Singapore (see Figure 4, Figure 5). For example, Facebook, in partnership with AFP, have been fact-checking fake news found in Singapore across several languages such as English, Mandarin and Malay (Kwang, 2019). For the fact-checking of various government policies, there is Factually, which is a website up by the Ministry of Communications and Information.

Figure 5. Factually, a website tackling fake news/misconception around government policies or statistics

Figure 4. AFP Fact Check for fake news in Singapore

4.2.2. International fact-checkers

For fact-checking information in specific countries, readers can consult from a list of reliable fact-checkers who are a part of the International Fact-Checking Network (Verified Signatories of the IFCN Code of Principles, n.d.).

Efficient fact-checking using Google’s search operators

This can be done by searching for terms (related to the fake news) on fact-checking sites like PolitiFact in the following format {search terms}{site: fact-checker site} (see Figure 6).

HOME TEAM BEHAVIOURAL SCIENCES CENTRE 10

PUBLIC

Figure 6. How to quickly search a claim on Google

5. TIP 4: CHECK THE CONTENT FOR RED FLAGS

“Lies tend to stick out as being weird or odd and do not fit in nicely with a truthful ecosystem”

Angie Holan, editor of PolitiFact (fact-checking organization), in an interview with Ponyter on Why people lie and what you can do about it

Belief in fake news has been attributed to a whole host of factors: over- of weak claims, failing to examine how the authors are incentivised to be accurate, and missing out signs that the information is manipulated (Caulfield, 2017; Fact-Checking Tips and Advice, n.d.; Gray, 2017; Pennycook & Rand, 2017).

As a result, fact-checkers have offered a wealth of advice on red flags to look out for so that the public can identify these low-quality sources easily (e.g., Holan, 2014; Simon, 2018). Additional readings on the different varieties of red flags to look out for can also be found in Appendix C.

5.1 Examine the quality of evidence for the claims

One method to overcome the problem of being over-accepting of weak claims is to identify if the evidence given is weak or strong. The table below details some ways to identify if a piece of news has strong evidence (see Caulfield, 2017; “Fact-checking tips and advice”, n.d.; Gray, 2017):

Strong evidence Weak evidence • Clear information about real authors, • Unknown or anonymous authorship. editors, publishers, and owners. • All facts are verified by at least two • Anonymous sources used. reliable sources. • Relevant expert sources used. • Cite irrelevant evidence.

HOME TEAM BEHAVIOURAL SCIENCES CENTRE 11

PUBLIC

• Seek to inform in an objective • Written in a subjective manner (e.g., manner (e.g., reporting two sides of using emotionally driven language, the argument, absence of personal exaggerations, and personal attacks). opinion) • Shares data, findings, or primary • Shares unsubstantiated opinions. sources of information. • Able to provide a citation, link or reference to support their claims.

5.2 Check if the source is motivated to be accurate or is motivated by money

One way to identify red flags in the content is to think about ‘what the authors would lose’ if they publish inaccurate information (Caulfield, 2017; Krueger, 2016). For example, media outlets such as the New York Times and Reuters rely on their reputation as a credible source of information as a part of their business model (Caulfield, 2017; Stovall, 2012b). This means that publishing inaccurate information is not in their interest as they are likely to be held accountable for doing so (Stovall, 2012a), and it affects their reputation in the eyes of their readers (Stovall, 2012a).

In contrast, sites have been known to be motivated primarily by advertisement revenue, which creates an incentive for them to publish any information (true or false), so long as it grabs the attention of the readers (Gillin, 2017; Holiday, 2013). Furthermore, these sites do not face the same pressure to be held accountable for publishing false information. This can be attributed to two reasons: (i) they can easily start up a new site under a new name after their website has been banned or blocked (Subramanian, 2017a); and (ii) if the editors are anonymous or based overseas, it would be difficult for local authorities to take legal action against them.

People can also fall for exaggerated claims when they fail to realise that the content is being sponsored by a business or organisation. These sponsored content is often biased in favour of the organisation that paid for it (Caulfield, 2017; Gillin, 2017). Additionally, in most of these cases, these types of content are often labelled with euphemistic names such as “ publisher”, which can mislead people into believing that it is objective news when it is just advertisement or PR. As a result, people can be fooled by these sponsored content, thinking that the information is accurate and not biased (Amazeen & Wojdynski, 2019).

HOME TEAM BEHAVIOURAL SCIENCES CENTRE 12

PUBLIC

5.3 Look out for recycled/doctored images

Fake news content frequently feature ‘recycled’ images used in highly misleading ways. One example is during the Rohingya crisis in 2017, where there was a proliferation of recycled images used by anti-Rohingya elements for purposes. In one case, there was the use of the photograph of Bangladeshi volunteers fighting in the 1971 Bangladesh Liberation War (see Figure 7), which was passed off as an image of the Figure 7. Fake news during the 2017 Rohingya crisis Rohingya training as a dangerous militia which claimed that the Rohingya were training terrorists to fight Myanmar citizens. The use of the (Ratcliffe, 2017; Soldiers Aiming Weapons word 'Bengali' is used as a slur against the Rohingya on the Ground, 1971). people, by labelling them as foreigners

One way to check if an image is recycled from other articles is to use Google’s reverse image search (see Figure 8) or TinEye reverse image search. The steps are described below:

1. Google reverse image searches can be done as easily as dragging and dropping an image on the Google search bar or right-clicking and searching for the image on Google (currently available on Chrome browsers). 2. For mobile phones, the Chrome browser app for iOS and Android supports reverse image search, just simply press your finger on the image until the option pops out.

This allows the user to see if the image existed in other places and assess if the image has been used in a misleading manner. Additionally, this method can also work for instances where images are cropped or photoshopped, as there are likely to be similar-looking images available online for individuals to check against.

Figure 8. Doing a Google reverse image search by dragging and dropping on the search bar (left) or simply searching an image on one’s phone (right)

HOME TEAM BEHAVIOURAL SCIENCES CENTRE 13

PUBLIC

6. TIP 5: COUNTER FAKE NEWS BY TAKING ACTION AGAINST IT

“Send suspect information to local journalists or fact-checkers you trust, and ask that they send you their conclusion. Tell them exactly where you received the information, from whom and if you believe it to be real or fake. Later, reshare their article in those same social media groups, or directly with friends and family.”

Daniel Funke, author and reporter at International Fact-checking Network, in 9 ways you can help fact-checkers during a crisis

Most people would likely ignore fake news when they encounter it (Tandoc, 2017), or not share it with others (Tan, Neo, & Chen, 2019). However, there is a need for more people to counter fake news proactively. Actions, such as reporting fake news or publicly sharing debunked information, can be helpful in mitigating the risk of others falling for the fake news.

6.1 Sending fake news directly to fact-checkers or news agencies

If there is dubious content spotted on private Facebook groups, online sites, or WhatsApp groups, one avenue is to take a screenshot of the content and then forward it to news agencies or fact-checkers (Funke, 2018).

This can be helpful in preventing more people from falling for fake news, as the fact-checkers and news agencies are able to follow up on the fake news, and if necessary, disseminate debunked information to a wider audience.

Fact-checkers and media outlets based in Singapore to send tip-off about fake news Fact-checkers Media outlets AFP Singapore Fact Check Straits Times Factually Mothership

6.2 Flag fake news found on social media

Currently, there are tools which are offered by various social media platforms (e.g., Facebook, Twitter, WhatsApp, etc.) to flag fake news (Crawford & Gillespie, 2016). In the case of Facebook, users are able to flag posts which they evaluate to be a false news story (see Figure 9). This can help reduce the reach of fake posts online, as Facebook works with third-party fact-checkers to identify these fake news and then reduce its distribution on the platform (Lyons, 2018; Schaedel, 2017).

HOME TEAM BEHAVIOURAL SCIENCES CENTRE 14

PUBLIC

Figure 9. Flagging fake news on Facebook by clicking on the '...' sign next to the post (left, circled in red) and marking the post as a false news story (right)

6.3 Affirm their good intentions when countering fake news from friends and family

People are worried about harming relationships with friends and family members when they correct them for sharing fake news (Dixit, 2017; Silverman, 2019). To overcome this, a two- step approach can be adopted.

First, there is a need to understand where the person is coming from, in terms of their motivations for sharing the fake news, and their attitudes on the content of the fake news (Lewandowsky et al., 2012; Silverman, 2019).

Second, there is a need to affirm the person’s good intentions (Lewandowsky et al., 2012; Silverman, 2019) before debunking the fake news and directing them to relevant, trusted sources for the correct answers.

7. CONCLUSION

This report identified five tips by examining the behaviours of fact-checkers and their recommendations for countering fake news. The value of this research lies in providing these tips for individuals to resist fake news based on insights from a behavioural sciences perspective, as well as drawing on the expertise of professional fact-checkers.

Nevertheless, this should not lead the reader to conclude that these tips are an exhaustive list to avoid falling for fake news. Recent trends indicate that fake news creators are always coming up with new creative ways of manipulating content to achieve their malicious agenda.

HOME TEAM BEHAVIOURAL SCIENCES CENTRE 15

PUBLIC

REFERENCES

Allcott, H., & Gentzkow, M. (2017). Social media and fake news in the 2016 election. Journal of Economic Perspectives, 31(2), 211–36.

Amazeen, M. A., & Wojdynski, B. W. (2019). Reducing native : Revisiting the antecedents and consequences of knowledge in digital news contexts. Mass Communication and Society, 22(2), 222–247.

Anagnostopoulos, A., Bessi, A., Caldarelli, G., Del Vicario, M., Petroni, F., Scala, A., Zollo, F., & Quattrociocchi, W. (2014). Viral Misinformation: The Role of and Polarization. ArXiv:1411.2893 [Physics]. http://arxiv.org/abs/1411.2893

Asch, S. E. (1955). Opinions and social pressure. Scientific American, 193(5), 31–35.

Bakir, V., & McStay, A. (2018). Fake news and the economy of emotions: Problems, causes, solutions. Digital Journalism, 6(2), 154–175.

Barberá, P. (2014). How social media reduces mass political polarization. Evidence from Germany, Spain, and the US. Job Market Paper, New York University, 46.

Barberá, P., Jost, J. T., Nagler, J., Tucker, J. A., & Bonneau, R. (2015). Tweeting from left to right: Is online political communication more than an echo chamber? Psychological Science, 26(10), 1531–1542.

Benhabib, S. (1996). Toward a deliberative model of democratic legitimacy. In Democracy and difference: Contesting the boundaries of the political (pp. 67–94).

Berger, J., & Milkman, K. L. (2012). What makes online content viral? Journal of Research, 49(2), 192–205.

Berinsky, A. J. (2015). Rumors and health care reform: Experiments in political misinformation. British Journal of , 1–22.

Böckenholt, U. (2012). The Cognitive-Miser Response Model: Testing for Intuitive and Deliberate Reasoning. Psychometrika, 77(2), 388–399. https://doi.org/10.1007/s11336-012-9251-y

Borah, P., & Xiao, X. (2018). The Importance of ‘Likes’: The Interplay of Message Framing, Source, and Social Endorsement on Credibility Perceptions of Health Information on Facebook. Journal of Health Communication, 23(4), 399–411.

Boutyline, A., & Willer, R. (2017). The social structure of political echo chambers: Variation in ideological homophily in online networks. Political Psychology, 38(3), 551–569.

Carden, J. (2017, March 17). Trump-Putin connection: Liberals’ paranoid attempts to tie Trump to Russia are distracting them from their real problems. Quartz. https://qz.com/933735/trump-putin-connection-liberals- paranoid-attempts-to-tie-trump-to-russia-are-distracting-them-from-their-real-problems/

Casad, B. J. (2007). Confirmation Bias. In R. F. Baumeister & K. D. Vohs (Eds.), Encyclopedia of . Thousand Oaks, CA, Sage.

Caulfield, M. A. (2017). Web literacy for student fact-checkers.

Chan, M. S., Jones, C. R., Hall Jamieson, K., & Albarracín, D. (2017). Debunking: A meta-analysis of the psychological efficacy of messages countering misinformation. Psychological Science, 28(11), 1531–1546.

Charlton, E. (2019). How Finland is fighting fake news in the classroom. World Economic Forum. https://www.weforum.org/agenda/2019/05/how-finland-is-fighting-fake-news-in-the-classroom/

Chen, X., & Goh, P. (2019). Fake News: Five Key Things Home Team Officers Should Know. Home Team Journal, 8, 121–129.

HOME TEAM BEHAVIOURAL SCIENCES CENTRE 16

PUBLIC

Chen, X. K. (2019). Fake News After a Terror Attack: Psychological Vulnerabilities Exploited by Fake News Creators. In M. Khader, L. S. Neo, D. D. Cheong, & J. Chin (Eds.), Learning from Violent Extremist Attacks: Behavioural Sciences Insights for practitioners and policymakers (pp. 435–451). World Scientific Press.

Chen, X. K., & Neo, L. S. (2019). The Threat of Fake News in Singapore: Prevalence, Impact, and Methods of Transmission [HTBSC Research Report 14/2019]. Home Team Behavioural Sciences Centre.

Chen, X. K., Wong, Y., Honnavalli, V., Tan, A. Q., Justina, & Neo, L. S. (2019, July). A preliminary analysis on the social impact of fake news after terror attacks in SEA [Poster]. 4th Asian Conference of Criminal & Operations Psychology, Singapore.

Colino, S. (2016, February 29). Are You Catching Other People’s Emotions? HuffPost. https://www.huffpost.com/entry/are-you-catching-other-peoples-emotions_n_56b8feb6e4b08069c7a85b26

Colombo, C. (2018). Hearing the Other Side?– Political Opinions in the Case of the Scottish Independence Referendum. Political Studies, 66(1), 23–42.

Crawford, K., & Gillespie, T. (2016). What is a flag for? Social media reporting tools and the vocabulary of complaint. New Media & Society, 18(3), 410–428.

Del Vicario, M., Bessi, A., Zollo, F., Petroni, F., Scala, A., Caldarelli, G., Stanley, H. E., & Quattrociocchi, W. (2016). The spreading of misinformation online. Proceedings of the National Academy of Sciences, 113(3), 554– 559.

Dixit, P. (2017, December 20). Older Indians Drive Millennials Crazy On WhatsApp. This Is Why They’re Obsessed. BuzzFeed News. https://www.buzzfeednews.com/article/pranavdixit/older-indians-drive-millennials- crazy-on-whatsapp-this-is

Don’t spread “unsubstantiated” messages on terror threats at shopping areas: SPF. (2016, December 6). Channel NewsAsia. http://www.channelnewsasia.com/news/singapore/don-t-spread-unsubstantiated-messages-on-terror- threats-at-shopp-7647432

Dunning, D. (2011). The Dunning–Kruger effect: On being ignorant of one’s own ignorance. In Advances in experimental social psychology (Vol. 44, pp. 247–296). Elsevier.

Ecker, U. K., Lewandowsky, S., Chang, E. P., & Pillai, R. (2014). The effects of subtle misinformation in news headlines. Journal of Experimental Psychology: Applied, 20(4), 323.

Fact-checking tips and advice. (n.d.). Africa Check. https://africacheck.org/how-to-fact-check/tips-and-advice/

Fan, R., Xu, K., & Zhao, J. (2016). Higher contagion and weaker ties mean anger spreads faster than joy in social media. ArXiv Preprint ArXiv:1608.03656.

Fazio, L. K., Brashier, N. M., Payne, B. K., & Marsh, E. J. (2015). Knowledge does not protect against illusory truth. Journal of Experimental Psychology: General, 144(5), 993–1002.

Ferrara, E., Varol, O., Davis, C., Menczer, F., & Flammini, A. (2016). The rise of social bots. Communications of the ACM, 59(7), 96–104.

Ferrara, E., & Yang, Z. (2015). Measuring emotional contagion in social media. PloS One, 10(11), e0142390.

Fessler, D. M., Pisor, A. C., & Navarrete, C. D. (2014). Negatively-biased credulity and the cultural evolution of beliefs. PloS One, 9(4), 1–8.

Festinger, L. (1962). A theory of cognitive dissonance (Vol. 2). Stanford University Press.

Funke, D. (2018, April 27). 9 ways you can help fact-checkers during a crisis. Poynter. https://www.poynter.org/fact-checking/2018/9-ways-you-can-help-fact-checkers-during-a-crisis/

Gillin, J. (2017, October 4). How clickbait ads make money for fake news sites. PunditFact. https://www.politifact.com/punditfact/article/2017/oct/04/more-outrageous-better-how-clickbait-ads-make- mone/

HOME TEAM BEHAVIOURAL SCIENCES CENTRE 17

PUBLIC

Gneezy, U., & Imas, A. (2014). Materazzi effect and the strategic use of anger in competitive interactions. Proceedings of the National Academy of Sciences, 111(4), 1334–1337.

Goldenberg, A., Garcia, D., Halperin, E., Zaki, J., Kong, D., Golarai, G., & Gross, J. J. (2019). Beyond emotional similarity: The role of situation-specific motives. Journal of Experimental Psychology: General.

Gray, B. (2017, April 11). 10 Tips for Fighting Fake News: How to Fact Check Like a Pro. LexisNexis. http://www.lexisnexis.com/pdf/nexis/Nexis-webinar-how-to-fact-check-like-a-pro.pdf

Guerini, M., & Staiano, J. (2015). Deep feelings: A massive cross-lingual study on the relation between emotions and virality. Proceedings of the 24th International Conference on World Wide Web, 299–305.

Halperin, E., Porat, R., Tamir, M., & Gross, J. J. (2013). Can emotion regulation change political attitudes in intractable conflicts? From the laboratory to the field. Psychological Science, 24(1), 106–111.

Halse, S. E., Binda, J., & Weirman, S. (2018). It’s what’s outside that counts: Finding credibility metrics through non-message related Twitter features. ISCRAM.

Hart, W., Albarracín, D., Eagly, A. H., Brechan, I., Lindberg, M. J., & Merrill, L. (2009). Feeling validated versus being correct: A meta-analysis of selective exposure to information. Psychological Bulletin, 135(4), 555.

Hatfield, E., Cacioppo, J. T., & Rapson, R. L. (1993). Emotional contagion. Current Directions in Psychological Science, 2(3), 96–100.

Holan, A. D. (2014, August 20). 7 steps to better fact-checking. PolitiFact. https://www.politifact.com/truth-o- meter/article/2014/aug/20/7-steps-better-fact-checking/

Holiday, R. (2013). Trust me, I’m lying: Confessions of a media manipulator. Penguin.

Kelly, J. R., Iannone, N. E., & McCarty, M. K. (2016). Emotional contagion of anger is automatic: An evolutionary explanation. British Journal of Social Psychology, 55(1), 182–191. https://doi.org/10.1111/bjso.12134

Kiely, E., & Robertson, L. (2016, November 18). How to Spot Fake News. FactCheck.Org. https://www.factcheck.org/2016/11/how-to-spot-fake-news/

Kramer, A. D., Guillory, J. E., & Hancock, J. T. (2014). Experimental evidence of massive-scale emotional contagion through social networks. Proceedings of the National Academy of Sciences, 201320040.

Krueger, V. (2016, November 2). 9 questions to help you evaluate the credibility of news sources. Poynter. https://www.poynter.org/educators-students/2016/9-questions-to-help-you-evaluate-the-credibility-of-news- sources/

Kwang, K. (2019, May 2). Facebook expands fact-checking initiative to Singapore amid challenges in other markets. CNA. https://www.channelnewsasia.com/news/singapore/facebook-fact-checking-singapore-amid- challenges-other-markets-11496900

Lee, B. (2017, June 16). When Influencers Peddle Click-Bait And Fake News, Everyone Loses. Forbes. http://www.forbes.com/sites/forbestechcouncil/2017/06/16/when-influencers-peddle-click-bait-and-fake-news- everyone-loses/

Lerner, J. S., Li, Y., Valdesolo, P., & Kassam, K. S. (2015). Emotion and decision making. Annual Review of Psychology, 66.

Lessenski, M. (2018). Common sense wanted: Resilience to ‘post-truth’ and its predictors in the new media literacy index 2018. Open Society Institute – Sofia.

Lewandowsky, S., Ecker, U. K., Seifert, C. M., Schwarz, N., & Cook, J. (2012). Misinformation and its correction: Continued influence and successful debiasing. Psychological Science in the Public Interest, 13(3), 106–131.

Lieberman, M., Eisenberger, N., Crockett, M., Tom, S., Pfeifer, J., & Way, B. (2007). Putting feelings into words: Affect labeling disrupts.

HOME TEAM BEHAVIOURAL SCIENCES CENTRE 18

PUBLIC

Lin, X., Spence, P. R., & Lachlan, K. A. (2016). Social media and credibility indicators: The effect of influence cues. Computers in Human Behavior, 63, 264–271.

Liu, Z. (2012). Digital reading. Chinese Journal of Library and Information Science (English Edition), 85.

Lockwood, G. (2016). Academic clickbait: Articles with positively-framed titles, interesting phrasing, and no wordplay get more attention online. The Winnower, 3.

Lord, C. G., Lepper, M. R., & Preston, E. (1984). Considering the opposite: A corrective strategy for social judgment. Journal of Personality and Social Psychology, 47(6), 1231.

Lord, C. G., Ross, L., & Lepper, M. R. (1979). Biased assimilation and attitude polarization: The effects of prior theories on subsequently considered evidence. Journal of Personality and Social Psychology, 37(11), 2098.

Lyons, T. (2018, June 14). Hard Questions: How Is Facebook’s Fact-Checking Program Working? Facebook Newsroom. https://newsroom.fb.com/news/2018/06/hard-questions-fact-checking/

Munson, S. A., Lee, S. Y., & Resnick, P. (2013). Encouraging reading of diverse political viewpoints with a browser widget. Seventh International AAAI Conference on Weblogs and Social Media.

Mutz, D. C. (2002). Cross-cutting social networks: Testing democratic theory in practice. American Political Science Review, 96(1), 111–126.

Niles, A. N., Haltom, K. E. B., Lieberman, M. D., Hur, C., & Stanton, A. L. (2016). Writing content predicts benefit from written expressive disclosure: Evidence for repeated exposure and self-affirmation. Cognition and Emotion, 30(2), 258–274. https://doi.org/10.1080/02699931.2014.995598

Nygren, T., & Guath, M. (2019). Swedish teenagers’ difficulties and abilities to determine digital news credibility. Nordicom Review, 40(1), 23–42.

Passe, J., Drake, C., & Mayger, L. (2018). Homophily, echo chambers, & selective exposure in social networks: What should civic educators do? The Journal of Social Studies Research, 42(3), 261–271.

Pennycook, G., Cannon, T. D., & Rand, D. G. (2017). Prior exposure increases perceived accuracy of fake news. SSRN. https://ssrn.com/abstract=2958246

Pennycook, G., & Rand, D. G. (2017). Who falls for fake news? The roles of analytic thinking, motivated reasoning, political ideology, and bullshit receptivity. SSRN Electronic Journal.

Quattrociocchi, W., Scala, A., & Sunstein, C. R. (2016). Echo Chambers on Facebook (SSRN Scholarly Paper ID 2795110). Social Science Research Network. https://papers.ssrn.com/abstract=2795110

Ratcliffe, R. (2017, September 5). Fake news images add fuel to fire in Myanmar, after more than 400 deaths. http://www.theguardian.com/global-development/2017/sep/05/fake-news-images-add-fuel-to-fire-in-myanmar- after-more-than-400-deaths

Ray, R. D., McRae, K., Ochsner, K. N., & Gross, J. J. (2010). Cognitive reappraisal of negative affect: Converging evidence from EMG and self-report. Emotion, 10(4), 587.

Report Of The Select Committee On Deliberate Online Falsehoods – Causes, Consequences And Countermeasures. (2018). Select Committee on Deliberate Online Falsehoods.

Schaedel, S. (2017, July 6). How to Flag Fake News on Facebook. FactCheck.Org. https://www.factcheck.org/2017/07/flag-fake-news-facebook/

Schmidt, A. L., Zollo, F., Scala, A., Betsch, C., & Quattrociocchi, W. (2018). Polarization of the vaccination debate on Facebook. Vaccine, 36(25), 3606–3612.

Schmitt-Beck, R. (2015). Bandwagon effect. The International Encyclopedia of Political Communication, 1–5.

Silverman, C. (2019, July 23). What To Do If The Older People In Your Life Are Sharing False Or Extreme Content. BuzzFeed News. https://www.buzzfeednews.com/article/craigsilverman/young-people-worry-about- older-people-sharing-fake-news

HOME TEAM BEHAVIOURAL SCIENCES CENTRE 19

PUBLIC

Simon, M. (2018, April 3). Why people lie and what you can do about it. Poynter. https://www.poynter.org/fact- checking/2018/why-people-lie-and-what-you-can-do-about-it/

Soldiers Aiming Weapons on the Ground. (1971, November 22). Getty Images. https://www.gettyimages.com/detail/news-photo/syaldaa-nadi-east-pakistan-using-an-assortment-of-captured- news-photo/515398852

Spencer, S. H. (2017, October 5). No Evidence Linking Vegas Shooter to Antifa. FactCheck.Org. https://www.factcheck.org/2017/10/no-evidence-linking-vegas-shooter-antifa/

Stanovich, K. (2009). The cognitive miser: Ways to avoid thinking. In What Intelligence Tests Miss: The Psychology of Rational Thought (pp. 70–85).

Stencel, M. (2019, June 11). Number of fact-checking outlets surges to 188 in more than 60 countries. Poynter. https://www.poynter.org/fact-checking/2019/number-of-fact-checking-outlets-surges-to-188-in-more-than-60- countries/

Stieglitz, S., & Dang-Xuan, L. (2013). Emotions and information diffusion in social media—Sentiment of microblogs and sharing behavior. Journal of Management Information Systems, 29(4), 217–248.

Stovall, J. G. (2012a). The Writer and the Law. In Writing for the (Eighth).

Stovall, J. G. (2012b). Writing in the Media Environment. In Writing for the Mass Media (Eighth).

Subramanian, S. (2017a, February 15). Inside the Macedonian Fake-News Complex. WIRED. https://www.wired.com/2017/02/veles-macedonia-fake-news/

Subramanian, S. (2017b, February 15). Meet the Macedonian Teens Who Mastered Fake News and Corrupted the US Election. WIRED. https://www.wired.com/2017/02/veles-macedonia-fake-news/

Swire, B., Berinsky, A. J., Lewandowsky, S., & Ecker, U. K. (2017). Processing political misinformation: Comprehending the Trump phenomenon. Royal Society Open Science, 4(3), 160802.

Talwar, S., Dhir, A., Kaur, P., Zafar, N., & Alrasheedy, M. (2019). Why do people share fake news? Associations between the dark side of social media use and fake news sharing behavior. Journal of Retailing and Consumer Services, 51, 72–82.

Tan, G., Wan, W. P., & Teo, J. (2014). SURE campaign: Promoting information literacy awareness to Singaporeans. IFLA WLIC 2014 - Lyon - Libraries, Citizens, Societies: Confluence for Knowledge.

Tan, H. H., Neo, L. S., & Chen, X. K. (2019, July). Understanding why people do not intervene in the spread of fake news [Poster]. 4th Asian Conference of Criminal & Operations Psychology, Singapore.

Tandoc, E. C., Jr. (2017, May 27). It’s up to you, yes you, to stop fake news [Text]. The Straits Times. http://www.straitstimes.com/opinion/its-up-to-you-yes-you-to-stop-fake-news

The Rise and Fall of Fake News. (2017, January 27). WNYC. https://www.wnyc.org/story/rise-and-fall-fake- news/

The Susceptibility of Singaporeans Towards Fake News. (2018, September 28). Ipsos. https://www.ipsos.com/en- sg/susceptibility-singaporeans-towards-fake-news

Törnberg, P. (2018). Echo chambers and viral misinformation: Modeling fake news as complex contagion. PLOS ONE, 13(9), e0203958. https://doi.org/10.1371/journal.pone.0203958

Torre, J. B., & Lieberman, M. D. (2018). Putting feelings into words: Affect labeling as implicit emotion regulation. Emotion Review, 10(2), 116–124.

Troy, A. S., Saquib, S., Thal, J., & Ciuk, D. J. (2018). The regulation of negative and positive affect in response to daily stressors. Emotion.

Tversky, A., & Kahneman, D. (1973). Availability: A heuristic for judging frequency and probability. Cognitive Psychology, 5(2), 207–232.

HOME TEAM BEHAVIOURAL SCIENCES CENTRE 20

PUBLIC

Tversky, A., & Kahneman, D. (1985). The framing of decisions and the psychology of choice. In Environmental Impact Assessment, Technology Assessment, and Risk Analysis (pp. 107–129). Springer.

Verduyn, P., Delvaux, E., Van Coillie, H., Tuerlinckx, F., & Van Mechelen, I. (2009). Predicting the duration of emotional experience: Two experience sampling studies. Emotion, 9(1), 83.

Verified signatories of the IFCN code of principles. (n.d.). Retrieved December 27, 2019, from https://ifcncodeofprinciples.poynter.org/signatories

Vicario, M. D., Bessi, A., Zollo, F., Petroni, F., Scala, A., Caldarelli, G., Stanley, H. E., & Quattrociocchi, W. (2015). Echo chambers in the age of misinformation.

Vicario, M. D., Vivaldo, G., Bessi, A., Zollo, F., Scala, A., Caldarelli, G., & Quattrociocchi, W. (2016). Echo Chambers: Emotional Contagion and Group Polarization on Facebook. Scientific Reports, 6(1), 1–12. https://doi.org/10.1038/srep37825

Wardle, C. (2017, February 16). Fake news. It’s complicated. First Draft News. https://firstdraftnews.com:443/fake-news-complicated/

Weedon, J., Nuland, W., & Stamos, A. (2017). Information operations and Facebook. Facebook Security.

Wineburg, S., & McGrew, S. (2017). Lateral reading: Reading less and learning more when evaluating digital information. Stanford History Education Group Working Paper No. 2017-A1. https://dx.doi.org/10.2139/ssrn.3048994

HOME TEAM BEHAVIOURAL SCIENCES CENTRE 21

PUBLIC

APPENDICES

Appendix A: List of fact-checking and other relevant organisations used in this research

Fact-checking Organisation (English-Language) Organisation Source Full Fact https://fullfact.org/toolkit/ FactCheck.org https://www.factcheck.org/2016/11/how-to-spot-fake-news/ PolitiFact https://www.politifact.com/truth-o- meter/article/2014/aug/20/7-steps-better-fact-checking/ Washington Post Fact Check https://www.washingtonpost.com/news/fact- checker/wp/2016/11/22/the-fact-checkers-guide-for- detecting-fake-news/?utm_term=.1d12c9f1f739 Pontyer Institute https://www.poynter.org/educators-students/2016/9- questions-to-help-you-evaluate-the-credibility-of-news- sources/ International Factchecking https://www.poynter.org/fact-checking/2018/9-ways-you- Network can-help-fact-checkers-during-a-crisis/ Africa Check https://africacheck.org/how-to-fact-check/tips-and-advice/

Relevant Information Literacy Organisation Organisation Source National Library http://www.nlb.gov.sg/sure/wp-content/uploads/2018/07/SURE- Board S.U.R.E Fake-news-tip-sheet.pdf campaign The Craig Newmark //researchguides.journalism.cuny.edu/c.php?g=547454&p=3756526 Graduate School of Journalism at the City University of New York Stanford History https://sheg.stanford.edu/ Education Group

HOME TEAM BEHAVIOURAL SCIENCES CENTRE 22

PUBLIC

Appendix B: Commonly identified biases that lead individuals to fall for fake news

There are five commonly identified biases which can impair one’s ability to detect fake news online. They are: availability heuristic, bandwagon effect, confirmation bias, Dunning–Kruger effect, and echo-chamber effect.

Realistically speaking, it is impossible for people to be completely free from the influence of their biases, but research has shown that it is possible to reduce their effect on an individual’s ability to detect fake news if they are aware of these biases and challenge it (Lewandowsky et al., 2012).

B.1 Availability heuristic

The availability heuristic is a mental shortcut that relies on immediate examples that come to a person's mind (Tversky & Kahneman, 1973). When people prefer to attend to information that is immediate and easily accessible to them, this creates a weak spot that fake news creators can exploit. One of the ways fake news creators can do so is by making sure that the volume of fake news drowns out real news through the use of automated means like armies of bots and fake accounts (Ferrara et al., 2016).

People are likely to find fake news to be credible when they remember seeing similar content in the past. In the context of politics, researchers found that more than half of those who recalled seeing fake news stories believed that it was accurate (Allcott & Gentzkow, Figure 10. An example of fake news leveraging on the availability heuristic. A 2017). Others have noted that the repetition of viral fake message in 2016 about possible information can increase the perceived accuracy of the attacks on locations in Singapore. One statement (Berinsky, 2015; Pennycook et al., 2017) explanation for its spread is that it relied on recent media coverage of terror attacks which even though it may be inaccurate. In the same vein, if made the message seem reliable. there were reports of terrorist attacks in recent memory (“Don’t spread ‘unsubstantiated’ messages,” 2016), this could increase the perceived reliability of fake messages that warn of impending attacks (see Figure 10).

B.2 Confirmation bias

Confirmation bias refers to a type of cognitive bias that leads people to favour information that confirms their beliefs rather than those that do not (Confirmation Bias; Casad, 2007; Kiely & Robertson, 2016). One famous example of confirmation bias at work is the experiments showing how people rated evidence from a fictitious death penalty study that supported their beliefs as reliable while studies that did not support their beliefs as highly flawed (Lord et al., 1979). One explanation for the driver of confirmation bias is that people are motivated to avoid information that challenges their views due to the discomfort from holding inconsistent ideas (Festinger, 1962). Hence, it is easier to rationalise or ignore new evidence that is against one’s beliefs, than to recognize that one is mistaken.

HOME TEAM BEHAVIOURAL SCIENCES CENTRE 23

PUBLIC

Confirmation bias can make it difficult for people to reject fake news, especially if the fake news validates what they believe in. For example, in the context of the 2016 U.S. elections, researchers found that people are more likely to believe in false news that support their candidate (Allcott & Gentzkow, 2017) and people who supported a particular political candidate are pre-disposed to assume that information attributed to their candidate was accurate regardless of its accuracy (Swire et al., 2017).

B.3 Bandwagon effect

The bandwagon effect refers to the psychological phenomenon where people do something (e.g., adopt a new or go on a fad diet) primarily because other people are doing it. This pressure to conform with the group’s actions is best illustrated by a famous series of experiments3 by Asch (1955), who showed that people conformed with the wrong judgments made by the group even though they initially had made the correct judgment. There are a few explanations for this : thinking that the safe choice is the choice made by the majority or fear of isolation from the main group (Schmitt-Beck, 2015).

The bandwagon effect can amplify the spread of fake news. For example, research has examined how people perceive posts with a large number of retweets or likes as a way to judge that the information is accurate (Borah & Xiao, 2018; Halse et al., 2018; Lin et al., 2016). Fake news creators had exploited this through the use of bots (Ferrara et al., 2016) to create a bandwagon effect artificially4. Additionally, Talwar et al. (2019) have found that the fear of missing out5 (FoMO) is associated with the sharing of fake news among WhatsApp users in India, suggesting that a sense of recklessness may emerge out of FoMO which drives people to share new information to their social group without checking it first.

B.4 Echo chamber effect

The echo chamber effect is a metaphorical description of a phenomenon where individuals expose themselves to information that validates and reinforces their views, just like how individuals can hear echoes of their spoken word in a closed room (Barberá et al., 2015; Boutyline & Willer, 2017). One observation that is consistent with this effect is that social media users are likely to follow like-minded opinion leaders, receive news that promote their preferred narratives, as well as form polarised groups (Quattrociocchi et al., 2016; Vicario et al., 2016).

Misinformation and rumours are likely to spread where there are echo chambers present (Törnberg, 2018). There has been empirical evidence showing that information containing

3 Asch’s experiments involved participants who were were presented with a card with a simple vertical black line drawn on it. Then, they were given a second card with three lines of varying length labeled "A," "B," and "C." One line on the second card was the same length as that on the first, and the other two lines were obviously longer and shorter. To test the effect of conformity, Asch put a naive participant in a room with confederates. The confederates had agreed in advance what their responses would be when presented with the line task. The real participant did not know this and was led to believe that the others were also real participants like themselves. Each person in the room had to state aloud which comparison line (A, B or C) was most like the target line. The real participant sat at the end of the row and gave his or her answer last. Asch found that people conformed with the wrong judgments made by the group even though they had already made a correct one when they were asked to record what they judged as the accurate one in the beginning. 4 Such a process is sometimes known as , which is the attempt to create an impression of widespread grassroots support for a policy, policy, politician or business (Weedon et al., 2017). Fake news creators and political actors can systematically steer public opinion in their favour through artificially boosted bandwagon effect. This is often achieved by using a mix of armies of bots or trolls 5 The researchers of that study describes as an anxiety that grips individuals when they feel excluded from their social group (Talwar et al., 2019). HOME TEAM BEHAVIOURAL SCIENCES CENTRE 24

PUBLIC deliberately false claims gets accepted if it is consistent with the group’s beliefs (Anagnostopoulos et al., 2014; Vicario et al., 2015). For example, Vicario et al., (2015) showed that social homogeneity as a primary driver for the virality of scientific and conspiracy news, indicating that information is likely to be shared and spread among like-minded people regardless of their accuracy.

B.5 Dunning–Kruger effect

The Dunning-Kruger effect describes a cognitive bias in which people are overconfident in the assessment of their ability even though it can be quite poor (see Figure 11). This effect can be frequently observed in cases where people report that they are confident in their ability to spot fake news – however, when tested on their ability to spot fake news, they often do not perform as well (Dunning, 2011; Pennycook & Rand, 2017). For instance, people have commonly reported that they are confident in their ability to spot fake news even though they may not be good at spotting fake news when tested (Nygren & Guath, 2019; The Susceptibility of Singaporeans, 2018).

The phenomenon where an individual is overconfident despite their lack of knowledge or ability could be attributed to a few things: lack of awareness of their ignorance, relying upon general knowledge rather than specific-domain knowledge, or even narcissism (Dunning, 2011).

Dunning-Kruger effect

"I know everything" "Trust me, this is complicated"

"This is more complicated than I thought" Confidence

"This is starting makes sense to me" "I know that I know Knows nothing nothing" Knowledge

Figure 11. Illustration of the Dunning-Kruger effect

HOME TEAM BEHAVIOURAL SCIENCES CENTRE 25

PUBLIC

Appendix C: Additional Resources

YouTube video series • Crash Course: Navigating Digital Information by John Green in partnership with Mediawise and the Stanford History Education Group.

Relevant reading materials • Web Literacy for Student Fact-Checkers by Michael Caulfield • Ten Questions for Fake News Detection by The News Literacy Project • The Debunking Handbook by John Cook and Stephan Lewandowsky • Journalism, ‘Fake News’ & : Handbook for Journalism Education and Training by United Nations Educational, Scientific and Cultural Organization. • S.U.R.E Campaign website hosts resources about fake news in various languages http://www.nlb.gov.sg/sure/sure-campaign/

Games • Bad News is a free online browser game in which players strive to become a fake news tycoon. The game is created by Dutch media organization "DROG" in collaboration with University of Cambridge to cultivate people’s ability to recognise and expose disinformation.

HOME TEAM BEHAVIOURAL SCIENCES CENTRE 26