KH Anglais Presse File 5 – American Elections 2020 – Social Media / Conspiration Theories

A/ Who will choose the next US president – the American people, or Facebook?

Jonathan Freedland, The Guardian, Fri 31 Jul 2020 globally and instantly and, at the same time, deliver them to a precisely selected audience, thanks to the This week, in a hearing on Capitol Hill, you could copious data Facebook holds on its users, the use of gaze upon the men with the power to determine which allows ads to be micro-targeted for a price. And November’s presidential election and the future of remember, this data isn’t limited to the attitudes you American democracy – but the men in question were not might have expressed online, but could include the politicians. Rather they were the four tech titans who purchases you’ve made on your credit card, even the appeared before a congressional committee. Even via mundane details of your life, as recorded by the gadgets video link, the power radiated from them: the heads of that comprise the internet of things. Facebook, Google, Amazon and Apple loomed from the Occasionally the social media behemoths are monitors as veritable masters of the universe, their compelled to take at least the appearance of action, if elected questioners mere earthlings. only for the sake of managing their own reputations. It That hardly exaggerates their might. Between them, happened this week, with the eventual removal of grime and with their users numbered in the billions, Facebook artist Wiley from multiple platforms after he went on an and Google determine much of what the human race extended, hate-filled rant against Jews: after a 48-hour sees, reads and knows. Mark Zuckerberg’s writ runs “walkout” from , organised by an ad-hoc group across the planet, no single government is able to of activists and celebrities, the network appeared to constrain him: he is an emperor of knowledge, a realise hosting high-profile racism isn’t a good look. minister of information for the entire world. A mere Today, Twitter removed the account of white tweak of an algorithm by Facebook can decide whether supremacist David Duke, which prompts the question: lies, hate and conspiracy theories spread or shrivel. what on earth took you so long? That’s been true for a while, but in 2020 it’s gained an Make no mistake, the presence of lies and hate on extra urgency. We know the impact social media had in these platforms is not some regrettable bug. It is a the US election in 2016 – when ever wilder fictions and feature. The business model for social media requires fantasies were allowed to proliferate about Hillary attention – eyeballs – and the best way to get that is Clinton and when, according to the Oxford scholar engagement. Messages that stir anger, fury and yes, Philip Howard in a new book, Lie Machines: “There hate, keep people online more effectively than content was a one-to-one ratio of junk news to professional that is merely interesting or amusing. It’s why studies news shared by voters over Twitter.” In fewer than 100 show that false news spreads faster than true news: the days Americans will choose a president, and there are algorithms are designed to favour virality over veracity. no guarantees that the same thing will not happen again. What can be done? There’s no shortage of ideas. Some What’s more, it’s now clear that the online spread of start with the demand for fact-checking and, after the falsehoods is a matter of life and death. (They knew that 2016 election, Facebook took steps in that direction. But already in Myanmar, where the violence against the when it emerged that one of its fact-checking partners Rohingya people was incited on Facebook.) In the midst was Daily Caller, a rightwing news website known for of a pandemic, solid, verified information is an essential pushing misinformation, the scheme’s credibility tool of public health. If bogus claims and unhinged plunged. conspiracy theories – like those aired in a pseudo- Or, more simply, Facebook, YouTube and Twitter documentary such as Plandemic – land in people’s news could admit that of course they are publishers and they feeds, it’s as if the water supply has become should therefore take the responsibility that goes with contaminated. Eventually Facebook and YouTube took the mighty power they have. If that means hiring a down Plandemic, with its evidence-free assertions that million moderators to check their content, weeding out Covid-19 is the fault of Bill Gates and the World Health lies and hate, then so be it. They can hardly cry poverty: Organization, that vaccines are bad and that wearing a these are close to trillion-dollar companies. mask is dangerous, but not before millions had ingested If they don’t like the analogy with publishers, then that garbage on those platforms. perhaps they’d rather be treated like, say, car Of course, cranks and fantasists have been with us manufacturers, who, if found to be delivering a forever, but social media has given them a reach they dangerously faulty product, have to recall and fix that could never have dreamed of. Armed with Facebook, product, regardless of the expense. At the moment, the the would-be propagandist can distribute messages social media giants enjoy legal protection from such But, as this week’s hearing proved, elected liability in the US. representatives are not powerful enough to do that Politicians could change that, just as they could alone. They’d have to work together, governments follow Howard’s demands in Lie Machines and break across the globe. They’d need the backing of the big companies’ “monopolisation of information” by advertisers, withdrawing their pounds and dollars from legislating a citizens’ right to donate their own data to companies that give a platform to hate. And they’d need smaller organisations: that way such groups would be all of us to declare we’re sick of this poison in the more able to compete with the tech giants and those able information bloodstream, and we won’t rest till it’s to pay for their services. drained away. • Jonathan Freedland is a Guardian columnist

B/Misinformation about Biden’s health spreads after debate TikTok videos and Trump ads with false information got more than 700,000 views and clicks

USA Today, Oct. 1, 2020 False stories about Joe Biden’s health continued to spread on social platforms the day after the first presidential debate, including misleading Facebook ads by the Trump campaign and a viral video on TikTok. A false story about Biden wearing an earpiece that emerged on Tuesday continued to get traction on Facebook after the debate. The Trump campaign ad, which encourages people to “Check Joe’s Ears,” and asked “Why won’t Sleepy Joe commit to an earpiece inspection,” was viewed between 200 to 250,000 times and marketed primarily to people over 55 in Texas and Florida. The implication of the ad, the content of which originated from a tweet by a New York Post reporter who cited a single anonymous source, is that Biden needed the assistance of an earpiece so someone could pass him information during the debates. And on the video platform TikTok, four grainy videos alleging that Biden was wearing a wire to “cheat” during the debate racked up more than half a million combined views on Wednesday, according to research by the left-leaning media watchdog group Media Matters. One of the videos shows a still of Biden with his hand inside his suit, while another overlays an arrow over Biden’s tie, but neither video shows any visual evidence of Biden wearing an electronic device of any kind. Tech companies have long struggled with misinformation and are on high-alert going into the election. Ahead of the debate, Twitter and Facebook executives reviewed hashtags, trends, and other accounts that may break the companies’ rules using a combination of software and human review. The companies are also pushing out accurate information about how to register to vote to millions of people. But the latest evidence shows that they continue to struggle, particularly when it comes to falsehoods spread by the president and his followers. On Wednesday night, Facebook said the falsehoods would undermine the legitimacy of the election, following the company’s previous announcement of a ban on new ads in the week before the election. Adding to concerns, Twitter said it acted on a tip from the FBI to remove 130 accounts that appeared to originate in Iran and were attempting to sow disinformation during the presidential debate. Twitter said the accounts had a minimal reach. TikTok said it would remove the Biden video after being contacted by . The company prohibits misinformation that “misleads community members about elections or other civic processes.” The campaign ad on Facebook focusing on Biden reveals a significant hole in the social media giant’s enforcement efforts. Though the company says it spends a huge amount of resources combating election- related misinformation — including on fact-checking posts and news articles — the social network does not fact-check political ads as a matter of policy. That makes paid speech an exploitable category for misinformation. Hundreds of Facebook employees have opposed the company’s policy to not fact- check political ads. (…) Facebook did take the rare step on Wednesday of removing Trump ads that made baseless claims that accepting more refugees would increase health risks related to the pandemic. There were more than 30 versions of the ad running on the social network, according to Facebook’s ad transparency library. It had gathered between 200,000 and 250,000 impressions. “We rejected these ads because we don’t allow claims that people’s physical safety, health, or survival is threatened by people on the basis of their national origin or immigration status,” Stone said in a statement.

The Long Read C/ Do social media threaten democracy?

Facebook, Google and Twitter were supposed to save politics as good information drove out prejudice and falsehood. Something has gone very wrong

Leaders, The Economist, Nov 4th 2017

IN 1962 a British political scientist, Bernard Crick, published “In Defence of Politics”. He argued that the art of political horse-trading, far from being shabby, lets people of different beliefs live together in a peaceful, thriving society. In a liberal democracy, nobody gets exactly what he wants, but everyone broadly has the freedom to lead the life he chooses. However, without decent information, civility and conciliation, societies resolve their differences by resorting to coercion. How Crick would have been dismayed by the falsehood and partisanship on display in this week’s Senate committee hearings in Washington. Not long ago social media held out the promise of a more enlightened politics, as accurate information and effortless communication helped good people drive out corruption, bigotry and lies. Yet Facebook acknowledged that before and after last year’s American election, between January 2015 and August this year, 146m users may have seen Russian misinformation on its platform. Google’s YouTube admitted to 1,108 Russian-linked videos and Twitter to 36,746 accounts. Far from bringing enlightenment, social media have been spreading poison. Russia’s trouble-making is only the start. From South Africa to Spain, politics is getting uglier. Part of the reason is that, by spreading untruth and outrage, corroding voters’ judgment and aggravating partisanship, social media erode the conditions for the horse-trading that Crick thought fosters liberty. A shorter attention spa...oh, look at that! The use of social media does not cause division so much as amplify it. The financial crisis of 2007-08 stoked popular anger at a wealthy elite that had left everyone else behind. The culture wars have split voters by identity rather than class. Nor are social media alone in their power to polarise—just look at cable TV and talk radio. But, whereas Fox News is familiar, social-media platforms are new and still poorly understood. And, because of how they work, they wield extraordinary influence. They make their money by putting photos, personal posts, news stories and ads in front of you. Because they can measure how you react, they know just how to get under your skin. They collect data about you in order to have algorithms to determine what will catch your eye, in an “attention economy” that keeps users scrolling, clicking and sharing—again and again and again. Anyone setting out to shape opinion can produce dozens of ads, analyse them and see which is hardest to resist. The result is compelling: one study found that users in rich countries touch their phones 2,600 times a day. It would be wonderful if such a system helped wisdom and truth rise to the surface. But, whatever Keats said, truth is not beauty so much as it is hard work—especially when you disagree with it. Everyone who has scrolled through Facebook knows how, instead of imparting wisdom, the system dishes out compulsive stuff that tends to reinforce people’s biases. This aggravates the politics of contempt that took hold, in the at least, in the 1990s. Because different sides see different facts, they share no empirical basis for reaching a compromise. Because each side hears time and again that the other lot are good for nothing but lying, bad faith and slander, the system has even less room for empathy. Because people are sucked into a maelstrom of pettiness, scandal and outrage, they lose sight of what matters for the society they share. This tends to discredit the compromises and subtleties of liberal democracy, and to boost the politicians who feed off conspiracy and nativism. (…) Social media, social responsibility What is to be done? People will adapt, as they always do. A survey this week found that only 37% of Americans trust what they get from social media, half the share that trust printed newspapers and magazines. Yet in the time it takes to adapt, bad governments with bad politics could do a lot of harm. Society has created devices, such as libel, and ownership laws, to rein in old media. Some are calling for social- media companies, like publishers, to be similarly accountable for what appears on their platforms; to be more transparent; and to be treated as monopolies that need breaking up. All these ideas have merit, but they come with trade-offs. When Facebook farms out items to independent outfits for fact-checking, the evidence that it moderates behaviour is mixed. Moreover, politics is not like other kinds of speech; it is dangerous to ask a handful of big firms to deem what is healthy for society. Congress wants transparency about who pays for political ads, but a lot of malign influence comes through people carelessly sharing barely credible news posts. Breaking up social-media giants might make sense in antitrust terms, but it would not help with political speech—indeed, by multiplying the number of platforms, it could make the industry harder to manage. There are other remedies. The social-media companies should adjust their sites to make clearer if a post comes from a friend or a trusted source. They could accompany the sharing of posts with reminders of the harm from misinformation. Bots are often used to amplify political messages. Twitter could disallow the worst—or mark them as such. Most powerfully, they could adapt their algorithms to put clickbait lower down the feed. Because these changes cut against a business-model designed to monopolise attention, they may well have to be imposed by law or by a regulator. Social media are being abused. But, with a will, society can harness them and revive that early dream of enlightenment. The stakes for liberal democracy could hardly be higher.

D/ Why Everyone Is Angry at Facebook Over Its Political Ads Policy

The New York Times, By Mike Isaac Published Nov. 22, 2019, Updated Sept. 4, 2020

SAN FRANCISCO — After Google announced restrictions on political advertising this week, campaign strategists in Washington quickly turned their attention to a different company: Facebook. Some strategists voiced concerns to Facebook about how Google’s decision would affect it, said two people who talked to the company. They told Facebook that if it followed Google by limiting how political campaigns target audiences, it would hurt their ability to reach unregistered voters and make it tougher for smaller organizations to collect donations online, the people said. The conversations added to the pressure on Facebook as it weighs how to handle political advertising. Mark Zuckerberg, the chief executive, has made it clear that Facebook will not fact check ads from politicians — even if they contain lies — in the interest of free speech. But the social network is discussing some ad changes, like restricting how precisely campaigns can reach specific groups, said three people briefed by the company. Facebook has made no final ruling on its political advertising guidelines, said the people, who declined to be identified because the discussions were confidential. But Facebook risks being whipsawed by its indecision. On the one hand, the company wants to curtail the spread of disinformation across its site. The practice of targeting specific groups with ads, known as “microtargeting,” can stoke disinformation because advertisers can inflame niche audiences who may be susceptible to tailored messages. At the same time, Facebook wants to avoid alienating the groups and candidates who depend on its platform for fund-raising and organizing. So in trying to find a way to please everyone on the issue, Facebook has managed to please no one. The social network has now become an outlier in how freely it lets political candidates and elected officials advertise on its platform. While Mr. Zuckerberg declared last month that Facebook would not police political ads, Twitter said it would ban all such ads because of their negative impact on civic discourse. On Wednesday, Google said it would no longer allow political ads to be directed to specific audiences based on people’s public voter records or political affiliations. The pressure on Facebook over what to do about political ad targeting has been unrelenting. Organizations on both sides of the political aisle — from as large as President Trump’s re-election campaign to smaller, grass-roots groups — have tried to persuade Facebook not to rein in the ad targeting. “Making major changes to platform ad targeting would severely disadvantage Democrats and progressives who rely on Facebook for fund-raising and currently have a much smaller organic audience and current database of supporters to engage on and off the platform than ,” said Tara McGowan, founder and chief executive of Acronym, a progressive nonprofit group. Some Republican strategists said they also feared losing the ability to raise significant campaign donations online if Facebook reduced ad targeting. On Friday, the Democratic National Committee, the Democratic Senatorial Campaign Committee and the Democratic Congressional Campaign Committee issued a joint statement condemning Google’s changes to its ad targeting policies. “Tech companies should not reduce the power of the grass roots just because it is easier than addressing abuse on their platforms,” the groups said in a memo to CNN. “We call on these tech companies, including Google, to reconsider their decision to bluntly limit political advertising on their platforms.”

E/Americans Favor Protecting Information Freedoms Over Government Steps to Restrict False News Online

But 56% support steps from technology companies, even if it means some limits on publishing and accessing information

BY AMY MITCHELL, ELIZABETH GRIECO AND NAMI SUMIDA THE PEW RESEARCH CENTER, APRIL 18, 2019 The widespread concerns over misinformation online have created a tension in the United States between taking steps to restrict that information – including possible government regulation – and protecting the long-held belief in the freedom to access and publish information. A new Pew Research Center survey finds that the majority of Americans are resistant to action by the U.S. government that might also limit those freedoms but are more open to action from technology companies. When asked to choose between the U.S. government taking action to restrict false news online in ways that could also limit Americans’ information freedoms, or protecting those freedoms even if it means false information might be published, Americans fall firmly on the side of protecting freedom. Nearly six-in-ten Americans (58%) say they prefer to protect the public’s freedom to access and publish information online, including on social media, even if it means false

information can also be published. Roughly four- in-ten (39%) fall the other way, preferring that the U.S. government take steps to restrict false information even if it limits those freedoms, according to a survey conducted Feb. 26-March 11, 2018, among 4,734 U.S. adults who are members of Pew Research Center’s nationally representative American Trends Panel. When the same question is posed about technology companies taking those steps, however, the balance changes. More U.S. adults (56%) favor technology companies taking steps to restrict false information, even if it limits the public’s freedom to access and publish information. By comparison, 42% prefer to protect those freedoms rather than have tech companies take action, even if it means the presence of some misinformation online. The resistance to U.S. government action cuts across nearly all demographic groups studied, with strong sentiments among young Americans, the college educated and men, as well as both Democrats and Republicans. The exceptions are those with a high school degree or less and those ages 50 and older, who are about evenly divided between the government taking steps and ensuring the protection of information freedoms. Additionally, most demographic groups express more support for action by tech companies than by the U.S. government. Yet the degree of support for such companies taking steps varies across groups. Specifically, Democrats express more support for technology companies acting than do Republicans, even if it brings some broader limits on freedom to publish. Older Americans (ages 50 and older) are Younger Americans show greater resistance to also more supportive of tech companies taking action by both the government and tech companies action than are younger adults. to limit misinformation online

The less educated are more supportive of action by Both men and women favor protecting freedom of the U.S. government to restrict false news information online over government intervention, but men do so at higher rates

F/ The Conspiracy Theory QAnon

The rise of QAnon

The Morning, Newsletter from , August 13 2020

Over three years, the far-right conspiracy theory QAnon has spread from the outskirts of the internet into the mainstream. And by next year, it will most likely make its way into Congress: A QAnon supporter is almost sure to be elected after winning a Republican House primary runoff in Georgia this week. It’s a remarkable rise for a group that believes in, among other things, a ring of Satan-worshiping, child- trafficking criminals led by prominent Democrats. We spoke with our colleague Kevin Roose, who has covered QAnon extensively, to get a sense of how the movement expanded its reach. Social media platforms are a big part of it. “It’s still very fringe in terms of its ideology, but not in terms of its scale,” Kevin said. “We’ve seen QAnon Facebook groups swell to hundreds of thousands of members, and they are routinely driving conversations on social media.” This week, NBC News reported that an internal Facebook investigation found thousands of QAnon-supporting groups and pages with millions of members and followers. Twitter permanently suspended thousands of accounts associated with the movement last month. And TikTok has blocked searches for QAnon-related hashtags. But these companies have “realized belatedly that this is a major problem,” Kevin said. “The horse has left the barn.” The pandemic, which led many people to spend more time online, has also bolstered the movement, Kevin said: “Our social interactions are mostly taking place online, and that means that the communities that have power online, including QAnon, are a much bigger part of the discourse.” Much of its growth also comes from its ability to attach itself to, and then absorb, both legitimate causes and “every major conspiracy theory of the past 50 years,” Kevin said. Its followers are “deliberately attempting to radicalize new groups of people,” he noted, by infiltrating Facebook groups focused on vaccine safety, parenting, and natural food and health. “These are people who might be skeptical of mainstream science or authorities,” Kevin said, “and they’re inserting their messages to those communities. So that’s what people need to be made aware of — these aren’t people hanging out in the dark corners of the internet anymore.”

G/ Debunked QAnon conspiracy theories are seeping into mainstream social media. Don't be fooled. NATHAN BOMEY | USA TODAY, Sept 17, 2020 An emboldened community of believers known as While many QAnon theories and content remain QAnon is spreading a baseless patchwork of conspiracy on fringe platforms like far-right message board 8kun, theories that are fooling Americans who are looking for some have made their way into mainstream social simple answers in a time of intense political media services like Facebook, Twitter and YouTube. polarization, social isolation and economic turmoil. On those platforms, the bogus or misleading material Experts call QAnon a "digital cult" because of its is gaining traction among people who have no idea pseudo-religious qualities and an extreme belief system they're dabbling in QAnon. that enthrones President Donald Trump as a savior While the major tech platforms have said they're figure crusading against evil. cracking down on certain QAnon content, much of The core of QAnon is the false theory that Trump was it continues to circulate. elected to root out a secret child-sex trafficking ring run Here are some key elements to watch out for: by Satanic, cannibalistic Democratic politicians and Pizzagate celebrities. Although it may sound absurd, it has This conspiracy theory was a precursor to QAnon, but nonetheless attracted devoted followers who have it has recently regained momentum and become begun to perpetuate other theories that they suggest, intertwined with QAnon. imply or argue are somehow related to the main Originating during the 2016 presidential campaign, this premise. falsehood claimed that emails exposed by Wikileaks showed Democrats with ties to had been running a child-sex ring from the basement of a intelligence experts, have said there’s no basis for Washington, D.C., pizzeria. believing the virus was intentionally released as a None of it was true, but that didn't stop a North Carolina bioweapon and that it's false to claim Democrats man from traveling to the district to investigate the engineered the release to hurt the president. Some matter for himself. He fired his rifle in the researchers have continued pressing the question of restaurant before he was arrested and later whether the virus was released accidentally from a lab imprisoned. No one was injured. in Wuhan, China, but the prevailing view among The Deep State scientists is that the scenario is not supported by A common claim among QAnon conspiracists is that a evidence and analysis. shadowy network of politicians and bureaucrats secretly Others have suggested falsely that Bill Gates, his collaborate to control the government behind the scenes. foundation, or both had planned the pandemic. More While this claim takes many forms, it generally centers recently, QAnon followers have promoted a false on the suggestion that a cabal of powerful elites is interpretation of Centers for Disease Control and manipulating the world. Prevention data, saying it proved that the pandemic was For some, the fantasy of a Deep State is a pillar of their not as deadly as health officials have reported. belief system regarding government, business and Taken together, the pandemic-related theories tied to entertainment. QAnon illustrate the movement's appeal, as it attempts Trump himself has promoted the concept many times, to explain frustrating elements of life as developments including most recently to assert that a "deep state, or that make sense in the broader scheme of things. whoever," at the Food and Drug Administration #SaveTheChildren was "making it very difficult for drug companies to get This hashtag, along with the related people in order to test the vaccines and therapeutics" for #SaveOurChildren, has circulated widely in recent COVID-19. months, posing as a harmless and, in fact, noble cause. It's one thing to allege that government bureaucrats are Who can't get behind the idea of saving children? posing an obstacle to progress, or that politicians make The problem is that the hashtag has been used to too many backroom deals. Those are common promote QAnon's false theory that a broad network of criticisms. pedophiles is using their collective power to run a child- It's another thing to suggest, without evidence, that a sex trafficking ring. secret network of people are coordinating plans to The hashtag masquerades as a mainstream cause, disrupt the rule of law and democracy. drawing currency from the unrelated century-old In a letter published by USA TODAY following humanitarian group with the same name. Trump's accusations about the FDA, eight agency Experts say that QAnon has gained momentum in part officials defended their processes and scientific through posts with hashtags like #SaveTheChildren integrity. because people who share them on social media often "When it comes to decisions to authorize or approve the don't realize that they're amplifying an insidious products we regulate, or to take appropriate action when network of theories that the FBI has called a domestic we uncover safety issues, we and our career staff do the terror threat. best by public health when we are the decision-makers, Signal decoding arriving at those decisions based on our unbiased QAnon started with a mysterious, anonymous person evaluation of the scientific evidence," the officials using the name "Q Clearance Patriot" and claiming to wrote. be a high-ranking intelligence officer with access to COVID-19 insidious secrets. Because of its massive effect on everyone's lives, The person, known to followers as "Q," has continued COVID-19 has become the target of numerous to publish mysterious posts with coded language that conspiracy theories connected to QAnon. The World followers attempt to puzzle out. Health Organization has a term for the collision of the In general, the suggestion that there are secret signals, coronavirus and misinformation: "infodemic." like clandestine acronyms and jumbled grammar, that Some QAnon followers have suggested that the can be decoded to reveal the truth about global pandemic was a Chinese bioweapon or that its eruption mysteries bears the hallmarks of the QAnon was designed in part by Democrats to derail Trump's community, experts say. reelection chances. Scientists who are studying where It can also take on the form of suggestions that users do the virus originated have generally concluded their own research or that they have taken the "red pill," it emerged in nature and was passed on to humans after a reference to the film "The Matrix," in which the main passing through animals, starting likely in a bat or character suddenly understands how his world has been pangolin. Scientists and others, including defense and manipulated after he ingests the drug.

H/ As QAnon grew, Facebook and Twitter missed years of warning signs about the conspiracy theory’s violent nature Fears of clamping down on authentic speech created a ‘Frankenstein’

The Washington Post, Oct. 1, 2020 https://www.washingtonpost.com/technology/2020/10/01/facebook-qanon-conspiracies-trump/ When an article on Facebook detailed a QAnon conspiracy theory about a “16-Year Plan to Destroy America,” commenters demanded death for those supposedly involved, including former president Barack Obama, former secretary of state Hillary Clinton and other Democrats. Some Facebook commenters were specific in their calls for justice: “Firing squad---by SHOTGUN!” Others just craved speed: “TREASON = FIRING SQAUD [sic] OR HANGING! DO IT NOW PLEASE THAT’S THE LAW! ! ! ! ! ! ! ! ! ! ! ! ! !” These posts — from January 2018, just months after QAnon flamed to life from the embers of Pizzagate, with its false claims of a child sex ring run by Democrats out of a Washington pizzeria — were among the many early warnings that the new conspiracy theory was fueling hatred and calls for violence on Facebook, Twitter and other social media. But it would be years before Facebook and Twitter would make major moves to curb QAnon’s presence on their platforms, despite serious cases of online harassment and offline violence that followed, and moves by other social media companies to limit the spread of QAnon’s lurid and false allegations of pedophilia and other crimes. One social media company, Reddit, closed forums devoted to the conspiracy theory in 2018 because of online harassment and calls for violence, and YouTube removed tens of thousands of QAnon videos and hundreds of related channels in June 2019 as part of a broader crackdown on content that violated its hate speech and other policies, YouTube spokesman Alex Joseph said. Still, it would be another year before Facebook and Twitter would initiate broad crackdowns against QAnon, waiting until this past summer to close or limit the reach of more than 20,000 QAnon-connected accounts and pages after two years of QAnon-fueled threats of violence and numerous real-world crimes. By then, FBI officials, in an intelligence briefing, had warned that QAnon was becoming a potential domestic terrorism threat, and the U.S. Military Academy’s Combating Terrorism Center had warned that “QAnon represents a public security threat with the potential in the future to become a more impactful domestic terror threat.” QAnon adherents made good use of the delay, using the power of those mainstream social media platforms to grow the movement into what many researchers consider the world’s largest and most virulent online conspiracy theory. Feverishly analyzing cryptic “drops” of information from the anonymous leader “Q,” followers spread misinformation about a host of seemingly unconnected issues, from the Sandy Hook, Conn., mass shooting to the supposed dangers of vaccines to the recent wildfires in the Pacific Northwest. Throughout, they traded in anti-Semitic tropes and other hateful content. “These accusations were so deranged,” said researcher Travis View, who co-hosts a podcast called “QAnon Anonymous” and has watched in growing horror as the conspiracy theory grew. “I always knew it would get to the point where people would ask: How did it get to this point? How did it get so bad?” One key answer, researchers who have studied QAnon say, was Silicon Valley’s fierce reluctance to act as “an arbiter of truth” even as disinformation with potentially dangerous consequences ran rampant across its platforms. Mainstream social media companies permitted the growth of the conspiracy theory in part because they considered it authentic domestic political speech at a time when President Trump and other Republicans were bashing the firms for alleged bias against conservatives, people familiar with internal debates at the companies say. Twitter’s head of site integrity, Yoel Roth, acknowledged his company had been slow. “Whenever we introduce a change to our policies, we can look back and wish that we’d introduced it earlier. And I think in the case of QAnon in particular, there were signals that I wish we and the entire industry and the world had responded to sooner,” he said in an interview. Facebook and Twitter did take actions against individual QAnon accounts and pages in the years before the recent crackdowns, including in April, when Facebook took down five pages and six QAnon-affiliated groups that had amassed more than 100,000 members and followers. But by the time of more systemic action this summer, more than 7,000 accounts affiliated with QAnon were spreading what Twitter called harmful disinformation on its service. Facebook removed nearly 800 groups and banned 300 hashtags when it acted in August, and placed restrictions on an additional 10,000 accounts across Facebook and Instagram. The company declined to say how many members the groups had, but researchers have said that millions of Facebook users were probably affected. Researchers say these moves curbed QAnon’s reach somewhat, but several asked: What took so long?

At QAnon’s core are baseless allegations that Democratic officials and Hollywood celebrities engaged in unconscionable crimes, including raping and eating children, while seeking to subvert the Constitution. Trump, the conspiracy theory holds, is quietly battling these evils. The “Q” of QAnon is supposedly a high-level government official privy to these secrets because of a top-secret security clearance. The shadowy figure speaks only on the site 8kun, a successor to the now-closed 8chan, but the information for years spread almost instantly across mainstream social media platforms, powered by those analyzing Q’s pronouncements. More than 70 Republican candidates have promoted or voiced support for at least some elements of the conspiracy theory this year, according to tracking by liberal research group Media Matters, and one open adherent, Marjorie Taylor Greene, is virtually guaranteed to win a seat in Congress in November’s election. Trump has praised Greene, defended QAnon supporters and retweeted content from QAnon accounts. QAnon T-shirts, slogans and posters have regularly appeared at Trump events since 2018 and, as his reelection effort intensified this year, in campaign ads as well. White House social media director Dan Scavino has posted QAnon- themed imagery. Vice President Pence had plans earlier this month to attend a Montana fundraiser hosted by a couple that has shared QAnon posts and memes on social media until the reported on the event. The list of violence inspired by QAnon is long, and the serious incidents date back to 2018, when an armed man touting the conspiracy theory was arrested after a standoff at the Hoover Dam. Another man fixated on QAnon fatally shot a New York crime family figure in 2019. And this past April, police arrested a woman armed with more than a dozen knives after she announced on Facebook that Clinton and former vice president Joe Biden “need to be taken out.” Even as the companies regarded QAnon posts as a largely protected class of free speech, there often were apparent violations of company policies against calls for violence and harassment of individuals. Though the targets often were public figures — including Obama, model Chrissy Teigen and Serbian artist Marina Abramovic — the intensity and hatefulness of the posts were as obvious in 2018 and 2019 as they were when Facebook and Twitter took action this summer, researchers said.

The political persuasion of most of QAnon’s supporters cooled interest within Facebook in cracking down on the conspiracy theory at a time when the company was working to refute Republican allegations of bias, said people familiar with internal conversations. Executives feared that punishing Trump supporters would compromise the perception of neutrality that the company hoped to achieve and would result in the censorship of genuine political speech.

One notable change that heightened calls for action this year: QAnon conspiracy theorists began touting false cures for covid-19, crossing a red line the social media companies had drawn during the early phases of the global pandemic. On this subject, Twitter, Facebook, YouTube and other companies have chosen to combat falsehoods more directly than ever before, arguing that untrue medical information was different from political speech. That applied even to posts by Trump and his top supporters, who for the first time faced sanctions from social media sites, including warning labels and the removal of some especially blatant misinformation. In May, a conspiratorial documentary called Plandemic, in which a discredited research scientist made false claims that wearing masks helps cause the coronavirus, went viral, becoming one of the top trending videos on YouTube. Social media researcher Erin Gallagher traced the aggressive promotion of the documentary back to a handful of Facebook groups with tens of thousands of members each.

Even more recently, researchers have documented QAnon accounts pushing false claims that members of antifa, a loosely organized, far-left political faction, had started wildfires in the Pacific Northwest. This prompted Stone, the Facebook spokesman, to tweet on Sept. 12 that the company was removing posts because police were having to “to divert resources from fighting the fires and protecting the public.” The move was ridiculed in replies to Stone’s tweet as too little, too late. The FBI took to Twitter to debunk the allegation. “Reports that extremists are setting wildfires in Oregon are untrue,” the Portland office tweeted. I/ Georgia Republican Marjorie Greene slams House vote to condemn QAnon

House overwhelmingly voted in favor of bipartisan resolution Friday: 371-18

Fox News, October 3, 2020 The House overwhelmingly voted in favor of a resolution to condemn QAnon conspiracy theories Friday, drawing the ire of an incoming Georgia congresswoman who has made headlines nationally for her past praise for Q. Marjorie Greene, the presumptive next congresswoman from Georgia's 14th congressional district, slammed the House for passing the resolution and said "Congress is failing" for not focusing on condemning Antifa and Black Lives Matter instead. "There's no reason for a resolution to condemn non-existent violence from folks talking about the VERY REAL Deep State and the attempted coup on President Trump on the internet," Greene told Fox News in a statement about QAnon.

Greene, a controversial candidate who has been previously censored on social media for posts deemed threatening, went on to say the real threats are Antifa, Black Lives Matter and the Democrats who support the groups. "Republicans should lead strongly by standing up for innocent Americans against this smear and rightfully designate Black Lives Matter and Antifa both domestic terror organizations," Greene said. "The fact that Democrats refuse to condemn the real domestic terrorists BLM and Antifa is what Americans should remember when they go to vote on Nov. 3." But most Republicans joined with Democrats in passing the resolution by a 371-18 vote Friday marking the first official bipartisan condemnation of this conspiracy movement that critics say is actively working to undermine public trust in America’s democratic institutions. Forty members didn't vote on the resolution and one Republican, Rep. Andy Harris of Maryland, voted present. QAnon, thought to have been founded in 2017, has been tied to theories about a "deep state" attack against Trump, involving career bureaucrats who are hellbent on taking down ousting the president. It’s unclear who "Q" actually is, and if he or she is just one person or multiple people. People supporting QAnon began appearing at Trump campaign rallies in 2018 and the president has retweeted QAnon-affiliated accounts dozens of times. The resolution was sponsored by Rep. Denver Riggleman, R-Va., and Rep. Tom Malinowski, D-N.J., who said he faced death threats after QAnon conspiracy theorists targeted him after a GOP ad falsely accused him of lobbying to "protect sexual predators." “Conspiracy theories that falsely blame secret cabals or marginalized groups for society’s ills have long fueled prejudice, violence and terrorism,” Malinowski said in a statement after the vote. “Today the House of Representatives came together across party lines to say that QAnon has no place in our nation’s political discourse.” The resolution condemns QAnon and rejects the conspiracy theories it promotes; condemns all other groups and ideologies, from the far left to the far right, that contribute to the spread of unfounded conspiracy theories and violence and destruction; and encourages the FBI to prevent criminal activity motivated by fringe political conspiracy theories. "QAnon initially alleged that prominent Americans are engaged in a secret plot to control the world, while using their power to exploit children, and has expanded to embrace virtually every popular conspiracy theory of the last several decades, from questioning the truth about the September 11th terrorist attacks, to believing in alien landings, to denying the safety of vaccines," the resolution states, which also lays out several examples of violence inspired by the conspiracy theories. During the debate about the resolution in the House Rules Committee earlier this week, Democrats said it's time to repudiate the extremist ideology. "QAnon is a sick cult -- plain and simple," said Rep. Jim McGovern, D-Mass. "Its supporters gather in the darkest corners of the internet to spread hate and conspiracy theories that lead to violence." Republicans didn't defend QAnon, with Rep. Tom Cole, R-Okla., saying "they're just crazy," but the GOP also wanted the resolution changed to specifically call out Antifa, the far-left movement. Rep. Debbie Lesko, R-Ariz., made an effort to amend the legislation, but it was voted down by the Democratic majority in the committee. "Both groups are bad and I think you should add them in," Lesko said. Also this week Facebook and Twitter promised to stop encouraging the growth of the baseless conspiracy theory QAnon, which fashions Trump as a secret warrior against a supposed child-trafficking ring run by celebrities and government officials after it reached an audience of millions on their platforms this year. The QAnon phenomenon sprawls across a patchwork of secret Facebook groups, Twitter accounts and YouTube videos. QAnon has been linked to real-world violence such as criminal reports of kidnapping and dangerous claims that the coronavirus is a hoax. But the conspiracy theory has also seeped into mainstream politics, with several GOP candidates like Greene who have been QAnon- friendly. Greene had previously said her "Q"-supporting videos are in her past and don't represent her priorities for Congress. But this week as QAnon came to the House floor, she defended it as exposing the deep state efforts against Trump and protecting children.

More links and resources

The Role of social media

> This is what Facebook is saying they’re doing: https://about.fb.com/actions/preparing-for-elections-on- facebook/?utm_source=newsletter&utm_medium=email&utm_campaign=wp_post_most&bb=true&wpisrc=nl_mos t QAnon > A very useful video from The Washington Post https://www.washingtonpost.com/video/politics/how-qanon-the-bizarre-pro-trump-conspiracy-theory-took-hold- in-right-wing-circles-online/2019/02/20/13fc256f-d647-41cd-be69-fbeba3769579_video.html

> Another video (This time in French, from France Culture) https://www.franceculture.fr/sciences/qanon-est-il-un-vrai-mouvement- populaire?utm_medium=Social&utm_source=Facebook&fbclid=IwAR30qahikgwsbvIOZm0prv46YGXJ1TnVwjG5x- U9nI1pCwFHaYL0eU2gR-0#Echobox=1601445976

> And their series of short radio programmes https://www.franceculture.fr/emissions/le-tour-du-monde-des-idees/qanon-ou-le-complotisme-dun-nouveau-genre

> The Long Read: a very long because thorough analysis and history of the QAnon phenomenon https://www.theatlantic.com/magazine/archive/2020/06/qanon-nothing-can-stop-what-is-coming/610567/

> Or an older article by the same reporter https://www.theatlantic.com/technology/archive/2017/06/the-normalization-of-conspiracy-culture/530688/