Entrepreneurial Solutions to Online Disinformation: Seeking Scale, Trying for Profit
Total Page:16
File Type:pdf, Size:1020Kb
ENTREPRENEURSHIP & POLICY WORKING PAPER SERIES Entrepreneurial Solutions to Online Disinformation: Seeking Scale, Trying for Profit Anya Schiffrin In 2016, the Nasdaq Educational Foundation awarded the Columbia University School of International and Public Affairs (SIPA) a multi-year grant to support initiatives at the intersection of digital entrepreneurship and public policy. Over the past three years, SIPA has undertaken new research, introduced new pedagogy, launched student venture competitions, and convened policy forums that have engaged scholars across Columbia University as well as entrepreneurs and leaders from both the public and private sectors. New research has covered three broad areas: Cities & Innovation; Digital Innovation & Entrepreneurial Solutions; and Emerging Global Digital Policy. Specific topics have included global education technology; cryptocurrencies and the new technologies of money; the urban innovation environment, with a focus on New York City; government measures to support the digital economy in Brazil, Shenzhen, China, and India; and entrepreneurship focused on addressing misinformation. With special thanks to the Nasdaq Educational Foundation for its support of SIPA’s Entrepreneurship and Policy Initiative. SIPA’s Entrepreneurship & Policy Initiative Working Paper Series Entrepreneurial Solutions to Online Disinformation: Seeking Scale, Trying for Profit By Anya SchiffrinI Executive summary to be tackling the problem of removing dangerous/ille- gal content but are universally resented because they As worries about dis/misinformation online and the profit from the spread of mis/disinformation and have contamination of public discourse continue, and inter- been slow to respond to warnings about the danger- national regulatory standards have yet to be developed, ous effects it is having on society. Facebook has blocked a number of private sector companies have jumped tools promoting transparency of advertising on the into the breach to develop ways to address the problem. site.1 Facebook has consistently refused to release data Most use a combination of people and natural lan- to researchers seeking to measure the impact of expo- guage processing to try and identify false information sure to disinformation. online. Nascent and not yet widespread, the firms hope to survive by finding commercial applications for their In this fragmented universe of solutions, characterized products or by being adopted by the social media plat- by a lack of comprehensive platform regulation, a num- forms. Most of the 13 startups we looked at fall into two ber of small private sector companies have proposed categories: “bootstrap” operations with small amounts ways of solving the problem. The aim of this report is of funding, or slightly larger enterprises that have raised to understand what solutions they are developing and about one-two million dollars of capital. Whether any outline some characteristics of the sector and points in of these firms will be able to scale is not clear. common. We also tackle the question of whether their proposed solutions will be effective in the absence of widespread adoption by the giant platforms. Introduction We carried out more than 20 in-depth interviews Many solutions are being tried to combat the alarming between December 2018 and February 2019 and this spread of mis/disinformation online. To name just a report includes short profiles of 13 firms. We identi- few: governments around the world are imposing regu- fied the companies by asking experts in the field for lations as to what can and cannot be said online, foun- suggestions which we then cross referenced as well as dations are funding fact-checking initiatives and efforts by reading press reports. Most interviews were con- to build trust in media. Journalists are trying to build ducted on the phone but some were done in person. relationships with their audiences and strengthen local We asked the companies about their technology and news reporting so as to provide societies with credible how it works, their business models, annual revenues, and useful information. Journalists also track the peo- and their plans to scale as well as their thoughts about ple and governments who create and spread false news government regulation. and propaganda. Facebook, Google and Twitter claim I. Anya Schiffrin is the director of the Technology, Media, and Communications specialization at Columbia University’s School of International and Public Affairs. ___________________________________________ Entrepreneurial Solutions to Online Disinformation: Seeking Scale, Trying for Profit | 1 Some general observations about the companies • Mostly the firms did not set out to crack the we studied: problem of dis/misinformation online. Many were involved in other activities and came across • The companies more or less fall into three mis/information and decided to do something categories a) detection and mapping b) curation about it, notes Alejandro Romero of Alto c) certification. Analytics, a data analysis firm based in Madrid • There is disagreement as to whether natural with presence in UK, US and Brazil. language processing will be able to identify all Many have day jobs and core businesses. Many have ties forms of mis/disinformation online. Therefore, to intelligence and government, cyber security, fraud most of the outlets use a combination of people detection. Some are tiny (Eric Feinberg from GIPEC and natural language processing to try and is just one person) some are large and well established. identify false information online so it can be taken down. “All the companies working in this space were working • Nascent and not yet widespread, these on something else and the disinformation they saw had companies hope to survive by finding an impact on what they were doing. They saw a busi- commercial applications for their products ness opportunity and thought this could be an interest- or by being adopted by the social media ing learning. I’ve not seen a company that started up platforms. Brandwatch and Alto have just to fight disinformation,” Romero said. existing businesses and this new area or The startup executives we interviewed assume that new application of their technology is not widespread regulation of online hate speech and the critical for their business continuity. platforms is not imminent and that consumer and cor- • Most of the 13 startups we looked at fall into porate demand for their products will continue. They two categories: bootstrap operations with small note that Facebook and its affiliates, such as YouTube, amounts of startup funding or slightly larger and Twitter have no incentive to change a business firms that have raised about one-two million model that is based on generating outrage and engage- US dollars of funding. Some, like Brandwatch ment. In this scenario, without regulation the plat- and Alto, are not startups. Alto has more forms will continue to allow, or even encourage, the than 100 employees and Brandwatch was circulation of falsehoods online. founded in 2006. Brandwatch has 64.7 million The platforms could do more to fix the problem but dollars in funding.3 have no incentive to do so. These firms exist because of a lack of action by social media companies that have no financial incentive to Disinformation is false information spread deliberately fix the problem themselves and because government to deceive. The English word disinformation is a loan regulation addressing the problem has not yet been translation of the Russian dezinformatsiya, derived from passed. Or the regulation that does exist, i.e. section the title of a KGB black propaganda department.” 4 230 of the Communications Decency Act, protects the social media platforms from being liable for what Claire Wardle, executive director of First Draft, they put on their platforms. Many of the compa- has further broken down the definition into types of nies we spoke to said that Facebook, YouTube, and dis/mis information citing a number of scholars Twitter have been extremely lax and irresponsible in who have also tried to do so. They have provided a allowing hate speech and disinformation to contam- taxonomy of the different kinds of mis and disin- inate their networks. Many believe that, technically, formation (including, for example, satire and false it’s not a hard problem to fix but they note that there context as well as imposter content and fabricated is no incentive for Facebook or Twitter or YouTube content) and also made taxonomies of who the to do so. “If the platforms try to tackle the problem different actors and targets are. internally it will be a huge revenue loss for them so 2 | SIPA’s Entrepreneurship & Policy Initiative Working Paper Series _______________________________________________________ they don’t have an incentive to do it,” predicted Sagar platforms and some other groups, like the United For Kaul, one of the founders of Metafact, a fact-check- News coalition, hope to persuade advertisers to push ing platform based in India. the platforms to do a better job of screening ads. “The online disinformation is a lot like the spam prob- The idea of pressuring advertisers to pressure the plat- lem. And it could probably be solved the way we solved forms is an attractive one. Since most of the platform the spam problem. We solved spam with a combina- revenue comes from advertising it seems that they tion of legal and technical measures. First, Congress would be most susceptible to pressure from advertis- passed a law (the CAN-SPAM Act) 2 that imposed fines ers. Journalists also hope to promote direct buying of on the senders of spam. Faced with liability, the email advertising rather than programmatic so that quality industry then set up a “black list” of known spammers media can benefit from advertising revenue rather than that they all agreed to block. see it go to random websites. Similarly, if there was some legal or financial cost to It is worth noting that advertisers would themselves be the platforms, they would likely set up a “black list” damaged by government bans on micro targeting or of disinformation outlets to block.