Trustworthy misinformation mitigation with soft information nudging Benjamin D. Horne Maur´ıcio Gruppi Sibel Adalı Department of Computer Science Department of Computer Science Department of Computer Science Rensselaer Polytechnic Institute Rensselaer Polytechnic Institute Rensselaer Polytechnic Institute Troy, NY, USA Troy, NY, USA Troy, NY, USA
[email protected] [email protected] [email protected] Abstract—Research in combating misinformation reports up the information’s correctness to debate which can lead many negative results: facts may not change minds, especially to suppression of minority voices and opinions. Furthermore, if they come from sources that are not trusted. Individuals can sophisticated misinformation campaigns mix correct and incor- disregard and justify lies told by trusted sources. This problem is made even worse by social recommendation algorithms which rect information, which can cause uncertainty and make dis- help amplify conspiracy theories and information confirming crediting information more difficult. The majority of technical one’s own biases due to companies’ efforts to optimize for clicks solutions have focused on classifying these extremes (fake and and watch time over individuals’ own values and public good. real), which leaves automatic assessment of uncertain, mixed As a result, more nuanced voices and facts are drowned out by veracity, and deeply contextual information difficult to assess. a continuous erosion of trust in better information sources. Most misinformation mitigation techniques assume that discrediting, A deeper problem that is left unaddressed in the technical filtering, or demoting low veracity information will help news research threads is what to do when information corrections, consumers make better information decisions. However, these whether done by an algorithm or journalist, do not work.