Detecting Biased Statements in Wikipedia

Detecting Biased Statements in Wikipedia

Detecting Biased Statements in Wikipedia Christoph Hube and Besnik Fetahu L3S Research Center, Leibniz University of Hannover Hannover, Germany {hube, fetahu}@L3S.de ABSTRACT editors only in the English Wikipedia. However, only a small mi- 1 Quality in Wikipedia is enforced through a set of editing policies nority, specifically 127,000 editors are active . Due to the diverse and guidelines recommended for Wikipedia editors. Neutral point demographics and interests of editors, to maintain the quality of of view (NPOV) is one of the main principles in Wikipedia, which the provided information, Wikipedia has a set of editing guidelines ensures that for controversial information all possible points of and policies. 2 view are represented proportionally. Furthermore, language used One of the core policies is the Neutral Point of View (NPOV) . in Wikipedia should be neutral and not opinionated. It requires that for controversial topics, Wikipedia editors should However, due to the large number of Wikipedia articles and proportionally represent all points of view. The core guidelines in its operating principle based on a voluntary basis of Wikipedia NPOV are to: (i) avoid stating opinions as facts, (ii) avoid stating editors; quality assurances and Wikipedia guidelines cannot always seriously contested assertions as facts, (iii) avoid stating facts as be enforced. Currently, there are more than 40,000 articles, which opinions, (iv) prefer nonjudgemental language, and (v) indicate the are flagged with NPOV or similar quality tags. Furthermore, these relative prominence of opposing views. represent only the portion of articles for which such quality issues Currently, there are approximately 40,000 Wikipedia pages that are explicitly flagged by the Wikipedia editors, however, the real are flagged with NPOV (or similar quality flaws) quality issues. 3 number may be higher considering that only a small percentage of These represent explicit cases marked by Wikipedia editors, where articles are of good quality or featured as categorized by Wikipedia. specific Wikipedia pages or statements (sentences in Wikipedia In this work, we focus on the case of language bias at the sen- articles) are deemed to be in violation with the NPOV policy. Re- tence level in Wikipedia. Language bias is a hard problem, as it casens et al. [17] analyze these cases that go against the specific represents a subjective task and usually the linguistic cues are sub- points from the NPOV guidelines. They find common linguistic tle and can be determined only through its context. We propose a cues, such as the cases of framing bias, where subjective words or supervised classification approach, which relies on an automatically phrases are used that are linked to a particular point of view (point created lexicon of bias words, and other syntactical and semantic (iv)), and epistemological bias which focuses on the believability of characteristics of biased statements. a statement, thus violating points (i) and (ii). Similarly, Martin [11] We experimentally evaluate our approach on a dataset consisting shows the cases of biases which are in violation with all guideli- of biased and unbiased statements, and show that we are able to nes of NPOV, an experimental study carried out on his personal 4 detect biased statements with an accuracy of 74%. Furthermore, we Wikipedia page . show that competitors that determine bias words are not suitable for Ensuring that Wikipedia pages follow the core principles in Wi- detecting biased statements, which we outperform with a relative kipedia is a hard task. Firstly, due to the fact that editors provide and improvement of over 20%. maintain Wikipedia pages on a voluntarily basis, the editor efforts are not always inline with the demand by the general viewership KEYWORDS of Wikipedia [21] and as such they cannot be redirected to pages that have quality issues. Furthermore, there are documented cases, Language Bias; Wikipedia Quality; NPOV where Wikipedia admins are responsible for policy violations and ACM Reference Format: pushing forward specific points of view on Wikipedia pages [2, 5], Christoph Hube and Besnik Fetahu. 2018. Detecting Biased Statements in thus, going directly against the NPOV policy. Wikipedia. In WWW ’18 Companion: The 2018 Web Conference Companion, In this work, we address quality issues that deal with language April 23–27, 2018, Lyon, France. ACM, New York, NY, USA, 8 pages. https: //doi.org/10.1145/3184558.3191640 bias in Wikipedia statements that are in violation with the points (i) – (iv). We classify statements as being biased or unbiased.A statement 1 INTRODUCTION in our case corresponds to a sentence in Wikipedia. We address one of the main deficiencies of related work [17], which focuses on Wikipeda is one of the largest collaboratively created encyclopedias. detecting bias words. In our work, we show that similar to [13], Its community of editors consist of more than 32 million registered words that introduce bias or violate NPOV are dependent on the This paper is published under the Creative Commons Attribution 4.0 International context in which they appear and furthermore the topic at hand. (CC BY 4.0) license. Authors reserve their rights to disseminate the work on their personal and corporate Web sites with the appropriate attribution. 1 WWW ’18 Companion, April 23–27, 2018, Lyon, France https://en.wikipedia.org/wiki/Wikipedia:Wikipedians#Number_of_editors 2 © 2018 IW3C2 (International World Wide Web Conference Committee), published https://en.wikipedia.org/wiki/Wikipedia:Neutral_point_of_view 3 under Creative Commons CC BY 4.0 License. This number may as well be much higher for cases that are not spotted by the ACM ISBN 978-1-4503-5640-4/18/04. Wikipedia editors. https://doi.org/10.1145/3184558.3191640 4https://en.wikipedia.org/wiki/Brian_Martin_(social_scientist) Thus, our approach relies on an automatically generated lexicon of [15], and kill verbs. They also ask the workers for their political bias words for a given set of Wikipedia pages under consideration, identification and find that conservative workers are more likely to and in addition to semantic and syntactic features extracted from label a statement as biased. the classified statements. Wagner et al.[20] use lexical bias, i.e. vocabulary that is typically As an example of language bias consider the following statement: used to describe women and men, as one dimension among other • Sanders shocked his fellow liberals by putting up a Soviet dimensions to analyze gender bias on Wikipedia. Union flag in his Senate office. Recasens et al.[17] tackle a language bias problem that is similar to our problem. Given a sentence with known bias they try to The word shocked introduces bias in this statement since it im- identify the most biased word using a machine learning approach plies that “putting a Soviet Union flag in his office” is a shocking based on logistic regression and mostly language features, i.e. word act. lists containing hedges, factive verbs, assertive verbs, implicative To this end, we make the following contributions in this work: verbs, report verbs, entailments, and subjectives. They also use • propose an automated approach for generating a lexicon of part of speech and a bias lexicon with words that they extracted bias words from a set of Wikipedia articles under considera- by comparing the before and after form of Wikipedia articles for tion, revisions that contain a mention of POV in their revision comment. • an automated approach for classifying Wikipedia statements The bias lexicon contains 654 words including many words that do as either biased or unbiased, not directly introduce bias, such as america, person, and historical. • a human labelled dataset consisting of biased and unbiased In comparison the approach for extracting bias words presented in statements. this paper differs strongly from their approach and our bias lexicon is more comprehensive including almost 10,000 words. Recasens et 2 RELATED WORK al. report accuracies of 0.34 for finding the most biased word and Research on bias in Wikipedia has mostly focused on different 0.59 for having the most biased word among the top 3 words. They topics such as culture, gender and politics [1, 8, 20] with some of also use crowdsourcing to create a baseline for the given problem. the existing research referring to language bias. The results show that the task of identifying a bias word in a given Greenstein and Zhu[4] analyze political bias in Wikipedia with sentence is not trivial for human annotators. The human annotators a focus on US politics. They use the approach introduced by Gent- achieve an accuracy of 30%. zkow and Shapiro[3] that was initially developed to determine Another important topic in the context of Wikipedia is vanda- newspaper slant. It relies on a list of 1000 terms and phrases that lism detection [16]. While vandalism detection uses some methods are typically used by either republican or democratic congress mem- that are also relevant for bias detection (e.g. blacklisting), it is im- bers. Greenstein and Zhu search for these terms and phrases within portant to notice that bias detection and vandalism detection are Wikipedia articles about US politics to measure in which spectrum two different problems. Vandalism refers to cases where editors (left or right leaning politics) these articles are. They find that arti- deliberately lower the quality of an article and are typically more cles on Wikipedia used to show a more liberal slant in average but obvious. In the case of bias, editors might not be aware that their that this slant has decreased over time with the growth of Wikipedia contribution violates the NPOV. and more editors working on the articles. For the seed extraction part of the approach we present in this paper we also use articles 3 LANGUAGE BIAS DETECTION APPROACH related to US politics but instead of measuring political bias our In this section we introduce our approach for language bias de- approach simply detects biased statements using features that are tection.

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    8 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us