Does Internet Censorship Reduce Crime Rate?

Total Page:16

File Type:pdf, Size:1020Kb

Does Internet Censorship Reduce Crime Rate? CommIT (Communication and Information Technology) Journal, Vol. 9 No. 1, pp. 35–44 DOES INTERNET CENSORSHIP REDUCE CRIME RATE? Erwin Bramana Karnadi Department of Economic Development, Faculty of Business Communication and Economics, Atma Jaya University Jakarta 12390, Indonesia Email: [email protected] Abstract—The study examines the relationship between In Section III, we discuss several approaches that Internet censorship and crime rate. Classical views of can be used to estimate the relationship between In- censorship suggest that filtering contents that are per- ternet censorship and crime rate. We consider several ceived as negative such as violence and pornography can have a negative impact on crime rate. However, there is methods such as OLS, fixed effects and 2SLS, but we no evidence to suggest any causal relationship between decide to settle on OLS method. In Section IV, we censorship and crime rate. The Internet has made it talk about the obtained data regarding their sources easier for individuals to access any information they and measurements. Furthermore, we discussed reasons want, and hence Internet Censorship is applied to filter why we wanted to include the obtained variables to contents that are deemed negative. We discussed several approaches that may be used to estimate the relationship our model. between Internet Censorship and Crime Rate, and we In Section V, we estimate the relationship between decided to use OLS. We found that Conflict Internet Internet Censorship variable and Crime Rate variable Censorship is the only type of Internet censorship that using OLS. We conclude that only conflict Internet has a significant negative impact on Crime Rate. Further- Censorship variable is likely to have an impact on more, it only significantly affects Crime Rate for highly educated countries. Crime Rate variable. Furthermore, we introduce sev- Keywords: Internet Censorship; Crime Rate; Educa- eral interaction variables between each of the Internet tion; Conflict Censorship Censorship variables and a high education dummy. I. INTRODUCTION We find that the impact of Internet penetration and censorship are more significant for highly educated The debate for and against censorship has been countries. In Section VI, we summarize our results and going on for decades. By removing contents that are stated several limitations of our analysis. deemed inappropriate, it is reasonable to suggest that censorship may be able to control a person’s behavior, II. LITERATURE REVIEW including his or her likeliness to participate in crime. However, it is questionable whether or not the impact A. What is Internet Censorship? of censorship is significant enough, and censorship is Internet censorship refers to the act of filtering not without its disadvantages. With the introduction of and controlling what can be accessed on the Internet, the Internet, everyone can access any information in usually done by the government to control the public. an instant, and thus Internet censorship was born to There are several approaches to Internet censoring control the information that is shared on the World (Ref. [1]): Wide Web. • Technical Blocking. Technical Blocking simply In Section II, we discuss about the definition of refers to the act of blocking access to Internet Internet censorship, its advantages and disadvantages. sites such as IP blocking, DNS tampering, and Furthermore, we discuss the theoretical relationship URL blocking using a proxy. This also includes between Internet censorship and crime rate. We argue keyword blocking, which is a method used to that although it is reasonable to suggest that Internet block access to sites that have specific words in censorship has a negative impact on crime rate, there their URLs. is no sufficient evidence to suggest that claim. • Search Result Removals. Search Result Removals Received: March 18, 2015; received in revised form: April 21, is a common censorship approach applied by In- 2015; accepted: May 12, 2015; available online: June 17, 2015. ternet search engines such as Google and Yahoo. Cite this article as: E. B. Karnadi, “Does internet censorship reduce crime rate?,” CommIT Journal, vol. 9, no. 1, pp. 35–44, 2015. Instead of blocking access to specific sites, this no sufficient evidence to suggest that children approach makes finding them much more difficult. can differentiate credible information from non- In many cases, this is used to make finding pirated credible ones. This can be dangerous as biased contents much harder to do. and incorrect information can affect a child’s • Takedown. The takedown approach simply de- perspective in a very significant way. mands specific sites to be removed from the • It protects individuals from contents that can be Internet. deemed inappropriate such as pornography. Be- • Induced Self-censorship. In many cases, individu- sides religious and ethical reasons, pornography als will stay away from several websites that may may have negative impacts towards young men question the authorities, even when these sites are and women, as they are more likely to engage in not restricted e.g. porn sites that include child risky sexual behaviors when they are exposed to pornography. high-level of pornography (Ref. [3]) Furthermore, Ref. [1] divided Internet censorship • It battles piracy by blocking contents that may into four types based on their themes: violate intellectual property rights. • Political Internet Censorship. Political Internet • It protects individuals from online scams. Censorship includes censoring contents related to • It can protect individuals from cyber-bullying, views that oppose the current political regime. cyber-racism, cyber homophobia and cyber- Censorship regarding to contents related to hu- sexism (Ref. [4]). man rights, minority rights, religious movements However, Internet censorship is not without any and freedom of speech are also included in this disadvantages. Some of these disadvantages include: group. Countries with high-level Political Internet • It limits freedom of speech. According to Ref. [5], Censorship include Vietnam and Uzbekistan. the Computer Crime Act in Thailand has been • Social Internet Censorship. Social Internet Cen- criticized for its excessive Internet censorship. sorship includes censoring contents that are • It gives government too much power and control deemed inappropriate by several religious or po- over information. This has been considered as a litical groups, such as pornography, violent ma- huge issue in China, as it may delay the radical terials, gambling websites and any other contents change that China needs (Ref. [6]). that may not be suitable for children. Countries C. How Internet Censorship May Relate to Crime with high-level Social Internet Censorship include Rate? Saudi Arabia and United Arab Emirates. • Conflict (Security) Internet Censorship. Conflict The possible relationship between Internet censor- Internet Censorship deals with contents including ship and crime rate stems from the relationship be- separatist movements, border issues and any other tween crime rate and media censorship in general. The military movements that may jeopardize the secu- classical notion argued by most people is that violence rity and safety of the country. Countries with high- and pornography in media have significant impacts on level Conflict Internet Censorship include China an individual’s violent behavior. However, a study by and South Korea. Ref. [7] concluded that there is no sufficient evidence • Internet Tools Censorship. Internet Tools Censor- to suggest any relationship between media violence ship covers blocking and censoring tools that are effects and violent crime. Furthermore, Ref. [8] argued used by Internet users such as e-mail, Internet that pornography does not lead to an increase in violent hosting, translation, search engines and any other sexual behavior. communication tools that use the Internet. Coun- The Internet takes information sharing to a different tries with high-level Internet Tools Censorship level, and its introduction to the public makes media include Saudi Arabia and United Arab Emirates. censorship so much more complicated. With the In- ternet, any individual can access any information they B. Advantages and Disadvantages of Internet Censor- want, and without any restrictions these information ship include child pornography, drug market and any other The debate in censorship has never been resolved, contents that are not only deemed inappropriate but and it only amplifies after the introduction of the are also expected to increase criminal intent. Along Internet. There are several advantages of Internet cen- with Internet usage, Internet censorship has also been sorship: increasing and it is gaining attention from scholars • It protects individuals from incorrect and bi- from different disciplines such as media and commu- ased information. According to Ref. [2] there is nication, information technology, law, political science 36 Cite this article as: E. B. Karnadi, “Does internet censorship reduce crime rate?,” CommIT Journal, vol. 9, no. 1, pp. 35–44, 2015. and economics (Ref. [9]). In the next section, we are Taking these variables into account, we consider the going to use several approaches that can be used to following equation: inspect the relationship between Internet censorship and crime rate. crimei = β0 + β1 · conflicti + β2 · politicali + β3 · sociali + β4 · toolsi III. EMPERICAL STRATEGY + β5 · internet penetrationi A. Using OLS: Possible Endogeneity Problem + β6
Recommended publications
  • Poster: Introducing Massbrowser: a Censorship Circumvention System Run by the Masses
    Poster: Introducing MassBrowser: A Censorship Circumvention System Run by the Masses Milad Nasr∗, Anonymous∗, and Amir Houmansadr University of Massachusetts Amherst fmilad,[email protected] ∗Equal contribution Abstract—We will present a new censorship circumvention sys- side the censorship regions, which relay the Internet traffic tem, currently being developed in our group. The new system of the censored users. This includes systems like Tor, VPNs, is called MassBrowser, and combines several techniques from Psiphon, etc. Unfortunately, such circumvention systems are state-of-the-art censorship studies to design a hard-to-block, easily blocked by the censors by enumerating their limited practical censorship circumvention system. MassBrowser is a set of proxy server IP addresses [14]. (2) Costly to operate: one-hop proxy system where the proxies are volunteer Internet To resist proxy blocking by the censors, recent circumven- users in the free world. The power of MassBrowser comes from tion systems have started to deploy the proxies on shared-IP the large number of volunteer proxies who frequently change platforms such as CDNs, App Engines, and Cloud Storage, their IP addresses as the volunteer users move to different a technique broadly referred to as domain fronting [3]. networks. To get a large number of volunteer proxies, we This mechanism, however, is prohibitively expensive [11] provide the volunteers the control over how their computers to operate for large scales of users. (3) Poor QoS: Proxy- are used by the censored users. Particularly, the volunteer based circumvention systems like Tor and it’s variants suffer users can decide what websites they will proxy for censored from low quality of service (e.g., high latencies and low users, and how much bandwidth they will allocate.
    [Show full text]
  • Self-Censorship and the First Amendment Robert A
    Notre Dame Journal of Law, Ethics & Public Policy Volume 25 Article 2 Issue 1 Symposium on Censorship & the Media 1-1-2012 Self-Censorship and the First Amendment Robert A. Sedler Follow this and additional works at: http://scholarship.law.nd.edu/ndjlepp Recommended Citation Robert A. Sedler, Self-Censorship and the First Amendment, 25 Notre Dame J.L. Ethics & Pub. Pol'y 13 (2012). Available at: http://scholarship.law.nd.edu/ndjlepp/vol25/iss1/2 This Article is brought to you for free and open access by the Notre Dame Journal of Law, Ethics & Public Policy at NDLScholarship. It has been accepted for inclusion in Notre Dame Journal of Law, Ethics & Public Policy by an authorized administrator of NDLScholarship. For more information, please contact [email protected]. ARTICLES SELF-CENSORSHIP AND THE FIRST AMENDMENT ROBERT A. SEDLER* I. INTRODUCTION Self-censorship refers to the decision by an individual or group to refrain from speaking and to the decision by a media organization to refrain from publishing information. Whenever an individual or group or the media engages in self-censorship, the values of the First Amendment are compromised, because the public is denied information or ideas.' It should not be sur- prising, therefore, that the principles, doctrines, and precedents of what I refer to as "the law of the First Amendment"' are designed to prevent self-censorship premised on fear of govern- mental sanctions against expression. This fear-induced self-cen- sorship will here be called "self-censorship bad." At the same time, the First Amendment also values and pro- tects a right to silence.
    [Show full text]
  • The Velocity of Censorship
    The Velocity of Censorship: High-Fidelity Detection of Microblog Post Deletions Tao Zhu, Independent Researcher; David Phipps, Bowdoin College; Adam Pridgen, Rice University; Jedidiah R. Crandall, University of New Mexico; Dan S. Wallach, Rice University This paper is included in the Proceedings of the 22nd USENIX Security Symposium. August 14–16, 2013 • Washington, D.C., USA ISBN 978-1-931971-03-4 Open access to the Proceedings of the 22nd USENIX Security Symposium is sponsored by USENIX The Velocity of Censorship: High-Fidelity Detection of Microblog Post Deletions Tao Zhu David Phipps Adam Pridgen [email protected] Computer Science Computer Science Independent Researcher Bowdoin College Rice University Jedidiah R. Crandall Dan S. Wallach Computer Science Computer Science University of New Mexico Rice University Abstract terconnected through their social graph and tend to post about sensitive topics. This biases us towards the content Weibo and other popular Chinese microblogging sites are posted by these particular users, but enables us to mea- well known for exercising internal censorship, to comply sure with high fidelity the speed of the censorship and with Chinese government requirements. This research discern interesting patterns in censor behaviors. seeks to quantify the mechanisms of this censorship: Sina Weibo (weibo.com, referred to in this paper sim- how fast and how comprehensively posts are deleted. ply as “Weibo”) has the most active user community of Our analysis considered 2.38 million posts gathered over any microblog site in China [39]. Weibo provides ser- roughly two months in 2012, with our attention focused vices which are similar to Twitter, with @usernames, on repeatedly visiting “sensitive” users.
    [Show full text]
  • Internet Freedom in China: U.S. Government Activity, Private Sector Initiatives, and Issues of Congressional Interest
    Internet Freedom in China: U.S. Government Activity, Private Sector Initiatives, and Issues of Congressional Interest Patricia Moloney Figliola Specialist in Internet and Telecommunications Policy May 18, 2018 Congressional Research Service 7-5700 www.crs.gov R45200 Internet Freedom in China: U.S. Government and Private Sector Activity Summary By the end of 2017, the People’s Republic of China (PRC) had the world’s largest number of internet users, estimated at over 750 million people. At the same time, the country has one of the most sophisticated and aggressive internet censorship and control regimes in the world. PRC officials have argued that internet controls are necessary for social stability, and intended to protect and strengthen Chinese culture. However, in its 2017 Annual Report, Reporters Without Borders (Reporters Sans Frontières, RSF) called China the “world’s biggest prison for journalists” and warned that the country “continues to improve its arsenal of measures for persecuting journalists and bloggers.” China ranks 176th out of 180 countries in RSF’s 2017 World Press Freedom Index, surpassed only by Turkmenistan, Eritrea, and North Korea in the lack of press freedom. At the end of 2017, RSF asserted that China was holding 52 journalists and bloggers in prison. The PRC government employs a variety of methods to control online content and expression, including website blocking and keyword filtering; regulating and monitoring internet service providers; censoring social media; and arresting “cyber dissidents” and bloggers who broach sensitive social or political issues. The government also monitors the popular mobile app WeChat. WeChat began as a secure messaging app, similar to WhatsApp, but it is now used for much more than just messaging and calling, such as mobile payments, and all the data shared through the app is also shared with the Chinese government.
    [Show full text]
  • The Impact of Disinformation on Democratic Processes and Human Rights in the World
    STUDY Requested by the DROI subcommittee The impact of disinformation on democratic processes and human rights in the world @Adobe Stock Authors: Carme COLOMINA, Héctor SÁNCHEZ MARGALEF, Richard YOUNGS European Parliament coordinator: Policy Department for External Relations EN Directorate General for External Policies of the Union PE 653.635 - April 2021 DIRECTORATE-GENERAL FOR EXTERNAL POLICIES POLICY DEPARTMENT STUDY The impact of disinformation on democratic processes and human rights in the world ABSTRACT Around the world, disinformation is spreading and becoming a more complex phenomenon based on emerging techniques of deception. Disinformation undermines human rights and many elements of good quality democracy; but counter-disinformation measures can also have a prejudicial impact on human rights and democracy. COVID-19 compounds both these dynamics and has unleashed more intense waves of disinformation, allied to human rights and democracy setbacks. Effective responses to disinformation are needed at multiple levels, including formal laws and regulations, corporate measures and civil society action. While the EU has begun to tackle disinformation in its external actions, it has scope to place greater stress on the human rights dimension of this challenge. In doing so, the EU can draw upon best practice examples from around the world that tackle disinformation through a human rights lens. This study proposes steps the EU can take to build counter-disinformation more seamlessly into its global human rights and democracy policies.
    [Show full text]
  • Translation and Manipulation
    5785 Forough Rahimi et al./ Elixir Ling. & Trans. 41 (2011) 5785-5790 Available online at www.elixirpublishers.com (Elixir International Journal) Linguistics and Translation Elixir Ling. & Trans. 41 (2011) 5785-5790 Translation and manipulation: a critical discourse analysis case study Forough Rahimi and Mohammad Javad Riasati Department of Foreign Languages, Shiraz Branch, Islamic Azad University, Shiraz, Iran. ARTICLE INFO ABSTRACT Article history: This paper seeks to delineate the constraints imposed on translators of movie dialogues in Received: 20 September 2011; the context of Iran. The movie ''Platoon'' is selected for this purpose. In this situation a Received in revised form: myriad of external constraints, that is, cultural, religious, and political, lead to rewriting or 16 November 2011; manipulation of movie texts. This is an effective means for creating and naturalizing Accepted: 28 November 2011; ideologies, through expurgation, derogation, etc. The movie translations in Iran are scenes of ideological incursions in translation, that is, suppression of dialectal features. They are sites Keywords of ideological clashes in which certain realities are constructed and challenged and Translation, xenophobic attitudes are propagated. The translators counterfeit realities and inculcate them Censorship, in the innocent audience. This paper explores the ideological and socio-political factors Ideological manipulation. which determine the strategies applied in the translation of western movies. © 2011 Elixir All rights reserved. Introduction language is an irreducible part of social life. The dialectic During the past decade educational researchers increasingly relation between language and social reality is realized through have turned to Critical Discourse Analysis (CDA) as a set of social events (texts), social practices (orders of discourse) and approaches to answer questions about the relationships between social structures (languages) [3].
    [Show full text]
  • Con-Scripting the Masses: False Documents and Historical Revisionism in the Americas
    University of Massachusetts Amherst ScholarWorks@UMass Amherst Open Access Dissertations 2-2011 Con-Scripting the Masses: False Documents and Historical Revisionism in the Americas Frans Weiser University of Massachusetts Amherst Follow this and additional works at: https://scholarworks.umass.edu/open_access_dissertations Part of the Comparative Literature Commons Recommended Citation Weiser, Frans, "Con-Scripting the Masses: False Documents and Historical Revisionism in the Americas" (2011). Open Access Dissertations. 347. https://scholarworks.umass.edu/open_access_dissertations/347 This Open Access Dissertation is brought to you for free and open access by ScholarWorks@UMass Amherst. It has been accepted for inclusion in Open Access Dissertations by an authorized administrator of ScholarWorks@UMass Amherst. For more information, please contact [email protected]. CON-SCRIPTING THE MASSES: FALSE DOCUMENTS AND HISTORICAL REVISIONISM IN THE AMERICAS A Dissertation Presented by FRANS-STEPHEN WEISER Submitted to the Graduate School of the University of Massachusetts Amherst in partial fulfillment Of the requirements for the degree of DOCTOR OF PHILOSOPHY February 2011 Program of Comparative Literature © Copyright 2011 by Frans-Stephen Weiser All Rights Reserved CON-SCRIPTING THE MASSES: FALSE DOCUMENTS AND HISTORICAL REVISIONISM IN THE AMERICAS A Dissertation Presented by FRANS-STEPHEN WEISER Approved as to style and content by: _______________________________________________ David Lenson, Chair _______________________________________________
    [Show full text]
  • Threat Modeling and Circumvention of Internet Censorship by David Fifield
    Threat modeling and circumvention of Internet censorship By David Fifield A dissertation submitted in partial satisfaction of the requirements for the degree of Doctor of Philosophy in Computer Science in the Graduate Division of the University of California, Berkeley Committee in charge: Professor J.D. Tygar, Chair Professor Deirdre Mulligan Professor Vern Paxson Fall 2017 1 Abstract Threat modeling and circumvention of Internet censorship by David Fifield Doctor of Philosophy in Computer Science University of California, Berkeley Professor J.D. Tygar, Chair Research on Internet censorship is hampered by poor models of censor behavior. Censor models guide the development of circumvention systems, so it is important to get them right. A censor model should be understood not just as a set of capabilities|such as the ability to monitor network traffic—but as a set of priorities constrained by resource limitations. My research addresses the twin themes of modeling and circumvention. With a grounding in empirical research, I build up an abstract model of the circumvention problem and examine how to adapt it to concrete censorship challenges. I describe the results of experiments on censors that probe their strengths and weaknesses; specifically, on the subject of active probing to discover proxy servers, and on delays in their reaction to changes in circumvention. I present two circumvention designs: domain fronting, which derives its resistance to blocking from the censor's reluctance to block other useful services; and Snowflake, based on quickly changing peer-to-peer proxy servers. I hope to change the perception that the circumvention problem is a cat-and-mouse game that affords only incremental and temporary advancements.
    [Show full text]
  • Running Head: FAKE NEWS AS a THREAT to the DEMOCRATIC MEDIA ENVIRONMENT
    Running head: FAKE NEWS AS A THREAT TO THE DEMOCRATIC MEDIA ENVIRONMENT Fake News as a Threat to the Democratic Media Environment: Past Conditions of Media Regulation and Their Contemporary Applicability to New Media in the United States of America and South Korea Jae Hyun Park Duke University FAKE NEWS AS A THREAT TO THE DEMOCRATIC MEDIA ENVIRONMENT 1 Abstract This study uses a comparative case study policy analysis to evaluate whether the media regulation standards that the governments of the United States of America and South Korea used in the past apply to fake news on social media and the Internet today. We first identify the shared conditions based on which the two governments intervened in the free press. Then, we examine media regulation laws regarding these conditions and review court cases in which they were utilized. In each section, we draw similarities and differences between the two governments’ courses of action. The comparative analysis will serve useful in the conclusion, where we assess the applicability of those conditions to fake news on new media platforms in each country and deliberate policy recommendations as well as policy flow between the two countries. Keywords: censorship, defamation, democracy, falsity, fairness, freedom of speech, ​ intention, journalistic truth, news manipulation, objectivity FAKE NEWS AS A THREAT TO THE DEMOCRATIC MEDIA ENVIRONMENT 2 Contents Introduction .................................................................................................................................... 4
    [Show full text]
  • The Right to Privacy in the Digital Age
    The Right to Privacy in the Digital Age April 9, 2018 Dr. Keith Goldstein, Dr. Ohad Shem Tov, and Mr. Dan Prazeres Presented on behalf of Pirate Parties International Headquarters, a UN ECOSOC Consultative Member, for the Report of the High Commissioner for Human Rights Our Dystopian Present Living in modern society, we are profiled. We accept the necessity to hand over intimate details about ourselves to proper authorities and presume they will keep this information secure- only to be used under the most egregious cases with legal justifications. Parents provide governments with information about their children to obtain necessary services, such as health care. We reciprocate the forfeiture of our intimate details by accepting the fine print on every form we sign- or button we press. In doing so, we enable second-hand trading of our personal information, exponentially increasing the likelihood that our data will be utilized for illegitimate purposes. Often without our awareness or consent, detection devices track our movements, our preferences, and any information they are capable of mining from our digital existence. This data is used to manipulate us, rob from us, and engage in prejudice against us- at times legally. We are stalked by algorithms that profile all of us. This is not a dystopian outlook on the future or paranoia. This is present day reality, whereby we live in a data-driven society with ubiquitous corruption that enables a small number of individuals to transgress a destitute mass of phone and internet media users. In this paper we present a few examples from around the world of both violations of privacy and accomplishments to protect privacy in online environments.
    [Show full text]
  • Censorship As Optimal Persuasion
    CENSORSHIP AS OPTIMAL PERSUASION Anton Kolotilin, Tymofiy Mylovanov, Andriy Zapechelnyuk Abstract. We consider a Bayesian persuasion problem where a sender's utility depends only on the expected state. We show that upper censorship that pools the states above a cutoff and reveals the states below the cutoff is optimal for all prior distributions of the state if and only if the sender's marginal utility is quasi-concave. Moreover, we show that it is optimal to reveal less information if the sender becomes more risk averse or the sender's utility shifts to the left. Finally, we apply our results to the problem of media censorship by a government. JEL Classification: D82, D83, L82 Keywords: Bayesian persuasion, information design, censorship, media Date: May 30, 2021. Kolotilin: UNSW Business School, School of Economics, UNSW Sydney, NSW 2052, Australia. E-mail: [email protected]. Mylovanov: University of Pittsburgh, Department of Economics, 4714 Posvar Hall, 230 South Bou- quet Street, Pittsburgh, PA 15260, USA. E-mail: [email protected]. Zapechelnyuk: University of St Andrews, School of Economics and Finance, Castlecliffe, the Scores, St Andrews KY16 9AR, UK. E-mail: [email protected]. We are grateful for discussions with Ming Li with whom we worked on the companion paper Kolotilin et al. (2017). An early version of the results in this paper and the results in the companion paper were presented in our joint working paper Kolotilin et al. (2015). We thank the anonymous refer- ees, Ricardo Alonso, Dirk Bergemann, Simon Board, Patrick Bolton, Alessandro Bonatti, Steven Callander, Odilon C^amara,Rahul Deb, P´eterEs¨o,Florian Hoffman, Johannes H¨orner,Roman In- derst, Emir Kamenica, Navin Kartik, Daniel Kr¨aehmer,Hongyi Li, Marco Ottaviani, Mallesh Pai, Andrea Prat, Larry Samuelson, Ilya Segal, Joel Sobel, Konstantin Sonin, Christopher Teh, as well as many conference and seminar participants, for helpful comments.
    [Show full text]
  • Partisan Platforms: Responses to Perceived Liberal Bias in Social Media
    Partisan Platforms: Responses to Perceived Liberal Bias in Social Media A Research Paper submitted to the Department of Engineering and Society Presented to the Faculty of the School of Engineering and Applied Science University of Virginia • Charlottesville, Virginia In Partial Fulfillment of the Requirements for the Degree Bachelor of Science, School of Engineering Luke Giraudeau Spring, 2021 On my honor as a University Student, I have neither given nor received unauthorized aid on this assignment as defined by the Honor Guidelines for Thesis-Related Assignments Signature __________________________________________ Date __________ Luke Giraudeau Approved __________________________________________ Date __________ Richard Jacques, Department of Engineering and Society Introduction In the United States, public opinion about tech companies’ political biases is divided along partisan lines (Vogels, Perrin, & Anderson, 2020). In the U.S. since 2018, 69 percent of Republicans claim that technology companies favor liberal views, whereas only 19 percent of Democrats say that technology companies favor the alternative view. Over 50 percent of liberals believe that perspectives are treated equally, whereas only 22 percent of conservatives feel this way. Critics who allege bias have organized to promote legislation such as the Ending Support for Internet Censorship Act (2020) as well as an executive order (Executive Order 13,925, 2020). Furthermore, conservative entrepreneurs have produced new social media platforms such as Gab and Parler that claim
    [Show full text]