Understanding User Beliefs About Algorithmic Curation in the Facebook News Feed
Total Page:16
File Type:pdf, Size:1020Kb
Understanding User Beliefs About Algorithmic Curation in the Facebook News Feed Emilee Rader and Rebecca Gray Department of Media and Information Michigan State University femilee,[email protected] ABSTRACT Researchers are becoming increasingly concerned that algo- People are becoming increasingly reliant on online socio- rithms enforce biases that are hard to detect, but have poten- technical systems that employ algorithmic curation to orga- tially negative outcomes [6, 14]. nize, select and present information. We wanted to under- The Facebook News Feed is a socio-technical system com- stand how individuals make sense of the influence of algo- posed of users, algorithms, and content. Users produce con- rithms, and how awareness of algorithmic curation may im- tent items (posts) that become part of Facebook’s corpus, pact their interaction with these systems. We investigated user which is continuously changing. At the same time, person- understanding of algorithmic curation in Facebook’s News alization algorithms select a subset of items from the cor- Feed, by analyzing open-ended responses to a survey ques- pus, rank or organize them according to a proprietary algo- tion about whether respondents believe their News Feeds rithm, and present them to users for consumption in their show them every post their Facebook Friends create. Re- News Feeds. If a user were to contribute a post that the News sponses included a wide range of beliefs and causal infer- Feed algorithm does not display near the top of others’ News ences, with different potential consequences for user behavior Feeds, that post becomes effectively invisible to those users. in the system. Because user behavior is both input for algo- Bucher [6] calls this the “threat of invisibility, ” or the po- rithms and constrained by them, these patterns of belief may tential for one’s contributions to go unseen by an unknown have tangible consequences for the system as a whole. number of people. If users become aware of this possibility, Author Keywords avoidance of this “threat” can then guide their behaviors and algorithms; feedback loop; intuitive theories; Facebook choices as they engage with others via the system. News Feed. The potential for negative consequences like the threat of in- visibility are presumably the result of feedback loops: situa- ACM Classification Keywords tions where the output of a process becomes an input to that H.5.m. Information Interfaces and Presentation (e.g. HCI): same process. This happens in social media because informa- Miscellaneous tion consumers are also producers; both explicitly via choices to post, comment or “Like”, and also implicitly via their be- INTRODUCTION havioral traces recorded in system logs. What users learn People are becoming increasingly reliant on online socio- about the system when they act as consumers can affect the technical systems that employ algorithmic curation: organiz- choices they make as producers of content. Feedback loops ing, selecting, and presenting subsets of a corpus of infor- make all users gatekeepers for each other, by using as input mation for consumption. An algorithm is “a finite, discrete to the algorithm evidence collected from other users about series of instructions that receives an input and produces an the value (comments, Likes, shares) of the particular item in output” [16], and systems like Facebook and Google (and question [30]. It is extremely difficult to understand the com- many, many others) use algorithms as information interme- plex, nonlinear interactions that take place in socio-technical diaries that determine what information should be displayed system like the Facebook News Feed where the algorithm, the and what should be hidden [5]. users, and the corpus of content itself are constantly interact- Because algorithms are automated and usually poorly under- ing and evolving [27]. stood by end users, people often assume that they are objec- Because of the feedback loop characteristics of these systems, tive or impartial [5]. However, just because a system is au- user beliefs about content filtering algorithms work are an im- tomated does not mean it is free from the potential for bias. portant component of shaping the overall system behavior. To better understand the interdependence between users and al- Permission to make digital or hard copies of all or part of this work for personal or gorithms, we conducted a study investigating users’ beliefs classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full cita- about what the Facebook News Feed chooses to display, and tion on the first page. Copyrights for components of this work owned by others than why. We describe the kinds of evidence of algorithm behav- ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or re- publish, to post on servers or to redistribute to lists, requires prior specific permission ior that users notice and respond to and the beliefs they form and/or a fee. Request permissions from [email protected]. about algorithm selection and ranking criteria, and we present CHI 2015, April 18–23 2015, Seoul, Republic of Korea Copyright is held by the owner/author(s). Publication rights licensed to ACM. ACM 978-1-4503-3145-6/15/04...$15.00 http://dx.doi.org/10.1145/2702123.2702174 implications of these beliefs for their own behavior and for In addition to relatedness, it is also important for the system to the system overall. keep track of the timeliness of the items and not fill the user’s content stream with outdated items or the same item over and over again. However, in a system where content items are RELATED WORK turning over very quickly like Twitter or Facebook, a greater degree of personalization means the system has fewer items Content Streams and Recommendations it can possibly recommend to the user. This makes it difficult Recommender systems researchers have explored ways to for the algorithm to select items that are both new to the user use algorithms to help users connect with the content they and relevant to the user’s interests [21]. are most interested in on social media sites. Several re- search teams have proposed different ways to use information Finally, several researchers have attempted to evaluate about network ties, topic preferences, and characteristics of whether filter bubbles (recommendations of decreasing diver- posts in the system to make recommendations. For example, sity over time) are likely to exist in content recommendation Sharma and Cosley [28] proposed using information about systems. Nguyen et al. [25] analyzed a MovieLens dataset content preferences from the Facebook profiles of users and and found that both the diversity of items recommended by their Facebook Friends to help generate recommendations for the system and items rated by users did indeed become less movies, television shows, and books. Items recommended by diverse over time, but only slightly. They concluded that this the algorithm variant that used Facebook Friends’ profile in- was weak evidence for a filter-bubble-like effect. In addition, formation received the highest number of views. In a follow- in two separate modeling and simulation studies, independent up study [29], they found that such social recommendations research teams found that some recommendation algorithms were more persuasive when they came from people who were are susceptible to a narrowing effect under particular condi- close friends whose interests are known to the user. tions; however, neither paper presented empirical data to ver- ify the simulation results [11, 17]. Much of this research has taken place using Twitter, which has been a more open platform to use for experimentation and Research projects like these use various data about users and evaluation of systems designed to help users with information content available in social media systems as input to algo- overload in their content streams and feeds. Chen et al. [7] rithms that are designed to prioritize content for display, and created a Twitter app to recommend “conversations” (a chain direct user attention to information they want to see. How- of @replies, or broadcasted messages directed to others via ever, few projects have explored interaction with algorithmic username) to users, to help them focus on the interactions that curation systems with large numbers of users over long peri- would be most interesting to them. The algorithm used con- ods of time. It is difficult to evaluate algorithms in isolation versation dimensions such as thread length, topic relevance, from their context of use—without data from real users to act and tie strength to make its selections. The algorithm that upon, one cannot get an accurate picture of performance [13, performed best was a combination of topic and tie strength. 21]. Therefore, bias that might be introduced by interde- pendence between user and algorithm behavior is largely un- Bernstein et al. [4] also created a Twitter client to help users known. focus on information that would be most interesting to them. Their tool clustered tweets within a user’s stream by topic, The Facebook News Feed and allowed users to browse those topics. This was essen- Facebook is the largest social network site (SNS) in the world tially a way to sort information in the user’s feed into groups with over 1.28 billion monthly active users [9] and over 71% of similar items to help them focus on more relevant infor- of American adults using the site [1]. The Facebook News mation. They tested a prototype with users (and users’ own Feed is a “constantly updating list of stories from people and Twitter accounts) in the lab. They found that while users liked Pages that you follow on Facebook” [10]. Maintaining rela- how it helped them read more relevant tweets when they were tionships is the main reason most people use Facebook [22], grouped by topic, they felt like it was harder to make sure they and seeing updates and other shared content from Friends is had seen everything.