<<

Towards Value-Sensitive Learning Analytics

Bodong Chen Haiyi Zhu University of Minnesota University of Minnesota Minneapolis, MN, USA Minneapolis, MN, USA [email protected] [email protected]

ABSTRACT The increasing emphasis on ethical considerations in learning ana- To support ethical considerations and system integrity in learning lytics is situated within growing public concerns with algorithmic analytics, this paper introduces two cases of applying the Value opacity and bias, as well as awakening advocacy of algorithmic Sensitive Design methodology to learning analytics design. The transparency and accountability [1]. In the field of learning ana- first study applied two methods of Value Sensitive Design, namely lytics, a rich body of theoretical, practical, and policy work has stakeholder analysis and value analysis, to a conceptual investi- emerged to promote the ethical collection and use of learning data. gation of an existing learning analytics tool. This investigation In 2016, the Journal of Learning Analytics published a special section uncovered a number of values and value tensions, leading to design featuring work on ethics and privacy issues [13]. Ethical frame- trade-offs to be considered in future tool refinements. The second works, codes of ethical practice, and ethical consideration checklists study holistically applied Value Sensitive Design to the design of a have been developed [11, 25, 32]. New policy documents are also put recommendation system for the Wikipedia WikiProjects. To proac- forward by higher education institutions to enhance transparency 1 tively consider values among stakeholders, we derived a multi-stage with regard to data collection and data usage. design process that included literature analysis, empirical investi- In contrast with the private sector, education as a social func- gations, prototype development, community engagement, iterative tion also has a moral responsibility to promote student success. testing and refinement, and continuous evaluation. By reporting on Therefore, ethical considerations in learning analytics need to go these two cases, this paper responds to a need of practical means beyond issues that are foregrounding the current public discourse to support ethical considerations and human values in learning an- surrounding surveillance and privacy [13]. Educational institutions alytics systems. These two cases demonstrate that Value Sensitive have “an obligation to act”—to effectively allocate resources to fa- Design could be a viable approach for balancing a wide range of cilitate effective teaching and learning through ethical and timely human values, which tend to encompass and surpass ethical issues, interventions [26]. Ethical practice in learning analytics entails in learning analytics design. considerations of not only typical ethical issues but also factors and values pertinent to the larger education system. Recently, Buck- CCS CONCEPTS ingham Shum [2] proposes a Learning Analytics System Integrity framework that goes beyond algorithmic transparency and account- • Information systems → Data analytics; Social networking ability to consider multiple pillars of a learning analytics system sites; • Human-centered computing → Visual analytics; • Ap- including learning theory, , human–computer interaction, plied computing → Collaborative learning; pedagogy, and human-centered design. While learning analytics KEYWORDS as a field is rightfully giving more attention to ethical challenges with data analytics, to foster ethical practice we also need effective learning analytics, values, value sensitive design, data ethics, social methods to consider ethical challenges in a manner that would media engage perspectives from different stakeholders while allowing ACM Reference Format: education—as a complex system—to fulfill its societal function. Bodong Chen and Haiyi Zhu. 2019. Towards Value-Sensitive Learning An- To this end, this paper proposes to apply Value Sensitive De- alytics Design. In The 9th International Learning Analytics & Knowledge sign—a theory and methodology initially developed in the field Conference (LAK19), March 4–8, 2019, Tempe, AZ, USA. ACM, New York, NY, of Human–Computer Interaction (HCI)—to support the design of arXiv:1812.08335v2 [cs.HC] 28 Jan 2019 USA, 10 pages. https://doi.org/10.1145/3303772.3303798 value-sensitive learning analytics systems. We posit that Value Sen- sitive Design can (1) guide researchers and practitioners in the 1 INTRODUCTION process of evaluating and refining existing learning analytics ap- Over the past few years, considerations of the ethical implications plications, and (2) provide systematic guidance on developing new of learning analytics have become a central topic in the field [27]. learning analytics applications to more holistically address values pertinent to stakeholders, educators, and the society. Permission to make digital or hard copies of all or part of this work for personal or The remainder of this paper is structured as follows. We first classroom use is granted without fee provided that copies are not made or distributed briefly review work on ethics frameworks, algorithmic transparency, for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the and accountability in the field of learning analytics. Then we turn author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or to the Value Sensitive Design literature, with a focus on its appli- republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]. cation to the development of information systems. After outlining LAK19, March 4–8, 2019, Tempe, AZ, USA our research goals, we introduce two studies in Sections 3 and 4. © 2019 Copyright held by the owner/author(s). Publication rights licensed to ACM. ACM ISBN 978-1-4503-6256-6/19/03...$15.00 https://doi.org/10.1145/3303772.3303798 1See the SHEILA project for examples: http://sheilaproject.eu/la-policies/ LAK19, March 4–8, 2019, Tempe, AZ, USA Chen & Zhu

We conclude this paper by discussing the importance of consid- learning analytics, we need means to interrogate the values that are ering values and value tensions in learning analytics design and often in tension with each other. The learning analytics accountabil- development. ity analysis put forward by Buckingham Shum [2] considers system integrity from multiple angles and holds promise for eliciting values 2 BACKGROUND beyond current thinking on ethical issues. Take automated feedback 2.1 Learning Analytics Systems: Ethics and on academic writing for example. Values supported by an academic writing analytics tool include the accurate detection of rhetorical Values moves based on linguistic features and usable feedback interface Learning analytics is defined as “the measurement, collection, anal- that can facilitate student sense-making. However, the accurate ysis and reporting of data about learners and their contexts, for detection of rhetorical moves may demand more learner data and is purposes of understanding and optimising learning and the envi- hereby in tension with the value of privacy foregrounded in ethical ronments in which it occurs” [24, p. 34]. By harnessing increasingly considerations. To better support learning analytics design, the abundant educational data and data mining , learning field needs strategies for weighing competing values and deriving analytics aspires to build applications to personalize learning based design trade-offs to mitigate value tensions. on individual progress, predict student success for timely inter- vention, provide real-time feedback on academic writing, and so forth [33]. As learning analytics applications are increasingly in- 2.2 Value Sensitive Design tegrated in the existing academic technology infrastructure (e.g. Stemming from an orientation towards human values, Value Sensi- student information systems, learning management systems, stu- tive Design “represents a pioneering endeavor to proactively con- dent support systems), more educational institutions are becoming sider human values throughout the process of technology design” equipped with strong, connected analytics capacity to better under- [8, p. 1]. In Value Sensitive Design, value is operationally defined stand students and hopefully to also better support student success. as “what is important to people in their lives, with a focus on ethics Empirical evidence accumulated so far has indeed demonstrated and morality” [15, p. 68]. Value Sensitive Design offers a systemic promise of learning analytics in supporting student learning [21]. approach with specific strategies and methods to help researchers While learning analytics are garnering international interests and explicitly incorporate the consideration of human across different levels of education, ethical and privacy concerns values into design [8, 15]. It recognizes the complexity of social life are increasingly mentioned in both public and scholarly discourse. and attempts to illuminate value tensions among stakeholders. Slade and Prinsloo [34] listed a number of ethical challenges in Value Sensitive Design is a theory, a methodology, and an estab- learning analytics, including the location and interpretation of data; lished set of methods [8]. As a methodology, Value Sensitive Design informed consent, privacy, and de-identification of data; and the involves three types of investigations: conceptual, empirical, and classification and management of data. Scholars have organized a technical [8]. Together they could form an iterative process of iden- series of workshops at the Learning Analytics and Knowledge (LAK) tifying and interrogating values in technology design. According conference to collectively tackle ethical challenges [e.g. 12], leading to [8, 16], conceptual investigations are primarily concerned with to the development of ethical principles, frameworks, and checklists identifying direct and indirect stakeholders and eliciting values to guide ethical research and practice [11, 13, 29, 40]. Empirical work held by different stakeholders. A conceptual investigation can draw has also been done, for example, to examine student perceptions of from philosophical and theoretical texts in order to conceptualize a privacy issues with regard to learning analytics applications [22]. value at hand (e.g. trust, autonomy). Empirical investigations ven- Design work following the principles has shown ture further to collect empirical data from stakeholders about their promise in identifying solutions to addressing privacy concerns [37]. perceptions of a value and value tensions (e.g. privacy vs. utility) in While institution- code of ethics are being created, efforts are a specific context. Both qualitative and quantitative methods canbe also made to help individual practitioners apply abstract principles used in empirical investigations. Finally, technical investigations fo- and code of ethics in real-world scenarios [23]. cus on the designed technology itself and seek to either proactively Undergirding these discussions of ethics and privacy are values design new technological features to support values or examine in learning analytics systems. Generally speaking, value represents how a current design would support or undermine human values. either “something important, worthwhile, or useful” or “one’s judg- These three types of investigations can be integrated in an iterative ment of what is important in life” (Oxford English Dictionaries, design process to holistically address human values. 2018). For instance, privacy is a fundamental human right, which Fourteen different methods can be drawn to support Value Sensi- holds not only value for a person but also social and public values tive Design [15]. For example, direct and indirect stakeholder analysis for democratic societies [28]. So safeguarding privacy has naturally can lead to the identification of indirect stakeholders of transit infor- become an important ethical consideration. There are other values mation tools [38]; fictional value scenarios are useful for revealing that are also important in learning analytics but often go unstated. values and tensions when developing mobile apps [7]; value dams For instance, as educators we value student success, learner agency, and value flows could be a useful analytic method for resolving and equitable access to education [13], but these values, despite value tensions among design choices—by removing a their educational significance, are often neglected in the discussion based on strong objection from a small percentage of stakeholders of data ethics. In other words, various values are not equally recog- (the value dams) or accepting a design option based on favorable nized in current discourse, leaving some values and tensions among reaction from a certain portion of stakeholders (the value flow) values neglected or under-explored. To promote ethical practice in Towards Value-Sensitive Learning Analytics Design LAK19, March 4–8, 2019, Tempe, AZ, USA

[7]. These methods suit varied purposes of design and a project the university. In other words, this tool could well represent the adopting Value Sensitive Design can draw from multiple methods. state-of-the-art of learning analytics applications designed for this particular context. 2.3 Overview of the Paper As described by its project website, the development of the Yel- To examine how Value Sensitive Design—the theory, methodology, lowdig Tool was motivated by an interest in helping and methods—could be applied to learning analytics design, this students and faculty learn more about their online discussions by paper reports on two studies. accessing Yellowdig data. Prior research on educational use of Web The first study is a conceptual investigation of a learning ana- 2.0 tools and online discussion environments has demonstrated lytics application that has already been developed. Applying two the need to support learner participation and engagement from Value Sensitive —stakeholder analysis and value pedagogical, technological, and sociocultural angles [6, 18]. The analysis [15]—this study aimed to (1) identify key stakeholders Yellowdig Visualization Tool was one example of supporting stu- of the developed application, (2) elicit values and value tensions, dent use of Yellowdig through designing a new learning analytics and (3) derive design trade-offs to be considered in future design application, which is expected to be coupled with pedagogical inter- iterations. ventions. During the design process, the design team came up with Moving beyond conceptually investigating an existing system, an initial prototype and engaged students in to the second study introduces a case of applying Value Sensitive elicit their ideas about visualizations that could be useful for their Design to proactively consider human values throughout a design learning. The version of the tool analyzed in this study contained process. The introduced case is about developing a value-sensitive the following key features (see Figure 1): recommendation system for the Wikipedia WikiProjects. Applying • Multi-mode, multi-plex network visualization. The visualiza- ideas from Value Sensitive Design, we and colleagues (1) engaged tion tool represents one class’s Yellowdig discussion as a stakeholders in the early stages of the algorithm design and used multi-mode, multi-plex network that contains multiple types stakeholders’ insights to guide the algorithm development, (2) itera- of nodes (students, posts, and comments) and multiple types tively improved, adapted, and refined the algorithmic system based of relations (posting, commenting). In this network, each on the stakeholders’ feedback, and (3) evaluated the recommen- Yellowdig post (also called “pin”) is represented by a blue dation system not only based on accuracy but also stakeholders’ square filled either by solid blue color (if the post contains acceptance and their perceived impacts of the system. pure text) or a thumbnail (if the post contains a rich me- dia object); comments on posts are shown as light green 3 STUDY I: A CONCEPTUAL INVESTIGATION circles; users are represented by grey user icons. Following OF A DISCUSSION VISUALIZATION TOOL a typical network visualization heuristic, the size of a post 3.1 Study Context nodes is correlated with the number of connections it has. The whole network is plotted using a force-directed layout, The first study was a conceptual investigation of a learning analytics which keeps well connected nodes in the center and pushes tool developed to visualize online discussions in a learning plat- less connected nodes to the edge. form named Yellowdig.2 This study and the creation of Yellowdig • Node highlighting. When a node is chosen by the user, the were contextualized within higher education’s interests in fostering visualization would highlight the node’s “2.0 level ego net- social learning and peer interactions among students. Branded as work,” which comprises the chosen node (i.e. the ego) and a social media platform designed for education use, Yellowdig re- all other nodes that can be reached within two steps. The sembles many of social network sites such as Facebook and Twitter. content of the chosen node is also displayed in the top panel For example, it features a news feed similar to that of Facebook; of the tool (see Figure 1). learners can contribute text or multimedia posts to the news feed, • Visualization of temporal growth. A movie can be played where other learners can “like” or comment on each other’s posts to inspect both daily and weekly growth of the discussion . By providing social-media features students are already familiar network. The placement of all nodes and edges are fixed in with, this tool aims to facilitate social learning in online and blended advance. The temporal visualization adds nodes and edges classes without intruding into students’ personal social networks. into the network in a step-wise manner. Since its inception in 2014, Yellowdig has been adopted by a number • Filtering mechanisms for instructor use. The instructor view of higher education institutions. of this visualization tool provides additional mechanisms As one of the early adopters of Yellowdig, Northwestern Univer- for filtering the network by student names, post tags, and sity developed a to visualize students’ Yellowdig Visualization Tool named entities (such as Uber) recognized by text mining peer interactions and conceptual engagement on Yellowdig.3 The algorithms. These filters are displayed on the right panel of design of this tool was similar to many existing social learning the visualization for the instructor to explore the discussion analytics tools such as SNAPP [9], Threadz,4 and CavanNet [3]. network. The tool was chosen for this study because it represented a more recent design effort, has a fairly polished design, and has beenin- In comparison with earlier social learning analytics applications, formed by participatory design workshops attended by students at the Yellowdig Visualization Tool is novel in several ways. First of all, in comparison with one-mode networks adopted in most social net- 2See https://yellowdig.com/ 3See https://www.teachx.northwestern.edu/projects2016/gruber/ work analytics, the Yellowdig Visualization Tool turns discussion 4https://threadz.ewu.edu/ data into a multi-mode, multiplex, temporal network that provides LAK19, March 4–8, 2019, Tempe, AZ, USA Chen & Zhu

3.3 Findings 3.3.1 Stakeholders. In this context, the direct stakeholders included students and instructors accessing the Yellowdig Visualization Tool. It was expected that the tool would assist instructors in analyzing the data generated on Yellowdig to inform their instructional deci- sions. The tool design also aimed to enhance students’ mastery of the course content and provide them with another way of engaging with the course material and with each other (see project website). However, the adoption of such a learning analytics tool is often decided by the instructor and its acceptance among students may vary. Therefore, students may be further divided into those who choose to use the tool and those who reject the tool for various reasons. Because Yellowdig is a private social media platform, the visu- alization tool is based on data generated by the class and is not accessible beyond the class. The indirect stakeholders of this tool may include future cohorts of the class whose discussion perfor- mance may be compared (explicitly or implicitly) with the present Figure 1: Yellowdig Visualization Tool. (Note: Used with per- cohort. Academic technology staff members are also indirect stake- mission of the Yellowdig Visualization development team at holders as they are charged with supporting technology use and Northwestern University.) may be asked to provide additional technical support for this tool. Student support services, such as academic advisers and academic analytics teams, are also indirect stakeholders. For example, an aca- a more holistic picture of student discussions. Its node highlighting demic adviser may be contacted by the instructor when she finds feature allows students and the instructor to inspect local network from the tool that a student has been inactive for a few weeks. structures, and its filtering mechanisms support the instructor to 3.3.2 Values and tensions. The Yellowdig Visualization Tool aimed quickly identify discussion trends. To a great extent, the visual- at providing another way of engaging with the class discussion, ization tool provides a unique view of the Yellowdig discussion, enhancing student learning and social interaction, and assisting the which can facilitate sense-making activities by its users and serve instructor to grasp discussion content in order to make informed assessment needs of the instructor. instructional decisions. However, through the conceptual investiga- tion we found these values supported by the tool can be in tension 3.2 Methods with other values. To study values in such a nascent learning analytics tool, we con- For such a learning analytics tool that can be used for assess- ducted a conceptual investigation [8] to identify its stakeholders ment purposes, student autonomy is an important value that is in and elicit values and value tensions among identified stakeholders. tension with the tool’s utility for assessment. The network visu- This conceptual investigation had two components: alization and node highlighting features of the tool make it easy to identify posts and comments made by each individual student. (1) . The first component was to identify Stakeholder analysis Even though such information is also accessible via the original the direct and indirect stakeholders of this tool. According Yellowdig discussion interface, the visualization tool makes it more to [15], this analysis includes individuals who are in direct readily accessible to all members of the class. These design features contact with this tool ( ) and those who direct stakeholders are meant to support the values of student success, accountabil- may not interact with the tool directly but are likely to be ity, discussion engagement, and usability, but they can also impacted by tool usage ( ). indirect stakeholders press students to participate more actively and hereby hinder their (2) . The second component was to identify and Value analysis autonomy as discussion participants. define the benefits, harms, and values implicated bythis In a same vein, because of the aggregation of student participa- learning analytics tool and to illuminate value tensions em- tion data, student privacy as another important value is also at bedded in the tool and its larger education systems. Values at tension with these intended benefits. Similarly, even though the stake in this analysis include generic human values that rep- visualization tool does not collect new data from students, it reveals resent “what is important to people in their lives" [15, p. 68], a bird-eye view of each student’s participation patterns and could as well as values endorsed by the educational literature. appear intruding to some students who are less comfortable with This conceptual investigation was grounded in our knowledge of such aggregated reporting. higher-education classes in similar settings and aided by technical Social well-being, a sense of belonging and social inclusion, is inspection of this Yellowdig Visualization Tool. It is expected that another important value to be considered in this context. Students through this study we could come up with suggestions for consid- in the class—using the tool or not—can reasonably expect to be ering identified values and design trade-offs in future iterations of free from embarrassment caused by less active participation or less this particular tool as well as other similar applications. central status in the network. The force-directed layout adopted to Towards Value-Sensitive Learning Analytics Design LAK19, March 4–8, 2019, Tempe, AZ, USA generate the network visualization pushes less active participants tool. One premise of discussing the implications is that the accom- to the edge and make them more identifiable by members of the modation of these values should not be solely achieved through class. This design is intuitively understandable, is broadly applied technology design, but also the design of pedagogical interven- in network visualization, and supports pedagogical decisions by tions [41]. Nonetheless, based on findings from the value analysis, revealing to the instructor students “at risk” of dropping out of we suggest the following possible design trade-offs that could be Yellowdig discussion. Nonetheless, students on the edge may feel considered in future technology design: embarrassed among peers should their identities be revealed. This • Considering the tension between utility (e.g. revealing a layout can potentially contribute to the “rich-club” phenomenon student’s participation) and the values of students’ auton- in online discussions [36], potentially channeling more attention omy, privacy, social well-being, and self-image, one de- to students in the center and leading to the exclusion of less active sign idea is to give each student the option of not revealing students in future interactions. his/her name to their peers in the network visualization. In There are also values that are implicated by algorithmic decisions this way, students can choose to become less identifiable in embedded in the visualization tool, including freedom from bias the tool. Another possible idea is to make student nodes non- and self-image. For instance, the force-directed layout algorithm clickable so that the node highlighting feature is only limited inherently boost the status of students who are more connected to post and comment nodes. Both design ideas will sacrifice in terms of connection volumes and do not consider the quality of utility of the visualization tool to support other values such their posts. Displaying the thumbnails of discussion posts renders as privacy and social well-being. posts with multimedia objects more attractive in comparison with • To support the values of freedom from bias and self-image, posts with pure text, and can hereby direct varied attention to posts future implementation of the network visualization could with and without multimedia objects. Also notable is the choice consider adjusting parameters of the force-directed layout of scaling posts based on the number of comments, which directs or consider other graph drawing algorithms (such as the cir- more attention to posts that have already attracted more attention. cular, arc diagram, hive plot, and hierarchical graph layouts). These differentiated treatments of discussion posts embedded in the Here, design trade-offs are likely to involve interpretability algorithms are linked to students’ self-image. When a student’s of a network visualization and its complications concerning node gets highlighted in the network visualization, the highlighted student status in the given sociocultural context. Empirical ego network essentially represents what the student has produced investigations of student perceptions of different layouts and how his/her participation is linked with peers in the community. could be necessary. Being able to see all connections of a student at once can benefit • Considering the values of fairness, epistemic agency, and the sense of community especially when a student see herself ease of information seeking, the filtering mechanisms well connected. However, it may also pose questions to a student’s provided to the instructor could be also considered for stu- self-image if she is less connected with peers in the community. dent access. Given the tension between privacy and ease The visualization tool was designed to provide another way of information seeking, filtering by student names could to make sense of and engage in online discussions on Yellowdig. still be reserved to the instructor, while the other two filter- Values facilitated by the tool include utility, usability, and ease of ing mechanisms (i.e. by tags or named entities) can be made information seeking in Yellowdig. In particular, the tool provides available for students. views of both the overall discussion structure and micro, local The design ideas mentioned above are presented here to illustrate interaction patterns that are not revealed by the Yellowdig interface. the promise of Value Sensitive Design for the particular context, The instructor view provides rich filtering mechanisms that are instead of disproving the current visualization tool that is quite useful for filtering discussion posts by students, tags, and/or named well-designed. These design ideas are tentative and by no means entities. Being able to look at entity names, such as in a comprehensive. Further empirical and technical investigations are business class, allows the instructor to quickly identify popular necessary in order to advance these design ideas. topics in student discussions and make pedagogical decisions on, for example, whether to address a specific topic in class. However, the 4 STUDY II: A HOLISTIC APPLICATION OF fact that these advanced features are reserved for the instructor is VALUE SENSITIVE DESIGN TO AN at tension with the values of epistemic agency and fairness. One can argue that the advanced features can equally benefit students, ALGORITHMIC SYSTEM OF WIKIPROJECTS who also need to complete information seeking tasks to effectively 4.1 Study Context participate in Yellowdig discussions. Nevertheless, one may also While the first study focuses on social learning in formal classes, the argue that providing these additional features places unnecessary second study was situated in the informal context of Wikipedia edit- cognitive overload and burden on students. ing. Specifically, this study was concerned with designing recom- mendation algorithms for group formation in Wikipedia’s WikiPro- jects. A WikiProject organizes “a group of participants... to achieve 3.4 Discussion and Implications specific editing goals, or to achieve goals relating to a specific field With aid from technical analysis of the Yellowdig Visualization Tool, of knowledge.”5 Editing a Wikipedia article is itself a learning and the conceptual investigation has revealed a number of values and knowledge-creation process. Learning scientists have recognized value tensions that have implications for future refinements of the 5See https://en.wikipedia.org/wiki/WikiProject LAK19, March 4–8, 2019, Tempe, AZ, USA Chen & Zhu mass collaboration, on Wikipedia for instance, as an emerging par- tasks are also explored in the MOOC (massive open online course) adigm of education [5]. Computer-supported collaborative learn- and collaborative problem-solving contexts [10, 30]. In contrast ing (CSCL) researchers have studied the social process of building with Study I, this study was focused on the recommendation algo- collective knowledge on Wikipedia, leading to findings about the rithm, a key component of a learning analytics system [2]. By doing intricate relations between contributors and Wikipedia articles [19]. so, we hope to illustrate the possibility of attending values when Indeed, coordinated editing in a WikiProject involves a great deal building an algorithmic system’s sub-component that is crucial of sense-making, negotiation, synthesis, and knowledge transfor- but not observable from the user interface. Below, we explain the mation. For this reason, since its inception Wiki as a Web 2.0 tech- process of applying Value Sensitive Design to the development of nology has been widely adopted in formal classrooms to support the recommendation engine, and discuss the implications for the sophisticated collaborative learning experiences [18]. Therefore, design of learning analytics systems. facilitating knowledge processes on the Wiki technology—and on Wikipedia in particular—is of interest to learning analytics because 4.2 Methods Wikipedia editing entails learning, and intelligent supports built 4.2.1 Study Context. Retaining and socializing newcomers is a for Wikipedia are transferable to Wiki-based learning scenarios. crucial challenge for many volunteer-based knowledge creation Prior work has illuminated that Wikipedia, like other social me- community such as Wikipedia. The number of active contributors dia environments, now heavily depends on data-driven algorithmic in Wikipedia has been declining steadily since 2007, due at least in systems to support its massive-scale collaboration and governance part to a sharp decline in the retention of new volunteer editors [17]. On Wikipedia, algorithmic systems have been used to support [20]. WikiProjects, which are groups of contributors who work to- a variety of critical tasks such as counter-vandalism, task routing, gether to improve Wikipedia, serve as important socialization hubs and the Wikipedia education program. Many of these existing algo- within the Wikipedia community. Prior work [14, 42] suggests that rithmic systems are driven by “Big Data” and supervised machine WikiProjects provide three valuable support mechanisms for new learning algorithms. Take predicting vandalism for example. The members: (1) enabling them to locate suitable assistance and expert first step of the task is to define a prediction target, i.e. amalicious collaborators; (2) guiding and structuring their participation by edit. The second step in the process is to use historical data, often organizing tasks, to-do lists, and initiatives such as “Collaborations in large volumes, to train and validate machine learning models. of the Week;” and (3) offering new editors “protection” for their Finally, the validated models are applied to new edits in order to gen- work, by shielding it from unwarranted reverts and “edit wars.” erate predictive scores of being malicious edits. Such algorithmic However, recommending new editors to WikiProjects is not a systems play important roles in the Wikipedia ecosystem. trivial task. In the English version of Wikipedia alone there are However, the data-driven approach can lead to unintended bi- currently about 2,000 WikiProjects, with on average 38,628 new ases in algorithms and potentially negative impacts on the editor users/editors registered to participate each month. The goal of community. It is recognized that the data-driven approach relies the case study was to create algorithmic tools to recommend new largely on historical human judgments, which are subject to his- Wikipedia editors to existing WikiProjects. torical stereotypes, discrimination, and prejudices. Using historical data to inform the future runs the risk of reinforcing and repeating 4.2.2 Value Sensitive Design Process. Drawing on the principles historical mistakes and thus fails to capture and incorporate human of Value Sensitive Design [15], we derived the following five-stage insights on how the system can be improved for the future. Recent process to proactively accommodate values in the WikiProjects ethnographic work of Wikipedia has revealed critical issues with recommendation system. current algorithmic systems and calls for the study of sociocul- (1) Review literature and conduct empirical studies to under- tural processes, such as newcomer socialization, that are strongly stand relevant stakeholders’ values, motivations and goals, impacted by the automated systems [17]. and identify potential trade-offs. Researchers have called for the development of systematic meth- (2) Based on the results of the first step, identify algorithmic ods to detect and mitigate biases in existing algorithmic systems approaches and create prototype implementations. [31]. In contrast with a reactive approach that attempts to address (3) Engage and work closely with the stakeholders to identify bias when it gets exposed, we argue for an approach that would viable means to deploy the developed algorithms, recruit proactively consider human values throughout the process of al- participants, and gather user feedback. gorithm design in a principled and comprehensive manner. One (4) Deploy the algorithms, gather stakeholder feedback, and possible way of achieving this goal is to apply principles of Value refine and iterate as necessary. Sensitive Design [15] to the creation of each algorithmic system. (5) Evaluate stakeholders’ acceptance, algorithm accuracy, and This holistic approach, explicated in great detail elsewhere [43], impacts of the algorithms on relevant stakeholders. engages stakeholders in the early stages of algorithm development and incorporates stakeholders’ tacit knowledge and insights into In contrast with Study I that was focused on a conceptual investi- the creation of an algorithm system. By doing this, it is hoped we gation, this five-stage process incorporates all three types of Value could mitigate biases embedded in design choices and hence avoid Sensitive Design methods—conceptual, empirical, and technical— undermining important stakeholder values. to synergistically address human values. In the proposed process, In this section, we briefly introduce a case of applying Value each component itself should be familiar to HCI scholars. However, Sensitive Design to designing a recommendation engine for group the derived process represents an attempt to put multiple Values formation in Wikipedia’s WikiProjects. Similar recommendation Sensitive Design methods into action to collectively consider values. Towards Value-Sensitive Learning Analytics Design LAK19, March 4–8, 2019, Tempe, AZ, USA

Figure 2: The interface of the prototype newcomer recommendations for WikiProject organizers.

4.3 Results the editor’s interests and the project topic. To this end, we cre- Following the described five-stage process, we have developed the ated four different recommendation algorithms—two interest-based algorithmic system to recommend new Wikipedia contributors to algorithms and two relationship-based algorithms. existing Wikipedia projects. • Interest-based algorithms rank candidate editors based on how closely their editing history matches the topic of a 4.3.1 Step 1: Review literature and conduct empirical studies to WikiProject. Two types of interest-based algorithms were understand stakeholders’ values and goals of importance to them, considered. A rule-based algorithm ranks the match of an and potential trade-offs. To achieve the design goal, we reviewed editor to a WikiProject by counting the number of edits prior research and conducted a survey study with 59 members by that editor to articles within the scope of the project. from 17 WikiProjects in order to answer the following questions: A category-based algorithm ranks editors by computing a (1) Which factors would motivate new editors to participate in similarity score between an editor’s edit history and the a WikiProject? (2) Which factors would motivate WikiProjects topic of a WikiProject. organizers to recruit new members? (3) What collective outcomes • Relationship-based algorithms rank candidate editors based are important for WikiProjects and for Wikipedia in general? on relationships with current members of a WikiProject. Findings from the survey study included: Two types of relationship-based algorithms were developed. • Both newcomers and WikiProject organizers valued the in- A bonds-based algorithm ranks editors by the strength of terest match (i.e. the match between the editor’s interests “social connections” the editor has to current members of a and the project topic) and personal connections (i.e. the WikiProject. A co-edit-based algorithm is a version of collab- relationships between newcomers and current members of orative filtering and is inspired by the design of SuggestBot the community). [4]. Candidate editors are ranked by the similarity of their • WikiProject organizers valued productivity and wanted to edit histories to the edit histories of current members of a recruit newcomers who could produce high-quality work; WikiProject. • Although WikiProjects organizers valued editors’ prior ex- Second, the recommendation algorithms needed to balance the perience and were more interested in recruiting newcomers interests of WikiProjects and the collective goal of Wikipedia com- who already had some editing experience in Wikipedia, the munity by targeting both experienced editors and “-new” Wikipeida community as a whole was more concerned with editors. Specifically, the candidate editors included both the editors socializing “brand-new” editors. who were new to Wikipedia and editors who were moderately ex- • WikiProject organizers hoped to remain control of the re- perienced (but not highly experienced). This design instantiated cruitment process. Specifically, they objected to the idea of the tension we identified previously. Moreover, we also ranked the complete automation, i.e. to automatically identify and invite two types of newcomers separately, so that experienced ones did newcomers deemed relevant by a recommendation system. not overshadow the inexperienced ones. They wanted to have the final say on whom to be invited to Finally, the recommendation algorithms should satisfy the goals their projects. of WikiProjects and their organizers by excluding editors likely to produce low-quality work. 4.3.2 Step 2: Identify algorithmic approaches and develop system With these developed algorithms, we further designed a user prototypes. Drawing on the results of the first step, the team devel- interface to present top recommendations from each of the four oped recommendation algorithms that would meet the following algorithms to WikiProject organizers (see Figure 2). The decision criteria that were informed by identified stakeholders’ values. of presenting multiple results was based on a finding from Step First, the recommendation algorithms needed to satisfy the goals 1 that some project organizers strongly objected to the idea of of new editors and WikiProjects by considering the match between complete automation and wished to “remain in the loop” of inviting LAK19, March 4–8, 2019, Tempe, AZ, USA Chen & Zhu newcomers. With the prototype interface, WikiProject organizers that were category-based, bonds-based and co-edit based. could review the recommendations and decide who they would like The average rating for the rule-based algorithm was 3.24 in a to invite to their projects [43]. As such, their desire to remain in 5-point scale, significantly higher than the other three types the loop was accommodated. of algorithms (t = 3.51, p <.001). The invitation rate (which is analogous to the click-through rate) for the rule-based algo- 4.3.3 Step 3: Engage and work closely with the community. On- rithm was 47%, which was also higher than category-based line communities like Wikipedia constitute a rich laboratory for (16%), bonds-based (22%), and co-edit based (28%). There was research. Digital traces of collaboration, social interaction, and pro- no significant difference among the other three algorithms. duction are made visible, offering rich opportunities for empirical We also found that inexperienced editors and experienced ed- studies and experimentation. However, there is an unfortunate tra- itors were equally invited by the project organizers through dition of some researchers treating such communities merely as the recommendation system. data sources for research instead of “living organisms” with their (3) Assessing impacts. Our evaluation of the impacts on the own values, goals, and norms. Research studies of Wikipedia often community sought to understand what happened to new- encounter resistance from Wikipedia editors because community comers if WikiProject organizers invited them to the projects. values are sometimes violated by research activities. Initial results suggested that: (1) Only experienced editors To avoid these problems, and to ensure community values are who received invitations from project organizers had a sig- well attended to, the team worked with stakeholders to develop a 6 nificant increase in their within-project activities over the research protocol that was acceptable to the community. Essen- baseline group composed of equally competent editors who tially, we developed our research plan on the Wikipedia platform. did not receive invitations; and (2) the increase in the in- In other words, we were developing and deploying our algorithms vited experienced editors’ contributions did not come at the not just for, but with, the Wikipedia community. cost of fewer edits of Wikipedia articles beyond the joined 4.3.4 Step 4: Deploy the algorithms and iteratively refine them. We WikiProject [see 43]. iteratively tested and refined our algorithms. Over a duration ofsix months we sent weekly batches of recommendations to our pilot 4.4 Discussion and Implications participants, conducted short surveys to seek their views on these 4.4.1 Challenges and Future Directions. While this WikiProjects recommendations, and also interviewed project organizers and case demonstrated the promise of Value Sensitive Design and the newcomers. We used their feedback to make significant changes five-stage holistic approach, it also helped reveal several challenges. to the algorithms. Example revisions included refining algorithm First, it is challenging to explain the algorithms in ways that enable explanations and boosting the threshold of the matching algorithms. stakeholders to assess them and to provide sensible feedback to In particular, candidate editors will only be recommended by the improve them. This constitutes a barrier to full stakeholder engage- rule-based matching algorithm if they have made minimally five ment in the design process. One future direction is to explore the edits on project-related articles in the previous month. effectiveness and trade-offs of different strategies and interfaces to help non-expert stakeholders understand algorithmic decision- 4.3.5 Step 5: Evaluate algorithms based on stakeholder acceptance, making. algorithm accuracy, and impacts of the algorithm on the community. Second, there is a lack of holistic understanding of the mapping We conducted qualitative and quantitative studies to systematically between a wide range of human values and a variety of algorithmic examine the acceptance, accuracy, and impacts of the algorithms. design options. One promising direction for future work is to design The goal was to fully understand the influence of the algorithmic ways to translate human values into different algorithmic choices, tool on the community and stakeholders, and to monitor any unin- covering areas such as data pre-processing, model regularization tended consequences. during the learning process, model post-processing, and model Results from our initial evaluation included: selection. (1) Assessing acceptance among stakeholders. We conducted interviews with community stakeholders, including new- 4.4.2 Implications for Learning Analytics System Design. This study comers and WikiProject organizers who used our recom- has a number of implications for learning analytics design. First, the mendation systems. The feedback was positive. This gave designed algorithm products can be transferred to other learning us confidence that our system was acceptable to the stake- contexts. For instance, given the increasing popularity of virtual holders. For example, one organizer wrote: “This puts some collaboration in MOOCs [39], we could design similar recommen- science behind recommendations, and will be a great supple- dation algorithms to support the formation of productive virtual ment to the current processes.” Editors who were invited to learning groups in MOOCs. We have not seen work in this area join projects also reacted positively. For example, an editor attending to learner values yet, and the Value Sensitive Design who was invited to join WikiProject Africa wrote: “Thank methodology applied to the Wikipedia context could contribute to you for reaching out to me and thank you for informing me settings like MOOCs. about the WikiProject Africa ... I appreciate it.” Second, the five-stage design process is directly applicable toa (2) Assessing accuracy. Organizers rated the rule-based algo- learning analytics design project. Even though the study was pri- rithm higher compared to the other three types of algorithms marily focused on algorithms, the Value Sensitive Design process introduced in this study can be applied to the design of a learning 6See https://meta.wikimedia.org/wiki/Research:WikiProject_Recommendation analytics system, of which algorithm is an important component Towards Value-Sensitive Learning Analytics Design LAK19, March 4–8, 2019, Tempe, AZ, USA

[2]. While we could embrace the process when choosing or devel- emerged to advance this work. Besides the Privacy by Design ap- oping network visualization layout algorithms for a social network proach mentioned above [37], the Connected Intelligence Centre analytics tool, we can also apply the process to other parts of the at the University of Technology Sydney is working on an Ethical analytics system. For example, as demonstrated in this case study, Design Critique (EDC) approach to involve stakeholders to consider the consideration of human values is critical for the process of ethical and risk factors involved in new learning analytics tools. communicating analytics results to stakeholders. Other aspects of Even though this approach does not directly apply Value Sensitive the system, such as the technological infrastructure used to store Design, it considers competing values identified by different stake- and transit learning data can also benefit from the Value Sensitive holders and aims to reach resolutions. One future direction can be Design approach. purposefully exploring synergies among algorithm auditing, Pri- vacy by Design, Ethical Design Critique, and Value Sensitive Design 5 GENERAL DISCUSSION AND to comprehensively consider human values in learning analytics. CONCLUSIONS REFERENCES To support ethical considerations and system integrity in learning [1] ACM US Public Policy Council. 2017. Statement on Algorithmic Transparency analytics, this paper introduces the application of Value Sensitive and Accountability. Online. Design in learning analytics design processes. The first study was [2] Simon Buckingham Shum. 2017. Black box learning analytics? Beyond algorith- mic transparency. Keynote presented at the Learning Analytics Summer Institute, a conceptual investigation of an existing learning analytics tool. Ann Arbor, MI. Applying Value Sensitive Design methods—specifically stakeholder [3] Bodong Chen, Yu-Hui Chang, Fan Ouyang, and Wanying Zhou. 2018. Fostering analysis and value analysis—we uncovered a number of values student engagement in online discussion through social learning analytics. The Internet and Higher Education 37 (2018), 21–30. https://doi.org/10.1016/j.iheduc. (e.g. student autonomy, self-image) and value tensions with the 2017.12.002 studied tool. Based on these analyses, we proposed design trade- [4] Dan Cosley, Dan Frankowski, Loren Terveen, and John Riedl. 2007. SuggestBot: Using Intelligent Task Routing to Help People Find Work in Wikipedia. In Pro- offs that are subject to further empirical, technical, and conceptual ceedings of the 12th International Conference on Intelligent User Interfaces (IUI ’07). investigations. ACM, New York, NY, USA, 32–41. https://doi.org/10.1145/1216295.1216309 The second study went a step further. It proposed a holistic pro- [5] Ulrike Cress, Johannes Moskaliuk, and Heisawn Jeong. 2016. Mass Collaboration and Education. Springer. https://doi.org/10.1007/978-3-319-13536-6 cess of conducting Value Sensitive Design and demonstrated its [6] Betul C Czerkawski. 2016. Networked learning: design considerations for online application to building recommendation systems for WikiProjects— instructors. Interactive Learning Environments 24, 8 (Nov. 2016), 1850–1863. https://doi.org/10.1080/10494820.2015.1057744 an informal, mass collaboration scenario. The design process in- [7] Alexei Czeskis, Ivayla Dermendjieva, Hussein Yapit, Alan Borning, Batya Fried- cludes literature analysis, empirical investigations, prototype devel- man, Brian Gill, and Tadayoshi Kohno. 2010. Parenting from the pocket: value opment, community engagement, iterative testing and refinement, tensions and technical directions for secure and private parent-teen mobile safety. In Proceedings of the Sixth Symposium on Usable Privacy and Security. ACM, New and continuous evaluation. Even though this study was focused on York, NY, USA, 15. https://doi.org/10.1145/1837110.1837130 algorithms, the process can be readily applied to the design of a [8] Janet Davis and Lisa P Nathan. 2013. Value Sensitive Design: Applications, whole learning analytics system. Adaptations, and Critiques. In Handbook of Ethics, Values, and Technological Design: Sources, Theory, Values and Application Domains, Jeroen van den Hoven, Overall, this paper is motivated by a need of practical means to Pieter E Vermaas, and Ibo van de Poel (Eds.). Springer Netherlands, Dordrecht, support ethical considerations and human values in learning ana- 1–26. https://doi.org/10.1007/978-94-007-6994-6_3-1 [9] Shane Dawson, Aneesha Bakharia, and Elizabeth Heathcote. 2010. SNAPP: lytics system. Learning analytics operates in the “middle ” be- Realising the affordances of real-time SNA within networked learning environ- tween learning and analytics and needs to consider a wide range of ments. In Proceedings of the 7th International Conference on Networked Learning, factors including learning theory, data integrity, human–computer Lone Dirckinck-Holmfeld, Vivien Hodgson, Chris Jones, Maarten de Laat, David McConnell, and Thomas Ryberg (Eds.). Lancaster University, Lancaster, UK, interaction, and sociocultural factors [35]. As a result, a wide range 125–133. of values, which go beyond ethical concerns and are often compet- [10] Nia M M Dowell, Tristan M Nixon, and Arthur C Graesser. 2018. Group commu- ing with each other, need to be considered. nication analysis: A computational linguistics approach for detecting sociocog- nitive roles in multiparty interactions. Behavior research methods (Aug. 2018). Value Sensitive Design is a promising approach to promoting https://doi.org/10.3758/s13428-018-1102-z ethical practice in learning analytics. Preliminary work in this area [11] Hendrik Drachsler and Wolfgang Greller. 2016. Privacy and analytics: It’s a has already been reported by colleagues [37], and more systematic DELICATE issue a checklist for trusted learning analytics. In Proceedings of the Sixth International Conference on Learning Analytics & Knowledge. ACM, New push for the application of Value Sensitive Design in learning ana- York, NY, USA, 89–98. https://doi.org/10.1145/2883851.2883893 lytics is needed. Two studies reported in this paper mitigate the gap. [12] Hendrik Drachsler, Tore Hoel, Adam Cooper, Gábor Kismihók, Alan Berg, Maren Scheffel, Weiqin Chen, and Rebecca Ferguson. 2016. Ethical and Privacy Issues They demonstrate two ways of adopting Value Sensitive Design in the Design of Learning Analytics Applications. In Proceedings of the Sixth in learning analytics projects: one was a conceptual investigation International Conference on Learning Analytics & Knowledge (LAK ’16). ACM, (a “lite” version), whereas the other was a detailed design process New York, NY, USA, 492–493. https://doi.org/10.1145/2883851.2883933 [13] Rebecca Ferguson, Tore Hoel, Maren Scheffel, and Hendrik Drachsler. 2016. Guest comprising multiple Value Sensitive Design methods (a “holistic” Editorial: Ethics and Privacy in Learning Analytics. Journal of Learning Analytics version). It is hoped both versions of using Value Sensitive Design 3, 1 (2016), 5–15. https://doi.org/10.18608/jla.2016.31.2 [14] Andrea Forte, Niki Kittur, Vanessa Larco, Haiyi Zhu, Amy Bruckman, and Robert E can strengthen future design of learning analytics systems. Kraut. 2012. Coordination and beyond: social functions of groups in open content To improve learning analytics systems (esp. its ethical dimen- production. In Proceedings of the ACM 2012 conference on Computer Supported sion), algorithmic transparency and accountability are not enough; Cooperative Work. ACM, 417–426. [15] Batya Friedman, David G Hendry, and Alan Borning. 2017. A Survey of Value Sen- rather, we should consider the system’s integrity holistically [2]. sitive Design Methods. Foundations and Trends® in Human–Computer Interaction This paper is only an initial step towards this direction. Much work 11, 2 (2017), 63–125. https://doi.org/10.1561/1100000015 remains to be done to infuse Value Sensitive Design more broadly [16] Batya Friedman, Peter H Kahn, Alan Borning, and Alina Huldtgren. 2013. Value Sensitive Design and Information Systems. In Early engagement and new tech- in the field of learning analytics. Interestingly, several projects have nologies: Opening up the laboratory, Neelke Doorn, Daan Schuurbiers, Ibo van de LAK19, March 4–8, 2019, Tempe, AZ, USA Chen & Zhu

Poel, and Michael E Gorman (Eds.). Springer Netherlands, Dordrecht, 55–95. American Society of Civil Engineers, Reston, VA, 296–308. https://doi.org/10. https://doi.org/10.1007/978-94-007-7844-3_4 1061/9780784413210.026 [17] R Stuart Geiger. 2017. Beyond opening up the black box: Investigating the role [39] Miaomiao Wen, Diyi Yang, and Carolyn Penstein Rosé. 2015. Virtual Teams of algorithmic systems in Wikipedian organizational culture. Big Data & Society in Massive Open Online Courses. In in Education (Lecture 4, 2 (Sept. 2017), 2053951717730735. https://doi.org/10.1177/2053951717730735 Notes in Computer Science). Springer, Cham, 820–824. https://doi.org/10.1007/ [18] Christine Greenhow and Emilia Askari. 2017. Learning and teaching with social 978-3-319-19773-9_124 network sites: A decade of research in K-12 related education. Education and [40] James E Willis, Sharon Slade, and Paul Prinsloo. 2016. Ethical oversight of student Information Technologies 22, 2 (March 2017), 623–645. https://doi.org/10.1007/ data in learning analytics: A typology derived from a cross-continental, cross- s10639-015-9446-9 institutional perspective. Educational Technology Research and Development 64, 5 [19] Iassen Halatchliyski, Johannes Moskaliuk, Joachim Kimmerle, and Ulrike Cress. (Oct. 2016), 881–901. https://doi.org/10.1007/s11423-016-9463-4 2014. Explaining authors’ contribution to pivotal artifacts during mass collab- [41] Alyssa Friend Wise and J Vytasek. 2017. Learning analytics implementation oration in the Wikipedia’s knowledge base. International Journal of Computer- design. In Handbook of learning analytics, Charles Lang, George Siemens, Supported Collaborative Learning 9, 1 (2014), 97–115. Alyssa Friend Wise, and Dragan Gašević (Eds.). Society for Learning Analytics [20] Aaron Halfaker, R Stuart Geiger, Jonathan T Morgan, and John Riedl. 2013. The Research, 151–160. Rise and Decline of an Open Collaboration System: How Wikipedia’s Reaction [42] Haiyi Zhu, Robert Kraut, and Aniket Kittur. 2012. Organizing without formal to Popularity Is Causing Its Decline. The American behavioral scientist 57, 5 (May organization: group identification, goal setting and social modeling in direct- 2013), 664–688. https://doi.org/10.1177/0002764212469365 ing online production. In Proceedings of the ACM 2012 conference on Computer [21] Madeline Huberth, Patricia Chen, Jared Tritz, and Timothy A McKay. 2015. Supported Cooperative Work. ACM, 935–944. Computer-tailored student support in introductory physics. PloS one 10, 9 (Sept. [43] Haiyi Zhu, Bowen Yu, Aaron Halfaker, and Loren Terveen. 2018. Value-Sensitive 2015), e0137001. https://doi.org/10.1371/journal.pone.0137001 Algorithm Design: Method, Case Study, and Lessons. Proceedings of the ACM [22] Dirk Ifenthaler and Monica W Tracey. 2016. Exploring the relationship of ethics on Human-Computer Interaction 2, CSCW (2018), 194. https://doi.org/10.1145/ and privacy in learning analytics and design: implications for the field of educa- 3274463 tional technology. Educational technology research and development: ETR & D 64, 5 (Oct. 2016), 877–880. https://doi.org/10.1007/s11423-016-9480-3 [23] Charles Lang, Leah P Macfadyen, Sharon Slade, Paul Prinsloo, and Niall Sclater. 2018. The Complexities of Developing a Personal Code of Ethics for Learning Analytics Practitioners: Implications for Institutions and the Field. In Proceedings of the 8th International Conference on Learning Analytics and Knowledge (LAK ’18). ACM, New York, NY, USA, 436–440. https://doi.org/10.1145/3170358.3170396 [24] P Long and G Siemens. 2011. Penetrating the Fog: Analytics in Learning and Education. Educause Review 46, 5 (2011), 30–32. [25] Paul Prinsloo and Sharon Slade. 2016. Student Vulnerability, Agency and Learning Analytics: An Exploration. Journal of Learning Analytics 3, 1 (2016), 159–182. https://doi.org/10.18608/jla.2016.31.10 [26] Paul Prinsloo and Sharon Slade. 2017. An Elephant in the Learning Analytics Room: The Obligation to Act. In Proceedings of the Seventh International Learning Analytics & Knowledge Conference (LAK ’17). ACM, New York, NY, USA, 46–55. https://doi.org/10.1145/3027385.3027406 [27] Paul Prinsloo and Sharon Slade. 2017. Ethics and Learning Analytics: Charting the (Un)Charted. In Handbook of Learning Analytics (first ed.), Charles Lang, George Siemens, Alyssa Wise, and Dragan Gasevic (Eds.). Society for Learning Analytics Research (SoLAR), 49–57. https://doi.org/10.18608/hla17.004 [28] Priscilla M Regan. 1995. Legislating Privacy: Technology, Social Values, and Public Policy. University of North Carolina Press, Chapel Hill, NC. [29] María Jesús Rodríguez-Triana, Alejandra Martínez-Monés, and Sara Villagrá- Sobrino. 2016. Learning Analytics in Small-Scale Teacher-Led Innovations: Ethical and Data Privacy Issues. Journal of Learning Analytics 3, 1 (2016), 43–65. [30] Carolyn Penstein Rosé, Oliver Ferschke, Gaurav Tomar, Diyi Yang, Iris Howley, Vincent Aleven, George Siemens, Matthew Crosslin, Dragan Gasevic, and Ryan Baker. 2015. Challenges and opportunities of dual-layer MOOCs: Reflections from an edX deployment study. In Proceedings of the 11th International Confer- ence on Computer Supported Collaborative Learning. (CSCL 2015). Vol. 2, Vol. 15. International Society of the Learning Sciences, Gothenburg, Sweden, 848–851. [31] Christian Sandvig, Kevin Hamilton, Karrie Karahalios, and Cedric Langbort. 2014. Auditing algorithms: Research methods for detecting discrimination on internet platforms. Data and discrimination: converting critical concerns into produc- tive inquiry, a pre-conference at the 64th Annual Meeting of the International Communication Association. [32] Niall Sclater. 2016. Developing a Code of Practice for Learning Analytics. Journal of Learning Analytics 3, 1 (2016), 16–42. https://doi.org/10.18608/jla.2016.31.3 [33] George Siemens. 2013. Learning analytics: The emergence of a discipline. The American behavioral scientist 57, 10 (2013), 1380–1400. [34] Sharon Slade and Paul Prinsloo. 2013. Learning analytics: Ethical issues and dilemmas. The American behavioral scientist 57, 10 (Oct. 2013), 1510–1529. https: //doi.org/10.1177/0002764213479366 [35] Dan Suthers and Katrien Verbert. 2013. Learning analytics as a “middle space”. In Proceedings of the Third International Conference on Learning Analytics and Knowledge - LAK ’13. ACM Press, New York, New York, USA, 1–4. https://doi. org/10.1145/2460296.2460298 [36] Luis M Vaquero and Manuel Cebrian. 2013. The rich club phenomenon in the classroom. Scientific reports 3 (2013), 1–8. https://doi.org/10.1038/srep01174 [37] Josine Verhagen, Lucie Dalibert, Federica Lucivero, Tjerk Timan, and Larissa Co. 2016. Designing values in an adaptive learning platform. In 2nd workshop on Ethics & Privacy in Learning Analytic at the 6th International Learning Analytics & Knowledge Conference. CEUR Proceedings, Edinburgh, UK, 1–5. [38] Kari Edison Watkins, Brian Ferris, Yegor Malinovskiy, and Alan Borning. 2013. Beyond Context-Sensitive Solutions: Using Value-Sensitive Design to Identify Needed Transit Information Tools. In Urban Public Transportation Systems 2013.