<<

1

Personalized Learning Technology and the New Behaviorism: Beyond Freedom and Dignity1

In 2016, Facebook founder Mark Zuckerberg announced that he would be joining the National

Science Foundation and Gates Foundations in spending hundreds of millions of dollars to support the development of technologies for “personalized learning.” These technologies promise to provide learners or users with interactive content that is tailored with precision to their preferences and previous behaviors—much like Facebook’s newsfeed does. And also like

Facebook, they accomplish this by using “big data” and “analytics,” recordings of clicking, scrolling and typing of many thousands of users. However, analytics experts, sociologists of technology and most recently, several former Facebook and Google executives understand these techniques as ultimately representing a sophisticated and highly effective form of Skinnerian operant conditioning—one that customizes Skinner’s “schedules of reinforcement” precisely to learners’ behavioral patterns. First explaining how this “new behaviorism” is instantiated in personalized learning design and technology, this paper then argues that such an approach is antithetical to the most basic priorities and purposes of education: Namely, to cultivate in students a sense of ownership in their own learning, and responsibility for their own behavior and its effects on others.

Introduction

Psychologist B.F. Skinner (1904-1990), inventor of “radical behaviorism,” the “operant conditioning chamber” and the “teaching machine,” has been making a comeback. His spirit is returning not as much to the laboratories and publications of academic psychologists as it is to

1 I owe a debt of gratitude to Leif Nelson of Boise State University and Ben Williamson of the for their careful reading and helpful feedback.

2

the corporate campuses of Silicon Valley and the world of Web platforms. Examples abound:

Psychology Today mentions Skinner in a 2014 article on addiction to Facebook and Instagram

(Muench 2014); a 2009 paper outlining the field of “Web analytics” (the analysis of users’ online activity) identifies Skinner’s behaviorism as the field’s sole “conceptual basis” (Jansen 2009, p.

7); educational technology company Dreambox proudly tells prospective customers that its

“Intelligent Adaptive Learning™” has its foundations in “the work of behaviorist B.F. Skinner in the 1950s” (Dreambox, 2018). Despite being widely spurned by educators, Skinner is now even being referenced by those developing and studying educational technologies. This is particularly the case for those working in the fields of learning analytics (using Web analytics to support

2 learning) or personalized learning (customizing instruction based on analytics). In surveying the

state of the cognate field of “educational data mining,” Bakhshinategh et al (2018) highlight

Skinner and his teaching machines as being important precursors.

Skinner’s behaviorism, his operant conditioning and teaching machine were all attempts, as the

title of Skinner’s own bestseller reads, to get Beyond Freedom and Dignity (1971). In getting

“beyond” freedom and dignity, Skinner was not hoping to resolve the challenges and paradoxes

of human liberty, autonomy and responsibility. Instead, he simply wanted to bid “autonomous

man”—the individual who understands his or her actions in terms of freedom and dignity—

“good riddance” (1971, p. 191). The one thing that mattered for Skinner was behavior, and

above all, the environment that he knew could powerfully control it. Skinner’s box (or operant

conditioning chamber) and his teaching machine were, in this sense, “technologies of behavior,”

designed environments for controlling behavior.

2 “Learning analytics refers to the collection of large volume of data about students in an educational setting and to analyse the data to predict the students' future performance and identify risk” (Kavitha & Raj, p. 21).

3

In the minds of many developers, data scientists and Silicon Valley CEOs, we are now all part of

such environments wherever we go and anytime we look at a screen. These environments are

formed by Facebook or Instagram, by the ubiquitous pings, tweets and vibrations of our

smartphones. In education, these behavior modification environments take the form of the

designs of the latest personalized or adaptive learning platforms. For the purposes of this paper,

such technologies are defined simply as algorithms operating on multiple forms of users’

behavioral (and other) data—both personal and aggregate—to predict and shape their future

behavior.3 Systems of this kind that can be used on a laptop or a tablet in a classroom, with

“AltSchool,” “Connections Learning” or the “Summit Learning” platform being among the most

popular of these platforms. But there is at least one difference separating these environments

from Skinner’s boxes and machines: They do not simply condition, reinforce and record a single

type of behavior in a single way. There is no lever to press, no food pellets or electric shocks to

receive. These systems are much more complex and responsive. Based on intricate patterns of

previous behavior—recorded and analyzed down to the minutest detail—these digital

environments will be able to intimately “personalize” the operant conditioning they perform.

They can then provide exactly the stimulus (or sequence of stimuli) shown to produce the precise response desired, whether this response is to click on micro-targeted advertising or to persist with a set of multiple choice questions. It is the capability of such systems to respond to and customize operant effects that most clearly separates these new techniques from Skinner’s radical behaviorism of the 1950s. This data-driven approach has been described by technophiles

3 Insofar as intelligent tutoring systems (e.g. ALEKS, Assessment and Learning in Knowledge Spaces system, formerly of UC Irvine) also use initial assessments or records of behavior to influence future behavior, they can be seen as also being underpinned by a behaviorist logic. However, given the particularized nature of the data used by ALEKS, it and similar systems are not specifically addressed in this paper.

4

as “behavior design” or “persuasive technology,” and by critics as the “new behaviorism”

(Watters, 2017)4 or simply “data behaviorism” (Rouvroy, 2015).

It is precisely such a data-driven system of behavioral persuasion that Mark Zuckerberg (founder

and CEO of Facebook) has already realized at stunning scale and efficiency with his social

media network. And it is something very similar that he and his venture philanthropy, the Chan-

Zuckerberg Education Initiative (CZI),5 are now planning for American schools and students— through investments of hundreds of millions of dollars (Reuters, 2018). “[O]ne major focus of the Chan Zuckerberg Education Initiative,” Zuckerberg himself explains, is to “bring

personalized learning to… schools…” to allow “[e]very student [to] learn in their own way at

their own speed in a way that maximizes their potential” (2016).

Other charitable organizations are aligning their funding similarly, while the US National

Science Foundation has likewise devoted hundreds of millions to related research. The broader

educational technology community—whether private and publicly funded—has responded in

kind, issuing urgent reports and white papers, setting up research centers and testing and

developing prototypes: From Stanford to Sydney, and Singapore to Sweden,6 from the Open

4 Not to be confused with discussions of “new behaviorism” which is critical of Skinner, and arose from the publication of: Staddon, J. (2014). The New Behaviorism 2nd Ed. New York: Psychology Press. 5 Mark Zuckerberg pledged to give away 99% of his Facebook stock in 2015, and founded the CZI as an LLC, rather than as a charity. This allows him to invest these funds in for-profit companies, as well as in not-for-profit initiatives (e.g., those of many university researchers). It also allows the CZI to engage in government lobbying (see: Dolan, 2015). 6 Singapore and Sweden are included given that they are both home to research labs of Advanced Distributed Learning, a large and ongoing research and development initiative that reports to the US Department of Defense. It, too, has identified “personalized and adaptive learning” as one of its key goals (e.g., see: https://www.nist.gov/sites/default/files/documents/ineap/ADL_IITSEC_flyer_FINAL_22Nov2011-1.pdf)

5

University (UK) to Pearson Publishing (the world’s largest educational publisher and test

provider (Bookscouter, 2016)—all have set up centers devoted to the study of the use of big data

in education (Williamson, 2017). Researchers speculate “that education, as both a research field

and as a professional practice, is on the threshold of a data-intensive revolution” (Knight &

Buckingham Shum, 2017, p. 17). One foundational report in the field, supported by the Gates and MacArthur Foundations—known as the Learning Analytics Working Group or LAW

Report—criticizes educators and their organizations of “driving blind” (Pea, 2014, p. 16).7

Unlike savvy businesses and corporations, education uses the weakest of “feedback loops to

evaluate the impact of ongoing practices or changes that are implemented in their practices” (Pea p. 16). While other researchers speak of “the learning analytics imperative and the policy challenge[s]” it presents (Macfadyen, Dawson, Pardo & Gašević 2014), the LAW report urges its

readers: “Failure to support this effort or delaying its initiation will hamper our country’s ability to provide personalized learning at scale to all students, with corresponding losses to the

intellectual diversity and value of our graduates to the workforce and society at large” (Pea,

2014, p. 12).

It goes without saying that such bold statements, plans and investments raise any number of

questions that extend well beyond matters of technology and scalability. These concerns include

widely discussed issues of privacy and data protection and they also include broad questions

about what education becomes when it is understood in terms of such technologies: What

happens to teachers’ responsibilities when they must work with technologies of much greater

7 Here and below, I take this early, bold and comprehensive report submitted by Roy Pea (Founding Co-Director of the Stanford LYTICS Lab), as broadly indicative of the concerns, ambitions and rationale of those involved in technology-enabled personalized learning as I have defined it above.

6 predictive power than they themselves possess? What happens to students’ ownership of their own learning when an automated process effectively tells them what, when and how they are to learn? Finally, what happens to freedom and dignity of both teachers and students in such a context? This paper seeks to address these questions, and, with the aid of Skinner’s arguments in

Beyond Freedom and Dignity, to explore the broader implications of the use of these technologies for education—understanding education as a particular configuration of roles, values and responsibilities. It begins, however, with a closer examination of the nature of these technologies, the “new behaviorism” that underlies them, the unprecedented nature and volume of the data they generate, and the way they are being adapted for education.

The New Operant Environment: Personalized Learning Technologies

In January of 2012, Facebook undertook a bold experiment on almost 700,000 of its unsuspecting users. It decided to make the posts in some subscribers’ newsfeeds more positive overall, and the posts seen by others more negative. Based on words appearing in each post,

Facebook automatically excluded those of one or another “emotional valence,” and found that users responded in kind: Unwitting subjects’ own status updates mirrored the negative or positive coloring of their newsfeeds. “Online messages,” the Proceedings of the National

Academy of Sciences concluded, “influence our experience of emotions, which may effect [sic] a variety of offline behaviors” (Kramer, Guillory & Hancock, 2014, p. 8790). Despite the modest effect size produced in the study (d=0.001), the result is not negligible, as the report emphasizes.

7

Given Facebook’s scale, in “early 2013, this [effect size] would have corresponded to hundreds

of thousands of emotion expressions in status updates per day” (p. 8790).8

We are naturally averse to thinking of ourselves as susceptible to manipulation of this kind. We

see ourselves as knowing more about the causes and effects of our own feelings and motivations

than some impersonal machine or algorithm. As Skinner would say, we would much rather

understand ourselves in terms of “purposes, deliberations, plans, decisions, theories, [in]tentions

and values” (1994, p. 8; emphasis in original)9 than in terms of the simple logic of conditioning

and response. However, like “freedom and dignity,” our values, decisions, self-understandings

and intentions are merely

the possessions of …autonomous man of traditional theory, and they are essential to practices in which a person is held responsible for his conduct and given credit for his achievements. A scientific analysis shifts both the responsibility and the achievement to the environment. (pp. 22-23) What Skinner’s theory proposes, however, is not so much a shift in either “achievement” or

“responsibility” as simply their abrogation. Humans and other “organisms” for Skinner are

basically the same; they are actors whose behavior is to be predicted and controlled through the

environment:

We undertake to predict and control the behavior of the individual organism. This is our “dependent variable”—the effect for which we are to find the cause. Our “independent variables”—the causes of behavior—are the external conditions of which behavior is a function. (1965, p. 35)

8 Facebook justifies its manipulation of unsuspecting users as follows: The experiment “was consistent with Facebook’s Data Use Policy, to which all users agree prior to creating an account on Facebook, constituting informed consent for this research” (Kramer, Guillory & Hancock, 2014, p. 8789). 9 Skinner is quoting from the italicized text of Popper, K. (1973). Of Clouds and Clocks: An approach to the problem of rationality and the freedom of man. In Popper, K. Objective knowledge: An evolutionary approach. (Pp. 206-255.) Oxford: Oxford University Press.

8

As Felix Stalder explains, seeing the environment and its power to condition as so dominant

leads “to the conviction that someone observing another's activity always knows more than the

latter does about himself or herself.” The consequences of this conviction, Stalder continues, are

clear: “Unlike the person being observed, whose impressions can be inaccurate, the observer is in

command of objective and complete information”—especially about environmental causes of

said person’s behavior (2018, p. 123). Of course, the personalized learning systems that are the

focus of this paper differ from Facebook or Google in their specific features and appearance.

However, in using records of diverse behavior to predict and shape future behavior, they can be

said to fall within the definition of the new behaviorism provided above. This also means that

learning analytics and data mining designers who may see themselves as embracing cognitivist

or constructivist theories of learning must nonetheless be regarded as leveraging techniques of

the new behaviorism in their systems and designs.

The range of behavioral data gathered by these technologies—which today includes clicking and scrolling, pausing and hovering, typing and swiping—is expanding. Facebook has recently filed patents for systems that detect “typing speed, movement (using accelerometers), location and other factors… [in order to] predict emotion and change the font text, size and probably just add a winky face or poo emoji” (Silver 2017, n.p.). Similar systems, which detect your emotions using cameras now built into most devices, could be used for much more than to predict emotion to enhance your latest update; they can track the details of your affective responses to posts and

of course, strategically placed advertising (Silver, 2017).

Developers of personalized learning software are eager not to be outdone. Inspired by Angela

Duckworth’s (2016) popular notion of grit or Carol Dweck’s growth construct (e.g.,

Dweck & Haimovitz, 2017; see also: Sisk et al, 2018), developers are integrating biometric

9 sensors in their systems in order to capture data-streams reflecting students’ physiological and emotional states and dispositions. One such online tutoring system is described in the 2013 “Grit

Report” as able to measure feelings and states ranging from “frustration,” “boredom” and

“fatigue” to “confidence,” “motivation” and “flow” (Schechtman et al, p. 44). This system uses a camera to monitor facial expressions, a “posture analysis seat” to detect one’s attitude, and a pressure sensitive mouse and a skin conductance sensor for frustration, stress and arousal

(Roberts-Mahoney, Means & Garrison, 2016, p. 412).

On the basis of the combination of data collected by such sensors and inputs, we can not only be conditioned to feel good or bad, but can be encouraged to stay on task (whether the task is

Facebooking or completing an assignment), or be given just the right mixture of challenges and intermittent rewards. Analytics and personal learning researchers see such interventions and inducements as relying on two particular “data models.” The first is a profile that can be built up of the emotional, demographic, academic and moment-by-moment interactions of an individual user. Serving as a kind “big data” equivalent of the “permanent” student record, this particular collection of data is known as the “learner model:”

For personalized learning, a pre-eminent objective is creating a model of the learner. What characteristics are important as predictors for what is appropriate to support the learner’s personalized progress? What are the classes of variables and data sources for building a… of the knowledge, difficulties, and misconceptions of an individual? (Pea, p. 24) The second model deals with a slice of “big data” that is somewhat more inscrutable: It has as its basis the aggregate traces and patterns, the behavioral “paths” followed by many thousands of users, each under recorded environmental conditions. Hundreds of thousands of students have already generated trillions of relevant “data points” in MOOCs (Massive Open Online Courses) and in analytics-driven projects around the world—in subject areas as diverse as mathematics,

10

science and second languages. Of course, the idea is not that these innumerable, intricate

moments and paths are to be directly analyzed by teachers—or by any other human beings, for

that matter. Instead, the literature of learning analytics and data mining speaks of the widest

range of automated techniques and constructs to be deployed and autonomously adjusted and

optimized based on the latest data and user responses (e.g., Macfadyen, Dawson, Pardo &

Gašević 2014). These range from “association analysis” and “cluster techniques,” through

“model building” and “model validation,” to “knowledge discovery” and “machine learning”

(Tan et al, 2019) These may sound exotic, but such technologies already underlie Google search

rankings and notifications, and the Netflix and Amazon recommendations that serve us every

day. For researchers in education, such techniques are used to construct what is known as

multiple “learning models”—with the aspiration that such models are to be interconnected and

“networked across and within the formal and informal learning and educational activities in

which any learner participates” (Pea, p. 35). Researchers and developers envision that on the

basis of such models, teachers will be provided with simplified notifications, recommendations

and rankings—with a “dashboard” offering various data visualization and report functions (e.g.,

see: Liebowitz 2018).

The Annus Horribilis of High Tech

Just as the visions of researchers were coming together with the lavish support of venture

philanthropists, however, history intervened. In the wake of the Brexit vote and the election of

Donald Trump—and the dubious role that social media played in both—Silicon Valley and high- tech platforms have been subject to unprecedented suspicion, scrutiny and critique (e.g., Elgan,

2017). 2017 has been declared Silicon Valley’s “annus horribilis” (Solon, 2017). Meanwhile, the Cambridge Analytica scandal—which forced Zuckerberg to appear in the US Congress and

11 the European Parliament (e.g., see: Meredith 2018)—has so far made 2018 even worse. In this context, questions and concerns about any number of digital visions and realities have gained an unprecedented urgency. A central target of this widespread critique, unsurprisingly, has been the intentionally addictive behaviorist designs used by the likes of Facebook and Google. High profile defectors like Tristan Harris (former Google Design Ethicist) or Chamath Palihapitiya

(former Facebook executive) started to speak up in late 2017 about the difficulties that they had earlier helped to create: Harris warns that “people’s minds are already jacked into [an] automated system…. steering [them] toward either personalized paid advertising or misinformation or conspiracy theories” (as quoted in: Thompson, 2017, n.p.). Palihapitiya is much more strident:

“The short-term, dopamine-driven feedback loops that we have created, are destroying how society works. It [sic] is eroding the core foundations of how people behave… [with] each other”

(as quoted in: Wong, 2017, n.p.). The solutions that these experts offer, however, are not particularly encouraging: Harris suggests that high-technology corporations voluntarily facilitate what he calls “Time Well Spent” for users (Thompson); Palihapitiya’s simply boasts: “I can control my decision, which is that I don’t use that sh** [sic]” (as quoted in Wong, n.p.)

Fortunately, educators are not yet in the position of needing to wean students from personalized learning apps or of asking corporations like Pearson to voluntarily change their business models.

We are thankfully still in a position to debate the terms for the use (or non-use) of personalized learning technologies. As mentioned before, some in the learning analytics community have suggested responsible policy prescriptions for privacy, data protection and ownership, asking, for example, that “transparent learner models” be made available teachers, parents and learners themselves (Pea, p. 33; Hildebrandt 2017, pp. 18-19). As a further example, Slade and Prinsloo

(2017), have proposed that a type of “information justice” be brought to bear on big data

12 initiatives for learning: “Our collection methods, algorithms and decision-making structures should be responsive and sensitive to student contexts as dynamic and multidimensional” (p.

115) Like others (e.g., Scholes, 2016; Fritz, 2017), Prinsloo and Slade draw attention to questions of possible student discrimination, suggesting measures such as the possibility of redress in case of harm, the protection of students’ right to opt out, and the enforced expiration of certain types of data (Slade and Prinsloo, 2013). I believe, however, that it is important to go much further, to question the very fitness of these systems to the aspirations, values and purposes of education.

Teachers versus Algorithms

It is easy to assume, as policy and reform discussions often do, that education is principally if not solely focused on the preparation of students for the workforce. Personalized learning designers and advocates are no different. For example, one recent survey of reports, white papers and research publications on personalized learning concludes that “the main justification offered” in these documents “is to better prepare a diverse population of students for success in the twenty first century workforce” (Roberts-Mahoney, Means & Garrison, 2016, p. 417). While this kind of preparation is obviously important, focusing on this goal to the exclusion of the variegated, interpersonal processes and experiences that also constitute education results in a distorted, truncated and foreshortened view. One’s education is hardly a linear progression from early childhood to the workforce, or a seamless interlocking of means and ends. Instead, a person’s youth and education reflects the integration of and tension between the widest range of moments, interests and purposes. As a result, we hold our educational institutions to account for much more than the eventual employment of the children and young people placed in their hands.

13

Both parents and students, for example, are concerned about much more than effective and

efficient learning in classrooms and lecture halls. They hold educators to account not only for

student’s safety and well-being, but also for educators’ instructional and above all, evaluative

decisions. Although advocates and designers insist that “teacher agency will be vital in a

personalized-learning-enabled educational system” (Pea, p. 40), they don’t consider in any concrete terms how this still vital agency will be configured. Where will the line be drawn between teacher’s responsibility and the “objective” predictive power of systems? Indeed, given

that the behaviorist logic behind such systems asserts knowledge of your behavior better than

you yourself have, on what basis will teachers sustain claims to their own authority and

expertise? Will the teacher only be answerable for a diplomatic or euphemized account of a

system’s authoritative data visualization and reporting functions? And how will parents’ or

students’ questions about such functions themselves be addressed? Indeed, what answers would

there be for processes that are themselves automated and beyond the analytic capabilities of their

human creators? Although certainly not all instructional and assessment tasks are to be assigned

to personalized learning systems, these questions remain urgent to whatever extent these are thus

outsourced. Clearly, a satisfying answer from the thousands of students’ aggregate behavioral

data—or from the developers of the algorithms processing these—would almost certainly not be

forthcoming. Instead, at best, the school would have to point to an organization, likely a

privately-owned corporation, with its own business goals and priorities, and of course, with its

own public relations and support apparatus.

Given their inscrutability to their human creators, there is the additional danger that these metrics

and visualizations will be “reified” or “hypostatized”—that is, given a weight and understood in

ways that do not recognize their highly complex, constructed and contingent nature. There is of

14

course the familiar problem of such metrics and indicators excluding or occluding things that cannot be measured—leading stakeholders to forget “that the information [made available] is at best a partial representation of what one wishes to know about” (Clayton & Halliday, 2017, p.

298). In recent years, reports and studies have produced no shortage of evidence of further problems produced by a similar forgetting or “reification” of the contingent nature of metrics and outcomes in conjunction with No Child Left Behind and similar policies of high stakes testing

Clayton & Halliday 2017). These range from teachers or whole schools manipulating test scores

(e.g., Porter 2015) to students spending time in pep rallies to relieve stress and focus energies on test preparation (e.g., Richardson 2017). It is easy to envision similar energies and strategies being directed to the improvement of specific metrics and valuations appearing on teachers’ or

principals’ dashboards for metrics whose causal grounding is well beyond the scope of human explanation.

Still further challenges are presented by the “politics” of algorithms. Both researchers and the public have been increasingly aware that existing systems working with big data are no less liable to prejudice and bias than their human designers. For example, Google has been known to classify photos of black people as gorillas (Kasperkevic 2015), and to provide links to “big booty” and other explicit sites when queried about “black girls” (Noble 2018). Of course, many other possibilities for prejudice and bias are less egregious, but no less problematic, and perhaps more insidious. What weightings and correlations will data mining and machine learning technologies learn to give to students whose learner models contain racial indicators? For

example, how will machine learning techniques encode the correlations between certain ZIP

codes and student outcomes—perhaps then applying these to other data operations? What

recommendations and decisions will these weightings and correlations then contribute to? As

15

Clayton and Halliday point out, such “causal feedback loops” can readily occur in analytics

systems when “the use of an algorithm to process data… eventually shape[s] the kind of data that

the same algorithm continues to process later on… perhaps without anyone noticing” (p. 296).

Again, the inscrutable complexity of big data and its processing would make it extremely difficult, if not impossible, to understand how—or why—it may be disadvantaging certain students.

Students versus the Machine

Conceiving of education just as a set of means for attaining the ultimate end of student employment produces still further truncations and distortions when compared to complex educational realities. As young people move through K-12 schooling and enter into tertiary education, they accomplish much more than simply a preparation for successful employment.

Over these many years, what were once little Kindergarteners reach the age of majority and become adults who are expected to be responsible. Graduation is not only the time that most students leave school, it is also the first time that they are able to vote, run for public office, enter into contracts, and to initiate civil proceedings. It is no surprise, then, that one of the central concerns of education, particularly at high school and undergraduate levels, is the cultivation of

“personal” or “student responsibility.” There is a vast and growing literature on the subject (over

2300 articles according to eric.ed.gov), and it is closely tied to expanding areas of educational

psychology such as “self-determination” and “self-regulation.” How does this literature define

responsibility and what does it say about “teaching” it? First, according to one recent definition,

personal responsibility for students is defined by a small number of factors related to self-

awareness and self-control: “an awareness of, and control over, choices made regarding

behavior; a willingness to be accountable for the behavior enacted and the resulting outcome;

16

and an awareness of, and concern for, the impact of one’s behavior upon others” (Mergler, 2017,

pp. 256-257).

The ways that students gain such awareness and control over their behavior converge in the

following understanding: To become responsible, one must be treated by the authors as if one

actually is responsible. Students become responsible for their own learning, as the familiar

argument goes, when there are high expectations that they be thus responsible (e.g., Rutledge,

Cohen-Vogel, Osborn-Lampkin & Roberts, 2015; see also: Collier, 2005; Rutledge & Cannata,

2016). One landmark study investigating disciplinary methods in Australia, China, and Israel—

despite finding considerable variation—concludes that teachers should use what it calls

“inclusive techniques” of “Discussion, Involvement and Recognition” to engender student

responsibility (Romi, Lewis & Katz, 2009). Quoting Gary Fenstermacher (2001), this study says that teachers should provide “a model” of discussion, recognition and inclusion “that students will accept as ‘a standard for how things will be” (p. 451; emphasis added). Behavioral contracts

(anticipating later legal contracts), individual conferences and careful follow-up are also prominent themes of these types in other investigations (e.g., Frank & Scharff 2013; Gibney et al

2017). Finally a number of recent studies even recommend a type of “personalization” in teaching that they see as critical to the development of personal responsibility. Of course, this is hardly the kind of algorithmic personalization advocated in personalized learning, but rather a kind of inter-personalization—cultivated by the teacher to help “students develop their own sense of efficacy, [by] giving students opportunities to make responsible choices” (Gibney et al

2017, p. 130). Personal responsibility emerges through a kind of self-perpetuating circle or cycle that is perhaps familiar from how young people learn about relationships, violence or care: It arises interpersonally, by placing the young person in a position to exercise a type of behavior

17

already visibly modeled by adults. We learn about love and anger, dignity and disgrace, responsibility and freedom not through abstract lessons, but by living through their enactment by others around us and by experiencing them directly ourselves.

Conclusion: Returning to Freedom and Dignity

This brings us back to the central concern of this paper: freedom and dignity. Personalized learning technologies represent on many points the rupture, corruption or subversion of the virtuous circle through which responsibility emerges. I’ve already shown in practical terms how these technologies threaten, disrupt and undermine the responsibility and authority of the teacher and the school as a whole and result in the diremption of responsibility for any one decision, grade or policy to the inscrutable mechanisms of some far-away process or entity. An identical dynamic would apply on the level of individual students and their relation to their own technology-enabled personalized learning. We may well ask: In such a relation between student and system, is it the student or is it the machine that is responsible for learning and academic

advancement? Is a student’s inability to learn or advance academically their own fault or that of fallible algorithms and decision systems—regardless of the relative degree to which they are used? Given its power to know and control the behavior or feelings of students better than the

students themselves, what is the status of the student’s own self-awareness, self-knowledge and

self-control in relation to this machine?

Big data systems, with their demonstrated ability to “know” us better than ourselves, are

certainly not invisible or neutral. They do not simply deliver updates or customized learning

material and then disappear from awareness. By the simple fact of their ongoing operation, they

at once powerfully embody and demonstrate the authority and efficacy of the very means by

which they operate. Clearly this authority and efficacy has captured the minds of advocates of

18

personalized learning and their funders. Something similar would surely happen on the part of

students—in the same way that memories of teachers, pedagogical moments and other school experiences remain with us all today.

What message would these technologies and their powerful operation communicate to students?

At least in part, it would be the message that the calculated manipulation of operant conditioning

works. It would also be the message that any one student and others are being powerfully—if

often unwittingly—determined by this conditioning. It is further the message that such

calculations hold the key to similarly affecting others. It is thus a message of depersonalization at

a level much deeper than curricular competency or workforce placement. It is the message, as

Skinner says, that our “purposes, deliberations, plans, decisions, theories, [in]tentions and

values” are to be dismissed along with the fiction of the “autonomous man.” It is the message

that, as Skinner believed, any one person cannot be “held responsible for his conduct and given

credit for his achievements” (1971, pp. 8, 23). But as Skinner says—and some advocates of

personalized learning seem to agree—technological, economic and other exigencies now make

such options untenable.

I am naturally not advocating against any and all use of data in an effort to administer resources

in schools and divisions more efficiently. Nor am I arguing against claims by experts in

personalized learning that their systems may enhance students’ achievement of learning

outcomes (as tentative as they may now be). My argument is not one about the outputs that

technologies may produce for learning, but simply about how, and at what expense, they may be

achieved. Ultimately, I am making a simple point: Education, in its better moments, has been a

human activity which grants to those not yet mature freedom and dignity in the hope that these

might take root. Such qualities are naturally valued in the workplace; but they are also the

19 foundation of our governments and legal systems, and they are integrated into many, if not all, aspects of everyday life in civil society. Any measures contrary to these values and priorities are ultimately contrary to the purposes of education itself. To encourage children to gradually become free and responsible adults is ultimately to “drive blind,” to use Pea’s expression. It is to go well beyond the cynicism inherent in any deterministic or probabilistic calculus, and to take significant risks on matters that, by their very nature, are beyond calculation and control.

20

References:

Arendt, H. (1954). The crisis in education. In Between Past and Future (pp. 173-196). London: Faber & Faber. Bakhshinategh, B. Zaiane, O.R., Elatia, S. (2018). Educational data mining applications and tasks: A survey of the last 10 years. Education and Information Technologies, 23(1), pp 537–553. https://doi.org/10.1007/s10639-017-9616-z Bookscouter. (Jun 28, 2016). The Biggest Educational Publishers. Retrieved from: https://bookscouter.com/blog/2016/06/the-biggest-textbook-publishers Clayton, M. & Halliday, D. (2017). Big data and the liberal conception of education. Theory and Research in Education 15(3) 290-305. Collier, M. D. (2005). An ethic of caring: The fuel for high teacher efficacy. The Urban Review, 37(4), 351–359. https://doi. org/10.1007/s11256-005-0012-4 Dolan, K.A. (Dec. 4, 2015). Mark Zuckerberg Explains Why The Chan Zuckerberg Initiative Isn't A Charitable Foundation. Forbes. https://www.forbes.com/sites/kerryadolan/2015/12/04/mark-zuckerberg-explains-why-the- chan-zuckerberg-initiative-isnt-a-charitable-foundation/#4fc0204670c5 Dreambox. (2018). Adaptive Learning. Accessed Feb. 26, 2018. http://www.dreambox.com/adaptive-learning Duckworth, A. (2016). Grit: The power of passion and perseverance. New York: Simon and Schuster. Elgan, M. (September 27, 2017). Why the public’s love affair with Silicon Valley might be over. Fast Company. Retrieved from: https://www.fastcompany.com/40472189/why-the- publicslove- affair-with-silicon-valley-might-be-over Fenstermacher, G.D. 2001. On the concept of manner and its visibility in teaching practice. Journal of Curriculum Studies 33(6), 639-653. Frank, T. and Scharff, L.F.V. (2013). Learning contracts in undergraduate courses: Impacts on student behaviors and academic performance. Journal of the Scholarship of Teaching and Learning, 13(4), 36-53. Fritz, J. (2017). Using Analytics to Nudge Student Responsibility for Learning. New Directions for Higher Education, 179, 65-75. Gibney D.T., Preston, C., Drake, T.A., Goldring E. & Cannata, M. (2017). Bringing student responsibility to life: Avenues to personalizing high schools for student success, Journal of Education for Students Placed at Risk, 22(3), 129-145, DOI: 10.1080/10824669.2017.1337518

21

Hildebrandt, M. (2017). Learning as a machine: Crossovers between humans and machines. Journal of Learning Analytics, 4(1), 6–23. http://dx.doi.org/10.18608/jla.2017.41.3 Jansen, B.J. (2009). Understanding user–web interactions via web analytics. Williston, VA: Morgan & Claypool. K. Haimovitz, C. S. Dweck. (2016). Parents’ views of failure predict children’s fixed and growth intelligence mind-sets. Psychological Science, 27(6), 859-869. DOI: 10.1177/0956797616639727 Kasperkevic, J. (July 1, 2015). Google says sorry for racist auto-tag in photo app. The Guardian. Retrieved from: https://www.theguardian.com/technology/2015/jul/01/google-sorry- racistauto- tag-photo-app Knight, S. & Buckingham Shum, S. (2017). Chapter 1: Theory and learning analytics. In: Lang, C. Siemens, G. Wise, A. & Gašević, D. (Eds.). Handbook of Learning Analytics. (Pp. 17- 22). Accessed Feb. 26, 2018 from: https://solaresearch.org/wpcontent/ uploads/2017/05/hla17.pdf Koerrenz, R. (2017). Existentialism and Education: An Introduction to Otto Friedrich Bollnow. London: Palgrave. Kohn, A. & Strauss, V. (February 24, 2015). Four reasons to really worry about “personalized learning.” Washington Post. https://www.washingtonpost.com/news/answersheet/ wp/2015/02/24/four-reasons-to-seriously-worry-about-personalizedlearning/? utm_term=.60782b5a147b Kramer, A.D.I., Guillory J. & Hancock, J.T. (2014). Experimental evidence of massive-scale emotional contagion through social networks. Proceedings of the National Academy of Sciences. 111 (24) 8788-8790. https://doi.org/10.1073/pnas.1320040111 Liebowitz, J. (2018). Thoughts on recent trends and future research perspectives in big data and analytics in higher education. In: B.K. Daniel (Ed.), Big data and learning analytics in higher education. (pp. 7-17). New York: Springer.

Macfadyen, L.P., Dawson, S., Pardo, A., and Gašević, D. (2014). Embracing big data in complex educational systems: The learning analytics imperative and the policy challenge. Research & Practice in Assessment 9, 17-28. Meredith, S. (April 10, 2018). Facebook-Cambridge Analytica: A timeline of the data hijacking scandal. CNBC. https://www.cnbc.com/2018/04/10/facebook-cambridge-analytica-a- timeline-of-the-data-hijacking-scandal.html Mergler, A. (2017) Personal responsibility: an integrative review of conceptual and measurement issues of the construct, Research Papers in Education, 32(2), 254-267, DOI: 10.1080/02671522.2016.1225801

22

Muench, F. (March 18, 2014). The New Skinner Box: Web and Mobile Analytics. Psychology Today. https://www.psychologytoday.com/blog/more-tech-support/201403/the- newskinner- box-web-and-mobile-analytics Noble, S.U. (2018). Algorithms of oppression: How search engines reinforce racism. New York: NYUPress. Oremus, W. (November 10, 2017). Addiction for fun and profit. Slate Magazine. http://www.slate.com/articles/technology/technology/2017/11/facebook_was_designed_to _ be_addictive_does_that_make_it_evil.html Pea, R. (2014). The learning analytics workgroup: A report on building the field of learning analytics for personalized learning at scale. Palo Alto: . Porter, E. (March 24, 2015). Grading teachers by the test. . https://www.nytimes.com/2015/03/25/business/economy/grading-teachers-by-the- test.html Prinsloo, P. & Slade, S. (2017). Big data, higher education and learning analytics: Beyond justice, towards an ethics of care. In: B.K. Daniel (Ed.) Big Data and Learning Analytics in Higher Education. (Pp. 109-124). New York: Springer. Reuters. (March 1, 2018). Mark Zuckerberg sold $500 million worth of Facebook stock in February to fund his philanthropy efforts. http://www.businessinsider.com/mark- zuckerberg-sells-500-million-stock-charity-2018-3 Richardson, W. (April 15, 2017). A pep rally for tests? What we need is a ‘prep’ rally. Huffington Post. https://www.huffingtonpost.com/will-richardson/standardized-testing- prep-rally_b_848662.html Roberts-Mahoney, H., Means, A.J. & Garrison, M.J. (2016). Netflixing human capital development: personalized learning technology and the corporatization of K-12 education Journal of Education Policy 31(4), 405–420. http://dx.doi.org/10.1080/02680939.2015.1132774 Romi, S., Lewis, R. & Yaacov J. Katz, Y.J. (2009). Student responsibility and classroom discipline in Australia, China, and Israel. Compare, 39(4), 439-453. DOI: 10.1080/03057920802315916 Rouvroy, A. (2015). The end(s) of critique: data behaviorism versus due process. In M. Hildebrandt & K. De Vries: Privacy, Due Process and the Computational Turn: The Philosophy of Law Meets the Philosophy of Technology. London: Routledge. Rutledge, S.A. & Cannata, M. (2017). Identifying and understanding effective high school practices. Phi Delta Kappan, 97(6), pp. 60-64. Rutledge, S.A., Cohen-Vogel L. Osborne-Lampkin, L. Roberts, R.L. (2015). Evidence for personalization for academic and social emotional learning. American Educational Research Journal, 52(6), pp. 1060–1092. DOI: 10.3102/0002831215602328

23

Shechtman, N., A. H. DeBarger, C. Dornsife, S. Rosier, and L. Yarnall U.S. Department of Education, Office of Educational Technology. (2013). Promoting Grit, Tenacity, and Perseverance: Critical Factors for Success in the 21st Century. SRI International website: http://www.ed.gov/edblogs/technology/files/2013/02/OET-Draft-Grit-Report-2-17-13.pdf Scholes, V. (2016). The ethics of using learning analytics to categorize students on risk. Education Technology Research and Development 64(5), pp. 939–955. DOI 10.1007/s11423-016-9458-1 Silver, C. (June 8, 2017). Patents reveal how Facebook wants to capture your emotions, facial expressions and mood. Forbes Magazine. Retrieved from: https://www.forbes.com/sites/curtissilver/2017/06/08/how-facebook-wants-to- captureyour- emotions-facial-expressions-and-mood/#5b95d486014c Sisk, V.F., Burgoyne, A.P. Sun, J., Butler, J.L. & Macnamara, B.N. (2018). To What Extent and Under Which Circumstances Are Growth Mind-Sets Important to Academic Achievement? Two Meta-Analyses. Psychological Science 29(4) 549–571. Skinner, F.B. (1965). Science and Human Behavior. New York: The Free Press. Skinner, B.F. (1971). Beyond Freedom and Dignity. New York: Bantam/Vintage. Slade, S. & Prinsloo, P. (2013). Learning analytics: Ethical issues and dilemmas. American Behavioral Scientist, 57(10) 1510–1529 DOI: 10.1177/0002764213479366 Solon, O. (December 23, 2017). Tech's terrible year: how the world turned on Silicon Valley in 2017. The Guardian. Retrieved from: https://www.theguardian.com/technology/2017/dec/22/tech-year-in-review-2017 Stalder, F. (2018). The digital condition. London: Polity. Tan, P-N., Steinbach, M., Karpatne, A. & Kumar, V. (2019). Introduction to data mining. New York: Pearson. https://www-users.cs.umn.edu/~kumar001/dmbook/index.php Thompson, N. (September 26, 2017). Our minds have been hijacked by our phones. Tristan Harris wants to rescue them. Wired Magazine. Retrieved from: https://www.wired.com/story/our-minds-have-been-hijacked-by-our-phones-tristan- harriswants- to-rescue-them/ Watters, A. (2017). Education technology and the new behaviorism. Hack Education. Accessed Feb. 26, 2018 from: http://hackeducation.com/2017/12/23/top-ed-tech-trends- socialemotional- Learning Watters, A. (February 16, 2018). The Tech 'Regrets' Industry. Hack Education. Retrieved from: http://audreywatters.com/2018/02/16/the-regret-industry Williamson, B. (2017). Who owns educational theory? Big data, algorithms and the expert power of education data science. E-Learning and Digital Media, 14(3) 105–122. DOI: 10.1177/2042753017731238

24

Wilson A, Watson C, Thompson TL, Drew V & Doyle S (2017). Learning analytics: challenges and limitations, Teaching in Higher Education, 22 (8), pp. 991-1007. Wong, J.C. (December 12, 2017). Former Facebook executive: social media is ripping society apart. The Guardian. Retrieved from: Zuckerberg, M. (May 4, 2016). Post on Facebook. Retrieved from: https://dede. facebook.com/zuck/posts/10102815508091011