Cold War Games: Operational Gaming and Interactive Programming in Historical and Contemporary Contexts

by

Matthew Jason Wells

A thesis submitted in conformity with the requirements for the degree of Doctor of Philosophy Faculty of Information University of Toronto

© Copyright by Matthew Wells 2017

Cold War Games: Operational Gaming and Interactive Programming in Historical and Contemporary Contexts

Matthew Wells

Doctor of Philosophy

Faculty of Information University of Toronto

2017 Abstract

With the emergence of STEM-centric educational strategies, particularly those that promote what

is called "computational thinking," computer programming is being marketed as a critical skill with a fervour not seen since the 1980s. Products such as the Raspberry Pi computer and

Arduino circuit board are very deliberately marketed as DIY computing tools, inheritors of the legacy left by companies such as Commodore and Tandy Corporation. At the same time, new programming languages and systems have emerged that very much target new learners, with

MIT's Scratch system being foremost among them.

The present study traces the historical roots of DIY computing by focusing on the concept of interactivity, and especially interactive forms of digital gaming. Interactivity may take a variety

of forms, but due to certain forces discussed in this study, a particularly limiting type of

interactivity was picked up and popularized via the earliest personal computers. This model

sharply divides the roles of programmer and user, and reduces the user largely to a supplier of data for pre-existing programs and games.

ii

This study will discuss and analyze a variant form of interactive computing that emerged in the late 1970s, and is typically known as the Interactive Programming System (IPS). In an IPS, developers and users occupy the same environment, thus giving users virtually full access to code. The IPS model, moreover, encourages all users to develop code "ecosystems," which may contain variables, functions, objects, and relationships between them all. For various reasons,

IPSs only had limited success, but this study will advocate for their continued use by presenting a simple IPS developed by the author of this thesis called Hail-Workshop. Designed as a platform for text-based games, Hail-Workshop allows users to develop such games in piecemeal fashion, and thus treats game players the same as game developers.

iii

Acknowledgments

This dissertation would simply not exist were it not for the support, generosity, and understanding of individuals too numerous to name here (though I will try). First and foremost would be my wife, Emily Wells. With a level of care and generosity that I strive to match, but could never equal, Emily has helped guide my way through academia, championing every success and consoling me with every setback. Though a doctoral degree requires hard work and sacrifice, I am here because she worked hard, and she made sacrifices, and unconditionally supported my efforts. Emily, I know I do not say this often enough, but you coming into my life has brought me a level of joy that is much more than I ever expected or deserved. I will strive to repay my debts to you for many years to come.

Almost five and a half years ago, moreover, our little family expanded with the arrival of our daughter, Riley. I began my PhD when Riley was not even one year old, and leave with her on the cusp of entering first grade. Her presence throughout has kept me grounded (as well as entertained), and reminded me of the importance of making time for family. Riley is not only a great kid, she is an admirable person, and has enriched our lives in ways that words cannot ever hope to describe.

My years at the Faculty of Information (a.k.a. the "iSchool") have been memorable, and overwhelmingly positive. I must begin here by thanking my committee – Professors Sara Grimes, Alan Galey, and Siobhan Stevenson – which I took to calling the "dream team" upon its formation, a sobriquet that was entirely appropriate from the start of this process to the finish. Each has helped out in so many ways, even before I entered into the PhD program.

Siobhan was my academic advisor when I started at the iSchool, and early on she recommended that I should seriously consider going into the doctoral program. At a time when my academic career was uncertain, such encouragement was instrumental in keeping me motivated and focused. Alan helped me reconnect with older interests – specifically, coding and history – that I had largely given up, and opened pathways through which both of these pursuits could be joined for both scholarly and hobbyist pursuits. And Sara, who years ago responded so positively to a somewhat rambling email in which I outlined a potential game-related research project, opened

iv my eyes to the vast libraries of critical scholarship that provided the foundation for much of my doctoral work. Her course on research methods that I took as the Master level was a transformative event in my academic career. Sara has since guided my efforts with care and consideration, donating her time generously and providing reassurance at the most stressful of times. These are merely examples of how each of these distinguished scholars (and people) has shepherded me along this path. A proper elaboration of their efforts would require much more space than is provided here.

I would be remiss if I did not also thank Carleton University professor Marc Saurette. Marc was my first true academic mentor, and always pushed and challenged me to do better. Marc also impressed upon me the sanctity of academic research (and teaching), giving purpose to my initially haphazard academic efforts. Our in-class and in-office discussions remain some of my most cherished memories of student life. Though academically I ended up going in a different direction, the research (and life) skills Marc impressed upon me continue to resonate.

Returning to family, I have long been blessed with parents and siblings who unfailingly cared about and supported me even in the toughest of times. This includes my father, George Wells, his partner, Karen Arsenault, my brother Jamie Wells and sister-in-law Karen Chang, my sister Laura Wells and brother-in-law Brendan Warren, and of course my late mother, Geraldine Wells. My last in-person conversation with my mom was when I announced that I had been accepted into the iSchool PhD program. Though illness took her away from us far too early, in my mind and heart her love and support will never wane.

I have many friends and peers to thank for helping me along this path. This list includes (in no particular order): Chris Young, Dan Southwick, Hervé St-Louis and the rest of the 2012 PhD cohort, Andy Keenan, Matt Bouchard, Jenna Jacobson, Jack Jamieson and the rest of the Semaphore gang, Rhon Teruelle, Corinna Prior, Abraham Plunkett-Latimer, and Meagan Gilpin.

Finally, I would like to thank the Ministry of Advanced Education and Skills Development of the Government of Ontario, who through the Ontario Graduate Scholarship Program provided a significant amount of funding for my research. Thanks as well go to the Faculty of Information and the School of Graduate Studies (University of Toronto), who also provided pivotal support.

v

Table of Contents

Acknowledgments...... iv

Table of Contents ...... iv

List of Figures ...... x

List of Appendices ...... xii

Introduction ...... 1

Background ...... 8

The Current Study ...... 14

Chapter Overview ...... 19

1 Literature Review ...... 22

1.1 Game Studies ...... 23

1.1.1 Gaming Histories ...... 25

1.1.2 Constraints and Creativity...... 25

1.2 Amateur Computing: Hobbyist and Pedagogical Perspectives ...... 29

1.3 Pedagogical Computing ...... 31

1.4 The Cold War and Computing ...... 34

1.4.1 Technology-Centric Cold War Histories ...... 34

1.4.2 History of Interactive Computing ...... 37

1.4.3 Critical Approaches to Technology ...... 39

1.4.4 SCOT and ANT ...... 40

1.4.5 Software Studies and Critical Code Studies ...... 42

1.5 Technology as Text ...... 46

1.6 Applying Technology as Text ...... 48

1.6.1 Ethnographic Content Analysis/Qualitative Document Analysis ...... 48

vi

1.6.2 Design Worlds ...... 50

1.6.3 Model Building ...... 52

1.6.4 Conclusion ...... 53

2 Origins: Computation, Mathematical Modeling, and the Emergence of Operational Gaming ...... 55

2.1 Human Computers, Skilled and Unskilled ...... 58

2.2 ENIAC and Whirlwind ...... 64

2.3 Systems Analysis and the RAND Corporation ...... 70

2.4 Systems Analysis and Monte Carlo ...... 72

2.5 Emergence of Operational Gaming ...... 75

2.6 The World in the Computer ...... 79

3 Antecedents: Interactive Computing, Decision Simulations, and the Spread of Operational Gaming ...... 81

3.1 Digital Operational Gaming and Network Computing ...... 83

3.2 JOSS and Real-Time Gaming ...... 88

3.3 Operational Gaming Outside RAND ...... 94

3.3.1 Early Work ...... 94

3.4 Time-Sharing and Command-Line Games ...... 99

4 Hobbyist Programming and Hobbyist Gaming ...... 107

4.1 Worlds of BASIC ...... 110

4.2 Beyond Hammurabi ...... 115

4.3 Arcades, Consoles, and COMPUTE!...... 123

4.4 The Limitations of BASIC and the End of the Type-In Era ...... 128

4.5 From BASIC to HTML5 ...... 130

5 Building Blocks: An Analysis of Interactive Programming Systems ...... 134

5.1 Programming Systems vs. Programming Languages ...... 137

vii

5.2 Worlds of Lisp ...... 142

5.3 IPSs, in Theory...... 148

5.4 IPSs, in Practice: Xerox Star, Mesa, and Smalltalk ...... 152

5.5 Towards a Modern IPS ...... 157

6 Gaming the IPS Paradigm: Hail Workshop ...... 159

6.1 Games as Interactive Programming Systems ...... 161

6.2 Benefits of IPS Gaming ...... 167

6.2.1 Programming as Gaming ...... 167

6.2.2 Gaming as World Building ...... 170

6.3 The Hail Programming System ...... 172

6.3.1 Programming Languages and Tools ...... 174

6.3.2 The Hail Language ...... 175

6.3.3 The Hail Workshop Environment ...... 178

6.3.4 Hammurabi in Hail ...... 179

6.4 Beyond Hail-Workshop ...... 183

Conclusion ...... 185

Gaming, Graphics, and Operational Models ...... 194

Why RAND? ...... 197

Role-Playing Games...... 201

The Future of Hobbyist Programming ...... 205

References or Bibliography ...... 213

Appendix A: Using Hail-Workshop ...... 241

A.1 Hail ...... 241

A.1.1 Objects and Subroutines ...... 241

A.1.1 Architecture ...... 242

viii

A.1.2 Licence ...... 242

A.2 Getting Started ...... 242

A.2.1 Focus ...... 243

A.2.2 Bring to Front ...... 243

A.2.3 Moving ...... 243

A.3 Commands and Code ...... 244

A.3.1 Entering Commands...... 244

A.3.2 Entering Code ...... 244

A.3.4 Running Code ...... 245

A.4 Saving and Loading Images ...... 246

A.5 The Hail Programming Language ...... 246

A.5.1 Creating Objects and Instances ...... 246

A.5.2 Creating and Executing Subroutines ...... 248

A.5.3 Language Reference ...... 250

Appendix B: Historical Timeline ...... 253

ix

List of Figures

Figure 1. Depth Charge game, sample output...... 3

Figure 4-1. HMRABI, sample input and output (101 BASIC, 1973/1975, p. 128)...... 112

Figure 4-2. KING, sample input and output (101 BASIC, 1973/1975, p. 129)...... 113

Figure 4-3. CIVILW, sample input and output (101 BASIC, 1973/1975, p. 82)...... 116

Figure 4-4. CIVILW, choosing strategies (101 BASIC, 1973/1975, p. 82)...... 117

Figure 4-5. Grand Prix, sample input and output (Ahl, 1979, p. 66)...... 120

Figure 4-6. Deepspace, sample output (Ahl, 1979, p. 46)...... 121

Figure 4-7. Deepspace, more sample output (Ahl, 1979, p. 46)...... 121

Figure 4-8. CAVES 1, sample input and output (Kaufman, 1973, p. 4)...... 122

Figure 4-9. Hunt the Wumpus, sample input and output (Ahl, 1979, p. 179)...... 123

Figure 6-1. The Hail Workshop programming environment ...... 174

Figure A-1. The Hail Workshop programming environment...... 242

Figure A-2. Setting and printing a variable value...... 244

Figure A-3. A simple for-next loop ...... 245

Figure A-5. An object definition ...... 246

Figure A-6. Instantiating an object ...... 247

Figure A-7. Setting instance variables ...... 247

Figure A-8. Setting instance variables ...... 248

Figure A-9. A subroutine that accepts a generic object x ...... 248 x

Figure A-10. Subroutines with two and zero parameters ...... 249

Figure A-11. Passing objects as parameters ...... 249

xi

List of Appendices

Appendix A: Using Hail-Workshop ...... 241

Appendix B: Historical Timeline ...... 253

xii

List of Acronyms

BASIC ...... Beginner's All-purpose Symbolic Instruction Code

BBN ...... Bolt, Beranek and Newman

BOCES ...... Boards Of Cooperative Educational Services

BRL ...... Ballistic Research Laboratory

DEC...... Digital Equipment Corporation

DECUS ...... Digital Equipment Corporation Users' Society

EDVAC ...... Electronic Discrete Variable Automatic Computer

ENIAC ...... Electronic Numerical Integrator And Computer

FOCAL ...... Formulating On-Line Calculations in Algebraic Language

FORTRAN ...... Formula Translation

GUI ...... Graphical User Interface

IDE ...... Integrated Development Environment

IPS ...... Interactive Programming System

JOSS ...... JOHNNIAC Open Shop System

Lisp ...... List Processing

MS-DOS ...... Microsoft Disk Operating System

NLS ...... oN-Line System

OEG ...... Operations Evaluation Group

ONR ...... Office of Naval Research

OR ...... Operations Research

ORO ...... Operations Research Office

PARC [Xerox] ...... Palo Alto Research Center

PC-DOS ...... Disk Operating System

PDP ...... Programmed Data Processor

xiii

RAND [Corporation] ...... Research And Development [Corporation]

SAGE ...... Semi-Automatic Ground Envrionment

STEM ...... Science, Technology, Engineering and Mathematics

xiv

Introduction

In the fall of 1974, the first issue of a magazine titled Creative Computing was published. Conceived and created by David Ahl, a former employee of Digital Equipment Corporation (DEC), its very title was making a rather radical statement for the time. Could computing really be associated with creativity? To those who participated in the development of the first digital computers less than thirty years earlier, such a concept would have been wholly alien. These early machines, such as the Electronic Numerical Integrator and Computer (ENIAC) and the Electronic Discrete Variable Automatic Computer (EDVAC), both developed at the Moore School of Electrical Engineering at the University of Pennsylvania, were simply the latest in a long line of devices intended to automate particular mathematical operations. In his famous First Draft Report of the EDVAC, a document which described many of the concepts that would go into the modern digital computer for the first time, John von Neumann noted that the proposed machine could, as an example, "solve a non-linear partial differential equation in 2 or 3 independent variables numerically" (Von Neumann, 1945, p. 1). The rapidity with which electronic computers could solve differential equations was the primary reason they were commissioned to be built, as it just so happened that the United States military used such equations to plot the trajectories of ballistic artillery pieces (Stern, 1981; Polachek, 1997). By the 1970s, of course, computers were used in a variety of applications in both public and private sector organizations. They had even started appearing in schools. But, seemingly, they were still just tools, used to manipulate numbers and, more recently, alphanumeric data. How was such work creative?

A look inside the magazine offers some hints as to what constitutes creative computing. In particular, a column entitled "Problems for Creative Computing," written by Walter Koetke, a teacher at Lexington High School in Lexington, Massachusetts, shed much light on the issue (Koetke, 1974, pp. 16-17). At the beginning of the article, Koetke presents the rules of a game called Tac Tix, invented by the Danish mathematician and poet Piet Hein:

Each game begins with 25 markers arranged in a 5x5 square formation…Two players then alternate turns. On each turn a player may take as many markers as he chooses from any single row or column, provided that the markers are next to

1 2

each other…The player who removes the last marker is the winner (Koetke, 1974, p. 16).

Koetke then briefly discusses the optimal strategy for winning the game (from the perspective of the first player) before noting that "[s]ince Tac Tix is played on a small board, has only a few easily stated rules, and requires only a short time to play, it is a very good game to implement on a computer" (Koetke, 1974, p. 16). He then issues a challenge to readers to create a Tac Tix program, with the computer playing the role of either the first or second player (he suggests trying both to see the differences in play.) Creative computing, then, seems to involve the creation and programming of simple games, at least in part.

Games, in fact, became an increasingly dominant presence in Creative Computing's pages. But these were not the sorts of games that are currently popular in the consumer marketplace – that is, they were not graphics-intensive, action-oriented titles costing millions of dollars to produce. Rather, they were simple, text-based games, written in the BASIC programming language and run within BASIC parsers. A game called "Depth Charge," for example, which was featured in the first issue, operated via a simple premise: players would type in coordinates for a point on a three-dimensional grid, and the game would tell them whether or not they "hit" a hidden submarine. Players thus interacted with the game while it was running, via a command-line prompt that resembled those offered by operating systems such as and DOS, and that still feature in the "terminal" or "command prompt" applications found in most modern programming languages. Unlike in those systems, however, this prompt only allowed players to enter specific values that were then processed by the game code (see figure 1).

Along with this sample output, the game's entire source code – consisting of only 31 lines of code – is listed. This indicates that the magazine's readers were to have a very different relationship to this game than most contemporary game players have to their titles. These readers were expected to type the game in themselves into their own computers, and only then to begin playing it. It could even be argued that entering the code was part of the "fun" of the game. Koetke's column shows us that programming was meant to be a creative challenge, and this first issue of Creative Computing is filled with other "puzzles" – mostly simple mathematical problems – in which the reader is challenged to develop solutions via programming. Being 3

creative with the computer, then, meant to program the computer, and in particular to play games and solve puzzles via programming.

Figure 1. Depth Charge game, sample output.

Creative Computing emerged at a time when text-based, do-it-yourself interactive game programming was still a nascent trend in public school systems, and just before it would become a widespread hobbyist phenomenon. By the early 1980s, with the advent of personal "home" computers such as the Commodore VIC-20, Apple II, and Atari 400 and 800, the magazine would be but one of many that catered to hobbyist game programmers, working mostly in the BASIC language, who both typed out games they found in books and magazines and invented their own, which they would then submit to the publishers of these works. By the mid-1990s, with the rise of the commercial computer game industry and the increasing sophistication of graphics-based gaming, these publications disappeared. But hobbyist game programming has had a lasting impact, and continues to be relevant for two major reasons. The first is that many commercial game genres, including strategy games, role-playing games, and simulation games, 4 have their roots in text-based hobbyist game programming. While most of these games are now graphics-based, the types of interactivity that they allow are fundamentally close to those offered by hobbyist games. The second is that hobbyist game programming is going through something of a resurgence, and the practices that are emerging out of this new movement borrow heavily from the hobbyist era of the 1970s and 1980s. The focus is once again on simple interactive game development via "high-level" programming languages. Virtually all pedagogical, "how-to" programming texts, with titles such as Invent Your Own Computer Games with Python, Javascript for Kids: A Playful Introduction to Programming, and Programming for Kids, focus on games as a means to learn how to program (Sweigart, 2012; Morgan, 2014; and Harbour, 2014). Some of these games are virtual recreations of earlier hobbyist programs; the game Sonar Treasure Hunt, featured in Sweigart's work, is conceptually the same type of game as Depth Charge (Sweigart, 2012). All are meant to be typed out and played by the same user/player, blending these roles just was they were in the earlier hobbyist era. The term "creative" has even been borrowed; Mitch Resnick, head designer of the Scratch programming system, argues that "thinking creatively, reasoning systematically, [and] working collaboratively" are vital skills that children (and others) can learn through programming (Resnick, 2012). The head of a programming training centre for teachers in the United Kingdom, moreover, recently claimed that "[w]e're not just trying to encourage people to become developers. We're trying to encourage children to become creative" (Dredge, 2014).

Because of the continued importance of hobbyist-level interactive game programming, it is concerning that it has been subject only intermittently to critical scrutiny. This neglect has meant that current research on interactive gaming, and on interactive computing in general, is incomplete. This thesis will demonstrate that text-based interactive gaming in fact played a major role in shaping how digital computing technologies have changed and evolved since the earliest mainframes were developed in the 1940s, particularly with respect to interactivity. Many of the earliest interactive programs were actually games, with much of the important work in this area happening at the RAND Corporation, the United States Air Force-funded research institution based in Santa Monica, California. RAND researchers developed a form of computer gaming called "operational gaming" that they used to study hypothetical wartime scenarios, with a particular focus on logistics. They also created their own interactive network computer system – one of the first of its kind – that they used to play these games. RAND's practices were later 5

adopted by businesses and educational institutions for game-based learning exercises, and from there they spread to hobbyist programmers working on systems. The hobbyist games described above thus inherited a framework for play that was originally designed for militaristic purposes.

Central to this framework is the concept of the computational, mathematical model. The use of models to determine mathematical outcomes is borrowed from classical physics, and in fact the first computers were developed in order to solve complex mechanical problems. Game models are capable of accepting inputs, processing them via a series of equations (written in a given programming language,) and then providing output data. The user of such a model is expected to run it for a finite set of "turns". At each new turn, the output data from the previous turn is meant to have influence over the input data that the user subsequently generates. Note that in the Depth Charge game described above, the user adjusted the location of each "shot" based on the feedback given in the previous round. In an economic or resource management game – a genre that was among the most important and influential in operational and hobbyist gaming – fixed quantities such as money and resources could fluctuate based on the model's calculations at each turn, thus limiting the player to using whatever quantities are available. Importantly, the equations used in such models tend to be hidden from the player(s). The general idea is that the players are supposed to learn what works and what does not work in a given scenario, without having access to the underlying rules that govern that scenario. As Noah Wardrip-Fruin noted when discussing the model-based game SimCity, "Successful play requires understanding how initial expectation differs from system operation, incrementally building a model of the system's internal processes based on experimentation" (Wardrip-Fruin, 2009, p. 302). To "play" an operational, model-based game was to experiment with it, which is why they were thought to be such effective teaching tools.

The type of computing that was labeled "creative" in Ahl's publication was thus a very specific form of interactive programming largely based around interactions with fixed mathematical models. This thesis will therefore demonstrate that model-based, operational gaming offers only a very limited form of interactivity, particularly for those users that only "play" with models, as opposed to coding them. Such players are only allowed to pass data values to these games when specified in the game code, and only in the format(s) expected by the game. In the Depth Charge game, for example, players could, when prompted, enter three coordinate values, separated by 6

commas, which the game would then use to determine whether a "hit" was made. Yet this is the only information that the player could provide while the game was running. They could not, for example, pass their own instructions to the game, or provide it with new code to execute. They could, of course, exit the program to perform these actions. But then the game was no longer running, and the player would probably no longer be considered a player, and would instead be cast as a programmer. I will argue that all of these divisions – player versus programmer, running versus non-running programs, limited interactivity versus more open-ended interactivity – are artificial, and exist only because of the specific characteristics of operational gaming. If operational gaming had not been as influential as it was in terms of shaping the forms of human- computer interactivity that first emerged in the 1960s and 1970s, such divisions might not exist.

There are, in fact, alternative forms of interactive programming, one of which will be looked at in detail here: the interactive programming system (IPS) paradigm, developed across several programming language research projects over the course of the 1970s and 1980s, with Smalltalk and Interlisp being the most prominent examples. IPS designers envisioned programming as a workshop-style design process in which users would create "components" out of code that they could configure and control in various ways in order to produce desired outputs. Rather than running individual programs, users would refer information to specific components, which would adjust variables from a global pool. The overall effect was to create an "ecosystem" of tools, components, and data that were always under the control of the user. Barriers to full interactivity were removed, so that roles as such as "player" and "programmer" had no meaning. A consequence of this, of course, was that all IPS users must have some programming knowledge in order to use a given system. But given current efforts to revive hobbyist programming, this is in fact an ideal situation. In an IPS, a user can both develop and play games at the same time, creating new components, or modifying existing components, as needed. Playing a game, moreover, does not put the user in a restricted mode from which they can only behave as the running program dictates. Rather, the same skills a user builds while programming are also be used when playing. As users develop and play, moreover, they create their own "universes" of components tailored to their needs, to use a term coined by Smalltalk designer Alan Kay (see Kay, 1969).

A major caveat to the IPS approach, however, is that none of the systems that were developed in the 1970s and 1980s became popular among programmers of the time. Few, in fact, became 7

fully realized IPSs. Of those that did, only Smalltalk is still supported, though it only exists in a variety of obscure dialects such as Squeak and Pharo. Complexity was probably a major factor in the decline of IPSs, given that they were mostly based around highly-complex languages. In this thesis, however, I will demonstrate how a modern IPS could be built that relies on a much more accessible underlying language. I will then describe a prototype IPS system, Hail Workshop, that I have been developing over the course of preparing this thesis. Hail Workshop is a windowed, text-based system that relies on an underlying language (simply called "Hail") that supports a simple syntax such as those found in BASIC and Python, while also enabling the creation of code and variable-based components that are stored in a persistent memory. A session in Hail Workshop can be saved and later retrieved, restoring all components that were created earlier. Users come to take ownership of the Hail Workshop session they are interacting with, giving them a level of agency over their code that cannot be managed with more familiar programming "studios" such as Visual Basic and Xcode. Hail Workshop is intended to serve as a framework for an IPS, and is designed to be extensible.

The overall goals of the present study, then, are to demonstrate the limitations of interactive digital gaming by exploring relevant historical events in digital and pre-digital computing, and to describe a prototype system for interactive game development which accords equal agency to all users and allows such users to build systems, rather than just programs. The prototype system will draw from the historical component of my work, in that it is an attempt to address the major flaws in the style of gaming that emerged in the 1960s and was popularized in the hobbyist home computer era. This thesis thus challenges prevailing norms within interactive gaming and game development, and offers an approach that enhances and equalizes agency for all types of users. It encourages a shift in thinking in which the concept of creating programs is replaced with the concept of creating systems, thereby enabling users to incrementally build "parts" that they may use and reuse to both build and play games. A larger goal of this work is to promote the notion of studying the history of digital computing in order to better understand contemporary computing practices. Aspects of such practices that tend to be considered as "settled" or somehow inevitable are, as will be seen, deliberate constructions, with viable alternatives. 8

Background

The first American digital computers emerged in the wake of the Second World War as responses to wartime military needs. Both the ENIAC and EDVAC (see above) were commissioned by the United States Army to calculate the ranges of artillery weaponry, a complex problem that may be solved via numerical analysis – that is, by performing basic arithmetic calculations across a range of values (Stern, 1981; and Polachek, 1997). The digital computer was thus a specific type of problem-solving machine. It was flexible enough to accommodate a variety of uses, but only if the steps required to perform a given task could be solved by numerical methods. While digital computing changed drastically over the next several decades, the digital computer remained, and remains to this day, an automated numerical analysis engine.

These early computers could only be operated by one handler at a time, and programs were written and read via magnetic tape and punched cards. The early 1960s, however, witnessed the emergence of the first computer networks, allowing multiple users to connect with central servers via "terminals" that were composed of keyboards and printers (monitors would not become common until at least the mid-1970s.) For the first time, multiple users could simultaneously interact with the same computer in real time, issuing commands and receiving almost immediate feedback. The first such networks emerged at MIT and the Cambridge-based high-tech firm Bolt, Beranek, and Newman (BBN), but quickly spread to other centres (Wildes and Lindgren, 1985). Key to the concept of time-sharing was the notion of having a "conversation" with a given computer, mediated through the terminal (Orr, 1968; and Fass, 1969). It was believed by some in the field that such conversations would allow users to perform tasks and solve problems much more quickly, and thus result in "man-computer symbiosis" as defined by J. . R. Licklider (Licklider, 1960). Computers were thus seen by such advocates as being more than mere numerical calculating machines. Rather, they were tools to support and enhance one's ability to "think" – that is, to both develop and solve complex problems. Yet the underlying architectures of networked computers, apart from the functionality needed to maintain a network, hardly differed from those machines that preceded them. This was largely a perceptual shift, as opposed to a technical shift. 9

Along with time-sharing technologies, changes in how computers were programmed were making them more accessible to non-expert users. The introduction of "high-level" programming languages such as (developed in the mid-1950s) and COBOL (developed from 1959-1960) enabled users to forego complex machine code and phrase their problems using logical, formulaic syntaxes (Sammet, 1969). In the 1960s, new languages emerged that leveraged time-sharing technologies to become fully embedded within interactive systems. The Johnniac Open-Shop System (JOSS), developed at the RAND Corporation, was a time-sharing network that contained its own high-level language that users could interact with directly, or else use to build (and modify) programs (Baker, 1966; and Marks, 1982). The JOSS language was borrowed and adapted by a variety of private companies and academic institutions, extending its influence. In 1964, mathematicians John Kemeny and Thomas Kirtz, based at Dartmouth University, created BASIC, an interactive programming language that was embedded within their larger Dartmouth Time-Sharing System (DTSS) computer network (Kemeny and Kurtz, 1985; and Sammet, 1969). As with JOSS and its variants, users could interact with BASIC both by issuing direct commands, and by building and executing larger programs.

These accessible, high-level languages were being developed at a time when computers were just beginning to be used outside of major corporations and research institutions. The invention of the transistor allowed computers to become much smaller and less expensive, and therefore accessible to a wider range of users. Companies such as International Business Machines (IBM) and DEC began to market their "" to institutions that could not or would not have supported their older mainframes. A major inroad was made into public school systems. DEC took their PDP-8 computer, packaged it with a BASIC interpreter and other applications and tools, and created the EduSystem series, a "complete line of computer systems tailored to the needs of schools and colleges" (EduSystem Handbook, 1973, p. v). David Ahl, in his time at DEC, began circulating BASIC games created by EduSystem users, particularly those from students. Ahl went on to publish several volumes of such game programs, connecting a substantial library of BASIC games to a burgeoning market of home hobbyist programmers. The relationship between educational programming and hobbyist programming has always been close, so much so that their fortunes seem to rise and fall together.

In the mid-1970s, Microsoft co-founders Bill Gates and Paul Allen would develop their own version of BASIC, based in part off of a DEC version of BASIC, for the Altair 8800 10

"," a pioneering machine manufactured by Micro Instrumentation and Telemetry Systems (MITS) and sold by mail order (Allison, 1993). The Altair 8800 helped to catalyze interest in "personal" home computing, and would be followed by machines such as the Commodore PET, Apple I, and Tandy's TRS-80 (Ceruzzi, 2003). Gates and Allen generalized their product to create Microsoft BASIC, which they then licensed out to these manufacturers for use on their own systems (Allison, 1993). Microsoft BASIC, while customized somewhat on each new platform, thus became something of a lingua franca at the beginning of the hobbyist computer era. Most personal computer systems built from the mid-1970s to the mid-1980s ran a form of BASIC-based operating system that worked much like the DTSS described above. Users could type in direct commands, as well as create whole programs. Programs would be stored in memory, and would run when the user typed in the "RUN" command, or something similar. A burgeoning trade in hobbyist programs emerged out of these circumstances, and a variety of books and magazines were published that included program listings (see chapter four). As already noted, games were by far the most popular types of programs that were created and distributed. When game programs were run, users would generally play them by entering information when prompted, as in the Depth Charge program shown above. These text-based games flourished in the earliest years of the hobbyist era, but by the early to mid-1980s graphics- based games also became popular. Specific genres, such as the "resource management" game (to be described in later chapters), emerged, and thus specific code elements were often borrowed and adapted.

The decline of hobbyist BASIC programming occurred gradually over the course of the late 1980s and early 1990s. As computers increased in power and sophistication, it no longer made sense to outfit them with simple, BASIC-based operating systems. This shift was evident in part with the rise of "IBM PC compatible" or "IBM clone" computer systems manufactured by companies such as Compaq and Olivetti. Such systems leveraged the IBM PC's popularity in the business marketplace and gradually built a base of home consumers. They were helped in these efforts by Microsoft, who licensed the operating system it created for the IBM PC – called "Personal Computer Disk Operating System," or PC-DOS – to PC clone vendors under the name "Microsoft Disk Operating System," or MS-DOS (Anthony, 2011). MS-DOS offered a sophisticated command-line interface – borrowed from CP/M, an earlier operating system – that focused on file manipulation and system tools (Hunter, 1983). Programming was thus not built- 11

in, and programming systems had to be called upon specifically in order to compile and/or run code. Concurrent to these events was the emergence of the Apple Macintosh and Commodore Amiga computers, which placed heavy emphasis on graphics, including "windowed" displays and graphical user interfaces (GUIs), thereby putting pressure on PC manufacturers to prioritize graphics as well. Given the limitations of high-level languages in this period, most graphics- based programs were built in low-level machine code, which was far more difficult for hobbyist users to learn and use. Programming periodicals such as COMPUTE! responded by listing machine code programs, but these were much more daunting than the older BASIC programs. As commercial software, particularly games, became more sophisticated, printing entire programs in books and magazines became infeasible. The impetus to teach programming in schools was also largely lost. In a study of participation in STEM (science, technology, engineering, and mathematics) courses at the high-school level conducted by the National Center for Education Statistics in the United States, computer science was the only subject that showed declining enrollment between the years 1990 and 2009 (Nord et al., 2011). Another study noted the decline in computer science education throughout all grades of schooling, and that most schools that did teach the subject focused on the use of existing software applications, not on programming (Wilson et al., 2010).

While these studies demonstrated that computer science education in schools was largely on the decline, the fact that such research was conducted reflects a renewed concern for such education. As it was in the past, the current revival in hobbyist programming is connected to a fresh sense of urgency to teach programming in schools as part of a broader push to promote STEM-based learning in general. Three related arguments motivate calls for an increased emphasis on STEM. The first is the perception that the West has witnessed a pronounced, and irreversible, "decline" of "traditional manufacturing industries in the face of competition from low-wage, low-skilled Asian economies," meaning that future economic growth will come almost exclusively from a "range of high-skilled fields including financial services, telecommunications, biotechnology and aerospace" (Orpwood, Schmidt, and Jun, 2012, p. 7). The second is the notion that "knowledge and proficiency in the areas of science, technology, engineering and mathematics…are closely related to a country’s capacity to compete" in these emerging fields (Orpwood, Schmidt, and Jun, 2012, p. 8). Finally, it is argued, faulty or misguided curricula in schools and universities result in "relatively low rates of participation in STEM fields," particularly when compared to 12 developing countries such as China and India (Orpwood, Schmidt, and Jun, 2012, p. 10). As a result, it is believed that "investments in STEM literacy are crucial for developing a skilled society that is prepared to respond to an uncertain future" (Expert Panel on STEM Skills for the Future, 2015, p. xiii).

As a result of these concerns, various educational institutions have made programming education a priority once more. In 2014, the United Kingdom became the first country to make programming a mandatory component of state school curricula (Dredge, 2014). A similar initiative is planned for Chicago's public schools, and other major North American cities have also expressed interest in making some programming or computer science education mandatory at the K-12 level of schooling (Rampell, 2014). Elsewhere, private training centres have emerged to teach programming to children and other learners, and individual teachers have introduced coding into their classrooms (Kohli, 2015; and Oliveira, 2014). Non-profit organizations such as Code.org also attempt to generate global interest in programming with events such as their "Hour of Code." In all of these contexts, programming is largely presented as a panacea to the supposed ills of traditional school pedagogies, and a means to better prepare children for employment in the service-based, "post-industrial" workforces of the future.

This resurgence in the teaching of programming in schools is mirrored by developments in hobbyist computing that suggest a return, at least in part, to the practices of the 1970s and 1980s. Among the most important of these developments was the creation of the Raspberry Pi (RPi) computer, which was first released in 2012. The RPi is a simple, single-board computer that generally retails for under fifty dollars (CAN). The project was conceived of by Eben Upton, a former Director of Studies in Computer Studies at the University of Cambridge. Noting that many incoming students were lacking basic programming skills, Upton wished to design a cheap computer that could be used by hobbyists as a platform for experimentation and simple coding. It was thus a deliberate attempt to reproduce the technologies and practices of the old hobbyist era (Upton and Halfacree, 2014). The RPi has been a major success, selling over 5 million units as of early 2015 (Lomas, 2015). A second model, the Raspberry Pi 2, with enhanced processing power and additional peripheral ports, was released in February of 2015. A third model was released the following year. 13

Along with such hardware, hobbyist computing has benefited from the development of new programming languages and systems that cater in various ways to casual and novice programmers. The most powerful and versatile of these, arguably, is the Python language, originally designed by Guido van Rossum while at the Dutch research centre Centrum voor Wiskunde en Informatica (CWI). Rossum describes Python as "a high-level programming language that happens to be implemented in a way that emphasizes interactivity" (Venners, 2003). Its design was influenced by the language ABC, which was developed earlier at CWI and was "intended to be a programming language that could be taught to intelligent computer users who were not computer programmers or software developers in any sense" (Venners, 2003). Python is thus a highly-accessible language, and lacks the complex syntactic constructions required in languages such as C++ and Java. It therefore serves as a vital entry point into programming for many aspiring new hobbyists, much as BASIC did in an earlier era. Python is in fact positioned as one of the primary programming languages for the Raspberry Pi; the "Pi" in its name is meant to refer to Python, and Sweigart's work, cited above, leads readers through the development of text-based games using the language.

The present context, then, is not identical to that of the older hobbyist era, but shares much in terms of practices and aspirations. The overall impetus is to make programming more accessible to novice users, and to present it as both a fun and rewarding endeavour. And, once again, a key strategy employed by contemporary teachers, authors, computer scientists, hardware and software manufacturers, and other vested interests is the promotion of interactive game programming. Such games are often printed in books or magazines, though online distribution is a major new outlet. They cannot compare to commercial game releases, but there are not meant to. Rather, they are meant to be exercises in basic interactivity, enabling players to engage with running code by setting values, making selections, and otherwise passing along simple data elements, including directional data in games with animated graphics. The types of interactions that are allowed are determined by the programmer, and players may not send any other type of information other than what is specified in the code. This holds true for the programmer as well when their code is running. This type of dichotomization between programmer and player/user was developed on early time-sharing networks, was adopted readily by older hobbyist programmers, and remains largely unchallenged. 14

The Current Study

Historical digital gaming has lately received an increasing amount of scholarly attention. However, in research that focuses more generally on the history of digital computing technologies, or on science and technology in the Cold War and post-Cold War eras, gaming is generally overlooked. This is particularly striking given that researchers in such fields often come close to discussing the importance of gaming, but fail to truly broach the subject. Paul Edwards, for example, in his sweeping study of the role of the United States military in shaping computer science and engineering throughout the Cold War period, discusses the research conducted at the RAND Corporation in some detail. He mentions their work in game theory, simulation, strategic analysis, and other topics that relate very much to gaming, but never actually refers to the extensive war gaming that they also conducted; at one point he even makes reference to the "chess-style war games of previous eras," omitting the fact that RAND was very much playing such games in the Cold War era (Edwards, 1996). Ceruzzi, in his general history of digital computing, cites Stewart Brand's seminal Rolling Stone article on Spacewar at length, but treats gaming largely as a frivolous pursuit of little significance. When discussing some of the earliest personal computers, for example, he claims that "[t]hey were useful for playing games and for learning the rudiments of computing, but they were not good enough for serious applications" (Ceruzzi, 2003, p. 266). Slayton, meanwhile, details the creation in 1961 of the Office of Systems Analysis by United States Secretary of State Robert McNamara, but fails to discuss his creation of the Joint War Games Agency at the Pentagon two years later (Slayton, 2013; also Allen, 2012). In all of these cases, I would argue, game-related research would have supported the authors' overall theses, and would not have interfered with their respective narratives.

By not discussing gaming, in fact, such scholarship may only provide an incomplete understanding of the computing technologies under investigation. To return to Edwards, he attempts to connect RAND's interest in digital computing with its work in modeling, simulation, and analysis. Yet all he can really do is make vague assertions about how computers could allow for more detailed simulation models (Edwards, 1996). While this is not wrong, he does not even mention what was arguably RAND's most important contribution to early computer engineering: the JOSS network. As noted above, JOSS was one of the most powerful networks of its kind, and had lasting influence. Since it distributed computing power to multiple sites, moreover, it 15

was also an ideal platform upon which to play war games, and such games came to be just as important to RAND as their more analytic approaches to war-related problems. Edwards, however, does not mention JOSS at all in his work, and, as such, his discussion of RAND's advancements in computer engineering is rather limited. If he had included information about RAND's war gaming-efforts, however, he almost inevitably would have had to bring in JOSS.

The present study will advocate for the importance of gaming with respect to digital interactivity by adopting Woolgar's concept of technology as text. Rejecting any sort of "rhetorical distancing between subject and object," Woolgar advocates for treating technological artefacts as being fully embedded within larger networks of experience, configured by their designers to express certain meanings, but malleable to the point that users can "read" and interpret them in ways that their designers did not intend or anticipate (Woolgar, 1991). He then offers three specific interpretations, or "responses", of the technology as text paradigm itself, each of which points towards specific research methodologies. The "instrumental" response entails the study of how a given technological artefact influences the larger contextual environment within which it is embedded. The "interpretivist" response involves the study of the writers and readers themselves, and the ways in which they influence the construction, interpretation, and reconstruction/reinterpretation of a given technology text. Finally, the "reflexive" response considers the role of the researcher, and how the study of a given technological text is necessarily influenced by the characteristics of the text itself, and the types of readings it offers. This involves the de-emphasis of scholarly authority such that academic research is considered as simply one more text (or multiple texts) in a network involving the artefact in question. While this approach problematizes traditional perspectives on scholarship, it also opens up the possibility of creating alternate forms of text that may stand as novel interpretations of an artefact.

This dissertation will involve all three of Woolgar's responses in a research methodology in which computer programs will be interpreted as technological texts. This does not simply entail close readings of source code, though source code will be looked at. In addition, however, the contexts within which programs were written and executed will be under investigation. This includes the study of programming languages, programming systems, operating systems, and interfaces, as well as the engineers, scientists, and organizations that designed and built specific systems, and a wide range of printed texts, including manuals, reports, reference works, and 16 hobbyist programming books and magazines. Networks of technology texts, producers, and users will be delineated and mapped, so that connections may be drawn between artefacts that were developed in the earliest years of digital computing, those that were developed on personal computers in the 1980s, and those that are created in contemporary computing environments. In addition, in keeping with the reflexive interpretation, an alternative programming system, capable of producing new forms of program texts, will be proposed and discussed. This dissertation thus not only serves as a singular text, but also outlines a process through which additional texts – that is, programs – may be produced as continued responses to the same issues.

While Woolgar's technology as text model supports the basic framework of this study, the methodological approaches used at each stage are drawn from a variety of traditions. I root much of my work in the field of software studies, and the related field of critical code studies. While based in part on the work of Manovich, my perspective aligns more closely with the research conducted by Chun on digital neoliberal discursive spaces, as well as Kitchin and Dodge's work on code-based spatiality (Manovich, 2001; Chun, 2011; and Kitchin and Dodge, 2011). When performing documentary research and analysis, I will engage in what David Altheide calls "ethnographic content analysis," or "qualitative document analysis" (Altheide, 1987; and Altheide et al., 2008). When studying the trade in program code in the hobbyist era of the 1970s and 1980s, I will adapt ideas from Schӧn's concept of design worlds (Schӧn, 1988). When building a new programming system as described in chapter six, I was inspired in part by McCarty and Mahoney's work in digital humanities, and in particular their call for scholars to design their own digital models in order to better understand the problems they are researching. As Mahoney notes, "[t]he future of digital scholarship depends on whether we can now design computational models of the aspects of the world that most interest us" (Mahoney, 2005, p. 33).

This study will unfold across several stages. Each stage is approached differently, but are linked by common themes and the general evolution of several related arguments. Specifically, I will advance the following claims:

• Digital computation as it exists today was not an inevitable or natural development. Rather, it is the product of crucial choices made by select individuals in the "human computer" era, a time when groups of people were organized to solve complex mathematical problems by hand. I will focus on one of these organizations – the 17

Mathematical Tables Project – and show how it was designed around hierarchical, Fordist principles, meaning that individual workers had little to no agency over the rote arithmetic work they were assigned by managers. I will show how it was this form of computation that went on to inform the design of digital computers, as well as the consequences of this choice. • The RAND Corporation played a major role in the early development of interactive computer games. RAND developed the concept of gaming against a mathematical model (see above) from their efforts to quantitatively predict the outcomes of potential future wars against Cold War enemies. RAND actively promoted this form of gaming, which they called "operational gaming," to private-sector corporations, which adapted it for management-related tasks. From there, operational gaming spread into educational institutions, and eventually to home hobbyist programmers. • Along with operational gaming, RAND developed a form of computer interactivity that has become dominant even in modern systems. The JOSS system included a command called "Demand," which allowed users to send data to a program while it was executing. This meant that users could interact with such programs without having to modify the original code. Importantly, such users did not even need to understand how a given program functioned, as long as they knew how to interact with it. This is a powerful paradigm that was echoed by later languages such as FOCAL and BASIC, and it is at the foundation of most modern computing practices. • The home hobbyist game programming movement of the 1970s and 1980s was a product of the continued spread of the operational gaming paradigm outside of RAND. Early games such as Hammurabi were adapted directly from operational games designed and developed by organizations that had inherited the RAND model, either directly or indirectly. The rise of BASIC, and in particular Microsoft's version of BASIC for personal computers, created something of a common language for hobbyists, and so text- based operational games were created and traded in print and on physical media such as cassette tapes and floppy disks. The nature of this trade, as well as the relatively simple structure of the BASIC programs themselves, meant that code was often adopted and adapted from one game to another, so that specific genres emerged, and program design became somewhat modular, resembling a larger design world as discussed by Schӧn. 18

• Because of its roots in BASIC-based interactive computing, hobbyist game programming reified the notion of interacting with running code (using the "INPUT" command instead of Demand.) As a consequence, a divide emerged between the roles of game developer and player. While developers worked with game code, players simply interacted with running games wherever the developer had incorporated an INPUT statement. This "modal" model of interactive computing allows for certain efficiencies, but at the cost of severe constraints imposed on most users. Contemporary computing practices echo this divide, so that most users never see, or have access to, the code of the programs that they run. • An alternate form of interactive computing emerged in the 1970s and 1980s, in part as a challenge to modal forms of interactivity, that provided tools to build not just programs, but programming environments. These interactive programming systems (IPSs) enabled users to develop their own code "toolkits" by building modules and storing them within a larger system. Users could then configure or reconfigure networks of components, and call them in specific sequences to play a game or otherwise experiment with the available code. IPSs eliminate modal computing, and blend the roles of programmer and player. • Learning from the difficulties encountered in the first IPSs, it is possible to build such a system for interactive, text-based games, thus allowing for a more equitable, yet also more powerful, form of interactive computing. The prototype system that will be demonstrated in this dissertation, called Hail Workshop, demonstrates one potential form of IPS for text-based games. Built out of the Javascript language, Hail Workshop provides a windowed, command-line interface that enables the construction of code modules in the Hail programming language. The modules are stored in memory, and may be called at any time by the user to perform necessary tasks. By dividing a single text-based game, it will become clear that a programming IPS allows for much more flexibility in terms of both designing and playing games. Hail Workshop takes one step towards rectifying the problems with interactivity that have emerged over the past several decades, doing away with the INPUT-style commands that funnel users into different roles in other interactive languages. While largely text-based, moreover, Hail Workshop presents features that can also be adapted for graphical environments. 19

Chapter Overview

Chapter 1 will provide a review of prior literature related to both the subject matter I have chosen for this thesis, as well as the theoretical framework(s) I have employed. In terms of subject selection, my work bridges several areas of research, including digital game studies, the history of digital games, hobbyist computing (historical and modern,) and the history of Cold War-era technology. In order to bring all of these topics together, I built a theoretical framework that will also be discussed in this chapter. Informed by a wider body of literature in Science and Technology Studies (STS), my approach builds upon existing literature in the field of Software Studies by incorporating Woolgar's concept of technology as text, all of which will be explained in more detail in this chapter.

Chapter 2 provides a history of interactive computing that elucidates the importance of gaming as a major factor in shaping how such interactivity evolved. It will begin in the 1930s with the creation of the Mathematical Tables Project, a "human computer" organization that did much to influence the design of later electronic computers. Employing Altheide's notion of ethnographic content analysis, I will then examine the emergence of interactive computing at institutions such as MIT and the RAND Corporation (Altheide, 1987; and Altheide et al., 2008). RAND will then be a major focus in its development of operational gaming as performed on their JOSS interactive computer network. My overall argument in this chapter is that interactive digital computing was the product of key decisions made by individuals and organizations that had the power to enact change, and that RAND's focus on gaming pushed interactivity in a specific direction in which user agency was split across various designated roles.

Chapter 3 will pick up this narrative, focusing on the hobbyist game programming era from the mid-1970s to the mid-1980s. I will first discuss how RAND's operational gaming model spread first to private-sector businesses in the form of "management games," then to educational institutions under different guises, and finally to the growing market of personal computer users. I will then focus on how hobbyist games were developed and distributed by applying Schӧn's design world paradigm via the close reading of several games from the period. I will demonstrate how the BASIC programming environments implemented on most home computers served to foster the development of game programming while also propagating the splintered user agency that emerged during RAND's development of operational gaming. Hobbyist game 20

programming thus flourished for a short period, but declined as the roles of user and developer became more autonomous.

Chapter 4 will look at the history and development interactive programming systems in detail. It will first chronicle the major sources of inspiration for the IPS movement: Douglas Engelbart's oN-Line System (NLS), and Seymour Papert's concept of the programming "microworld." The development of Lisp, and its later, environment-based dialect Interlisp, will also be looked at, as will the foundational literature related to IPS development. Finally, the IPSs developed for the Xerox Alto experimental computer will be examined, with a particular focus on Smalltalk. I will demonstrate how these IPS developers were able to conceive of an entirely different model of computing as compared to the typical programming environment, one in which users built "universes" of components tailored to their needs, and where the roles of programmer and user/player were conflated, so agency could not be limited or otherwise splintered across roles. Yet I will also discuss the negative aspects of Smalltalk, and argue that it failed to become popular largely due to its complexity.

In chapter 5, I will call for the development of new interactive programming systems, and argue that IPSs that are built around gaming are of particular importance with respect to cultivating a new hobbyist programming culture. I will first discuss the importance of gaming with respect to exploring the potential of programming systems. I will then discuss how a particular text-based game from the hobbyist era, Hammurabi, a resource management and strategy game, would be organized if it were to be programmed and played within an IPS. Finally, after describing specific modern games and programming systems that offer IPS-like features, I will describe a prototype IPS that I designed specifically for text-based games. After delineating the major elements of this IPS, I will describe the process of creating a Hammurabi program based on the principles discussed earlier in the chapter, and how the game might be played, and expanded upon.

This thesis will end with a discussion of how the ideas presented here can be translated to graphics-based computing, and how the event-based model of interactivity prevalent in graphical systems compares with text-based, command-line interfaces. I will argue that graphical interactivity is in many ways even more limiting than command-line interactivity, and that a blended form of interactivity is preferable with respect to giving users agency over their systems. 21

I will also discuss the importance and influence of pen-and-paper role-playing games in digital gaming, and how IPS-based role-playing games could involve the more improvisational aspects of pen-and-paper games that are difficult to capture with traditional digital game development tools.

1 Literature Review

Due to the nature of my research, this literature review will be divided across several disciplines and genres. The first will be history, and more specifically the history of computing technologies, with a focus on gaming. While I am not conducting a rigorous historical analysis in any specific area, I do need to discuss certain historical events, and documentation related to those events, in order to analyze themes key to my overall thesis. The period under question roughly spans from the 1930s through to the 1980s, though I will be focusing more on postwar events. In the early years, the computing technologies that I will highlight were largely the products of governmental and research institutions. Private-sector interests enter the picture quite quickly, however, with firms such as Bolt, Beranek and Newman (BBN) and Xerox introducing pivotal innovations beginning in the late 1950s.1 As will be seen, the extant secondary materials for this era are an important foundation for my own research, but much work remains to be done in important areas. This is particularly true with respect to gaming. While certain games such as Pong and Spacewar merit attention, other foundational titles such as the original, text-based Lunar Lander, Hammurabi, and Rogue have not been the focus of any academic research. The roles played by institutions such as the RAND Corporation in shaping gaming, moreover, have only been lightly considered.

There are many issues both evidential and methodological to consider here. As such, this literature review will be divided into several parts. As games are the primary focus of my work, I will begin by looking at scholarly works that take historical and critical approaches to gaming that relate to my own research plans. From digital gaming, I will move into sources that deal with hobbyist computing more directly. Hobbyist computing today is much different as compared to what was happening in the 1980s, and I will make that clear as I work through the relevant literature. I will then include a discussion on works related to constructionist learning. Constructionism was a major influence with respect to the hardware and software that was

1 IBM, perhaps the most well-recognized of the early private-sector computing firms, will only play a tangential role in this historical narrative. This is not an oversight – rather, the innovations introduced by IBM were largely focused on the business market, and the research that they did perform was often in tandem with institutions such as MIT and the RAND Corporation. As such, they are an implicit presence in many of the projects discussed here, typically through the supply of computing power and data management. 22 23 developed for hobbyists in the 1970s and 1980s, and it is worth investigating the subject directly to better understand its motivations. From there, I will look at works that deal more broadly with the role of computing technologies in the Cold War period. This will include a section on histories of interactive computing, which is a subject that deserves more direct focus than it often receives.

At this point I will move into methodology. I will first consider Woolgar's notion of technology as text, and in particular the implication of his three response model. I believe that this model will be highly useful for framing my own research; it is, however, incomplete, in the sense that specific research methodologies that could be used in each area are not discussed. I will therefore introduce and discuss three such models that I believe will help here: Altheide's notion of ethnographic content analysis, Schӧn's design world model, and Engelbart's notion of bootstrapping, combined with other materials related to interactive programming systems. I will review relevant literature, and discuss how each model may apply in my own work.

1.1 Game Studies

While it is still (arguably) an emerging discipline, there is a substantial (and increasing) body of literature in which gaming is examined from a variety of disciplinary perspectives. As discussed by Wolf and Perron, major published works on gaming began to emerge in the late 1990s, in the wake of a period of nostalgia over early gaming that continues to this day (Wolf and Perron, 2003). They cite Aarseth's 1997 work Cybertext: Perspectives on Ergodic Literature as the moment when "serious academic writing on video games" emerged within academia (Wolf and Perron, 2003, p. 9; and Aarseth, 1997). Borrowing from "the Greek words ergon and hodos, meaning 'work' and 'path,'" Aarseth's main argument is that, "[i]n ergodic literature, nontrivial effort is required to allow the reader to traverse the text," though a reviewer at the time found the work to be overly polemic (Aarseth, 1997, p. 1; emphasis in original; and Hunter, 1998). This would be followed a year later by From Barbie to Mortal Kombat: Gender and Computer games, edited by Cassell and Jenkins, which leveraged the "debates about gender and games" that were centered at the time around the "girls' games" movement (Cassell and Jenkins, 1998; and Wolf and Perron, 2003, p. 10). By "the turn of the millennium," they note, "video game theory, as a field of study, included a handful of books, several academic programs, the first online academic 24

journal (Game Studies), and over half a dozen annual conferences" (Wolf and Perron, 2003, p. 11; emphasis in original).

Much of this scholarship focuses on commercial gaming. There are perhaps many reasons for this, but popularity and simple economics are arguably the most important factors in play. Commercial gaming has had an outsized influence on cultural, economic, and political currents in gaming, and scholars are naturally tapping into these currents when selecting topics for their own research. That is not to say that scholarly work on gaming focuses exclusively on commercial titles, however. In Persuasive Games: The Expressive Power of Videogames, Bogost discusses the political game September 12th, as well as a host of other political games, several of which he helped to design (Bogost, 2007). In Flash: Building the Interactive Web, Salter and Murray discuss the Flash language and platform, and how amateur and hobbyist developers used it to create new forms of online interactive experiences, including games (Salter and Murray, 2014). Montfort's Twisty Little Passages: An Approach to Interactive Fiction, a thorough resource for information on "interactive" fiction, or text-based "adventure" games, discusses the genre's hobbyist beginnings in detail (Montfort, 2003). Yet the games that are to be the focus in the present study – the BASIC games made by hobbyist programmers in the 1970s and 1980s – have yet to receive much attention.

With respect to works that are related to hobbyist game development, there are a few existing sources to turn to. Montfort and Bogost's Racing the Beam: The Atari Video Computer System contains a fair amount of material on the "homebrew" hobbyist development scene, noting, crucially, that the individuals involved "are doing more than recreating the glory days of the Atari VCS – they are continuing to discover previously unknown capabilities of the platform" (Montfort and Bogost, 2009, p. 9). While contemporary hobbyists working with older technologies is not my specific area of focus, I will be engaging in cognate practices with respect to the development aspect of my research. An article by Owens indicates that users of RPG Maker, a GUI-based tool for creating tile-based role-playing games, use online sites devoted to the system as "scaffolding" enabling "deeper understanding of digital production and the development of practical skills"; this sort of community-building will become an important focus when I discuss hobbyist practices (Owens, 2011, p. 52). Other materials use gaming as pedagogical tool; I will discuss this subject in more detail in a later section. 25

1.1.1 Gaming Histories

Histories of gaming occupy a more specialized subject field within game studies, or, it could be argued, they are wholly separate sub-field within the larger discipline of history. A number of works focus on computer gaming – as opposed to gaming on consoles or mobile device – though hobbyist game development still remains largely unexplored territory. Donovan's work Replay: The History of Video Games offers much in the way of computer gaming-related discussion, largely from a British perspective (Donovan, 2010). Lowood offers what he calls a "biography" of "computer games", focusing on several specific games and consoles, though he tends to gravitate more towards the latter (Lowood, 2006). In addition, Lowood wrote an article on the history of Pong in which Spacewar – and its arcade game successor, – are also brought into the discussion (Lowood, 2009). In Wolf's edited volume entitled Video Game Explosion: A History from Pong to Playstation and Beyond, a chapter on the home computer, written by Rehak, is quite relevant here, but it only offers a rather brief discussion on PC gaming (Rehak, 2008). Wolf's chapter on adventure gaming, a genre that was (and perhaps still is) largely dominated by PC entries, is more thorough (Wolf, 2008). Other chapters include content ranging from the earliest arcade games and home consoles to modern genres like the first-person shooter. A later volume by Wolf (2012) on early digital game history offers longer treatments on a variety of subjects, most of them having to do with console gaming; a piece by Tucker on bulletin board systems (BBSs) and multi-user dungeons (MUDs) is an important exception (Tucker, 2012).

1.1.2 Constraints and Creativity

In later chapters, and particularly in chapter seven, issues around rules in games and creativity in games will become important. There are important links between these two concepts, as scholars both within and outside of game studies have argued that rules actually enhance creative output. Margaret Boden focuses on the concept of constraints to make such a point.

Constraints are imposed by the rules and tools associated with conceptual spaces, and are vital to the creative process. While creativity may be considered by some to be a form of free play, Boden demonstrates that the creative process is actually directed in ways that play is not. Far from being inhibiting, constraints are necessary for the creative process to function, as they 26

define the very space and structure within which one is supposed to think and work. She notes that the negative connotations associated with constraints make this somewhat unclear:

People often claim that talk of 'rules' and 'constraints'…must be irrelevant to creativity, which is an expression of human freedom. But far from being the antithesis of creativity, constraints on thinking are what make it possible…Constraints map out a territory of structural possibilities which can then be explored, and perhaps transformed to give another one. Dickens could not have created his luxuriant description of Scrooge without accepting the grammatical rule about adjectives, and pushing it towards its limits (Boden, 2004, p. 95).

Boden notes elsewhere that "to throw away all constraints would be to destroy the capacity for creative thinking," as the conceptual space required for such thinking would cease to exist (Boden, 1995).

Game studies scholars have long noted the importance of rules, which are a type of constraint, in gaming. As Suits notes, "[i]n games I obey the rules just because such obedience is a necessary condition for my engaging in the activity such obedience makes possible" (Suits, 1978/2005, p. 45). Rules play a central role in Suits' overall conception of gaming:

To play a game is to attempt to achieve a specific state of affairs [prelusory goal], using only means permitted by rules [lusory means], where the rules prohibit use of more efficient in favour of less efficient means [constitutive rules], and where the rules are accepted just because they make possible such activity [lusory attitude]…[P]laying a game is the voluntary attempt to overcome unnecessary obstacles" (Suits, 1978/2005, pp. 54-55).

Game rules thus limit player actions, while simultaneously encouraging players to develop creative ways to negotiate those limits. They also give games structure, and without them players would have no means to act. Rules, as well as other types of constraints, thus provide both the incentives and mechanisms for action.

Creativity, Boden argues, is also goal-based. As she notes, the creative process is focused on the discovery of new ideas, and potentially the production of new works based on these ideas. She cautions that creativity may be "open-ended," and that not necessarily goal-directed in the sense 27 of looking to solve a specific problem. But, at the very least, "its goal is a very general one: exploration – where the terrain explored is the mind itself (Boden, 2009, p. 59; emphasis in original). The ideas that are discovered in such exploration may be thought of as the "proof" that creative thought has taken place; they are a "quantifiable outcome," to use a term employed by Salen and Zimmerman (Salen and Zimmerman, 2003, p. 80). Goals are similarly vital to gaming, as Greg Costikyan observes:

What does a player do in any game? Some things depend on the medium. In some games, he or she rolls dice. In some games, he chats with his friends. In some games, he whacks at a keyboard. In some games, he fidgets with the controller. But in every game, he responds in a fashion calculated to help him achieve his objectives (Costikyan, 2002, p. 12).

Yet goals are not always explicit. The game SimCity is often mentioned when this issue is discussed. Based on the premise of serving as a "software toy," SimCity lets users build cities block by block on their computers without giving them any specific goals to strive for. But that does not mean that goals are non-existent. As Salen and Zimmerman explain it, "Sim City does not have explicit goals, and in that way is more like a toy than a game. However, as its designer Will Wright has often stated, players can turn it into a game by constructing their own goals" (Salen and Zimmerman, 2003, p. 82). Costikyan elaborates on this:

SimCity…lets you choose what kind of city you want, and to struggle to make your city stable. You can try to build a suburban utopia, with commuters using cars and no big central district. You can try to build a centralized city with good mass transit and no heavy industry. You can try a million things – and it's always interesting to play again, because you can always try something new (Costikyan, 2002, p. 13).

Such objectives, then, drive all aspects of game play, and the same could be said for the work inherent in the creative process, or so Boden argues.

A similar notion of creation within constraints may be found in Mark Wolf's work on world building, particularly in his conception of world gestalten. Wolf borrows his notion from Gestalt psychology, which focuses on how "the human perceptual system organizes sensory input 28

holistically, automatically filling in gaps, so that the whole contains percepts that are not present in the individual parts from which it is composed" (Wolf, 2012, p. 51). Wolf extends this concept to narrative and the concept of narrative gestalt, which he describes as follows:

Like visual gestalten, narrative gestalten occur automatically and seemingly without much conscious effort on the part of the viewer, provided the viewer is familiar with the cinematic storytelling conventions being used. Biographical films like Gandhi (1982) and The Last Emperor (1987) may cover several decades of someone's life in only a few hours, resulting in a staggering amount of omission and ellipsis, and yet such stories, if they are well constructed and include the right events, can seem complete and comprehensible (Wolf, 2012, pp. 51-52).

A larger story, in other words, can be imagined from the select details provided in a given film or other work of fiction. Similarly, a larger world may be perceived through the information provided in a given media artefact. As Wolf notes, "we can go one step further and suggest the idea of world gestalten, in which a structure or configuration of details together implies the existence of an imaginary world, and causes the audience to automatically fill in the missing pieces of that world, based on the details that are given" (Wolf, 2012, p. 52; emphasis in original). Wolf here borrows the notion of "syuzhet" from David Bordwell, a term from Russian formalism that refers to the methods used to present and depict a narrative story (i.e. the "fabula".) He argues that a narrative that may appear "overloaded" with world-based details actually serves to support a world-focused syuzhet:

If the world is considered instead of merely the narrative set in the world…the ideal syuzhet would have to provide enough information for the audience to be able to feel that an independent world appears to exist, and to have some sense of its infrastructure, cultures, geography, history, and so forth. Thus, what might appear to be "excess" from a narrative-oriented point of view, may prove to be necessary from a world-oriented point of view (Wolf, 2012, pp. 52-53). 29

1.2 Amateur Computing: Hobbyist and Pedagogical Perspectives

Much of the current focus currently in hobbyist computing is channeled towards the so-called "maker movement" in which "physical computing" takes precedence over straight-up programming on a pre-fabricated machine. To define physical computing, I will cite from Arduino co-founder Mario Banzi:

Physical Computing uses electronics to prototype new materials for designers and artists…It involves the design of interactive objects that can communicate with humans using sensors and actuators controlled by a behaviour implemented as software running inside a microcontroller (a small computer on a single chip) (Banzi, 2011, p. 3).

Dale Dougherty, founder of Maker Media and a major figure in the maker movement, equates modern makers with "those engaged in the early days of the computer industry in Silicon Valley," and explains their ethos as follows:

[T]hose makers in the early days of the computer industry were essentially playing with technology. They didn't know what they wanted computers to do and they didn’t have particular goals in mind. They learned by making things and taking them apart and putting them back together again, and by trying many different things (Dougherty, 2012, p. 12).

While hardware hobbyists have existed for decades, products like the Arduino board make it much easier for novices to engage in their own playing and "tinkering". Calling tinkering "undervalued" in "many educational settings," Resnick and Rosenbaum describe it as a practice "characterized by a playful, experimental, iterative style of engagement, in which makers are continually reassessing their goals, exploring new paths, and imagining new possibilities," and argue that "it is exactly what is needed to help young people prepare for life in today’s society" (Resnick and Rosenbaum, 2013, p. 164). Stangler and Maxwell go so far as to predict the emergence of a "do-it-yourself (DIY) producer society, driven by grassroots movements in tinkering, entrepreneurship, and small-scale manufacturing" (Stangler and Maxwell, 2012, p. 3). 30

In a similar vein, but related more closely to personal computing, a number of small-scale engineering firms and foundations have designed and built single-board, scaled-down personal computer systems, the most well-known of these being the Raspberry Pi line. What makes the RPi especially interesting here is the fact that it is very deliberately connected to the hobbyist computers of the 1980s. Its designers at the University of Cambridge "were concerned that the demise of cheap personal computers like the Commodore 64, the Amiga, and the Spectrum were adversely affecting young people's ability to program," and they therefore wanted to design a systems that could provide the same experiences to young people today (Donat, 2014, p. 2). Obligingly, RPi programming guides aimed at children have been published by third-party vendors, with Philbin's work advertising the system's potential as follows:

The Raspberry Pi gives you the opportunity to build and control a device that does what you want it to do. For example, you can deploy your very own robot arm, controlled by a program that you have written. You can design and create your own roleplaying game, or produce beautiful computer art or music, all by using code (Philbin, 2014, p. 1).

This passage presents us with an interesting juxtaposition of programming and making practices, reflecting the RPi's close association with the maker movement.

Despite this movement towards new forms of hobbyist computers, certain scholars have taken an interest in historical hobbyist computing, though much work in the field remains to be done. For example, the experimental work 10 PRINT CHR$(205.5+RND(1)); : GOTO 10, was published by MIT Press as part of their Software Studies series (Montfort et al., 2013). 10 PRINT, while it has a narrow focus, does touch on some wider issues with respect to 1980s hobbyist computing. The chapters on portability – that is, the transfer of a program from one computer to another, or to a video game console – are particularly relevant, outlining the real differences between platforms. A recent work about the Commodore Amiga is also an important contribution, but the history of the Amiga, beginning late in the home computer era, is largely beyond the scope of this study (Maher, 2012). Beyond this, Campbell-Kelly, Aspray, Ensmenger, and Yost's general history of computing, now in its third edition, devotes one of four parts to personal computing (Campbell-Kelly et al., 2014). Ceruzzi's slightly older work also devotes a few chapters to the era (Ceruzzi, 2003). 31

Beyond these examples, it is valuable to consider non-academic releases, many of which are well-cited within scholarly circles. Levy's Hackers: Heroes of the Computer Revolution does not focus exclusively on the PC, but it provides a thorough introduction to the individuals and institutions that played pivotal roles in its development and evolution (and was recently released in a 25th anniversary edition; see Levy, 1984/2010). Freiberger and Swain's work, from the same era is specifically about the PC, and otherwise covers similar ground (also re-released, as a second edition; see Freiberger and Swain, 2000). Hiltzik's work is essentially about Xerox PARC, but it discusses the many innovations made by PARC that influenced the design of later PCs (Hiltzik, 1999). There are also several self-published works that, if read critically, can fill in much of the detail that is lost in broader accounts. Bagnall's work on Commodore, for example, contains a wealth of interviews, first-hand accounts, and related source materials (Bagnall, 2010). Welsh and Welsh have much to say with respect to the Tandy TRS-80, the authors having been third-party application developers for the machine (Welsh and Welsh, 2011). Johnstone published a wide-ranging work about the use of computers in education, including a significant amount of information about the pre-PC era (Johnstone, 2003). This is all in addition to the articles, interviews, and primary source materials that are accessible online on personal and organizational websites. To cite one example, Jim Storer, creator of the original Lunar Lander program and now a professor at Brandeis University, has posted a variety of articles and documents related to the game on his personal website.2

1.3 Pedagogical Computing

Turning this discussion to educational and pedagogical approaches to games-based computing, there are materials with important overlap with my own work. There are, for example, several research and development projects, such as Scratch and Gamestar Mechanic, in which programming environments that are intended to be "child-friendly," or at least easy to use, are developed and tested. Much of the credit for current projects can be given to the earlier work of well-known MIT professor Papert, primary designer of the educational LOGO programming language and the notion of constructionism that motivated its creation. The basic idea behind constructionism is that children learn better by exploring and "making" within well-defined

2 http://www.cs.brandeis.edu/~storer/LunarLander/LunarLander.html 32

systems, as opposed to rote methods, a position he argues as follows in his classic work Mindstorms:

In most contemporary educational situations where children come into contact with computers the computer is used to put children through their paces…The computer programming the child. In the LOGO environment the relationship is reversed: The child, even at preschool ages, is in control: The child programs the computer. And in teaching the computer how to think, children embark on an exploration about how they themselves think (Papert, 1980, p. 19).

Papert's ideas were an expansion of the concept of cognitive constructivism outlined by Piaget, which refers to a model of learning and knowledge-building in which engagement with one's external environment is paramount, as opposed, again, to more rote learning methods (see, for example, Wadsworth, 2003). In The Children's Machine, published in 1993, Papert expanded upon his earlier ideas by advocating for the creation of a "Knowledge Machine" fully dedicated to enabling constructionist learning for child users (a vision quite similar to Kay's Dynabook, which will be discussed in chapter 6). Papert's vision was bold – he briefly engaged in a French- led global effort to bring computing technologies to developing countries – and such ambitions were to be echoed by the One Laptop per Child (OLPC) project undertaken at MIT (Servan- Schreiber, 1980; and James, 2010).

Papert's successors would go on to expand upon his ideas on a number of fronts, and would feel justified in doing so; Kafai and Burke argue that there exists an increasing desire to teach computational thinking in the classroom. As they put it, "the premise is that by learning to think as a computer scientist, students can solve everyday problems, design systems that we all use in daily life, and progress an innovate in other disciplines" (Kafai and Burke, 2014, p. 4). This is a position that the authors generally adopt; Kafai is a particular proponent of this notion, arguing in another work that when "youth program games, animations, interactive art, or digital stories," they involve themselves in "in many of the same critical, creative, and ethical considerations that new media literacy researchers consider relevant practices in more common forms of creative media production" (Kafai and Peppler, 2011, p. 90). The making of media is a crucial element to their approach; in an earlier article, they advocate for "creative production as a pathway for youth to participate in today’s new media culture, question its conventions, and integrate new 33 media such as videogames into 'media mixes' of images, video and texts" (Kafai and Peppler, 2007, p. 150).

Other, similar research initiatives reflect similar values, to varying degrees. Perhaps foremost among these is the Scratch system designed by MIT Media Lab's Lifelong Kindergarten group (Resnick et al., 2009; and Maloney et al., 2010; note that Resnick is the Director of Lifelong Kindergarten). Its creators credited Lego as an inspiration, noting the play practices of children that they wanted to emulate: "[a]s they play and build, plans and goals evolve organically, along with the structures and stories…We wanted the process of programming in Scratch to have a similar feel" (Resnick et al., 2009, p. 63). They also promote the idea of tinkerability in Scratch, arguing that "[t]inkerability encourages hands-on learning and supports a bottom-up approach to writing scripts where small chunks of code are assembled and tested, then combined into larger units" (Maloney et al., 2010, p. 4; see also Resnick and Rosenbaum, 2013). Scratch ostensibly "lets users create interactive, media-rich projects," while the Scratch website allows users to "share their Scratch projects, receive feedback and encouragement from their peers, and learn from the projects of others"; a particularly ambitious user "created and shared new Scratch projects on a regular basis, like episodes in a TV series" (Maloney et al., pp. 1, 3; and Resnick et al, 2010, p. 60; emphasis added).

In addition to Scratch, there are other pedagogical game development tools that each adopt a similar ethos. Gamestar Mechanic is a particularly well-known example. The end product of a "public-private partnership" between a game company (GameLab) and the University of Wisconsin-Madison, Gamestar is a highly-structured, story-based game development system designed by scholars Salen and Gee ("Gamestar Mechanic Parent's Guide", n.d.). As with Scratch, Gamestar is designed ostensibly to enable users – primarily children – to "learn to think like designers," so that "players can learn not only to analyze designs articulated by others, but also to articulate their own versions of problems and solutions" (Games, 2010, pp. 35, 49). There is also Storytelling Alice, developed by Kelleher, Pausch, and Kiesler as a tool for "creating Pixar or Dreamworks-style animated 3D movies" (Kelleher, Pausch, and Kiesler, 2007, pp. 1455-1456). The focus is on creating digital "stories", which play out similarly to many non- game Scratch programs. Greenfoot also aims to be an easy-to-use system for creating simple interactive applications, albeit in the Java language (Kӧlling, 2010). 34

1.4 The Cold War and Computing

The history of computing is tied inextricably with the history of the Cold War; the computer, after all, was first designed to be a war machine (Polachek, 1997). Much of the institutional research on digital computing, moreover, was at the behest of the United States military, with the RAND Corporation and MIT's Lincoln Lab standing out as exemplars (Leslie, 1993). Yet the practices researchers engaged in at these places were not wholly instrumental, and in fact a substantial amount of experimentation took place. Most projects needed to be described as potentially beneficial with respect to military and/or political matters, but this criteria appears to have been rather loosely applied (see, for example, Ghamari-Tabrizi, 2000). I believe that many current sources on this era largely fail to appreciate this aspect of Cold War-era computer engineering projects, particularly when it comes to game development. Instead, such work tends to focus on profiles of individuals with power and influence, rather than accounts of practices at the ground level.

1.4.1 Technology-Centric Cold War Histories

As already stated, the presence of the Cold War throughout the events I intend to cover serves as a major influence with respect to how these events unfolded.3 The perpetual presence of what was generally seen as a dangerous enemy state in the Soviet Union – one that threated the United States' interests both at home and abroad – proved a major catalyst in shaping the research and development projects at the various academic institutions and private corporations that served to drive the development of digital computing technologies. As Leslie puts it in the opening line of his study on the period, "[f]or better and for worse, the Cold War redefined American science" (Leslie, 1993, p. 1). This is particularly true for the science and technology projects that were funded directly by the military, and served military interests, at least initially. Yet it is also true when it comes to the civilian computer industry; the home computers that people bought and installed in their homes starting in the 1970s used technologies that were derived in part from military-based research. To extend from Polachek, it might be said that all computers are war machines.

3 Events from the postwar/pre-Cold War years are also vital for my study. Fortunately, most of the works considered here include this earlier period, and tend to touch on the Second World War as well. 35

Edwards' work in this area examines the computer as a discursive object, thereby defining technology as "a product of complex interactions among scientists and engineers, funding agencies, government policies, ideologies, and cultural frames" (Edwards, 1996, p. xiii). This approach stands in contrast, he claims, to those which "ignore or downplay phenomena outside the laboratory and the mind of the scientist or engineer" (Edwards, 1996, p. xii). Context, in other words, matters just as much as technical detail, and the Cold War period reflects this point very clearly.

Leslie's The Cold War and American Science stands as another exemplar work that adopts the cultural approach. While focusing much of its attention on developments at MIT and Stanford, it still manages to provide a broad-based study on the theme of American academic institutional involvement in military research and development projects during the Cold War period. Leslie advances his argument as follows:

In the political economy of the Cold War, science was anything but academic, with the blueprint for significant aspects of the nation's industrial policy being drafted by the military…[o]nly the universities could both create and replicate knowledge, and in the process train the next generation of scientists and engineers. The universities provided most of the basic research and all the manpower for the defense industry (Leslie, 1993, pp. 1-2).

Leslie emphasizes that leaders of the scientific community were largely (but not entirely) complicit in this endeavour, such that, for example, "leading members of the scientific establishment such as Robert Millikan and George Ellery Hale reorganized science in the conviction, as Hale put it, that 'war should mean research'" (Leslie, 1993, p. 5). As such, ambitious university administrators positioned their schools to earn the sorts of contracts that would often go right to private engineering firms, under the supposition that they would also benefit the more traditional research activities in these schools. This did in fact happen in certain respects, but, Leslie argues, it resulted in "our scientific community's diminished capacity to comprehend and manipulate the world for other than military ends" (Leslie, 1993, p. 9).

Also discussing Whirlwind within the context of SAGE is Hughes' Rescuing Prometheus: Four Monumental Projects That Changed Our World (Hughes, 1998). While not entirely scholarly, this work does build on Hughes' academic research, and reflects his opinions on his chosen field 36 of study. A highly-positive – and, arguably, positivist – account of technology in the postwar period, Hughes opens with the following reflection on twentieth-century American society:

Americans had transformed a natural world into a human-built one characterized by technological systems and unmatched complexity. In doing so, they demonstrated a technological prowess unequaled elsewhere in the world…The post-World War II period we shall consider witnessed the continuation of a technological transformation that can be seen as a second creation; the first was mythologized in the book of Genesis (Hughes, 1998, p. 3).

Clearly Hughes' opinion on such events is at stark odds with Leslie's. Yet when it comes to Whirlwind and SAGE, they share similar perspectives with respect to historical narrative. Similarly to Leslie, for example, Hughes claims that Whirlwind's developers very early on envisioned the machine purely as an instrument of military information processing. This position, I will argue, reflects only a small proportion of the available documentary evidence, and contradicts much of what is said elsewhere.

Other treatments are more specialized or general. MIT's work on its own history, Becoming MIT, has a good amount of relevant material; chapters by Douglas, Kaiser, and Leslie are probably the most important (Douglas, 2010; Kaiser, 2010; and Leslie, 2010). Lowen's work uses Stanford as an exemplar of the "Cold War university" model that emerged in this period (Lowen, 1997). Farish devotes a chapter to military-sponsored university research in his work on the Cold War (Farish, 2010). Finally, Slayton, in a recent work on early-warning and missile- defence systems, frames Whirlwind and other MIT projects within a larger discussion on the roles of science and technology in modern war (Slayton, 2013). All of these sources make contributions to larger Cold War-related themes, but technology is typically treated quite cursorily – that is to say, technology is never truly the focus. Rather, it serves as an instrument that serves the needs of those with the power to wield it – in the case of the Cold War, it is the military that possesses much of that power. The problem with such an approach is that makes it seem as if computing technologies could do no more than serve the needs of those in power who had them built. If that were the case, computers would have barely moved much past the original ENIAC machine, at least with respect to how they are generally used. As already noted, however, computer designers and programmers used their respective machines for experimental 37

and subversive tasks. The digital computer, therefore, needs to be recast as a machine that played a variety of roles.

1.4.2 History of Interactive Computing

The history of interactive computing is one element of the larger history of computing, for which Ceruzzi and Campbell-Kelly's aforementioned works are probably the best academic works. Interactivity, however, is arguably just as important a sub-topic of this history as are the histories of databases, digital media, and operating systems. It is not a generic term, moreover; interactivity means something very specific when it comes to digital computing. One of the goals of this project is to elucidate these ideas and make it clear that a particular notion of what it means for a computer to be interactive informs much about the design of computer hardware and software. Once this idea is grasped, one may begin to play with the concept and potentially create compelling new interactive forms.

Licklider's article entitled Man-Computer Symbiosis established an ethos for interactivity that, I believe, is still relevant today. Licklider was seeking a new paradigm by which to situate digital computing, thinking that the research into "time-sharing" network system would allow for more efficient problem solving by allowing for a timely exchange of ideas between user and computer (Licklider, 1960). Licklider was also heavily involved in implementing actual time-sharing networks, and his work is chronicled at least in part in several sources. Waldrop's biography of Licklider is very complete, and is a useful resource all-around, even though it is, I believe, not sufficiently critical (Waldrop, 2001). Akera also has a significant amount of material on Licklider and time-sharing in general, embedded within a history of early computing (Akera, 2007). Markoff also spends some time on Licklider and early time-sharing (Markoff, 2004). The general issue with most of these works, however, is that they generally do not go beyond basic institutional history with respect to Licklider and his work. These materials are generally quite adept in terms of explaining relevant big events and backing it up with documentation and/or interviews with relevant figures, but they fall short in terms of discussing "off-label" practices – that is, computing tasks that were not what relevant interests had in mind when they designed and/or funded various projects, but that often played a significant role in determining how computing technologies expanded and evolved. 38

Scholarship on Engelpart presents more of the same. The materials in this area tend towards the biographical, buying into the notion that this area of research was led by solo visionaries (Steve Jobs being a good modern example of this phenomenon.) Engelbart, then – designer of the oN- Line System (NLS) that has become famous for its HCI innovations – is the subject of his own thorough biography in which his work and his persona become fused (Bardini, 2000). Taken from such a perspective, Engelbart's work is generally lauded, as reflected in the following passage from Barnes:

Today, people do not need to know the name Thomas Alva Edison to turn on an electric light and read a book. Similarly, they do not need to know the name Douglas Carl Engelbart to turn on a personal computer, click on an icon with a mouse, and access the digital libraries of the world (Barnes, 1997, 16).

Barnes also uses language to describe Engelbart's aspirations in language that resonates clearly with Licklider: "Engelbart’s goal was to develop interactive computer systems that would match computational capabilities with human capabilities" (Barnes, 1997, p. 16). Yet much of the work is biographical, further perpetuating the fusion discussed above. Even Bardini and Friedewald's article on the "collapse" of Engelbart's research lab put the blame mostly on differences in expectations that emerged due to uncontrollable outside sources (Barnes, and Friedeman, 2002).4 Other works on Engelbart focus specifically on his demonstration film for the NLS, which has been nicknamed the "Mother of All Demos" (see, for example, Ju, 2008). Chun sees the film as a platform from which the viewer and Engelbart struggle for control. By controlling the action, Engelbart clearly has the advantage, though he has ceded much of his control to machines: the cameras that film him and the computer, and the computer itself, which is ultimately responsible for driving the overall narrative (Chun, 2011). These materials tend to deviate from my primary concerns, however.

1.4.3 Critical Approaches to Technology

This section will survey a wide variety of critical ontologies and methodologies that coalesce around the notion that scientific and technological knowledge, as well as all associated artefacts

4 Bardini's biographical work does, however, critique Engelbart's work in places, as will be discussed below. 39

and texts, may be subject to critical inquiry. The research conducted within these fields strives to break away from the positivist, and even triumphalist, discourses on technology that tended to dominate when scientific and technological scholarship was developed almost exclusively by scientists and engineers themselves. Much of this work is now categorized under the field of science and technology studies (STS), though it is often not labelled as such. STS might better be understood as an umbrella term that covers a variety of critical approaches the break from positivist orthodoxy and challenge concepts such as progress, truth, and technological determinism (Sismondo, 2010). Thomas Kuhn's 1962 work The Structure of Scientific Revolutions is noted as a foundational antecedent to such efforts. Kuhn argued that scientific "truths" were considered as such because they accorded with the parameters of enclosed scientific "paradigms," or specific worldviews. Revolutions occurred when new paradigms were developed and adopted but, crucially, these new paradigms were not necessarily improvements over the old. Rather, they were simply alternative belief systems, with different philosophies, methodologies, and modes of observation. Kuhn thus countered the notion of perpetual scientific progress, providing a framework that was much more reflexive and contingent (Kuhn, 1962/1996).

Kuhn's ideas are not universally accepted, and the notion of total incompatibility between paradigms is heavily disputed (Sismondo, 2010). In the wake of Kuhn, however, there emerged a growing body of scholarship in which technology and scientific knowledge were treated as sociological phenomena. The field of sociology of scientific knowledge (SSK) was particularly important in terms of promoting these views. As H. M. Collins notes, SSK was concerned primarily with "what comes to count as scientific knowledge and how it comes so to count" (Collins, 1983, p. 267). Out of SSK grew the "strong programme" approach associated with David Bloor, as well as colleagues such as Barry Barnes. The strong programme is a framework for the study of scientific knowledge, and holds any SSK research must adhere to four "tenets," listed below:

1. SSK word should be "causal", in that it should be "concerned with the conditions which bring about belief or states of knowledge." 2. It should be "impartial with respect to truth and falsity, rationality or irrationality, success or failure." 40

3. It should be "symmetrical" in its approach in that "[t]he same types of cause would explain, say, true and false beliefs." 4. It should be "reflexive", in that "its patterns of explanation would have to be applicable to sociology itself" (Bloor, 1976/1991, p. 7).

As Sismodo notes, one of the strengths of the strong programme is that it encouraged STS research that focused on "showing how much of science and technology can be accounted for by the work done by scientists, engineers, and others" (Sismodo, 2010, p. 48; emphasis in original). As a consequence, the practices involved in both the development of science and of technologies are conflated. It is no longer the case the science involves the virtuous discovery of truths, while technologies are the applications of these truths. Rather, both fields involve the production of information. As will be discussed below, such information may appear in the form of "texts," particularly if the meaning of the term is expanded to include scientific and technological artefacts. It is this wider perspective that informs the various critical approaches surveyed here.

1.4.4 SCOT and ANT

By the 1980s, these sociological approaches to the study of science and technology had gathered substantial momentum. Pinch and Bijker's 1984 article from Studies of Science outlined a particular approach for the study of technology that they referred to as "Social Construction of Technology," or SCOT (Pinch and Bijker, 1984). From a SCOT perspective, individuals, communities, and societies that use technologies help shape their evolution based on the ways in which they approach each iteration of a given artifact. Such evolution unfolds as "problems" are identified in existing versions by "relevant" social groups. As the authors put it, "in deciding which problems are relevant, the social groups concerned with the artifact and the meanings that those groups give to the artifact play a crucial role: A problem is defined as such only when there is a social groups for which it constitutes a 'problem'." (Pinch and Bijker, 1984, p. 30).

The SCOT approach has met with criticism over what seems to be a disregard for the politics involved in the formation of social groupings. As Winner asks, "Who says what are relevant social groups and social interests? What about groups that have no voice but that, nevertheless, will be affected by the results of technological change? What of groups that have been suppressed or deliberately excluded?" (Winner, 1993, p. 369). Winner's own approach to the study of technology adopts a highly-political sensibility. Specifically, he argues for an approach 41 that "identifies certain technologies as political phenomena in their own right," by adopting an approach in which scholars "pay attention to the characteristics of technical objects and the meaning of those characteristics" (Winner, 1980, p. 123). Winner's case study of the low overpasses along the Long Island Expressway – which he argues were deliberately made too low for buses to pass, thereby blocking beach access to low-income populations – has also met with criticism, particularly by Joerges (Winner, 1980; and Joerges, 1999). Woolgar also objected to Winner's assumption that any technology could be designed to produce such predetermined, predictable consequences. As he argues, "[i]n order to present technology as either requiring or being compatible with a particular form of social organization, Winner advances a definitive version of the capacity or effects of that technology" (Woolgar, 1991, p. 34).

Actor-network theory (ANT) takes the notion of groupings and extends it beyond the social, treating all aspects of technoscientific practices as agents, or actors, within networks of influence. Actors are so designated because of their capacity for inciting actions that strengthen (or weaken) the connections within their own networks. This means that non-human actors, such as objects, ideas, and institutions, possess their own agency, which is perhaps the most controversial aspect of ANT. ANT emerged largely as a response to what was considered by its progenitors – Michael Callon, Bruno Latour, and John Law – to be serious flaws in existing sociological research, particularly in terms of situating the "social dimension" when studying scientific practices. According to these scholars, the social dimension is typically cast as an ethereal "other" that serves to falsely dichotomize activities that take place within scientific laboratories from actions that occur "outside" these boundaries (Latour, 2005; and Callon, 1986). The ANT solution to such dichotomies is inclusion; that is, anything and everything that could influence the scientific practice under investigation is considered to be a member of a network of agents and actors (or "actants", another term used by ANT scholars). As Law puts it, "the actor- network diagnosis of science" is "that it is a process of 'heterogeneous engineering' in which bits and pieces from the social, the technical, the conceptual, and the textual are fitted together, and so converted (or 'translated') into a set of equally heterogeneous scientific practices" (Law, 1992, p. 381).

Criticism of ANT is generally directed towards the issue of agency in non-human actants, with Collins and Yearley arguing that ANT scholars generally fail to possess the scientific and technological knowledge necessary to understand the ways in which object agency might 42

function (Murdoch, 2001; and Collins and Yearley, 1992). Bloor has also countered Latour's own criticism of the strong programme, arguing that ANT essentially rephrases the same arguments (Bloor, 1999). Others have taken concepts associated with ANT and expressed them in new paradigms. Akrich, who has collaborated with Latour, developed the concept of the script as a means to understand the effects of affordance and constraints on users of "technical" objects (Akrich, 1992). The script is best understood as the imposition of specific attributes onto technical objects by those who create and develop them. A script provides both affordance and constraints, but it also defines the boundaries between users and objects. The "black box" effect inherent in technical objects is, according to Akrich, not a quality inherent in the object itself, but an imposition placed upon it by designers. As she describes it, "the boundary is turned into a line of demarcation traced, within a geography of delegation, between what is assumed by the technical object and the competences of other actants" (Akrich, 1992, p. 206).

1.4.5 Software Studies and Critical Code Studies

Beyond these more general works, other fields have emerged in which specific scientific and technological artefacts are subject to critical study. Computer software, and its associated code, have become the focus of many scholars in the humanities and social sciences who argue that such artefacts have become pervasive to the point of having a significant discursive impact on contemporary life with respect to influencing beliefs, perceptions, practices, and even languages and lexicons. As Matthew Fuller puts it in an introductory volume to the field of software studies, "[s]oftware structures and makes possible much of the contemporary world" (Fuller, 2008, p. 1). Montfort and Bogost, meanwhile, argue with respect to their platform studies series of works that "[w]e believe it is time for those of us in the humanities to seriously consider the lowest level of computing systems and to understand how these systems relate to culture and creativity" (Montfort and Bogost, 2009, p. vii). Kitchin and Dodge, whose book Code/Space is subtitled Software and Everyday Life, claim that "to varying degrees, software conditions our very existence" (Kitchin and Dodge, 2011, p. ix). These and other scholars have thus applied a variety of methodologies to study specific technological objects, processes, and texts with historical and contemporary contexts, typically to better understand their social, political, cultural, and economic influence. 43

Out of all of these fields, software studies is arguably the most developed in terms of published scholarship. In addition to the recent book series by the MIT Press (to be discussed in more detail below), earlier work by Lev Manovich has done much to shape the field. Manovich's The Language of New Media, while written before the term "software studies" was in use, very much applied an approach in keeping with the discipline. While discussing the nature of digital data and media at length, Manovich also devotes much of his time in analyzing the software used to create such media. As he states it, "[s]oftware programs enable new media designers and artists to create new media objects – and at the same time, they act as yet another filter which shapes their imagination of what is possible to do with a computer" (Manovich, 2001, pp. 117-118). These new media objects, moreover, according to Manovich, reflect the very nature of the societies and cultures within which they are made; the interactive nature of modern web applications, for example, "fit[s] perfectly with the logic of advanced industrial and post- industrial societies, where almost every practical act involves choosing from some menu, catalog, or database" (Manovich, 2001, p. 128).

Fuller provides a more expansive definition of the field in his work Software Studies: A Lexicon, an introductory work in MIT's Software Studies series:

Software Studies proposes that software can be seen as an object of study and an area of practice for kinds of thinking and areas of work that have not historically "owned" software, or indeed often had much of use to say about it. Such areas include those that are currently concerned with culture and media from the perspectives of politics, society, and systems of thought and aesthetics or those that renew themselves via criticism, speculation, and precise attention to events and to matter among others (Fuller, 2008, p. 2).

Other titles in the software studies series use such an approach on a variety of topics. Wendy Chun, in Programmed Visions: Software and Memory, argues that software itself is a product both of military-industrial "command and control" hierarchies and neoliberal economic thinking (Chun, 2011). Software thus disempowers users at the same time it discursively expresses a false sense of individuality and autonomy. This creates an illusory form of "sovereignty" over files that come to define our digital identities: 44

Computers embody a certain logic of governing or steering through the increasingly complex world around us. By individuating us and also integrating us into a totality, their interfaces offer us a form of mapping, of storing files central to our seemingly sovereign – empowered – subjectivity. By interacting with these interfaces, we are also mapped (Chun, 2011, p. 9).

Yet Chun also sees in computers the potential to create alternate experiences that defy the neoliberal model of market-based autonomy. Noting that "[c]omputers are mediums of power in the fullest senses of both words, " she argues that they may be used to "pleasurably create visions that go elsewhere, specters that reveal the limitations and possibilities of user and programmer, choices that show how we can rework neoliberal formulations of freedom and flexibility" (Chun, 2011, p. xii). Thus creation becomes a mechanism by which users may come to gain new perspectives on computation. This is a theme that will be revisited in later chapters.

Kitchin and Dodge's Code/Space looks at similar issues, but from a different perspective. Rather than analyzing how software limits and conditions human agency, the authors examine the agency of code itself with respect to defining and shaping human-created spaces. As they argue, "Software is shaping societal relations and economic processes through the automatic production of space that generates new spatialities and the creation of software-sorted or machine-readable geographies that alter the nature of access and governmentality" (Kitchin and Dodge, 2011, p. x). For Chun, humans are the prime instigators of digitally-mediated experiences, even if such experiences limit their autonomy and agency as a consequence. For Kitchin and Dodge, the digital intrudes on our public and private spaces on its own (or at the behest of those agents responsible for its creation and dissemination), and therefore automatically mediates experiences within such spaces. As they note, "Software is thus actively shaping socio-spatial organization, processes, and economies, along with discursive and material cultures and individuals' construction of identities and personal meanings" (Kitchin and Dodge, 2011, p. xi; emphasis added).

A research discipline very closely associated with software studies is critical code studies. While also focused on software, CCS places more of a focus on the structures that define and shape software. CCS has been largely championed by Mark C. Marino, who states that "Critical Code Studies (CCS) is an approach that applies critical hermeneutics to the interpretation of computer 45

code, program architecture, and documentation within a socio-historical context" (Marino, 2006). The larger goal of CCS is not simply to look at code – as Marino states, "The goal need not be code analysis for code's sake" – but rather to look at the effects code has on larger social and political movements and to "better understand programs and the networks of other programs and humans they interact with, organize, represent, manipulate, transform, and otherwise engage" (Marino, 2006).5 As Mackenzie notes, "[i]n code and coding, relations are assembled, dismantled, bundled and dispersed within and across contexts" (Mackenzie, 2006, p. 169; see also Chun, 2009, p. 3). Hayles similarly argues that "[c]ode has become an important actor in the contemporary world because it has the power to change the behavior of digital computers, which in turn permeate nearly every kind of advanced technology" (Hayles, 2005, p. 48). But Hayles also goes a step further, arguing for the existence of a "Regime of Computation," which may be defined as a particular, and increasingly pervasive, worldview founded upon the notion of logical computation. For those who subscribe to the Regime's beliefs, "code is understood as the discourse system that mirrors what happens in nature and that generates nature itself" (Hayles, 2005, p. 27).

1.5 Technology as Text

Certain digital game scholars have already advanced the argument that games should be treated as texts. Davidson, for example, equates games with literary texts in the sense that a player may become "well played" the same way that a reader may become "well read" – that is, the player will develop a something of a literary sensibility with respect to gaming (see, for example,

5 To this end, Marino introduces the notion of "implied code" – that is, a hypothetical, abstract programming language that can describe various computational processes when actual code itself is unavailable or is overly complex (Marino, 2013; see also Douglas, 2007). A form of implied code called pseudocode is used extensively in certain forms of computer science scholarship, particularly when dealing with the functioning of algorithms. Computer scientist Naomi Nishimura describes pseudocode as follows: Pseudocode strikes a sometimes precarious balance between the understandability and informality of English and the precision of code. If we write an algorithm in English, the description may be at so high a level that it is difficult to analyze the algorithm and to transform it into code. If instead we write the algorithm in code, we have invested a lot of time in determining the details of an algorithm we may not choose to implement…The goal of writing pseudocode, then, is to provide a high-level description of an algorithm which facilitates analysis and eventual coding…but at the same time suppresses many of the details that vanish with asymptotic notation (Nishimura, n. d., p. 1). Pseudocode will be reintroduced in chapter seven of the present study. 46

Davidson, 2008). Kerr indicates that certain scholars classify games as "media texts" – putting them on equal footing with more familiar digital media such as MPEG-4 videos and e-books – as a means by which to highlight the unique aspects of digital games in comparison with older media forms (Kerr, 2006, p. 38). Aarseth carries these arguments a step further by developing the notion of the "cybertext." The cybertext paradigm is used to persuade researchers to focus on the mechanical organization of the text, by positing the intricacies of the medium as an integral part of the literary exchange. However, it also centers attention on the consumer, or user, of the text, as a more integrated figure than even reader-response theorists would claim (Aarseth, 1997, p. 1). Krzywinska echoes this position when she indicates that "[t]he analysis of a game as text takes into account all formal aspects of a game, including all those factors in play in the way that functionality operates in the games" (Krzywinska, 2006, p. 121). Grint and Woolgar largely defined and developed the concept of treating technologies as if they were texts. According to them, such an approach "sets the frame for an examination of the processes of construction (writing) and use (reading)" of any given technology, and therefore puts the researcher in a better position to discover and describe the complex relationships that exist between individuals and groups and the technologies they build and implement (Grint & Woolgar, 1997, p. 70). Despite their introduction of the concept into technological discourses, however, the authors did not advocate for any specific methodological approach for the criticial appraisal of technological texts. This may partly be by design: as they remark, "the point is to play against this metaphor, to see how far we can go with it," which is done by asking, "[w]hat happens to the structure of our discourse when we introduce the notion of machine as text?" (Grint and Woolgar, 1997, p. 70). As something of a response to this query, Woolgar in another work devises what he calls three "responses" to the technology-as-text metaphor. These are, essentially, methodological tools grounded in three different ontological perspectives. I have summarized each response below:

• The Instrumental Response: Building from the assumption that "the very content" of a given technology/text "can be said to be understood sociologically," the instrumental perspective "enables us to identify the process of the construction of the text and, in particular, its insinuation within a network of 'actors'" (Woolgar, 1991, p. 37). This response allows for an understanding of the historical factors that have led to the emergence of a given text, particularly with respect to relationships with external 47

"actors", including other texts (an approach that resonates strongly with actor-network theory; see Latour, 1983; and Latour, 2005). • The Interpretivist Response: Enables the "study of the ways in which technology texts are written and read," (Woolgar, 1991, p. 38). From this perspective, "the pressing analytic issue is to understand the production, organization, and interpretation of the textual character of technologies (Woolgar, 1991, p. 41). While I believe a historical approach is also valid here, this response suggests that a more sociological perspective on these historical materials – that is, an understanding of how relevant communities of actors interacted with one another in meaningful ways – will yield important results. • The Reflexive Response: Probably the most complex of the three responses, the reflexive response is built on the idea that "readings of the technology text are accomplished both by technologist subjects and by the analyst in the course of sociological argument" (Woolgar, 1991, p. 39). Researchers, then, cannot discern objective meanings in a given technology text; like all users, their readings are subjective and contextual. Pushing this argument a step further, Woolgar claims that no reading of a given technology/text may possess "greater authority than any other outcome of textual production and interpretation," including the texts produced by researchers. This perspective, however, actually offers researchers a significant amount of freedom to experiment with respect to their methodological approaches to a given technology/text.

In later sections of this work, I will elaborate and build upon this model, using it to situate various facets of my research.

1.6 Applying Technology as Text

Woolgar's response framework is highly useful for my purposes, but still needs to be expanded upon from a methodological perspective. I have therefore chosen to highlight three areas of research that will correspond to the instrumental, interpretivist, and reflexive responses, respectively: ethnographic content analysis, design world theory, and interactive programming system theory (while the latter two begin with theory, they go on to describe practices that reflect their theoretical foundations). I will review the literature for each below; more detailed explanations as to how I will incorporate each into my own research may be found in later sections. 48

1.6.1 Ethnographic Content Analysis/Qualitative Document Analysis

Much of Altheide's theory-building rests upon an expanded understanding of the concept of ethnography. Traditional ideas about ethnography, he notes, place heavy emphasis on physical presence – that is, the notion that the researcher must be in a physical setting, observing human activity as it happens, in order to conduct proper research (Altheide et al., 2008). While not dismissing the importance of "immersion and involvement" in a given setting, he argues that the notion of what constitutes setting shifts when "symbolic communication" is being studied. Such research would need to be performed in the same "spirit" as traditional ethnography, but not employing the same methodologies:

If the key element involves human beings in a situation, then an ethnographic perspective entails being "there" with the people. However, if the research focus is not on human action per se but rather on symbolic meanings and perspectives within a different domain, then a research perspective and orientation can also be said to be "ethnographic" if it is oriented to emergence, discovery, and description (Altheide et al., 2008, p. 135).

From this perspective, ethnographic research is grounded in a value system, not a specific methodology. Altheide then goes on to explain how this approach would be implemented in a scenario where symbolic communication is emphasized:

Document analysis becomes ethnographic when the researcher immerses him or herself in the materials and asks key questions about the organization, production, relationships, and consequences of the content, including how it reflects communication formats grounded in media logic. The focus initially is on exploration, reading, looking, reflecting, and taking notes before more systematic and focused observations are undertaken (Altheide et al., 2008, p. 135).

The key point here is that documentary ethnographic researchers need to adopt a grounded, flexible approach in which they allow the evidence to lead them on a process of discovery and analysis. This means that the overall goals of a given research project cannot be overly rigid, as research findings may encourage a different tack with respect to a particular question. As Altheide puts it, "[a]lthough categories and 'variables' initially guide the study, others are 49

allowed and expected to emerge throughout the study," leading to an overall process of "constant discovery and constant comparison of relevant situations, settings, styles, images, meanings, and nuances" (Altheide, 1987, p. 68; emphasis in original).

What Altheide is proposing, then, is an approach to engaging with documentary research that breaks from older methods in which (ostensibly) objective and systematic analysis was considered ideal (see Franzosi, 2008). This sort of work centered around the notion of the protocol, defined by Altheide as "a list of questions, items, categories, or variables that guide data collection from documents" (Altheide, 1996, p. 26). While QDA researchers also pose questions and deal with categories and variables, these are all subject to change as progress is made through the documentary evidence, whereas they are essentially fixed when using traditional methods. As Altheide puts it, "the investigator is continually central" in ECA/QDA research (Altheide, 1996, p. 16; as cited by Altheide et al., 2008, p. 128). And it is the impressions made by the documents on the researcher that drive an ECA-based research project forward:

Interpretation in QDA emerges as the researcher is immersed in a community of documents, as he or she converses with them by considering them together as a community that can speak, and as he or she tracks his or her emerging interpretations in this very community of documents (Altheide et al., 2008, p. 128; emphasis in original).

Altheide's approach will allow me to develop a historical narrative with respect to specific themes related to interactive computing. I will therefore engage with relevant documentary materials both as elements of historical evidence and as carriers of discourse and thematic emphasis. This will allow me to challenge current assumptions about interactive computing, and to chart potential alternate paths with respect to how interactive technologies are designed and implemented.

1.6.2 Design Worlds

In his work The Reflective Practitioner, Donald Schӧn was critical of the "Technical Rationality" ethos that at the time was dominant in contemporary professional research, noting that "[f]rom the perspective of Technical Rationality, professional practice is a process of problem 50

solving…But with this emphasis on problem-solving, we ignore problem setting, the process by which we define the decision to the made, the ends to be achieved, the means which may be chosen" (Schӧn, 1983, pp. 39-40). For Schӧn, the instrumentality inherent in such research obscures the very mechanisms by which meaningful output is produced:

In real-world practice, problems do not present themselves to the practitioner as givens. They must be constructed from the materials of problematic situations which are puzzling, troubling, and uncertain. In order to convert a problematic situation into a problem, a practitioner must…make sense of an uncertain situation that initially makes no sense (Schӧn, 1983, p. 40).

In later works, Schӧn expands upon these ideas with his concept of the design world. As Waks explains it, for Schӧn, design practices and design knowledge are inextricably linked, to the point where the designer must engage in his or her own practices in order to tap into the knowledge that sustains and informs their work (Waks, 2001). And this work often involves the making of things, so that "design knowledge and reasoning are expressed in designers' transactions with materials, artifacts made, conditions under which they are made, and manner of making" (Schӧn, 1988, p. 182). Such are the core principles of the design world, as Schӧn explains it:

These are environments entered into and inhabited by designers when designing. They contain particular configurations of things, relations and qualities, and they act as holding environments for design knowledge…As a designer brings understandings, strategies and images to a particular design situation, conducts a dialogue with that situation, and constructs in it a version of a more or less familiar design world, he instantiates a particular set of things to think with (Schӧn, 1988, pp. 182-183; emphasis in original).

The design world paradigm has proven to be influential in several areas of design practice. Mitchell's work The Logic of Architecture devotes an entire chapter to design world theory. Snodgrass and Coyne bring up the topic, though they are critical of it, arguing that a "formal language – a rule-bound and artificial language made up of primary tokens – no more gives a true account of the language of design than it does of ordinary spoken language" (Snodgrass & Coyne, 2006, p. 53). McCullough discusses design worlds within the larger context of digital crafting: 51

Much as different languages cast expressions of basic concepts in unique ways, so different design worlds can cast consideration of design problems in particular orientations. Much as the traditional craftsman takes care to select the appropriate tools and materials, so anyone engaged in creative computing alertly chooses a suitable design world in which to work…Design worlds are more than interfaces and constructions: they are invitations to construct particular mental models of generative strategy. They are different takes on creative computing (McCullough, 1996, pp. 185-186).

McCullough touches here on a crucial characteristic of design worlds: unlike with more serious simulation practices, they are not always intended to accurately model a particular aspect of "reality". As McCullough notes, a design world may be built in a certain way in order to allow for a particular mode of creative expression. Instrumentality is not always the primary concern here. Experimenting with new design world models, moreover, can become a creative process on its own.

Returning to Schӧn, one of his most compelling arguments with respect to design world practices relates to what he calls the "seeing-drawing-seeing" process. Essentially, according to Schӧn, designers make progress on a particular problem by engaging in a two-step cyclical process. First, the designer sketches (in whatever medium meets their needs) a representation of their design problem; usually this involves breaking a problem down into its constituent elements. The designer then critically appraises their work in order to gain new insights with respect to the project they are working on. They then sketch again based on this new knowledge, and the process continues (see Schӧn, 1992). Schӧn also is a strong proponent of the notion of shared and/or common design worlds that would enable collaborative design work. As he puts it, "[d]esigning is primarily social…Hence, designing is a communicative activity in which individuals are called upon to decipher one another's design worlds" (Schӧn, 1992, p. 4). Hobbyist gaming communities were strongly engaged in this sort of work; certain games and gaming tropes circulated widely, with individual programmers customizing and/or adding to existing programs to create familiar, but personalized programs. I will attempt to recreate the connections between these works in order to understand how collective design worlds grew and evolved. 52

1.6.3 Model Building

Academics both analyze texts and produce new texts. These practices cannot be decoupled – that is, in order to analyze a text, one must produce a new text, and in order to have the knowledge needed to produce a text, scholars need to read and interpret previous texts. The texts that scholars produced may then be read and analyzed in turn. According to Woolgar, there is nothing special about such scholarly texts that separates them conceptually from other texts. Appeals to scholarly authority may be raised, but given the subjective nature of textual analysis, such arguments lack a strong foundation.

Humanities scholars Willard McCarty and Michael Mahoney take this line of thinking and apply it to the realm of digital modeling. Both focus their arguments around the notion of modeling and its effects on information, and on "the fundamental dependence of any computing system on an explicit, delimited conception of the world or 'model' of it" (McCarty, 2005, p. 21). Mahoney argues that modelling is the fundamental design act with respect to computer programming, elaborating on this point as follows:

Design is not primarily about computing as commonly understood, that is, about computers and programming. It is about modelling the world in the computer, about computational modelling, about translating a portion of the world into terms a computer can 'understand' (Mahoney, 2005, p. 128).

Given that models are designed artefacts, I argue that it is possible to treat them as texts in the same way that other "machines" are studied as texts. As McCarty and Mahoney note, models are subjective representations, and by reading them one is able to discern biases and larger contextual factors that shaped their design. McCarty argues that "[w]e need to see [modelling] as a form of craftsmanship set into the context of scholarship" (McCarty, 2005, p. 22).

These scholars both go one step further, however, and argue that scholars also need to make their own models. In the introduction to his work, McCarty claims that "[t]he aim of this book, then, is to…to demonstrate persuasively not only that constructing indefinitely many such machines is the way forward, but also that doing so is a new form of traditional scholarly practice" (McCarty, 2005, p. 6). As noted in the introduction, Mahoney states that "[t]he future of digital scholarship depends on whether we can now design computational models of the aspects of the world that 53 most interest us" (Mahoney, 2005, p. 33). This is a highly-reflexive approach to text analysis. By studying models, one is able to recognize their subjective character. This subjectivity may then be leveraged in the scholarly production of new models that do not encapsulate knowledge so much as they present information from specific perspectives. Since, according to the reflexive reponse, scholarly texts are not othered by their supposed authority and objectivity, scholars are free to produce texts that experiment with ideas, rather than stating ostensibly objective facts. This is an approach that will be adopted in chapter six of this dissertation via the description of a specific form of computer programming system.

1.6.4 Conclusion

While I have covered a variety of different topics here, this literature review as a whole is meant to delineate a process through which all of these disparate ideas are linked. Starting from a general context – the history of digital gaming – I moved to the more specific issue of hobbyist game programming, while also touching on related concepts such as constructionist learning and cold war computing. From there, I defined interactivity as a key concept for understanding hobbyist practices, and traced the various elements which make up the history of interactivity. Finally, I outlined both an overarching epistemology – the technology as text – as well as specific methodologies I will use in my research on these topics.

Now that this process has been defined, however, I believe that it is possible immediately shift to the topic of interactivity with respect to the research and analysis sections that follow. This work will inevitably lead to hobbyist computing; unlike in this literature review, I will not stop my inquiry into interactivity before the hobbyist era, but will instead continue on into the 1970s and 1980s. I will, of course, bring in other themes such as the Cold War and constructionist learning. But interactivity is the notion that will drive the narrative, for reasons I will discuss in more detail in the following section. 2 Origins: Computation, Mathematical Modeling, and the Emergence of Operational Gaming

This chapter provides a history of the concept of computation as it relates to digital computing technologies, and the development of forms of gaming that conformed to this conceptual framework. It is a history of digital computing, but not the sort that is traditionally offered both in academic and non-academic contexts. Such accounts tend to focus on a relatively small cadre of men, women, and machines. What has been missing is a historical account of the idea of computing, and how this idea evolved over time. Such an approach will shed new light on events that may otherwise seem disparate, and will indicate how digital computing may be further developed to address needs that have been largely sidelined.

The earliest electronic computers reproduced a specific form of human computation which favoured step-by-step arithmetic operations over more sophisticated analytical methods to solve specific mathematical problems. It is a powerful approach, allowing for the rapid solution to complex problems without the need for human intervention, but it is also limited, closing off other approaches that might prove more insightful. With the development of interactive computing, some steps were taken to address this issue, but practices associated around concepts such as software engineering have largely erased such gains. The RAND Corporation played a key role in shaping such events at a critical moment, and the games that they designed at the time reflect this fact. Such games were categorized as "operational games" by RAND, and evolved out of mathematical modeling processes developed over the course of the Second World War. I will argue here that, despite the fact that the earliest operational games did not rely on computers, digital computation and operational gaming are in fact founded on the same premise – that is, they both rely extensively on the quantitative modelling of real-world systems, albeit from different perspectives – and it was these similarities that enabled RAND to become a major innovator in digital computing for a brief time in the 1950s and 1960s. The earliest digital computers were designed to model aspects of mechanical physics in order to more effectively wage war with ballistic weapons. RAND's models, however, were much wider in scope, and were intended to simulate aspects of future warfare at a strategic level. This included not only the simulation of battles, but also of issues such as procurement and logistics. These two forms of modeling – one digital, one derived from pen-and-paper exercises – would converge when

54 55

operational games were migrated to digital platforms. This chapter will focus on the histories of digital computation and operational gaming before this convergence took place.

An overarching argument that I will make in the following three chapters is that the history of digital computing technologies, and of digital computation in general, was influenced by games to a significant degree – that is, existing scholarship on the history of digital computers may be enriched with a more concentrated focus on gaming. Or, taking a different perspective on the same issue, the problems that the earliest computers were meant to solve were easily adapted to gaming purposes, and were in essence very game-like. Key to this argument is the development of mathematical models. A model is, generally, a collection of variables, and a series of mathematical statements that incorporate these variables. Digital computers started by adapting a particular model that has existed for centuries: classical mechanics, first developed by Isaac Newton to describe the movement of objects through space. Any problem framed with the proper variable values may be "processed" by classical mechanics in such a way as to predict the future behavior of a given object, even if the object itself is purely hypothetical. That is not to say that classical mechanics will provide a complete answer – rather, it will produce results that are consistent with the parameters set out in the model itself.6 This is a crucial point: models are not full articulations of real-world systems. They are instead attempts to mathematically reflect certain aspects of such systems, at least initially. A key element of the narrative that will be built in this chapter, as well as the two that follow, is the surge in interest in mathematical models in the sciences, social sciences, and eventually in digital gaming.

War games – specifically, games played by militaries and military research institutions to examine and explore various war and battle-related scenarios – are another important element of this narrative. Specifically, the development of war games at the RAND Corporation, including what came to be known as operational games, was a pivotal moment in the evolution of digital gaming. Perhaps because these were not consumer products, or perhaps because they were put to pragmatic uses, the importance of RAND's games has not been recognized in academic research up to this point. Yet the vast majority of games within specific genres, including

6 Later developments in physics such as Einstein's relativity theories, as well as the emergence of quantum physics, reveal behaviours that are not captured by classical physics. Even so, mechanics produces usable results for most problems. 56 strategy games, simulations, role-playing games, as well as any games that incorporate elements from these genres, were derived from games that ultimately had their roots at RAND.

This chapter will begin, however, with a discussion of events that preceded the development of digital computers, with a focus on "human computers" – that is, human beings tasked to solve complex mathematical problems. In many ways, decisions that were made in this era about the nature and structure of human computation would go on to serve as a major influence in the design of the earliest digital computing devices. This is particularly true with the creation of the Mathematical Tables Project, a Depression-era governmental organization that hired unemployed "unskilled" workers to work on complex mathematical problems using the most basic of arithmetical operations. It is this particular template that would inform the design of the earliest digital machines. As such, digital computers were designed to solve similar problems via numerical analysis – that is, the breaking down of a problem into basic logical and arithmetical operations. This allowed for a form of automated computation that was quick and efficient, but also quite limited, allowing only for a rigid determinism. Programs had to be set up entirely in advance, and once executed users could only stand by and wait until the computer processed everything and output the results, assuming that the code was properly entered into the system. Users thus lost virtually all agency when a given program was executing, much like autoworkers of the same era working on automated assembly lines. There were other means to solve the sorts of problems computers were designed for that were more analytic and holistic, but on a digital platform these would require interventionist forms of computing that incorporated user input and analysis as a given problem was being solved. The choices made in this foundational era continue to reverberate, as most digital computer programs operate largely autonomously. Some workarounds were implemented that increased user agency, but only to a limited degree; these will be discussed in the next chapter.

The materials presented here were the result of a research methodology "oriented to emergence, discovery, and description," as delineated by Altheide (Altheide et al., 2008, p. 135). The hobbyist computer games described in later chapters were actually the starting point in a journey that took me back to the human computer era, and led me to focus in particular on the work happening at RAND in the postwar era. The documents I cite here led me to pose new questions and evolve my interpretative framework in light of new findings. Certain aspects of the narrative, such as the role played by MIT's Project Whirlwind computer, faded as new 57

discoveries suggested alternate links between pre-digital computing, postwar war gaming, and hobbyist programming. The "community" of documents (to use Altheide's term) presented here are the end result of a flexible, adaptable approach to documentary research.

2.1 Human Computers, Skilled and Unskilled

Many of the fundamental concepts associated with digital computing were inherited or adapted from analog practices developed in the years before the Second World War, and notions of what it means for a computer to be interactive (or not) originate in this period. Such history is of more than antiquarian interest; with respect to the "histories" of computing, Mahoney notes the following:

We have the story of where the physical devices came from, how they have taken their current form, and what differences they have made. But we remain largely ignorant about the origins and development of the dynamic processes running on those devices, the processes that determine what we do with computers and how we think about what we do (Mahoney, 2005, p. 127).

Process, from this perspective, drives design, not the other way around. With respect to the origins of modern computation, the work of Grier is essential. Grier has argued that scholars need to look more closely at the "human computer" era which began in earnest in the 1930s (Grier, 1997; and Grier, 1998).7 This was a time in which groups of like-minded individuals would gather to generate tables and other mathematical instruments by which to solve complex problems (Grier, 1997; and Grier, 1998). Such tables were used in a variety of disciplines, but among the most important were those that described the solutions to differential equations. Such equations needed to be integrated, a procedure from basic calculus in which the area under a graph – i.e. an equation plotted as a graph – is determined. Such an operation is vital to solving a wide variety of problems in the field(s) of physics, but can become exceptionally complex when all but the most basic of equations are considered. These computing organizations thus dedicated themselves to working out such problems and disseminating the results.

7 In a later monograph, Grier traces the origins of computing back even further, to the seventeenth century. In terms of tracing direct antecedents, however, I believe that it is more productive to begin at this later date (see Grier, 2005). 58

Among the more prominent of these human computer organizations was the Mathematical Tables Project, based in New York City. It was created in 1938, in the later years of the Great Depression, by the Works Projects Administration (WPA), the American federal agency in charge of commissioning development projects in order to reduce unemployment (Grier, 1998). The fact that the Mathematical Tables Project was a government-sponsored "make-work" initiative is extremely important with respect to understanding the roots of modern computing. As Grier notes, most pre-existing "scientific computing organizations" were made up of individuals who were "at least semiskilled," meaning that they knew enough about the mathematical work they were engaged in to be able to use "logarithm tables, slide rules, mechanical desk calculators, and, in the most modern facilities, punched card equipment" (Grier, 1998, p. 33). Yet the Mathematics Tables Project, in keeping with its mandate of employing ostensibly "unskilled" workers, mandated that a significant proportion of its work was to be done by hand, using basic arithmetic. As Grier notes, "the manual computing unit employed…about 150 jobless office clerks and other out-of-work white-collar workers. This group computed by hand, with no tools other than paper and pencil" (Grier, 1997, pp. 20-21). It just so happens that complex differential equations may be integrated using such purely "numerical methods," as they are often called. This means breaking down the problem into a series – often a very large series – of simple sub-steps that, when taken together, produce an answer of acceptable accuracy (however that may be defined; generally, more sub-steps will lead to more accurate results.)

A major consequence of such a strategy – which was intentional – was that the practices employed at the Mathematical Tables Project reflected those that unskilled workers engaged in on the automated assembly lines that had been one of the major innovations of early-twentieth century industrial manufacturing. Since the tasks performed by each human computer were extremely simple, the overall effect was an assembly-line model of computation, as Grier vividly illustrates:

Like the killing floors of the meatpackers or the assembly lines of Henry Ford, the Computing Floor of the Math Tables Project divided complex tasks into their component pieces and assigned each piece to an unskilled worker…Project computers would usually perform only a single operation, such as addition or multiplication (Grier, 1998, p. 33; see also Smith, 2005). 59

In effect, then, the process of solving an equation and/or producing a mathematical table had been "deskilled", to the point where individual workers were only concerned with one simple element of a larger problem that they did not even need to know about.8 A small group of managers at the Project were in charge of devising the methods used to produce their tables, and then dividing up the work amongst the lower-level employees. Such workers thus had to rely wholly on the institution that employed them to set the agenda with respect to what work they were going to be engaged in at any given moment. They were, in essence, components of a larger process over which they had no control, subjects within a "command and control" structure that, according to Chun, pervades all computation. Command and control, she argues, encompasses the disciplinary and managerial practices that emerged within military organizations over the course of the Second World War, which according to Edwards (whom she cites) are exemplified by "personal leadership, decentralized battlefield command, and experience-based authority" (Chun, 2005, p. 33; and Edwards, 1996, p. 71). The Mathematical Tables Project is an example of such a system that existed before the war even began, and was the product of a society in which bureaucratization was becoming increasingly sophisticated in the wake of mass industrialization. The notion of computing as a form of bureaucracy has persisted; Joseph Weizenbaum, writing about the nature of computers programs, noted that "[p]rogram formulation is thus rather more like the creation of a bureaucracy than like the construction of a machine of the kind Lord Kelvin9 may have understood" (Weizenbaum, 1976, p. 234; as cited by Chun, 2011, p. 28). Chun also acknowledges this debt, declaring that the "bureaucracies within the machine…mirror the bureaucracies and hierarchies that historically made computing possible" (Chun, 2011, p. 28). In the case of the Mathematical Tables Project, then, it was a literal bureaucracy that behaved as a metaphoric "machine."

8 Akera disputes the idea that these workers were "unskilled", but this appears to be largely an argument about semantics. He acknowledges that mathematicians were only hired at the managerial level, and that the calculating work was performed largely by unemployed clerical workers (Akera, 2007). 9 The reference to Lord Kelvin is scoped within the larger context of early industrialization, an era when mechanistic thinking was paramount within scientific circles. As Weizenbaum argues, due to the "imaginative impact of the relatively simple machines that transformed life during the eighteenth and nineteenth centuries, " it "became 'second nature' to virtually everyone living in the industrialized countries that to understand something was to understand it in mechanistic terms" (Weizenbaum, 1976, p. 233). 60

Yet the Mathematical Tables Project, and the specific form of top-down bureaucratic computing that it embodied, would actually fade into the background with the coming of war, at least initially. With the end of the Depression and the opening of the Second World War, the Project's make-work mandate was dropped, and its resources were assumed by various divisions within the United States military. This meant that arithmetic computation could be dropped in favour of more complex operations, performed by more qualified mathematical workers (Grier, 1997). The creation of mathematical tables had already assumed a prominent place in military research in large part due to the necessity of determining the firing ranges of the growing number of ballistic weapons that were used on the battlefield. Like many other problems that human computers were tasked to solve, ballistic ranges had to be worked out by integrating complex differential equations. Factors such as air resistance and air temperature made it quite difficult to calculate such ranges, and known values would be printed into "firing tables" for operators to use in combat (see Polachek, 1997). In the 1930s, concerns over this issue had led to the creation of the Ballistics Research Laboratory (BRL), which was operated by the United States Army Ordinance Center at the Aberdeen Proving Ground in Aberdeen, Maryland (see Stern, 1981, p. 10).

Unlike the Mathematical Tables Project, the BRL was not constrained by the need to employ unskilled workers, and it thus took a very different approach to its work. Jennifer Light discusses this in detail with respect to the women of the Women's Army Corps (WAC) which were recruited by the BRL to help devise its artillery tables during the Second World War (Light, 1999). Unlike the workers at the Mathematical Tables Project, these "computers" were given extensive training in the methodologies used to devise solutions for the tables they produced. Recruited women were rotated into classes in mathematics at the Moore School of Electrical Engineering at the University of Pennsylvania for periods of eight months at a time (the university had a close relationship with the BRL which will be discussed in more detail below.) As Light notes, "[t]he mathematics ranged from elementary algebra to simple differential equations. In addition, a unit on the use of calculating machines covered computation and calculation-machine techniques, handling numerical data, organizing work for machine calculation, and using slide rules" (Light, 1999, pp. 466-467). These workers were almost exclusively college educated, some with degrees in mathematics, so they were well-prepared for this sort of training. They were thus given a level of empowerment over their work that the 61 employees of the Mathematical Tables Project wholly lacked. Unlike their counterparts at the Project, the BRL's employees were given the information necessary to fully understand the nature of the problems they were dealing with, along with the methodologies used to obtain solutions.

Rather than engaging in the rote work of performing a single arithmetical operation repeatedly, as was the case at the Project, the BRL's workers would take on a given problem in its entirety, and perform all the steps necessary to develop the firing tables that were derived from these problems. Given the nature of the numerical methods used to obtain solutions, this did often mean that arithmetic was required, but they also engaged in more analytic work as well. Kathleen McNulty, one of the first women hired on by the BRL, noted that "when you finished the whole calculation" of a ballistic bullet's trajectory, "you interpolated the values to find out what was the very highest point and where it hit the ground" (Shurkin, 1984, p. 128). Such work was far from automatic, and required a full understanding of the nature of the problem that one was working with. McNulty was also extremely knowledgeable with respect to the nature of the problem of finding ballistics ranges beyond its mathematics, as she demonstrated when discussing the problem years later:

As the bullet travels through the air, before it reaches its highest points, it is constantly being pressed down by gravity. It is also being acted upon by air pressure, even by the temperature. As the bullet reached a certain muzzle velocity…when it got down to the point of 1,100 [feet per second], the speed of sound, it wobbled terribly (Shurkin, 1984, p. 127).

While such information was not necessary to solve the problem in purely mathematical terms, the fact that it was provided to the BRL's workers demonstrates that they were attempting to train knowledgeable workers that could think, and not merely act, and could understand the wider context of the work they were doing.

These projects – both the BRL, and the Mathematical Tables Project – reflect the centrality of modeling in pre-digital computation. As noted earlier, physics relies largely on models of various real-world phenomena, reduced to equations that may be leveraged to predict the "real- world" implications of specific processes (see Arfken, Weber, and Harris, 2012). As human computing organizations were employed largely to solve physics-related problems, the digital 62

computers that were built on their foundation were well-positioned to play important roles in the increasingly popular model-based methods used in the social sciences at the time. As McCarty notes, the models first used in sociology were derived in form and function from physics models, with Comte originally calling for a "social physics" to support a "natural science of society" (McCarty, 2005, p. 146). While modern sociological approaches are much more nuanced, the spirit of Comte's words carried through the postwar period, as modeling, simulations, and certain forms of gaming were seen as positivist elements of change. Boocock, recounting this period many years later, notes the following:

There was…a feeling of being part of a larger process of reform in schools and society in general…Many social scientists believed that American society was on the threshold of a major societal transformation, and that they would play an important role…Orville Brim, then president of the Russell Sage Foundation, predicted that social science knowledge would change the world "as drastically as did nuclear weapons", and that in thirty years' time Americans would have the knowhow to produce the kind of individuals and societies that they chose (Boocock, 1996, p. 153).

The increasing use of digital computing in these early years, in both the public and private sectors, was due to a significant degree on the popularity of quantitative models in a wide variety of disciplines, from war gaming to urban planning to business management. Without such enthusiasm for quantitative methods, digital computation might have remained a niche pursuit, at least initially. And while models are no longer as popular in such fields as they were in this earlier era, they remain the core components in many forms of digital gaming.

2.2 ENIAC and Whirlwind

With the advent of the Second World War, and the entry of the United States into the war in 1941, ballistics tables became more important than ever, and the time required to complete a single table became an increasing concern. Given the rate at which new artillery pieces were being developed for the war effort, output speed became the most important factor with respect to the computational work needed to produce firing tables. The ENIAC, the first electronic digital computer to be designed and built, was commissioned in large part to tackle the linked issues of time and efficiency. John Mauchly, a professor at the Moore School, wrote a paper in 63

1942 that extolled the virtues of the (then-hypothetical) electronic computer in terms of its speed. He claimed that a problem made up of "10,000 steps" would take an electronic device only "100 seconds" to calculate, while human computers could take "at least several hours" to so the same. (Mauchly, 1942/1982, p. 358; emphasis in original). It is worth reflecting on this point for a moment to gain more of an understanding of the processes he envisioned. In the introduction to this paper, Mauchly discusses his intentions as follows:

It is the purpose of this discussion to consider the speed of calculation and the advantages which may be obtained by the use of electronic circuits which are interconnected in such a way as to perform a number of multiplications, additions, subtractions or divisions in sequence, and which can therefore be used for the solution of difference equations (Mauchly, 1942/1982, p. 355).

Mauchly, then, was only talking about the savings in time one could gain with respect to basic arithmetical operations of the sort that the Mathematical Tables Project had employed. In fact, in a later report on the ENIAC, Brainerd and Sharpless noted that the time required to set up particular sets of problems would be better served via more sophisticated mathematical techniques, such as "quickly converging series", performed by hand (Brainerd and Sharpless, 1948, p. 168). But Mauchly had long been interested solely in automating simple arithmetic, and had even once set up his own version of the Mathematical Tables Project via the National Youth Administration, a federal program similar to the Work Project Administration (Akera, 2007). Despite the advanced techniques employed by human computers at, for example, the BRL, Mauchly was focused on the rote work of assembly-line arithmetic.

The ENIAC, then, developed at the Moore School, was the first electronic computer built according to Mauchly's plans. His 1942 paper, cited above, attracted the attention of the BRL, which was desperate for ways to alleviate its backload. In 1943, after liaising with Brainerd, the U.S. Army reached an agreement with the Moore School to develop a machine based on Mauchly's ideas, though it was intended specifically to address the ballistics table issue. Stern makes note of this contradiction:

The ENIAC was to be designed with a special application in view. That is, it would be designed expressly for the solution of ballistics problems and for the 64

printing of range tables, though, as originally envisioned by Mauchly, the device could have had wider applicability (Stern, 1981, p. 15).

Brainerd and Sharpless, in fact, described the ENIAC as an "electronic large-scale general- purpose digital computing device" (Brainerd and Sharpless, 1948, p. 163). The term "general- purpose" as it is used here can be misleading, and refers simply to the task of solving certain mathematical problems via numerical methods – that is, iterating a finite series of values over a specific algorithm or algorithms.

Mauchly's preferences and biases thus had profound implications for the future of digital computing. It is important to remember that this rote, arithmetic sort of computation was initially a social construct implemented to support the hiring of unskilled and semiskilled labourers. Here then is an example of a "political" technology as Winner would define it, in the sense that particular "arrangements of power and authority in human associations" have been frozen in the design and implementation of digital computation (Winner, 1980, p. 123). Chun argues that, with respect to professional digital computing, "routinization or automation lies at the core of a profession that likes to believe it has successfully automated every profession but its own," but this is only the case because of several decades' worth of computer science research and development in which automation has been cast as a foundational principle (Chun, 2005, p. 34). The very concept of automation has thus been reified. Given the ubiquity of this model, automated computing is often seen as an objectively "pure" concept. As MacKenzie and Wajcman note, "[t]he development of computer technology…is often seen as following trajectories that are close to natural laws," standing at a remove from the ambiguities and uncertainties of the real word, when in fact it was directly derived from specific real-world practices (MacKenzie and Wajcman, 1999, p. 3).

ENIAC's direct successor was the EDVAC (Electronic Numerical Integrator and Computer) machine, a more fully-realized version of the stored-program computer concept described in von Neumann's report. By this point, however, other digital computer development projects were emerging in other institutions.10 The Whirlwind computer, a product of MIT's

10 The question of how influential von Neumann's report actually was in the development of the first Moore School computers was a major point of contention in the acrimonious patent dispute between von Neumann 65

Servomechanisms Laboratory that was designed and built in the immediate postwar years, was based in part on the EDVAC, but was initially commissioned for very different purposes.

Project Whirlwind, as the overarching program was called, resulted in the first digital computer designed for "real-time" interactivity – that is, it allowed the user to interact with a program while it was running. This is because the project initially called for the development of a general-purpose mechanical flight simulator. At the behest of U.S. Naval officer Luis de Florez, who had pioneered much of the mechanical flight simulation work engaged in by the Navy during the Second World War, the Office of Naval Research (ONR) commissioned the Servomechanisms Laboratory to "develop a protean, versatile, master ground trainer that could be adjusted to simulate the flying behaviour of any one of a number of warplanes" (Redmond and Smith, 1980, p. 2). Originally the calculations necessary were to be performed by an "analogue" computer, but, as indicated in an early report from the Laboratory, "[b]ecause the amount of computation required for the aircraft analyzer problem appears greater than that practical for analogue computers, it is desirable to consider carefully the use of high-speed electronic digital computing methods" (Servomechanisms Laboratory, 1946, p. 11). The head of the project, Jay Forrester, became increasingly invested in the construction of the digital computer component of the project, and very gradually he was able to diminish the importance of the simulator element until it disappeared completely, and Project Whirlwind became a computer engineering endeavour (Redmond and Smith, 1981).

A simulator would have required some means to rapidly send data back and forth from the computer to the mechanical "cockpit" where users would have been able to read and respond to changes reported by the instrument panel. This was a notion of "real-time" interactivity that was wholly new; since the user would have had agency to manipulate variables within the simulated system, the programs the computer would have been expected to execute would not, from the outset, have all the data they needed. This was also very much a design plan built around a

and Eckert and Mauchly. Continuing battles over patent rights eventually caused Eckert and Mauchly to leave the University of Pennsylvania entirely and formed their own private company, the Electronic Control Company, which was quickly renamed to the Eckert-Mauchly Computer Corporation (EMCC; see Stern, 1981). While the history of EMCC is interesting, other events are more relevant for the present discussion. 66

mathematical model; while the physics were more complicated as compared to the ballistic weapons that ENIAC catered to, the principles of simulating a physical system via differential equations were essentially the same. Even when the simulator aspect of the project was dropped, and its funding was in peril, Forrester continued to tout the benefits of Whirlwind as a potential digital simulator for military purposes (Akera, 2007). At one point the Servomechanisms Laboratory proposed a simulation program for submarine warfare. From the language used to describe this simulation, it is clear that everything centered around the operation of a mathematical model:

Beginning at the point marked "start", the new position of the first ship is computed based on available data regarding its previous position and its speed and bearing during the previous time interval. After computation is complete for the first ship, the control orders are indexed to the position of the second ship and the same program of computation is repeated for the second ship. Repetitions of this computing program are continued until all ships have been calculated (Servomechanisms Laboratory, 1947, p. 5).

Real-time computation, then, was still very much what Forrester considered to be Whirlwind's most important selling point, even if he had pushed the project away from its original intentions. Despite such proposals, however, the ONR did in fact pull its funding. In 1947, the newly- formed U.S. Air Force, however, stepped in, and intended for Whirlwind to be the central piece of a vast, real-time radar system intended to warn of Soviet infiltration into American airspace. By the mid-1950s, however, Whirlwind and its proposed successor, Whirlwind II, were abandoned, and the Semi-Automatic Ground Environment (SAGE), as the new radar system would be called, ran off of modified IBM computers (Redmond and Smith, 2000).

Despite this somewhat ignominious end, Whirlwind was a highly-influential machine that informed and influenced many facets of interactive computing. A major reason for this is that Whirlwind was the first computer that came with a display screen, or, more specifically, a series of increasingly sophisticated oscilloscopes. At first, these displays were more or less ancillary devices, as a description in an early internal Whirlwind report indicates:

The display equipment now in use with WWI is intended primarily for demonstration purposes. It gives a qualitative picture of solutions to problems set 67

up in test storage, and it illustrates a type of output device that can be used when data are desired in graphical rather than numerical form (Servomechanisms Laboratory, 1949).

It was not long before people began putting these displays to new uses, however. Given the nature of an oscilloscope display – where individual points of light fade quickly after they are fired onto a screen – it was relatively simple to devise what may have been the first experiments in computer animation. A prominent demonstration of this technique was delivered in a 1951 episode of the television news documentary series See It Now, co-created and hosted by Edward R. Murrow. Along with other elements of stagecraft – at one point it appears as if Whirlwind itself is talking to Murrow – Forrester, who was being interviewed, presented a display program in which the arc-shaped trajectory of a hypothetical rocket was traced from the left side of the screen to the right side. Represented by a single dot, the rocket's apparent movement was accomplished by plotting one point on the trajectory graph at a time. When a new point was plotted, the previous one would quickly fade away. Vertical sliders on either side of the screen indicated the amount of fuel remaining and the current velocity, and were animated in the same fashion (See It Now, 1951).

An in-house program called Bouncing Ball borrowed this animation style, and applied it to what might be the first display-based computer game, depending on where you draw the boundaries between games and interactive display programs. Norman Taylor, a former engineer on the Whirlwind project, speaking at the 1989 SIGGRAPH conference, described Bouncing Ball as follows:

Charlie Adams, the original programmer, decided that we'd better go beyond static curves. And he invented what we call the Bouncing Ball Program, the solution of three differential equations...You see that the bouncing ball finds a hole in the floor and the trick was to set the frequency such that you hit the hole in the floor. This kept a lot of people interested for quite a while and it was clear that man-machine interaction was here to stay. Anyone could turn the frequency- knobs (Hurst et al., 1989, p. 21).

Essentially, Bouncing Ball, like the rocket demonstration described above, leveraged the oscilloscope screen to produce an animated display of a trajectory plot, the major difference 68

being that the ball/dot would "bounce" every time it hit the "floor", which was really just a plot of the x-axis. A "hole" in this floor was what the player was attempting to reach. Interactions with this problem were accomplished via the "knobs" described by Taylor, which altered the plot of the graph in various ways. It was this capability that made Bouncing Ball an interactive game and/or display program.

In 1962, one of Whirlwind's successor machines, the PDP-1, was used by several MIT-based programmers to develop what was unambiguously a game: the science fiction shooter Spacewar. Designed and programmed by Martin Graetz, Steve Russell, and Wayne Witanen (though ultimately Russell did most of the coding,) the game pit two player-controlled spaceships against one another on the PDP-1 display screen (Graetz, 1981). With a "star" in the middle of the screen acting as a gravity well, players had to navigate the tricky environment while firing "torpedoes" at one another. While the game play would seem simplistic now, at the time it was quite unlike anything ever seen before on a digital screen. It was so impressive, and had such an impact on computer users, that Stewart Brand, writing for Rolling Stone magazine, chronicled a frenzied Spacewar tournament at Stanford University a full ten years after the game was created (Brand, 1972). Spacewar would go on to serve as a major influence in the development of some of the earliest arcade games; that story, however, is beyond the scope of the present study.

2.3 Systems Analysis and the RAND Corporation

The ENIAC and Whirlwind were physical manifestations of wartime thinking – they represented a desire to systematize thought processes in an age when the waging of war and related tasks (such as diplomacy and scientific research) had become distressingly complex. The political and cultural conditions that gave rise to a certain form of digital computing would continue in earnest in the postwar and early Cold War periods, with crucial consequences for the development of digital computing technologies. Much of the relevant research in this period was conducted by the RAND Corporation. The RAND Corporation grew out of Project RAND, a creation of the Douglas Aircraft Company that was "charged with investigating future weapons for the Army Air Forces" (Johnson, 2002, p. 32). This meant attempting to estimate "future technological and operational trends" with respect to waging war, and then developing policy recommendations to address these potential trends (Johnson, 2002, p. 32). Project RAND was also initiated as a means by which research on military projects could continue after the end of the Second World 69

War. As Smith notes, "[m]any people feared that the effective wartime partnership between scientists and the military could not be maintained in peacetime" (Smith, 1966, p. 38). Project RAND was detached from Douglas and spun off into the RAND Corporation in 1948. This meant that the militarization of scientific research would continue, which also meant that the tools used in such research, including the digital computer, would stay relevant. Given the enthusiasm that was building with respect to digital machines, research in computer-related fields likely would have continued even without RAND. But since RAND was in the picture, it was able to exert strong influence over the ways in which electronic computers developed. As Edwards notes, "computer science, besides being heavily funded by the military, was conceptually driven by the strategic and technological puzzles the military provided, especially in its early years" (Edwards, 1990, p. 115). As one of the military's primary puzzle "solvers", RAND played a pivotal role in this history.

As noted in the introduction, RAND is important to this discussion for a crucial reason: it was an important early hub for computer gaming, and many or most of the hobbyist computer games of the 1970s ultimately had their roots at RAND. Unlike these later games, however, which were largely developed for recreational purposes, RAND's games were intended to serve largely as military-related teaching tools, and simulated various aspects of real-world combat.

RAND's interest in computer gaming grew out of its work in a field known as operations research. Operations research (OR) is the name generally given to an important set of research and analysis techniques that emerged over the course of the Second World War. As the nations involved in that conflict came to rely increasingly on the effective leveraging of scientific and technological advances, OR offered what appeared to be objective, scientific methods to inform decision making in matters such as force deployment and logistics. Such work generally involved the development of mathematical models, which are the fundamental components of OR work. As indicated by Cushen, modeling involves the isolation of key variables within a particular system or domain, as well as the relationships between those variables:

Among the many problems which arise in both industrial and military contexts is the optimal use of resources in operations which are designed to bring about the objectives toward which the organization is directed. The key to the solution of the problem, once recognized, is the preparation of a model by which the 70

operation can be studied. It lies in isolating the elements which are important, and in describing the relations between those elements (Cushen, 1955, p. 309).

OR represented in many ways a break from traditional military orthodoxy in that it relied on the work of civilian analysts in order to inform wartime planning. Interestingly, it was the development of radar in the United Kingdom that encouraged military officers to reconsider the value of scientific and mathematical research with respect to strategic and tactical planning, and OR first flourished in the Royal Air Force (RAF). In the United States, OR was met with initial resistance by Vannevar Bush, director of the powerful Office of Scientific Research and Development (OSRD), but in the face of pressure from those who had observed its effectiveness overseas Bush had relented in 1943 to create the OR-focused Office of Field Research. Both the United States Army and Navy concurrently developed their own OR offices, with the Army Air Force taking a particular interest in the field, given its origins with the RAF (Shrader, 2006).

At the close of the Second World War, these same military branches established permanent OR groups to continue their work in the field. The United States Navy created the Operations Evaluation Group (OEG), while the Army, at a slightly later date, founded the Operations Research Office (ORO) (Shrader, 2006). The Air Force, newly separated from the Army to become its own, independent branch, went a step further, creating both an in-house OR office – the Operations Analysis Division (OAD) – and tasking the newly-created RAND Corporation to develop its own OR research. Conceptually, the work performed by each group was meant to be complementary but distinct, as Shrader notes:

Day-to-day OR problems continued to be handled by OAD and the OA offices at major Air Force headquarters. The RAND research program constituted what the Air Force called "background research," that is, the application of scientific analysis of the weapons, equipment, methods, and organization of air warfare, including economic, political, and social factors (Shrader, 2006, 60).

RAND's OR models thus "often involved political, economic, and social considerations," with social scientists later playing an important role in their development (Shrader, 2006). It was this concern for factors not directly related to combat that would prove crucial in term of how OR practices were disseminated in the years to come. 71

2.4 Systems Analysis and Monte Carlo

In their OR work, RAND researchers were particularly focused on high-level strategic and logistical planning for hypothetical future wars against Cold War enemies, with the use of nuclear weapons featuring prominently. In particular, it was this work, which they labelled as "systems analysis," to differentiate it from more familiar OR research, that proved to be more influential in the long term. Systems analysis was a product both of the RAND's mandate to perform speculative "background research" of contemporary geopolitical events and potential future crises.

Hoag notes that systems analysis "typically deals with choices that concern operations farther ahead in time, and takes a somewhat broader look at problems of military choice" (Hoag, 1956, p. 1). Generally, this involves the wholesale quantification of every aspect of a given long-range scenario, with an emphasis on the relationships between these variables, as well as on the ways in which strategic decisions alter the larger system. Systems analysis was considered more "creative" than traditional operations research work, particularly because it was necessarily more speculative. As Kaplan notes, "during World War II, OR analysts were continuously working with real combat data…Yet there was, of course, no real combat data for World War III," so that "[t]he numbers that fed into the [systems analysis] equations came from speculation, theories, derivations of weapons tests results, sometimes from thin air – not from real war" (Kaplan, 1983, p. 87; emphasis in original).

A detailed report from RAND researchers Kahn and Mann serves as a good example of systems analysis work from the period. The authors, in typical fashion, use a hypothetical American aerial offensive as the focus of their work. They build an elaborate mathematical model that is meant to represent and reflect the major variables that could impact the prosecution of such an offensive, and the relationships between those variables. The model consists of a multi-stage process. First, both sides – that is, the Americans, and a hypothetical enemy – deploy their forces. Once both forces are set, the battle itself is modeled. As the authors note, "the [air] strike consists of a series of events most of which have probabilities associated with them" (Kahn and Mann, 1957, p. 16). The report thus contains a series of probability graphs reflecting the potential outcomes of battles between various configurations of forces. Again, as already noted, these graphs were built off of speculative data. This means that the graphs themselves served as 72

arguments, advocating for the importance of specific sets of variables in determining battle outcomes.

Once a model of a given system was built, then, it could be used to make recommendations, or "conclusions", with respect to specific strategic problems. Kahn and Mann illustrate this process by describing a fictional scenario in which "a General asks an Operations Analyst [i.e. systems analyst] how to increase the number of bombs dropped on the enemy in the next 30 days" (Kahn and Mann, 1957, p. 7). Working with that data provided in the report, this hypothetical analyst then builds a subset model which takes into account a wide variety of contingencies, including "the skill of pilots, bombardiers and navigators, the performance of the planes at various altitudes, the performance of the enemy's defenses (radars, fighters, missiles) at various altitudes," and "the deployment of the enemy's defenses" (Kahn and Mann, 1957, p. 7). In the end, this analyst comes up with two potential recommendations: "double (+/- 20%) the number of bombs on target at the cost of tripling (+/-20%) the number of planes shot down," or "triple (+/-) the number of bombs on target at the cost of increasing the attrition by a factor of 10 [around 30%]" (Kahn and Mann, 1957, p. 7). Note that unlike in a war game, these outcomes were not decided upon by actually simulating the prosecution of a given battle. Rather, the models were used to determine solely how such a battle would end, under varying circumstances.

Despite the perceived utility of such techniques, the systems analysis approach was fundamentally flawed with respect to the size and complexity of all but the simplest models. As Kahn and Mann note, the analysis of mathematical models was often excessively complex and time-consuming. They admit that "with the simple model we are considering here, we are able, but only barely able, to make the exact calculation," meaning that a more complex model – the sort that might actually be used in the field – would be almost unusable (Kahn and Mann, 1957, p. 49). As an alternative, however, they advocate for the use of a Monte Carlo approach to arrive at an "approximate" solution. Monte Carlo refers to a process of statistical sampling that builds up a body of knowledge with respect to the operations of a given system. Taking a fictional game of solitaire as an example, the authors describe how Monte Carlo techniques work in practice:

[T]here is a way to sidestep the combinatorial computation…We can simply play the game, say a thousand times. The relative number of wins then gives us an 73

estimate of the probability of winning. The estimate, of course, is not exact but it is probably not far off (Kahn and Mann, 1957, p. 49).

While there are additional complications with respect to Monte Carlo gaming that will be addressed shortly, this is generally how such methods are intended to operate.

Monte Carlo was conceived of initially by mathematician Stanislaw Ulam, and refined with the assistance of John von Neumann and Nicholas Metropolis, who were all working at Los Alamos Scientific Laboratory, a major hub of nuclear weapons research during the Second World War and beyond (Hurd, 1985; and Metropolis, 1987). From there, knowledge of the technique spread unevenly across the various postwar OR organizations. Among them, the ORO and RAND were the most committed to incorporating Monte Carlo into their work. Ulam appears to have consulted directly with the ORO about the method, as he is credited in an ORO report from Richard Zimmerman for contributing "basic and original thinking on Monte Carlo techniques" (Zimmerman, 1956, p. vii). Zimmerman himself would go on to become an important champion of Monte Carlo. At RAND, this role was played by Herman Kahn, cited above, who trained in university as a physicist but would go on to become perhaps the most well-known military strategist of his time (Ghamari-Tabrizi, 2005). Having joined RAND soon after its inception, Kahn's earliest reports with respect to Monte Carlo were concerned with optimizing its procedures, and applying them to problems in particle physics (Kahn, 1949; and Kahn, 1953). By the mid-1950s, however, working with fellow RAND researcher Irwin Mann (see above,) he began experimenting with OR and systems analysis-related problems – that is, applying Monte Carlo to problems of military planning and logistics (Kahn and Mann, 1947a). This was an important sub-step on the way to gaming (see next section).

2.5 Emergence of Operational Gaming

As useful as Monte Carlo techniques were to these researchers, problems emerged when mathematical models became overly complex. As Zimmerman notes, "it is clear that a major restriction on the scope of the combat action to be simulated is the length of time that can be allowed for the computer to simulate a single battle" (Zimmerman, 1956, p. 16). Since this same battle often had to be simulated hundreds or thousands of times in order to generate an actionable 74 amount of data, the time to complete one repetition was a crucial factor. As Zimmerman notes, a battle model that took an hour for a given computer to resolve – which was not unusual, given the computational power of the time – would take six months to process 1000 different input configurations, assuming a 40-hour workweek (Zimmerman, 1956). While not all models were this complex, even simple models had to work quickly in order to generate timely results. Another, somewhat related issue, concerned the human element. Since Monte Carlo generates successive sample data according to a specified rubric, there is never an opportunity to observe how a modeled scenario would play out if human beings were in control of making critical decisions. Such information might be considered useful, for example, when studying the psychological factors that go into strategic or tactical decision making, or to demonstrate fundamental principles of a model to others in a way that implicates them within it. For these and other reasons, certain groups of OR researchers took to turn-based war gaming as a means to engage with operational models. Such an approach would operate similarly to Monte Carlo, but with human players generating sample data instead of a deterministic rubric. Complex games would still take time to work through, but fewer runs would be needed, in theory at least, to generate useful results and prove or disprove assumptions and hypotheses.

A number of military forces had already been actively involved in war gaming. The German "Kriegsspiel" games of the nineteenth century ushered in the modern era of tabletop war gaming for military use, while H. G. Wells famously promoted war gaming as a hobbyist pursuit with his 1913 work Little Wars (Perla, 1990; Wells, 1913). The major forces of the Second World War all played war games to some degree, though Germany and Japan in particular came to rely on them to educate officers and inform their planning (Perla, 1990). As operations research emerged as a fundamental aspect of military planning in the United States and elsewhere, gaming gradually became recognized as a useful technique by which to engage with operational models, as Francis McHugh notes:

During the postwar years operations research personnel continued and expanded their wartime gaming and simulation experiments, and a number of hand-played games were devised by such organizations as the RAND Corporation, the Operations Evaluation Group, and the Operations Research Office (ORO) of Johns Hopkins University (McHugh, 1966, p. 2-40). 75

Writing in the journal Operations Research, Clayton Thomas and Walter Deemer of the United States Air Force's Operations Analysis Office (which worked closely with RAND's researchers) introduced to readers the concept of operational gaming. As they described it, operational gaming can be understood as a blend of Monte Carlo modeling and traditional, turn-based gaming.

To demonstrate how operational gaming works, the authors take a problem that was originally solved analytically, to show how it may be converted into an operational game. The problem scenario is drawn from an earlier article by Morse, in which a submarine wishes to cross through a narrow channel without being detected by a patrol aircraft (Morse, 1948). Morse solves this problem analytically by building a mathematical model out of it, then solving it in such a way as to minimize the probability of the submarine being discovered. Thomas and Deemer, however, turn the problem into a game. First, they divide the channel navigated by the submarine into four "squares" labelled from A to D. They then propose that three "teams" be created: the "blue" team, which will control movement of the submarine, the "red" team, which will control the aircraft, and the "control" team, which acts as umpire and moderates the game. Red and blue work in different physical spaces, so they only have knowledge as to the whereabouts of their own game piece. At each turn, the red team selects two squares to search, and communicates these to the control team. Depending on the submarine's position, the control team determines whether or not the red team detects it. Whatever the control group decides, it communicates the results to both teams, and play continues until the submarine is detected, or makes it safely across the channel.

In terms of calculating the "result of the play," Thomas and Deemer advocate for a randomness that would be instantly recognizable now to role-playing game players, and probably to all players of games of chance. This, as they put it, is in "the spirit of Monte Carlo methods," and goes as follows:

When at least one of the squares of search coincides with one of the squares of surfacing, there is a non-zero probability of detection. Random numbers may be used to decide whether or not detection does actually occur on that play of the game. The type of result communicated to the two teams, then, will be either "sighting in [for example] A" or "no sighting" (Thomas and Deemer, 1957, p. 16). 76

The scenario thus becomes something of a game of chance, though the randomness in the game is meant to reflect the uncertainties inherent in real-world combat. Regardless, this is a model that must be played in order to produce results.

The RAND Corporation was particularly enthusiastic about turning their operational models into games. An important early RAND game was "Monopologs", developed by the logistics department in the mid-1950s. A 1960 report describes it as follows:

The game is played in the context of a very simple simulation of the Air Force supply system, in which a player or group of players makes all the decisions of a spare-parts inventory manager. He selects a strategy for the life-cycle of the spare part he is managing, taking into account the given costs and lead-times for each of the actions necessary to supply management, as well as the cost of inaction – supply failure. His score consists of the total costs he incurs, added up at the end of the game (Renshaw and Heuston, 1960, p. 4).

The key variable that the player must contend with is demand for the given part, which fluctuates each turn. As the same report notes, "[d]emands are revealed month by month as the player goes through the life history of this particular aircraft" (Renshaw and Heuston, 1960, p. 9). The entire table of demand values that that the player will work through is included in the game; it is hidden at first, and then the "player then uncovers it, one month at a time" (Renshaw and Heuston, p. 12). This is thus a single-player game in which the player must react to and adapt to changing conditions that they have no control over. This type of single-player game – pitting the player against a specific operational model – will become an extremely important archetype in the interactive gaming era.

In terms of other case studies, Helmer describes a game called SWAP in a RAND report on operational gaming aimed at "the operations analyst and strategic planner who may consider availing himself of gaming as a tool in his profession" (Helmer, 1960, p. 1). The game's focus is on "procurement strategy, deployment, and operational strategy for a strategic air war" that begins sometime within the five-year (i.e. five-turn) time span in which the game is played (Helmer, 1960, p. 2). The year in which the war begins is determined randomly, so the amount of time given to prepare for battle is initially uncertain. Helmer and Quade, in the report cited 77

above, also speculate on how a national economy could be modelled for a hypothetical operational game (Helmer and Quade, 1963).

Operational gaming thus involves the use of an arithmetical model to define both the rules and mechanics of a turn-based game involving one or more players. Generally, each turn is made up of two phases. In the first phase, a player will generate data to be used in the game's underlying model. Such input is generally the actions a player wishes to take in a given turn. The second phase involves the application of such data to the mathematical model in order to produce a set of statistics that are output back to the users. These statistics generally have a direct impact on the resources the player can then marshal at the start of the next turn, creating something of a feedback loop for the durations of the game. Helmer refers to these as the "procurement phase" and the "operational phase", though, again, he was imagining a war-centric context in which procurement was arguably the fundamental issue at hand (Helmer, 1960, p. 2). Nevertheless, these labels illustrate the general ethos of an operational game, in which user decisions shape the data that is input into strategic/operational models, which then go on to produce new data that can either broaden or limit the potential decisions players can subsequently make, creating something of a feedback loop for the duration of the game.

2.6 The World in the Computer

It is worth re-emphasizing here the importance of mathematical modeling in operational gaming, even outside of digital computational environments. Scholars such as Mahoney connect modeling with digitization, and problematize modeling efforts with statements such as the following:

[P]utting a portion of the world into the computer means designing an operative representation of it that captures what we take to be its essential features. That has proved, as I say, no easy task; on the contrary it has proved difficult, frustrating and in some cases disastrous (Mahoney, 2005, p. 129).

Following this line of thinking, the very act of "computerizing" an aspect of the world means coding a potentially adverse digital representation of it. But Mahoney's argument is flawed in the sense that it assumes that there is a fixed definition in terms of what constitutes a computer. He makes it sound as if researchers at RAND and elsewhere strove to alter their work so that it 78

accorded with this supposedly fixed concept. But the research presented in this chapter, and those to follow, reveal that notions of what constitutes digital computation change over time. Partly because RAND would go on to have significant influence over this process, I would argue that computers, in fact, were designed to process specific forms of "operative representations," to use Mahoney's term. These representations, in other words, came "first," and digital computers where then built to operate them. This later became a problem, as this singular concept of how a computer could be designed would crowd out alternative forms in which relationships between programmers and users were much different.

It is at this point, however, with the computerization of operational gaming fast approaching, that the relative lack of agency afforded to programmers and other users becomes especially problematic. RAND's gaming researchers would come to recognize the limitations of operational models. As early as 1963, Olaf Helmer and Edward Quade noted that "computer simulation is open to a number of objections," arguing that "certain human factors, of great importance in the real world and hence not negligible, are extremely difficult to quantify; pride, loyalty, resistance to change, religious prejudice, etc." (Helmer, and Quade, 1963, p. 7). Yet operational models, particularly when digitized, would come to possess an authority that largely obscured such issues. Since computer programs run largely automatically, there are few, if any, tools available within a given system that would allow for real-time auditing and modification of a model. An important exception to this rule will be discussed in the next chapter, though it offers only a partial solution.

3 Antecedents: Interactive Computing, Decision Simulations, and the Spread of Operational Gaming

This chapter continues the narrative established in the chapter two, focusing in particular on the emergence of interactive computing in the 1960s, as well as the spread of operational gaming outside of military research units into the private sector and public education, a process led largely by RAND Corporation researchers. What exactly "interactive" computing entails will be an important topic for discussion, as the term does not have one agreed-upon definition. As will be seen in this chapter, it is as much an ideological concept as it is a technological construct. J. C. R. Licklider, then an executive at the highly-influential technology firm Bolt, Bernaek, and Newman, conceived of interactive computing as a means to effectively intermingle human intelligence and digital computation in order to solve complex problems, a concept he called "man-computer symbiosis" (Licklider, 1960). The concept of the computer network, familiar to most users today, was designed largely with this purpose in mind. This resulted in certain design choices – in particular, the development of increasingly complex user interfaces, coupled with a significant surrender of control over computation to the computer – that continue to resonate.

In keeping with a major theme from the previous chapter, the role of gaming in supporting and extending these ideas will be highlighted here. While the importance of Licklider's work is generally acknowledged, scant attention has been paid thus far to the influence that the RAND Corporation had on interactive computing. While hardware innovations tended to happen elsewhere, RAND focused largely on the development of "software" – that is, pre-built programs that would load into memory so that they were accessible to all users. They also had a pivotal use case that informed their work. As discussed in the previous chapter, RAND's research into operational gaming suddenly became very important in the early years of the Cold War. As RAND promoted its techniques to other outside interests, the systems developed to support operational gaming would go on to be a major influence in the development of other interactive computer systems. At the cusp of the home hobbyist programming era, the technologies used to build personal computers were thus designed in such a way as to facilitate interactive game programming.

A key event that will be discussed here is the creation of interactive programming language environments, beginning at RAND, and leading to the development of BASIC, the language of 79 80

choice in the hobbyist programming era. Such systems enabled a transactional form of programming that was perfectly suited to turn-based gaming. This was particularly true with the introduction of commands that allowed running programs to query users/players for variable values. Again, this concept was first developed by RAND, but BASIC's INPUT command became the most well-known of such query commands.

When these commands were introduced, programmers could begin to develop their models apart from their intended users, and then turn their programs over to players who would interact with them via these query commands, without having to worry about the underlying code. Such mechanisms allowed interactive, model-based programs – including war games – to flourish, and, as will be discussed in the next chapter, become a foundational element of hobbyist computing in the early PC era. The type of interactivity offered by such systems, however, was flawed at a fundamental level: while allowing for a great deal of access to programmers, it shut out other users from all but a limited set of pre-determined actions. This "layered" form of interactivity has become normalized, so that, for example, game players still essentially just supply variables to models (via keyboard or game controller) but typically cannot access and/or modify game code.11

As with the previous chapter, the research presented here was accumulated via Altheide's techniques with respect to documentary ethnographic research. I will trace connections between RAND's operational gaming and the business and economic games that would follow in its wake. Most of the relevant materials in fact forge such links on their own, in the sense that authors tend to credit the work of prior researchers in order to frame their own findings.

This chapter will begin with a brief discussion of the migration of operational gaming to digital computing platforms. Interestingly, this first occurred before the creation of interactive systems, but only at a tepid pace; the transactional computing model developed for time-sharing networks accelerated the trend. At the same time, RAND catalyzed the spread of operational gaming

11 The emergence of game "mods" and "modding" – that is, changes to a game's assets and mechanics using tools that are provided either by the publisher or created by players – overcome these limitations to a partial degree. The problem with modding, I would argue, is the fact that it serves as a patch to circumvent the restrictions imposed by a multi-layered interactive system. In fact, it essentially serves as yet another layer that further splinters a given game's community of users. Having said this, there are certainly aspects of modding culture and practices that could be adapted in a less hierarchical interactive model. 81

within the field of business management. These events will be discussed, as will the creation of an operational game by researchers working within the New York state public educational system that would go on to have a profound impact on hobbyist game development; the hobbyist era itself will be treated in the following chapter.

3.1 Digital Operational Gaming and Network Computing

Looking back on this earlier era from the late 1960s, a time when "games of strategy involving both men and machines operating in real-time" were more common, RAND researcher G. M. Northrop noted that "the difficulties in applying digital computers [to gaming] in the past derived in part from the lack of computational speed, the complexities of programming languages, and the batch operation feature (single input, single output) of most computers of that period" (Northrop, 1967, p. 1). Yet the transition from paper-based operational games to computer- based games was slow but seemingly inexorable. Despite the problems identified with respect to computer gaming, the speed advantage was simply too great to resist. Paxson, writing in the early 1960s, discussed at length the increasing importance of "man-machine games". This was a broad, emerging category of games in which one or more players employed a calculating machine – typically a digital computer – to perform the mathematical work required by a given operational model. Typically this would mean processing user input, applying it algorithmically to the model, and providing the user(s) with output data that represented the updated state of the model. This meant that much of the work that was required to conduct an operational game could be handled electronically. Game models could therefore be much richer without requiring any more work on the part of players. With respect to the sorts of air attack scenarios that RAND was studying, Paxson estimated that, with electronic models, "perhaps a hundred or more separate numerical planning factors" could be "estimated and compounded" (Paxson, 1963, pp. 14-15). This also meant that players did not even need to understand the models that they worked with, as long as they were programmed correctly by those with the required expertise. This resulted in certain issues that will be looked at later in this section.

Digital operational games progressed on numerous fronts. The U. S. Army's ORO group (see chapter 2) was an early adopter of digital computing technologies, and produced a war "simulation" game called Carmonette in the mid-1950s. Programmed for a Univac , Carmonette was capable of simulating a battle between two opposing armies (with up 82

to 36 units per side) on a wide variety of terrain types (Harrison, Jr., 1964). Continually revised and expanded, Carmonette was used for well over a decade. At roughly the same time, the RAND Corporation developed a digital simulation game for an air war called the Strategic- Operations Model, which allowed for the "rapid replication of war games with variations in parameters" (Adams and Jenkins, 1960, p. 601). This program was later taken over by the Air Force's Air Battle Analysis Division and renamed the Air-Battle Model, and was also expanded upon over the next several years (Adams and Jenkins, 1960; and Harrison, Jr., 1964). RAND also experimented with the computerization of logistics-based systems analysis models of the Monopologs variety. The Logistics Simulation Laboratory (LSL) at RAND developed a series of game models that culminated in their Laboratory Problem – IV (LP-IV) "game simulation" program, which simulated the "basic operating cycle of an aircraft base," challenging the player to keep things running smoothly (Geisler and Ginsberg, 1965, p. 14). Such games reflected the importance of the high-level systems analysis research conducted at RAND. While games developed by groups such as ORO focused on lower-level battle tactics, RAND's games tended to deal in longer time frames, and focus more on resource management. This specialization would have crucial consequences for the future of digital gaming.

Despite the growing enthusiasm for digital games, the earliest computers struggled to facilitate them. The lack of a proper interface that could facilitate turn-based gaming was the key problem. Advancements made in the mid to late-1960s, however, would result in the first interactive systems being implemented in a variety of institutions, though college campuses and research institutions such as RAND tended to be the earliest adopters of the required technologies. This was also thus the era in which interactive gaming emerged, in the sense that a user/player could engage with a given game program as it was actually running. While such capabilities were in some ways beneficial – the transfer of mathematical work to digital computers, as discussed above, was a key feature – they also led to complications with respect to interactivity that have arguably worsened over the years.

Those who devised the first so-called "interactive" systems had lofty ambitions with respect to their potential benefits to users. Foremost among these individuals is J. C. R. Licklider, who, while serving as a vice-president at Bolt, Beranek and Newman, published an article in 1960 entitled "Man-Computer Symbiosis" (Licklider, 1960). In this piece, Licklider envisioned a near-utopic future in which individuals and digital computers could mutually reinforce their 83

respective skills and talents in order to solve problems that neither could handle on their own. He described what he meant by symbiosis as follows:

It will involve very close coupling between the human and the electronic members of the partnership. The main aims are 1) to let computers facilitate formulative thinking as they now facilitate the solution of formulated problems, and 2) to enable men and computers to cooperate in making decisions and controlling complex situations without inflexible dependence on predetermined programs (Licklider, 1960, p. 4; emphasis added).

This last point is significant. Licklider's primary issue with digital computing at the time was that computers could only solve problems that were fully articulated by program developers. This meant that both the problem itself and the values that were to be used in solving the problem had to be decided in advance. Licklider's symbiosis model is really about returning a significant amount of agency back to the user, and making computing more interactive overall. Whether or not this goal was truly achieved is another matter entirely.

In order for his system to work, Licklider emphasized the importance of time-sharing computer networking (Bardini, 2000). The basic idea behind time-sharing is that a central computer governs the activities of several users, who operate the machine via "terminals". It does this by focusing on each user terminal for a short period of time – generally much less than one second – and performing whatever work it can do in this time frame before moving on to the next terminal, repeating the process indefinitely. Fernando Corbató, a computer scientist working at MIT, headed a team that went on to develop the Compatible Time-Sharing System, one of the earliest and most influential time-sharing systems (see Corbató, Merwin-Daggett, and Daley, 1962; and Walden and Vleck, 2011). Users could interact with a central machine with an "electric typewriter" console. By doing so, users encountered another important innovation of the early time-sharing era: a command-line terminal interface. While command-line systems were not entirely new at this point, they were rare enough that the CTSS programmer's guide had to describe how to use one:

The commands are typed by the user to the time-sharing supervisor… Commands are composed of segments separated by blanks; the first segment is the command name, and the remaining segments are parameters pertinent to the command… A 84

carriage return is the signal which initiates action on the command. Whenever a command is received by the supervisor, "WAIT" is typed back. When the command is completed, "READY" is typed back (Corbató et al., 1963, pp. 9-10).

To create a program, the user had to type in the input command, the functionality of which was described in the guide as follows:

Initiates an automatic mode of input. The supervisor types out line numbers which will be attached to the lines input by the user. The user types a card image per line, according to a format appropriate to the programming language…Each line is processed by the input program (Corbató et al., 1963, pp. 70).12

This notion of an input "mode" is important in terms of understanding how interactivity begins to become splintered within a time-sharing system. A system such as CTSS needed to know whether a user was typing commands directly to the system – to load or save a program, perhaps – or was entering lines of code into a new or existing program. This was done via the mode system. When the user invoked the input command, CTSS would then redirect subsequent user input, adding each line to the program residing in memory. Once the user invoked the file command within input mode, the user would be brought back into the regular CTSS mode.

It is easy to see how this could all get rather confusing, especially when other modes are introduced, such as the printer/plotter mode which was invoked by the plot command. It is said that systems such as CTSS are interactive, in contrast to batch mode, in part because they provide "sophisticated query and response programs which are vitally necessary to allow effective interaction," to cite once more from the programmer's guide (Corbató et al., 1963, p. 1). Yet there were still limitations imposed upon the user. The first came in the form of these modes, which restricted the user to certain actions depending on what mode they happened to be in at any given time. The second limitation was that running programs could not be interacted with directly. As will be discussed below, such runtime interactivity has come to define modern computing, but CTSS did not support it. Finally, CTSS users were limited to the functionality offered by the CTSS system. This may seem obvious, but consider the fact that CCTS itself had

12 This should not be confused with BASIC’s INPUT command, which will be discussed momentarily. 85

to be developed in an environment in which sufficient agency was given to the developers to program in the functions that CTSS runs automatically for the user. Much is thus concealed whenever a system such as CTSS controls the computer at a lower level than that which is offered to the user. This sort of multi-layer agency now permeates modern computing systems, with barriers thrown up so that users cannot, for example, modify large sections of memory used by a given operating system. As will be seen in chapter five, languages such as Interlisp and Smalltalk that arrived in the 1970s-1980s removed these barriers – at least to a certain extent – thereby conflating all interactions into a single layer.

Chun further problematizes the time-sharing model by equating it with neoliberal ideology. As she argues, "[i]n a neoliberal society, the market has become an ethics: it has spread everywhere so that all human interactions, from motherhood to education, are discussed as economic 'transactions' that can be assessed in individual cost-benefit terms" (Chun, 2011, p. 8). These early time-sharing networks, of course, are inherently transactional, in that the typical user sends requests to a central server via a command-line operating system, and then receives output back from the server. Such interactions, moreover, are conducted largely by individuals, working at their own terminals. With respect to time-sharing operating systems, Chun notes the following:

Operating systems also create users…literally, for users are an OS construction. User logins emerged with time-sharing operating systems, such as UNIX, which encourage users to believe that the machines they are working on are their own machines (Chun, 2011, p. 67).

This focus on the individual has thus had a major influence on how users interact with their computers, to the point where programs themselves can be thought of as products of neoliberal thinking. A program is a textual assertion of authorship – that is, rather than being a purely mechanistic process, such as, for example, the act of starting a car via its ignition system, digital computation is understood as a creative act akin to painting or writing a novel. As Chun notes, such thinking has given rise to "the nostalgic fantasy of an all-powerful programmer, a sovereign neoliberal subject who magically transforms words into things" (Chun, 2011, p. 8). As interactive systems became more sophisticated, this "fantasy" became all the more powerful, and thus reified within the larger culture of computer programming. 86

3.2 JOSS and Real-Time Gaming

CTSS pioneered the concept of time-sharing networking, but much more powerful systems emerged in the mid-1960s, and these had far more influence over the later development of interactive programming. Among the most important of these was the RAND Corporation's Johnniac Open Shop System (JOSS).13 RAND had been experimenting with digital computing since the early 1950s, leading to the development of the JOHNNIAC digital computer. While JOHNNIAC was built entirely in-house, its architecture was borrowed from an earlier machine developed at Princeton's Institute for Advanced Study (Ware, 1996; and Ware, 2008). In the broadest sense, JOSS operated much like CTSS, with a similar command-line interface. But JOSS was intended to be a simple, specialized system, or, to use Baker's phrasing, JOSS was "a personal service," complementary to other services and tools such as "a telephone, a desk calculator, or a slide rule" (Baker, 1966, p. 1). JOSS terminals were meant to be accessed by users straight from their office desks, with wall outlets connecting them to the central server (Baker, 1966). JOSS came embedded with its own programming language, also called JOSS. In fact, the language and the system were inextricably linked, which was an important innovation. Unlike CTSS, which required users to change modes in order to program, JOSS users could write their programs directly from the main command line. If a given command was preceded by a line number (i.e. "1.1 Set x = 10"), then it was stored in memory as a program line; otherwise, the command was executed immediately. This feature would also be implemented across a wide variety of home computer operating systems throughout the 1970s and 1980s.

The JOSS language was fairly straightforward, and designed expressly to assist in numerical analysis. Symbolic analysis – that is, non-numeric characters and strings of non-numeric characters, was not supported. Arguably its most crucial feature, however, was its Demand function. When embedded within a program, Demand prompted the user to supply a value for a given variable while the program was still running. So, for example, if I included the line "Demand x" in my program, then when the program is running and reaches that line, it will stop

13 The name of the system is somewhat confusing. It was originally derived from the fact that JOSS was first implemented on a Johnniac machine, a digital computer built at RAND and named after John von Neumann. But JOSS was soon moved to more powerful systems to meet user demand; despite this, the name JOSS persisted. 87

what it is doing and wait for the user to supply a value for x. That value can then be used in later mathematical operations.

It is difficult to overstate the impact that Demand and commands like it (see below) had on the nature of digital computing. Demand fundamentally altered the character of JOSS programs, which otherwise were structured along the same lines as FORTRAN and related languages. As emphasized in the previous chapter, the earliest computer programs could only be run automatically, barring virtually all forms of user intervention. When a program used the Demand command, however, its final output was not pre-determined, and in fact there would be no output unless the user complied with the Demand request and provided the necessary values. If modern personal computers operated in such a fashion, none of the more popular tasks that they perform, from word processing to internet browsing, would be possible. Here, then, a new form of user interactivity was developed, as well as a new input mode. This mode, however, is different from those looked at earlier, in that it appears only within the middle of running code, and only at the behest of the original programmer. It is also a heavily-restricted mode, in that its only purpose is to allow the user to supply a running program with a variable value. Users could supply a Demand command with data, but not with any sort of command.

Despite its usefulness, however, Demand is something of a problematic function. It might appear initially as if this solves the problem of the user having no agency over a running program. However, while Demand did allow the user to customize the output they would receive based on the variable values they supplied, it did not allow them to alter the program code itself, or to change how such code was executed in any way. This issue requires a bit more exploration in order to make the implications clear. To take the example of an older programming language, anybody who wanted to use a FORTRAN program essentially had to learn FORTRAN themselves. One had to be a programmer in order to use programs. With the Demand command in JOSS, however, this was no longer the case. If a given JOSS user wanted to solve a given problem – for example, to find the area of a circle – they could have hypothetically used an existing JOSS program in which the Demand command was employed to query the user for the necessary parameters of their specific problem – in this case, the radius of their circle. 88

This was not how JOSS was intended to be used, however; as already noted, one of the primary goals of creating JOSS was to make programming an "open shop" practice, giving individuals that were not computer scientists or engineers the ability to devise and solve mathematical problems. So the people that were running a given JOSS programs were, presumably, quite often also the same individuals that created the program. But as computing evolved over the next few decades, the roles of programmer and user became increasingly separate and distinct. Most users now work with software that they did not code in any way themselves, so their interactions are limited to whatever the developers allow. The agency given to users of such ostensibly interactive programs is, therefore, actually quite circumscribed.

Nonetheless, with the development of JOSS at RAND, war games became "real-time" games, and much more like the digital games played by consumers today. As defined by RAND researcher G. M. Northrup in a 1967 paper, the term "real time" refers to a form of gaming in which computers are used to speed up play and allow users to interact with a constantly evolving system (Northrup, 1967). Not only does this require "continuously available computer power," but also mechanisms through which multiple players could share and pool information, and otherwise communicate with one another. Northrup then argues that "large numbers of on-line, time-shared computer consoles" may be employed "to enter, recall, process, and display information typical of that used in command and control systems and the play of games" (Northrup, 1967, p. v). He then notes that the JOSS system allows such functionality, and that "it is presently being employed to simulate in real time elements of an automated tactical air control system and in the play of tactical games and games of global strategy" (Northrop, 1967, p. v).

Northrup describes two such JOSS-based war games in detail. The first involves two players, RED and BLUE, that played an automated JOSS-based game from two separate terminals. The game simulated is a submarine duel, and at every turn, each player would "choose a new course heading, new speed, and fire up to three missiles at selected aim points" (Northrop, 1967, p. 15). After these decisions were made, according to Northrop, "the various parts of the program" were "activated and the results determined and presented to the players" (Northrop, 1967, p. 15). Essentially, this means that the model would calculate the results of each move, which it would then pass on to the players. This basic framework has carried through to the present day, so that virtually all modern strategy and war games operate under similar principles. No matter how 89

many players are involved in a given game, the game program almost always operates the game, and enforces its rules.

In the second game, the basic setup is different, in that there is a third user, nicknamed "GREEN", that assists in managing and refereeing the game. Adding a human player is significant, in that it allows RED and BLUE to engage in much more sophisticated interactions with the game, and with each other, than that which is feasible solely with code. In the following passage, Northrop explains how GREEN's presence could allow the players to test out the algorithms used in the game model before settling on a specific course of action:

In its role as referee, GREEN could monitor…exchanges of data, supplying to the expender of resources certain parameter ranges and/or distributions, so the expender of resources could use the computational algorithms to test the range of effectiveness of the proposed action before actually committing the resources. Then, when the expenditure finally takes place, GREEN would supply the recipient of the attack with the actual parameter values to be used (automatically, within the algorithm program) in determining the effectiveness of the attack (Northrop, 1967, p. 14).

Northrop then discusses more sophisticated operations that GREEN could potentially manage:

[O]nce the basic programs for the exchange of information have been made operable…they can be easily expanded to include many of the more desirable and esoteric features of game operation. One of the more entrancing features is that…the builders of the game can add increasingly sophisticated computational features to the total program structure and perform checkout tests of new additions in an on-line, real-time mode of operation (Northrop, 1967, p. 14).

What Northrop is saying here is that the referee could send elements of the game model itself to either player on an as-need basis. Northrop expands on this point, noting that war game developers could use the same messaging system to "add increasingly sophisticated computational features to the total program structure and perform checkout tests of new additions in an on-line, real-time mode of operation" (Northrop, 1967, p. 14). What he is suggesting here is that new code – not just numerical data, but actual lines of JOSS code – could 90

be swapped in and out of a game while it was being played. The idea of treating a section of code as an element of data will feature prominently in the discussion on interactive programming systems that will appear in chapters five and six.

While Northrup's example games played out at the tactical level, RAND also moved its higher- level strategic games to the JOSS platform. Richard Rochberg, working as a summer student in 1967, wrote a report detailing the JOSS implementation of the STROP nuclear warfare simulation game. Rochberg described the basic mechanics of STROP as follows:

STROP is a nuclear war game played by two teams (here designated Red and Blue). The players make decisions concerning R & D expenditures, weapons procurement, and offensive and defensive weapons targeting. A computer then evaluates the results of a war fought with these allocations and presents four itemized lists of losses (Rochberg, 1967, p. 1).

JOSS allowed for the game itself to query users via the Demand command. After entering initial resource allocations for their respective forces, as well as allocations for research and development, the game would cycle through two rounds of battle, with each side (i.e. Red and Blue) getting a chance to attack their opponent. Each player would input information concerning their overall strategies, and the game would calculate the end results. It is for this two-round structure, in which information in the first round will affect the second, that JOSS served as an ideal platform. With the Demand command, the players could enter new information for the model in mid-play without losing existing data. It is this model that will become exceedingly important as operational gaming becomes popular outside of military research institutions, and eventually is taken up by home computer hobbyists in the 1970s and 1980s.

Given its versatility and relative simplicity and clarity, JOSS was adapted for a variety of other systems and platforms. Arguably the most important of these adaptations was the FOCAL language.14 Developed at DEC, FOCAL was implemented on several of DEC's PDP computer systems, though the PDP-8 version was probably the most important and influential. The PDP-8

14 Formulating On-Line Calculations in Algebraic Language; as with many of these programming language acronyms, however, the long-form is rarely used. 91

was a low-cost (for the time) machine that was particularly popular within educational institutions (Johnstone, 2003). The FOCAL language differed syntactically from JOSS to a limited extent, but otherwise retained its full feature set. JOSS's Demand command, for example, was renamed to Ask, but was otherwise functionally identical. At BBN, the TELCOMP language was developed, which was, in the words of one of its lead architects, "pretty much an exact copy of JOSS" (Walden and Nickerson, 2011, p. 63). Not surprisingly, then, TELCOMP preserved the Demand construct. The language CAL, developed at Berkeley, was also a JOSS clone. In its reference manual it is noted that "[o]ne of the most powerful features of CAL is that it allows the user to input numerical information while the program is in the execution mode" (CAL reference manual, 1967, p. 29). As with JOSS and TELCOMP, CAL employed the term Demand for such functionality.

Interestingly, shortly after JOSS was released, a function virtually identical to Demand was incorporated into another, seemingly unrelated programming language. BASIC15, developed by Kemeny and Kurtz at Dartmouth University, would eventually supplant JOSS and FOCAL as the most important of the high-level time-sharing languages developed in the 1960s and 1970s (Kemeny and Kurtz, 1985). Similar to JOSS and its descendants, BASIC was one component of a larger interactive system called the Dartmouth Time-Sharing System, an experimental time- sharing network that also featured a command-line interface (Kemeny and Kurtz, 1985; and The Dartmouth Time-Sharing System, 1964). Beginning with the third version of the language, released in 1966, BASIC offered an INPUT command that functioned identically to Demand (Brig, 1988). It is possible that Kemeny and Kurtz had based INPUT directly off of JOSS, or a JOSS variant, but there appears to be no evidence to confirm this. Reflecting on the language years after its invention, they recognized the importance of this addition, calling it "the most radical use of the interaction possible in time-sharing" (Kemeny and Kurtz, 1985). INPUT would go on to become a crucial tool in the development of text-based BASIC games, as is discussed in more detail in the following chapter.

15 Beginner's All-Purpose Symbolic Instruction Code. 92

3.3 Operational Gaming Outside RAND

Much of the research conducted at RAND was continued at other research centres because RAND's own staff did much to promote their work outside of their own facilities. The influence of JOSS on other interactive languages was already noted in the previous chapter. This section will detail the operational/strategic gaming movement as it spread to private corporations and educational organizations. This truly was a movement, in the sense that such games came to be seen in many circles as a panacea that could significantly improve the decision-making skills of corporate managers and those in similarly powerful positions within public and private institutions. This evangelical effort would spread knowledge of and enthusiasm for operational gaming to educators, and, eventually, to hobbyist programmers. On the way, the spread of time- sharing enabled operational gaming to remain interactive, putting the player in a feedback loop of the sort that would become an important element in most hobbyist games. It is here, then, that the roots of recipe-based gaming as it spread outside of RAND can be traced. What is remarkable is the fact that the basic template for such games remains so constant throughout the years, as will be seen.

3.3.1 Early Work

In 1957, the New York City-based American Management Association (AMA), a management training organization16, published a book entitled Top Management Decision Simulation: The AMA Approach (Ricciardi et al., 1957). The book describes what is generally considered to be the first management "simulation" game. Built upon the "powerful and tremendously promising" practice of "model building", the Top Management Decision Simulation game (TMDS) was intended to provide an efficient and effective system by which to educate players on the principles of "sound business management" (Ricciardi et al., 1957, pp. 7, 30). Cohen and Rhenman offer a concise description of the game's essential mechanics:

The [TMDS] provides an environment in which teams of players, representing the officers of firms, make business decisions. The game consists of five teams of three to five persons each. These companies produce a single product which they

16 See http://www.amanet.org/ 93

sell in competition with each other in a common market…[P]articipants generally simulate five to ten years of company operation by playing the game from twenty to forty quarters. There are six types of decisions which each team must make every quarter. They must choose a selling price for their product, decide how much to spend for marketing activities, determine their research and development expenditures, select a rate of production, consider whether or not to change plant capacity, and decide whether marketing research information about competitors' behavior should be purchased (Cohen and Rhenman, 1961, p. 135).

This framework, which relies exclusively on the management of quantitative data, was deliberately patterned after the war-centric operational games developed by the RAND Corporation. The AMA explicitly credits military simulation games as the inspiration for their own work, and they actually received considerable assistance from the RAND Corporation itself in developing TMDS. Reflecting on this relationship, the authors of the TMDS study make the following arguments:

In the war games conducted by the Armed Forces, command officers of the Army, Navy, and Air Force have an opportunity to practice decision making creatively in a myriad of hypothetical yet true-to-life competitive situations…Why then, shouldn't businessmen have the same opportunity?...Why not a business "war game", in which teams of executives would make basic decisions of the kind that face every top management-and would see the results immediately?...From these questions grew AMA's Top Management Decision Simulation (Ricciardi et al., 1957, p. 59; also cited in Cohen and Rhenman, 1961, p. 135; see also Eilon, 1963).

Most secondary sources credit the game Monopologs, described in the previous chapter, as the direct inspiration for TMDS (see Faria and Nulsen, 1996; Keys, Fulmer, and Stumpf, 1996; and Woods, 2004). Monopologs was a pen-and-paper game, however, while TMDS was largely computer-driven.

The emergence of digital computers, in fact, was an important catalyst for the development of the game. As the authors of the work noted, "we take advantage of a revolution in science that has resulted from the invention of a device which is as far-reaching in its implications for the 94

present-day world as was the wheel for more primitive civilizations…This device is the digital computer" (Ricciardi et al.,1957, p. 13). Cohen and Rhenman argue that "the use of computers has provided an opportunity for the designers of games to incorporate in them a great deal of realistic complexity while still keeping their administration relatively simple," while also claiming that "[a]n electronic computer also adds considerably to the drama of game play" (Cohen and Rhenman, 1961, p. 134). The internal RAND report on TMDS – which was written by many of the same people that produced the AMA work – also credits the computer with the capacity to quickly perform the necessary calculations to run the game (quickly being about five minutes per round), but it then mentions another noteworthy "benefit": the ability to conceal game mechanics from the player(s). Since the computer operates the mathematical model that governs gameplay, pen and paper calculations are not necessary. Noting that, in the real business world, "there are no precise relationships connecting allocations and monetary return," the RAND authors argue that the concealment of game mechanics allows for a level of necessary uncertainty, while also encouraging the players to "think in terms of general concepts and policies" (Bellman et al., 1957, p. 38). Note the similarities between this approach and the Demand and INPUT-based programming systems described in the previous chapter. In both scenarios, the underlying code is intended to remain hidden, limiting (deliberately or not) the agency of the user running the program. The notion of "hidden" code has become normalized over the years, and users have generally come to expect that the computer only reveals to us a select amount of the work that it is doing; as Kitchin and Dodge note, "[a]lthough code in general is hidden, invisible inside the machine, it produces visible and tangible effects in the world" (Kitchin and Dodge, 2011, p. 4).

As with military-based operational games, a mathematical model lies at the core of TMDS, driving its mechanics and processing incoming data. As the RAND report notes, with respect to operational game design, "the mathematician plays an essential role in designing the games and interpreting the results" (Bellman et al., 1957, p. 8). As discussed in the previous chapter, this involves expanding on the scientific practice of developing "mathematical models of physical phenomena" and applying it to new contexts, such as the business world, though the authors are quick to claim that "[m]any more problems arise to plague us in the construction of these business models than ever confronted an engineer" (Bellman et al., 1957, pp. 6-8). In fact, they argue that such models require consistent input from the user or users in the form of "decisions", 95 noting that "[t]he simulation processes…will involve the use of human beings and machines, rather than machines alone" (Bellman et al., 1957, p. 8). This combination of models and regular, though limited, interaction with such models defines the basic framework of an operational game, as discussed. TMDS thus represents a complete transfer of the operational gaming paradigm to an entirely new context, one in which it could more easily be copied and adapted by other organizations.

Despite its somewhat limited reach, then – TMDS was only played during specific AMA training programs – it catalyzed a host of similar management-based operational games. Such games were created, according to Cohen and Rhenman, "both to improve the game and to enable others to use it more conveniently and more cheaply" (Cohen and Rhenman, 1961, p. 136). Universities such as UCLA and Carnegie Tech (now Carnegie Mellon) developed their own games, as did a myriad of other institutions. In a 1961 article, William Dill – who had a hand in creating some of these games – notes that "there are more than a hundred management games" in existence (Dill, 1961, p. 55). Many of these followed the same format as the AMA game. Consider, for example, IBM's Management Decision-Making Laboratory, described as follows:

Briefly, play consists of three teams, each making six basic decisions concerning production, transportation, marketing, research and development, and plant investment expenditures, plus prices to be charged in both the home and outlying areas. Each team has a home area, two "opponent home areas," and a common area (Hoffman, 1965, p. 170).

In both games, groups of individuals were tasked to make management decisions in areas such as marketing, research and development, and overhead costs. The notion of making "decisions" governs gameplay in most cases, and the number of decisions required per turn in some games increased substantially over TMDS; in an experiment conducted at Harvard University to study the effectiveness of management games, a game was designed in which the participants were asked "[u]p to 76 questions" per turn "to define product quality, to budget marketing effort and production volume, to set prices and to determine the source and amount of outside financing" (McKenney & Dill, 1966, p. 29). The highly-influential Carnegie Tech Management Game required players to make "between 100 and 300 decisions" per turn, its authors believing strongly that "if the realism of business games could be increased, a more effective educational 96 and research tool than previously existed would be created" (Cohen and Rhenman, 1961, p. 139; and Cohen et al., 1960, p. 304).

It is worth dwelling on this idea of decision making for a moment. What exactly is a decision? In an early article on the subject, in which the authors go so far as to classify the management games they study as "decision" games, they describe their understanding of the term as follows: "[t]he players must consider the situation posed by the structure of the game and various forms of information related to it and then commit themselves to specific courses of action. Hence the designation 'decision game'" (Day and Lymberopolous, 1961, p. 251).

Starting from around this period, the notion that decisions are a fundamental element of operational gaming becomes increasingly emphasized. In an article on operational gaming published in 1966, Feldt notes that "a great number of games have been developed by firms and business administration schools for teaching some of the fundamentals of decision-making in a business organization" (Feldt, 1966, p. 18). Forsberg, writing about an operational gaming- related public school initiative supported by the University of Oklahoma, noted that "operational gaming is a sort of decision simulation where the players make decisions within the framework of a simulated operating system" (Forsberg, 1969, front cover). And Forbes, writing in 1965, offers a definition of operational gaming that resonates very much with the approaches to game development discussed in this chapter:

[An operational] game is a model or gross replica of the major decisions and environmental forces influencing decision-making that exist in the operation of some purposive enterprise. Individuals or teams of players compete against one another and against the environment in playing a game: (a) by making decisions regarding operations in the simulated enterprise; (b) by seeing the effects of these decisions upon the model; then, (c) by living with the results of their decisions in subsequent rounds of decision-making practice (Forbes, 1965, p. 15).

This makes it sound as if players typically have immense freedom in deciding on the "courses of action" that they wish to take. The reality of the situation, however, is more complex. The following passage, for example, illustrates the basic mechanics of TMDS: "[f]rom the given alternatives, shown on the data sheet as specific sums of money, each team selects expenditures it will make for production, marketing effort, research and development, and additional plant 97

investment. It also selects the price it will charge for the product" (Bellman et al., 1957, p. 79). Clearly in this case, then, players are limited to whatever decision options are provided on the given data sheet. The situation was similar for other early management games as well. Both IBM and Carnegie Tech's games allowed players more flexibility than TMDS, but only in the sense that players could enter variable values. Even so, no means was provided in any early management game to explore, for example, the underlying mechanics, or alter the decisions the player was to make. They were all frozen in a sense, reducing players to suppliers of variable values.

3.4 Time-Sharing and Command-Line Games

In the games described above, player input was fed into the computer via punched cards. This means that all of the decisions required at a given point in time had to be answered at once. With the advent of time-sharing networks, however, this situation began to change. As described above, such networks enabled users to interact with central servers via command-line terminal interfaces. At this early stage, such terminals were typically teletype machines that would print out both user commands (entered via typewriter-style keyboards) and the consequent output provided by the server. This provided for more immediate interactive engagement with programs such as management games, and enabled the rise of "command-line games", a category that was extremely important in the hobbyist game programming era. Interestingly, it was through public school systems in the United States that much of this knowledge was transferred from colleges and corporations to more casual users, enabled in large part by computer engineering firms such as DEC that targeted educational institutions in their marketing efforts.

As early as 1962, an article on MIT's early time-sharing efforts noted the potential utility of the technology with respect to both war and management games, though in a slightly-modified format:

In the design of the present system, great care went into making each user independent of the other users. However, it would be a useful extension of the system if this were not always the case. In particular, when several consoles are used in a computer controlled group such as in management or war games…it would be desirable to have all the consoles communicating with a single program (Corbató, Merwin-Daggett, and Daley, 1962, p. 340). 98

Shubik expands on such thinking by providing slightly more detail in terms of how such a network would operate when management games were played:

One feature of computer systems which may see considerable growth in the next few years is display devices and input-output implementation and instrumentation…Improvements and increases in the variety of peripheral equipment have made it possible to completely automate games to the extent that the player may enter his decisions directly into a console and receive messages back by machine-operated type-writer as soon as sufficient information has been obtained (Shubik, 1968, p. 640).

Shubik, however, then goes on to note that "major problems involved in time-sharing on large computers in full production have not been solved" (Shubik, 1968, p. 640). While the accuracy of this statement is perhaps questionable – systems such as JOSS and BASIC were already up and running at this point – it may reflect the general thinking among game designers on such matters at this point in time, given Shubik's prominence in the field. Yet by 1967 there was at least one experimental management game operating on a time-sharing network, employing MIT technology (Ferguson and Jones, 1969).

While these early experiments in time-shared gaming were taking place, events of major significance were also unfolding in a public educational facility in Westchester County, New York. The county is home to the Center for Educational Services and Research, a unit within the Board of Cooperative Educational Services (BOCES), a statewide organization tasked with providing a variety of services to its school districts (Wing, 1966). Beginning in 1962, according to Richard Wing, Coordinator of Curriculum Research, the Center had been "experimenting with the use of computer-based games with simulated environments as an instructional methodology" (Wing, 1962, p. 31). From October 1965 to March 1966, sixth-grade students from a school in Yorktown Heights were brought to the Center to play two computerized "economics games": the "Sumerian Game", and the "Sierra Leone" game.17 Each of these games was intended to provide users with "simulated environments" within which to learn fundamental aspects of a

17 A third game, the "Free Enterprise" game, was designed, but not fully developed at the time these experiments were taking place. 99

given economic system, and each was played via a command-line teletype interface connected to a time-sharing network.

Wing and the co-authors of a report on these experiments defined simulation as "a technique by which the essential features of some object or process are abstracted and recombined in a model which represents the functions of the original and can be manipulated for the purpose of study or instruction" (Wing et al., 1967, p. 6). Note the similarities between this definition and those given for operational games in this chapter and the one previous. Wing and his fellow researchers, in fact, cite management and business games as a direct antecedent of their own work, noting the following:

Business and management games are used by universities and corporations to train executives. Examples of this are the economic games of Dill, the IBM game [sic], and the Univac [sic] Game, in which competitive business situations are simulated and the decisions made by the players are analyzed by a computer to demonstrate to participants the organization, planning, information transfer, analysis, review, interaction, and dynamic nature of business (Wing, 1967, p. 6).

Each game, then, was structured around the notion of decisions, meaning that the computer would prompt the user for data that it would then process via an unseen economic model. Given the importance of in later events, it is worth citing a description of its operation at length here:

The first messages introduce the student to the proper use of the terminal and give him an idea as to what his game objectives should be. Following the introduction, the student receives a first Seasonal Population and Harvest Report. The reports provide facts about population, acres of land for planting grain, number of farm workers, grain recently harvested, and grain remaining in inventory from previous harvests. The last part of the report asks the student to allocate his resources (grain harvested plus grain in inventory) among three requirements: (1) food for the people, (2) seed for next season's planting, (3) inventory for future needs. Such seasonal reports…with their related allocation decisions comprise the basis of play of the entire game (Wing et al., 1967, p. 14). 100

The Sierra Leone game behaves in a similar manner, with additional components added to the economic systems such as mining and manufacturing. Wing indicates that the Sumerian Game was the first game to be developed, so it seems reasonable to assume that the Sierra Leone game was built off of its foundation (Wing, 1966).

The experiments at BOCES were quite extensive, involving lectures, audio recordings, and informational handouts that supplemented the learning that was meant to be derived from the games themselves. Results of the experiments, however, were decidedly mixed, and the value of operational gaming in an educational setting remained an open question. None of this research, however, was particularly influential in the emergence of hobbyist game programming. Rather, it was the Sumerian Game itself that was to have the most impact on future events. How this came to pass is somewhat unclear, at least in the early stages.18 The project, and especially the Sumerian Game, earned some national attention; an issue of LIFE Magazine from 1966 described the game in some detail, referring to it as a "technique of cybernetic make-believe," which actually was quite prescient given the role operational gaming was to eventually play in the computerization of strategy and role-playing games (Cory, 1966, p. R2). But Doug Dyment, a DEC employee based out of Carleton Place, Ontario, who is believed to be the first individual to have created a derivative version of the Sumerian Game in 1968, claims that he heard about the game and its basic mechanics from a conversation he had while at an academic conference (Monnens, personal communication, 2014). Dyment wrote his version in FOCAL, DEC's version of JOSS that was discussed in the previous chapter. It was much simpler than the BOCES game, retaining only the basic mechanics around farming land and feeding the population, and due to limitations on file name sizes it was renamed Hamurabi, a reference to the Babylonian King Hammurabi that was both inaccurate and incorrectly spelled.19 Regardless, the game was discovered by Digital Equipment Computer Users' Society (DECUS) and featured in its annual software catalogue (Monnens, personal communication, 2014).

18 Much of the credit for the research findings presented here on this issue goes to games scholar Devin Monnens, who has spent several years accumulating primary source materials on the Sumerian Game, and interviewing individuals who aided in the development of the game and its later incarnations. 19 I will refer to the game as "Hammurabi", as the correct spelling was generally used in later descriptions of the game. 101

Hammurabi, however, would not have become as popular as it did had there not been programming language environments through which it could be played and propagated. FOCAL, as was discussed, was a derivative of the RAND Corporation's JOSS language, and could therefore support the sort of command-line interactivity that the game required. But it was the BASIC version of Hammurabi that became the most well-known and adapted. To understand why this was the case, it is necessary to discuss in more detail the rise of the BASIC language in the 1970s. As noted in the previous chapter, BASIC was one of several time- sharing, network-based interactive programming languages. Based out of Dartmouth, Kemeny and Kurtz wanted to create a language that "was ideal for introducing beginners to programming and yet could serve as a language for all applications, even for large and complex software systems" (Kemeny and Kurtz, 1985, p. vii). With a small vocabulary of reserved words, as well as a simple line numbering system that could be leveraged by branching statements, BASIC programs were simple to create and build upon. Such features alone help to explain its popularity. Yet similar concerns motivated the development of JOSS, and then of FOCAL, as has been discussed, and both languages are arguably just as easy to learn and use. The major difference between BASIC and its peers, however, is the fact that Dartmouth allowed its creation to be copied, modified, and distributed without any restrictions such as licensing fees. David Ahl, former DEC employee and founder of Creative Computing magazine (see introduction), noted the following in a recent article on BASIC:

DEC's FOCAL language was equal to BASIC in most aspects and even better in some, but it had one huge drawback: DEC was unwilling to license it to other computer manufacturers. FOCAL was fighting an uphill battle against BASIC, which was available on GE, Honeywell, HP, and other computers (Szczepaniak, 2012, p. 37).

Ahl, in fact, played a major role in the promotion of BASIC, both for DEC's machines and for the industry as a whole. While at DEC in the early 1970s, Ahl became involved in "marketing minicomputers to schools and colleges," and these institutions were more interested in BASIC than DEC's own FOCAL. As he noted, "because of the pioneering work at Dartmouth by John Kemeny and Tom Kurtz in developing BASIC and related educational applications, most colleges and high schools wanted a computer that spoke BASIC" (Szczepaniak, 2012, p. 37). 102

These educational institutions reciprocated, or at least their students did so, by sending Ahl source code from the various BASIC projects they worked on. Many of these programs happened to be games. There are several possible explanations for this. First, thanks to DECUS, games such as Hammurabi – which, as noted above, was reproduced by a DEC employee and later picked up by DECUS – were already circulating. Storer has noted the presence of Hammurabi on his school's computer, and even used it as the foundation for one of his post- Lunar Lander games (Storer, 2015; more on this shortly). Second, much of Ahl's outreach was taking place within high schools, and even junior high schools. It seems reasonable to assume that these are populations for which gaming would be especially popular. Third, Ahl was actively looking for games to use as demonstration programs specifically for DEC's PDP-8 computer. Since the BASIC language and environment on the PDP-8 took up 3.5 kilobytes of the PDP-8's 4K memory, Ahl wanted to make sure users knew that they could create useful programs despite the small amount of remaining memory. Reflecting on this time, he noted that "[w]e needed to demonstrate that such a limited configuration could run real programs, so I started converting FOCAL demo programs to this low-end BASIC…We also encouraged users who wrote programs, especially games, to submit them" (Szczepaniak, 2012, p. 38). Finally, the BASIC programming environment, as discussed above, was conducive to the development of command-line games that made use of the INPUT command. It is worth re-emphasizing this point, as other high-level languages of the time, such as FORTRAN, did not include such a command, and Kemeny and Kurtz have even acknowledged that they did not foresee its eventual popularity, noting that they "did not place a high priority on being able to write programs that were in themselves interactive," and that their initial "first priority was to be able to 'interact' with the computer system in order to create, check out, and run small programs" (Kemeny and Kurtz, 1985, p. 25). What is interesting about these remarks is the fact that, as I argued in the previous chapter, Kemeny and Kurtz were probably correct in their initial estimation that direct interactions with the "computer system" was a better approach then using INPUTs. The INPUT command enabled command-line games to flourish, but at considerable cost with respect to user agency.

By the mid-1970s, then, computer programming was becoming increasingly accessible, particularly within educational institutions. Yet it must be remembered that the concept of interactive programming within a custom-made coding environment was first developed at 103

RAND, and caught on elsewhere via their outreach efforts. These findings stand in contrast to prior scholarship that focused too heavily on Licklider's work, as well as on the BASIC language. It must be remembered that DEC's highly-influential interactive BASIC programming environments were actually extensions of their FOCAL-based systems, providing the same functionality in a different, though quite similar, programming language. DEC's version of BASIC would then go on to become a major influence in the development of BASIC programming environments for personal computer (see next chapter).

It would only be a matter of a few years, in fact, before companies such as Commodore and Atari released the earliest viable "home" computer systems. Languages such as FOCAL and BASIC, moreover, were making coding much easier than it had been in the earliest years of digital computing technologies. Yet it must be stressed that, along with accessibility came standardization. The widespread acceptance of time-sharing networking meant that most, or at least many, computer users were accustomed to command line-based interactive computing. Personal computers were not network terminals, but the earliest PC manufacturers retained the familiar command-line interface. Microsoft BASIC, arguably the most important of the PC interactive systems developed in this period (as will be discussed in more detail in the following chapter), was patterned after a version of BASIC used by DEC's PDP-10 machines. Such influence would thus reify a certain type of interactive computing that would resonate for years, and arguably continues to do so. 4 Hobbyist Programming and Hobbyist Gaming

In the year 1977, the foundation for the hobbyist programming movement was built in the form of three new "," each from a different manufacturer: the Commodore PET, the Apple II, and Tandy Corporation's TRS-80. They came pre-assembled, unlike the "kit" computers that were being sold to home users earlier in the 1970s. Unlike such kits, which were meant to appeal to hardware enthusiasts, these pre-built PCs were designed for programmers. Users could jump right into coding once their computers were switched on. Due to certain events that will be discussed in this chapter, most of the earliest hobbyist computers came equipped with an interactive BASIC system – typically a variant of Microsoft BASIC – which behaved similarly to those programming systems described in the previous chapter. Just as with JOSS and FOCAL, BASIC systems on the PC allowed users to develop and run programs from the same command-line environment, while runtime interactivity was centered around the INPUT command. There was one important difference between the new PCs and their antecedents, however: rather than serving as a terminal for a central server, home computers were standalone, with their own memory and processing units. Machine manufacturers thus had to devise other ways of connecting users. Physical storage media such as floppy disks and cassette tapes were common, as were printers that let users print out their code. Print, in fact, was a particularly effective means to achieve widespread distribution of a given program. This chapter will thus look at the BASIC game programming movement of the 1970s and 1980s largely through the books and magazines from the period. All of these actants – the PCs, the BASIC language, command-line interfaces, printed materials, and, of course, the users themselves – combined to define a pivotal era in the history of digital computing.

This chapter will contextualize the hobbyist programming era within the larger narrative developed thus far – that is, the emergence of a specific form of computation in the postwar period, followed by a leveraging of interactive computing to produce operational games, primarily at RAND. As such, an emphasis will be placed on coding and game development practices. Rather than providing a general survey of such games, I will highlight those that had a major influence on hobbyist culture, or that reflect an important aspect of that culture. The roots of hobbyist game programming may be traced back to the postwar era in which RAND developed its concept of operational gaming and engineered the JOSS system that facilitated

104 105

interactive gaming via a digital . The major difference between game development in the hobbyist era and the work which was carried out by RAND – and later picked up by a variety of public and private-sector interests – was that hobbyists were programming for their own enjoyment, not in order to achieve some larger goal such as war preparedness or student edification. It was through playful experimentation that familiar games were modified and expanded upon, and new game genres were essentially invented by dedicated individuals looking to push the limits of their machines.

The majority of the games that hobbyists created, distributed, and adapted, were direct descendants of the operational games discussed in the previous chapter, at least initially. Resource management games such as Hammurabi were particularly popular in the early years, as were various types of war games. The operational gaming movement provided coders with a robust template upon which to build their own creations. Game programmers often borrowed heavily from exemplar listings featured in books and magazines in this way, and they reflectively adapted these programs by adding, removing, or modifying various elements. Science-fiction author Jerry Pournelle, reflecting on the popularity of in an article for BYTE magazine, stated that "[h]alf the people I know wrote a Hammurabi program back in the 1970s; for many, it was the first program they'd ever written in their lives" (Pournelle, 1989, p. 115). Specific games and genres were particularly important sources for inspiration and information, and hobbyists were quick to adapt and expand on them. Hobbyist gamers, then, coded about as much as they played, if not more so – or, to put this differently, coding itself was, I would argue, a form of play. This was possible because barriers to access to code were so thin. Games were developed in the same environments in which they were created, so access to code was fairly open. And, since so many programs were printed in books, and thus had to be typed out, many game players became quite familiar and comfortable with code.

Despite such creative adaptation and innovation, however, I will also discuss here how Microsoft BASIC and its variants were also rather limited platforms that could not evolve or adapt as games grew more complex and users migrated to more complex operating systems such as Microsoft's Windows and Apple's Mac OS. BASIC was not sophisticated enough of a language to provide both the power and efficiency needed to serve as a useful development platform beyond the mid-1980s. As hobbyist games grew more complex, this emerged as a serious problem. Game programs printed in books and magazines began to stretch out to several pages, 106 and "machine code" – that is, extremely low level code consisting of nothing but hexadecimal numbers – was increasingly incorporated. Such complex code was much more difficult to reuse and reincorporate into one's own game designs. Gaming also increasingly became a separate activity from programming, particularly when games evolved into lucrative consumer products. These forces conspired to turn hobbyist programming into something of a niche pursuit, a situation that, as discussed in the introduction, may reverse itself in the coming years.

I will ground my arguments in this chapter in part on Donald Schӧn's design world paradigm. As discussed in chapter one, Schӧn argues that "design knowledge and reasoning are expressed in designers' transactions with materials, artifacts made, conditions under which they are made, and manner of making" (Schӧn, 1988, p. 182). The adaptive and innovative practices of hobbyist programmers were only possible, I will argue, because programming environments such as Microsoft BASIC allowed for freeform design practices reminiscent of those found within design worlds. Microsoft BASIC and its variants provided platforms upon which to engage in reflective design work – that is, the Microsoft BASIC environment served as a "holding environment" for "design knowledge", and enabled programmers to engage in a "dialogue" with such knowledge (Schӧn, 1988, pp. 182-183). In connection with these ideas, I will bring Chun back into the discussion in order to reflect upon the implications of the notion of the computer program, and what it means within a neoliberal contextual environment. I will focus particularly on the idea of a "finished" program – that is, a program that is presumably set in stone, and typically sold as "software". Finally, I will bring in Marino's ideas with respect to critical code; the focus here will be on the "implied code" that may be discerned from program output. As Marino warns us, code is created within specific "historical, material, [and] social context[s]" (Marino, 2006). The same may be said about programming languages. To cite Marino once more, critical code studies involves holding a "view of code as already having meaning beyond its functionality since it is a form of symbolic expression and interaction"; programming languages provide both the vocabulary and organizational principles upon which such symbolic expression is designed and deployed (Marino, 2006).

4.1 Worlds of BASIC

In 1974, Ahl, still working at DEC, published in-house a work entitled 101 BASIC Computer Games. The end product of several years of collecting games from the users of DEC's machines, 107

this work is a valuable snapshot of the world of hobbyist game programming as it existed in the early 1970s. This is important because the games that featured in this work would remain popular, and would be republished in various forms, throughout the hobbyist PC era. Ahl would accelerate this trend by republishing 101 BASIC as BASIC Computer Games: Microcomputer Edition, which was targeted directly to home computer users (Ahl, 1978). All of the games that will be discussed here were featured in the later work; I will focus on the 101 BASIC versions because they reveal the influence of operational gaming within interactive time-sharing environments at the cusp of the PC era. Personal computer manufacturers thus borrowed both the interactive environments used in such systems, as well as the programs that were developed for them. Operational gaming as a construct was reified by this process. The books and magazines that published type-in programs generally did not situate them within a larger context, making text-based interactive gaming seem like a "natural" or inevitable aspect of digital computing.

According to the preface, 101 BASIC was the "the only collection" of programs up to that point "that contain[ed] both a complete listing and a sample run of each game along with a descriptive write-up" (101 BASIC, 1973/1975, p. 7). Each game is given a rather awkward six-character name due to file name length restrictions – so a baseball simulator, for example, becomes BASBAL – and each includes every line of BASIC code, ready to type in, to get each game up and running. Jim Storer's influential Lunar Lander20 – here renamed to ROCKET – included 35 lines of code, while the largest, such as the confusingly-titled SPACWR, contained several hundred lines.21 A striking number of submissions, in fact, came from high school students, and even junior high school students. And amongst these entries, roughly halfway through the text, is the game HMRABI, or, to use its more familiar name, Hammurabi.

This version of the game was coded by Ahl himself, who notes that it "is translated from the original FOCAL program which has been floating around DIGITAL for nine or more years" (101 BASIC, 1973/1975, p. 128). This ambiguous statement indicates that the history of the

20 See Edwards, 2009. 21 SPACWR is actually a version of a Star Trek-themed game originally written by Mike Mayfield, and has nothing to do with the PDP-1 version of Spacewar discussed in an earlier chapter (Markowitz, n.d.). It will be described in more detail shortly. 108 game, from its beginnings at BOCES through to Dyment's recreation, had already been lost. Nevertheless, the program that Ahl presents is still very much an operational game. Note the similarities in the description of the game provided by Ahl, as compared to that of the original BOCES game cited earlier:

In this game you direct the administrator of Sumeria, Hamurabi [sic], how to manage the city. The city initially has 1,000 acres, 100 people and 3,000 bushels of grain in storage. You may buy and sell land with your neighboring city-states for bushels of grain – the price will vary between 17 and 26 bushels per acre. You also must use grain to feed your people and as seed to plant the next year's crop (101 BASIC, 1973/1975, p. 128).

The program proceeds by asking the player to answer specific questions on these issues, in the manner that all command-line games operate (see figure 4-1).

The numbers that follow each question mark were typed in by the player, and represent the only means by which players could interact with the game's underlying mathematical model. After the final question was asked, these figures were then processed by the game's underlying model to produce data for the next game "year", which was described exactly the same way as it is above. This process is repeated for a maximum of ten turns, though a particularly poor showing would result in the player's impeachment before time expires. 109

Figure 4-1. HMRABI, sample input and output (101 BASIC, 1973/1975, p. 128).

The entire HMRABI program is made up of roughly 120 lines of code, and many of these are simple actions such as PRINT commands, branching statements (GOTO and GOSUB), and variable assignments. This was not an overly complex program, which is likely one of the reasons why it was so readily adapted to new purposes. A good example of a game built on a Hammurabi foundation is KING, also known as the Pollution Game, designed by Storer, the creator of Lunar Lander. KING is described in 101 BASIC as "one of the more comprehensive, difficult, and interesting land and resource management games," and the book notes that "[i]f you've never played one of these games, start with HMRABI" (101 BASIC, 1973/1975, p. 138). These remarks already indicate that KING was similar to Hammurabi, and some sample output from the program illustrates this point even more clearly (see figure 4-2). 110

Figure 4-2. KING, sample input and output (101 BASIC, 1973/1975, p. 129).

There is certainly more printed text per turn as compared to Hammurabi, and there are variables such as pollution and foreign investment in industry that are completely new. But the issue of feeding one's population persists, as well as an overall concern over land management. More fundamentally, the game follows the same INPUT-based interface as Hamurrabi, presenting summary data for each turn and prompting the user to make several decisions related to resource allocation. KING thus still functioned as an operational game, bringing user input to a mathematical model and feeding the results back to the user. This statistical output is, in fact, one of the stronger indicators that a given computer game can be classified as being operational. Mathematical models in operational games are identified largely by the data they produce, as well as the functions used to generate this data.

This process of adapting old code to create an entirely new game resonates strongly with Schӧn's design world paradigm. Schӧn even makes a point of emphasizing that such "reverse engineering" can be a fundamental step in the design process:

There is a two-way interaction between design types and design worlds. On the one hand, elements of a design world may be assembled to produce an artifact that comes to function – either in the practice of an individual designer or in a larger design culture - as a type; so it has been with Trinity Church. On the other 111

hand, the direction of causality may be reversed. A vernacular building type - with its constituent things and relations, forms, materials, construction methods, ways of organizing space, and symbolic vocabularies – may "loosen up" to provide the furniture of a design world (Schӧn, 1988, p. 184).

It could be argued, then, that Storer was able to "loosen up" the "furniture" provided in Hammurabi, and then use it to build his own version of the game. But is only one aspect of the design process. To re-emphasize a point made earlier, design worlds are not like building blocks or toolkits, in which component parts are discrete, atomic units. Rather, they represent subjective, reflective impressions of specific design situations. Storer's KING is clearly an adaptation of Hammurabi, but this adaptation emerged within a context that he created during the design process, in which he divided up the program according to his understanding of it, keeping and adapting certain elements, while discarding others.

This line of argumentation is getting at one of the most important aspects of the design world paradigm: the notion that designers do not simply construct solutions, but that they also construct the very problems that they solve. As Schӧn puts it, "it is clear that a 'problem space' is not given with the presentation of the design task; the designer constructs the design world within which he/she sets the dimensions of his/her problem space, and invents the moves by which he/she attempts to find solutions" (Schӧn, 1992, p. 11; emphasis in original). A design problem may set a general, overall objective, but the path to get to that objective – that is, the set of problems that need solving to meet it – are defined wholly by the designer. With respect to more freeform design work, this overall objective might not even exist, or at least may be very broad, as Schӧn notes:

It is worth noting that [the designer's] intention was not fully established at the beginning of her design process, but evolved through her appreciation of an intermediate design product. Her intention developed in 'conversation' with the process by which she transformed her design. An evolving intention is one of the outputs of her designing (Schӧn, 1992, p. 6).

Note that this practice of "learning by doing" resonates strongly with the constructionism model discussed in the literature review, and that will be revisited in the following chapter. 112

4.2 Beyond Hammurabi

Despite its novel aspects, KING is still very much what might be called a "clone" of Hammurabi, in that it preserves much of its functionality and its basic ethos. A more interesting example from 101 BASIC of a variant program that uses similar design world elements is CIVILW, a "simulation" game of various battles from the American Civil War. CIVILW was also created at Storer's Lexington High School, so it may be assumed that its developers had similar influences and interests as Storer, at least to a certain extent. The influence of Hammurabi is clearly visible with respect to the game's basic mechanics, even though it is quite different in other ways. There are 14 battles in total; they are presented to the player sequentially, one at a time, in the same manner that Hammurabi and KING advance one year at a time. In each round, the player is asked a few basic questions about resource management and strategy, and then they are asked to pick a strategy from a numbered list provided at the beginning of the game – examples include "ARTILLERY ATTACK," "FRONTAL ATTACK," and "ENCIRCLEMENT" (see figure 4-3).

Note that the majority of player interaction with the game involves the allocation of resources, just like in Hammurabi and KING. The question on strategy (i.e. "YOUR STEGY?"), however, does not have a cognate in these other games. As already noted, the user is required to select a battle strategy from a list of choices, which appear at the start of the game (see figure 4-4). Note that for the Battle of Chattanooga, listed above, the player was required to select an offensive strategy; so they are really working with two lists, but the basic functionality remains the same. The remaining core components in CIVILW are almost certainly adapted from Hammurabi, or perhaps KING. 113

Figure 4-3. CIVILW, sample input and output (101 BASIC, 1973/1975, p. 82).

Did CIVILW's programmers invent the very concept of menu-based choice? Probably not, and, in fact, it would be unwise to begin making such declarations with respect to any game element or mechanic. Consider the game SPACWR, which is the final program listed in 101 BASIC. SPACWR is something of an anomaly, and its presence in 101 BASIC requires some explanation.22 The original version of the game was designed and developed by Mike Mayfield in 1971 on a Scientific Data Systems (SDS) 7 mainframe. The original name of the game was actually Star Trek, and incorporated the setting and lore of the original television series. With support from sales staff at Hewlett-Packard, Mayfield rewrote the game for the HP2000C time-

22 Credit for uncovering this history goes to Maury Markowitz and his website Games of Fame. Markowitz had written a history of SPACEWR/Star Trek that was later supplemented with crucial information supplied by Mayfield himself, as well as Bob Leedom, who created an important later variant of the game. 114 sharing network computer. The game was then distributed by HP under the name STTR1, and became quite popular (Markowitz, n.d.).23

Figure 4-4. CIVILW, choosing strategies (101 BASIC, 1973/1975, p. 82).

Recognizing its popularity, Ahl (and Mary Cole, whom he credits in the book) converted STTR 1 from HP's version of BASIC to DEC's, and published this duplicate version in 101 BASIC. Like the original Spacewar, SPACWR/Star Trek was a sprawling, complex game written for a high-end computing system, and quite unlike the simpler programs that fill most of the pages of 101 BASIC. Its features included a map rendered in text, a wide range of rooms (i.e. "quadrants" in space) to explore, an in-game "computer" that provided vital statistics about the player's ship, and combat encounters with Klingon warships in which two types of weapons ("phasers" and "photon torpedoes") could be deployed. All of these features were accessed via an in-game menu system, similar – but more complex – to the menu found in CIVILW.

Does this mean the CIVILW's programmers borrowed the concept from Star Trek? This also seems unlikely, given that the PDP-8 at Lexington High was probably the most powerful computer that any of the students at the time had ever encountered. Given that Ahl was circulating games and other programs out to DECUS and other PDP users, however, it is very possible that a similar menu system was in a game that the students may have encountered. But pedigree is not the major issue here. What is important is the fact that the developers of all these game engaged in typical design world practices, building both from elements of older games and, at times, from new ideas to create games that serve as reflections of their subjective experiences

23 The original source code may be found at http://www.dunnington.u-net.com/public/startrek/STTR1 115

with their problem spaces. When an element is reused, this tells us that a given programmer discovered this element in an earlier work, and, while engaging in "conversation" with their own design domain – that is, when they began making their own computer games – retrieved this knowledge and used it to fashion one or more elements for their own design world. They may then copy the old code directly, or, like Dyment, they might build the same elements in an entirely new context. But what is important is the design process itself, and by looking at games with similar elements one may see how certain ideas may be adapted in different game contexts, each emerging within a unique design world; it is this process, and the way that this process spread across the hobbyist community, that is of concern, rather than issues of precedence.

Other games in 101 BASIC cast operational gaming in a variety of contexts. A game called FURS assign you the role of "leader of a French fur trading expedition in 1776" which must sell goods and buy supplies at various outposts (101 BASIC, 1973/1975, p. 106). A game called WAR-2 tasks you with dividing up the troops into various military branches (i.e. Army, Navy, Air Force) to wage fictional battles with the computer. A number of important innovations in game mechanics and structure were in fact introduced throughout the 1970s. These innovations were borrowed and adapted to different game settings and contexts, and therefore propagated through the networks of schools and universities, user groups, and books and periodicals that connected users in what was becoming an increasingly popular hobby and pastime. This fluid movement, particularly with respect to home PCs, was due in part to the rise of Microsoft's version of the BASIC language and interactive environment. While emerging home computer companies such as Commodore and Apple had been working on their own versions of BASIC, Bill Gates and Paul Allen convinced most of them to license Microsoft's version of BASIC instead (Allison and Gates, 1993). While not quite a lingua franca – versions varied somewhat between models – Microsoft BASIC provided enough of a standard that authors like Ahl could market their books of programs to "microcomputer" owners, meaning that the code inside was written for Gates' and Allen's version of the language (Ahl, 1978).

Given the close reading of 101 BASIC above, it is more useful now to skip over BASIC Computer Games, as it featured much of the same material, and turn to Ahl's later works: More BASIC Computer Games, published in 1979, Big Computer Games, published in 1984, and BASIC Computer Adventures, published in 1986. This takes us straight into the heart of the home computer hobbyist boom of the early 1980s, when landmark "microcomputers" from the 116

late 1970s, such as the Apple II and the Commodore PET, made way for the even more sophisticated Apple Macintosh, Commodore 64 (quickly supplanting their earlier VIC-20 model), and updated machines from companies such as Sinclair, Tandy, and Atari (see Ceruzzi, 2003). This would lead in turn to a mushrooming of publications catering to these home hobbyists, most of which contained the same sorts of type-in programs that Ahl had been marketing already for several years. While Ahl's books are still central, then, other actors began to come into the picture, influencing the direction of microcomputer gaming. As outside influences, including the influence of arcade gaming, grew more powerful, the costs in both time and money needed to develop games for increasingly powerful computers grew rapidly, and hobbyist programming would largely fall by the wayside.

Getting back to the late 1970s, however, Ahl's second book of BASIC games reveals extensions of trends that were established in the first title, as well as the introduction of new game types. With Artillery 3, the user is presented with a physics-based puzzle that requires input for certain parameters, which are then used to calculate how much distance a given artillery piece travels, with the overall goal to hit a specific, "target" distance. While the game is purely text-based, graphical artillery games – which operate on very similar principles – would go on to become a subgenre of its own that is still very much relevant today with casual games such as Angry Birds. The game Grand Prix is also a single-question, physics-based game, querying the player for an acceleration value as their car navigates a race track (displayed to the user via a text-based drawing, something that would become increasingly popular in certain genres). The game also, however, requires menu-based decisions with respect to what car the player wishes to drive, and what car they want to race against (see figure 4-5).

Such hybrid games feature prominently in More BASIC, as programmers brought design world elements together in new ways. The game Deepspace is a particularly interesting example. The main section of the game, in which you hunt down and destroy enemy spacecraft, is largely menu-based. The player is given options to fire various weapons, or to disengage entirely. But one of the options – "CHANGE VELOCITY" – requires the user to input a value, adding another layer of nuance to the game. Even before these options are given, however, the user must engage in the rather complex task of selecting their own ship, and outfitting it with weaponry. Each ship has different advantages and disadvantages in terms of speed, storage capacity, and defensive 117

capabilities, and each weapon has similar trade-offs with respect to size and power, (see figures 4-6 and 4-7).

Figure 4-5. Grand Prix, sample input and output (Ahl, 1979, p. 66).

This level of customization is rather striking when compared to more simplistic shooting games such as Artillery 3. It is also reminiscent of the military logistics games developed at RAND that, as discussed above, were the antecedents to hobbyist gaming. There are also traces of what will rapidly become a major influence on hobbyist gaming: pen-and-paper role-playing games. While this is a science-fiction setting, as opposed to the more popular fantasy, space combat is still a popular RPG trope, and there is a level of pre-game customization here that mirrors the character creation stages featured in most RPGs.

Another important sub-genre that needs to be mentioned here is one that I will refer to as "graph games" – that is, games in which players "travel" to various points in a network of nodes (i.e. a graph) to accomplish any number of goals. The oldest of these games play similarly to the board game Battleship – there is, in fact, a Battleship clone listed in 101 BASIC under the same SALVO. The player is tasked with locating ships hidden on a grid by entering coordinate values for that grid, with the computer then announcing whether or not their ships were "hit". 118

Figure 4-6. Deepspace, sample output (Ahl, 1979, p. 46).

Figure 4-7. Deepspace, more sample output (Ahl, 1979, p. 46).

A major innovation in graph games emerged in the early to mid-1970s, when game programmers began placing the player "inside" the graph. In Battleship, the player is positioned "above" the game space, and is essentially capable of accessing every node at any given point in time. Battleship imitators such as Mugwump, in which the goal is "to find the four Mugwumps hiding on various squares of a 10 by 10 grid," operate similarly (101 BASIC, 1973/1975, p. 156). But in the May 1973 issue of People's Computer Company (PCC) – a journal published by a San Francisco Bay-area countercultural organization of the same name – there is a sample run of a game called CAVES 1, written by one Dave Kaufman.24 As he explains it, "Caves 1 is one of a series of games that let you explore tree structures and networks represented as 'caves'" (Kaufman, 1973, p. 4). The program works by allowing navigation only to those nodes in the

24 CAVES 2 and CAVES 3 also existed, but were just variations on the same theme. 119 graph that are connected to the node that the player happens to "be in" at any point in time (see figure 4-8).

The object of the game, then, is for the player to navigate this space until they reach room 25, which leads to the exit. By treating the player as if they were physically navigating this space, then, CAVES 1 limits player choice in terms of exploring the larger graph, and therefore creates a new kind of game in which the player must cope with a mathematical model that provides only limited information at any point in time.

Figure 4-8. CAVES 1, sample input and output (Kaufman, 1973, p. 4).

In a later issue of PCC, and then in Creative Computing, and, finally, in More BASIC, there appeared a game, designed by one Gregory Yob, that built on the same navigational model as the CAVES series, but included an important new element: a creature that had to both be hunted down and avoided, depending on circumstances (Yob, 1975; and Yob, 1979). The game was called Hunt the Wumpus, or simply Wumpus, and became one of the more recognized hobbyist games of the era. In addition to the Wumpus, Yob added other tricks and traps to the cave network, as the in-game instructions reveal (see figure 4-9). 120

With its cavernous setting, a monster that needed to be killed, hidden traps such as bottomless pits, and mischievous bats that could carry you randomly from room to room, Wumpus is very reminiscent of the fantasy-based games that were becoming popular at the time, particularly Dungeons & Dragons. Given its date of publication, it is unlikely that D&D directly influenced Wumpus, but Wumpus provided a template upon which such games could later be built.

Figure 4-9. Hunt the Wumpus, sample input and output (Ahl, 1979, p. 179).

4.3 Arcades, Consoles, and COMPUTE!

In 1972, a year before the first edition of 101 BASIC was to be published, a television-based game called Pong was developed by a small, unknown company called Atari, and became a sensation in the bars and amusement arcades where pinball had once ruled. In 1975, a home version of the game would be released, meeting with similar success. By the mid-1980s, shortly before the approximate endpoint of the hobbyist movement discussed in this chapter, video arcades were a mainstay in North American cities and suburbs, while home consoles had suffered two "boom and bust" cycles; a Japanese gaming company called Nintendo was attempting to revive American interest in home gaming. The many stories and subplots that make up the early history of home and arcade gaming have become an important focus of gaming historians, both academic and non-academic, as was discussed previously in the literature review. Yet the influence (or lack of influence) that these games had on hobbyist home 121

PC gaming has not been fully considered. A final answer on the matter is not possible here, but a few important links between these two worlds will be outlined.

The earliest arcade games had been derived largely from one of two major templates: the Pong model, in which one or two players bat a ball around the screen with paddles, and the Computer Space model, in which players control their vehicles in simulated outer space while contending with some form of obstacle. The original Pong was created in 1972, as already noted, but it was inspired by earlier games designed for the Magnavox Odyssey, the first home digital game console (Wolf, 2008).25 Computer Space was a scaled-down version of Spacewar, which, while unsuccessful, would influence the development of later space-related games such as Asteroids (Wolf, 2008). These offered much different gaming experiences as compared to the BASIC games of the day. Graphics dominated, text was largely absent, and players controlled the games via joysticks and buttons. These games would also run continuously, moving objects on the screen even when the player(s) were not using the controls – the term "twitch game" is now used to describe this model, highlighting the importance of fast reflexes for successful play.

With the release of the Atari VCS (later Atari 2600) gaming console in 1977, a wide range of arcade-style games could be played at home. Despite its extremely limited graphical capabilities, the 2600 was able to reasonably recreate games such as Breakout (a one-player Pong-style game) and Asteroids, though its version of Pac-Man – arguably the most successful arcade game of all time – was widely panned (Montfort and Bogost, 2009). Nonetheless, the 2600, along with later competitors such as Intellivision and ColecoVision, would fuel the home video gaming boom of the early 1980s (Wolf, 2008). The 1983 crash was partly a result of the rise of the home computer; the major manufacturers in the industry targeted the home console market directly in their advertising, arguing that computers, unlike consoles, were educational and productivity tools that could also play games (Oxford, 2011). One major consequence of this strategy was the emergence of arcade-style games on home PCs – despite the advertised added features of home computers, people still expected them to play games. The PC

25 The Odyssey was designed almost entirely by one individual, Ralph Baer, an electronics engineer who had no prior experience in gaming. After Pong's success, Magnavox sued Atari and settled out of court. Magnavox would continue to sue other console manufacturers for several years thereafter, in one of the more litigious chapters of gaming history. 122

manufacturers themselves released their own games, while others branched out onto other platforms: in 1983, Atari launched its Atarisoft label for games published for Apple, Commodore, IBM, BBC Micro, and computers (while also publishing games for their own PCs). These were largely converted from pre-existing Atari arcade and 2600 games, bridging the gap between arcade/console and computer gaming (Mace, 1984). Third- party arcade/console game developers such as Activision quickly followed suit, while emerging PC-only companies such as MicroProse and Electronic Arts offered their own arcade action- style games.

Such events led to the emergence of arcade-style type-in games that challenged the text-based ethos that was still dominant in home hobbyist culture. The launching of COMPUTE! Magazine in 1979 did much to accelerate this trend. COMPUTE may be the most well-known of the old type-in program periodicals, particularly as it outlasted most of its competitors – staying in circulation until 1994, albeit without the type-in programs, which were dropped in 1988 – and was complemented in its heyday with several specialty spinoffs that focused on specific platforms such as the Commodore 64, the Commodore Amiga, and IBM PC compatibles. Its writers and editors were often key figures in the hobbyist movement, including Fred D'Ignazio and science-fiction writer Orson Scott Card. The typical issue of COMPUTE from the early to mid-1980s would contain a wealth of games, educational programs, and utilities. But it was the games that were clearly the big draw, as COMPUTE's publishers released several companion books that contained nothing but games, with titles such as COMPUTE!'s First Book of VIC Games, and COMPUTE!'s First Book of Commodore 64 Games (both published in 1983; a second volume would follow for each, plus a third for the 64.) The main magazine's covers, moreover, were typically designed around the games inside, and games were typically the longest, most complex programs featured.

Many of these games, moreover, made use of the basic elements of the typical arcade or console games – that is, graphics dominated, as opposed to text, gameplay was continuous (i.e. the game did not wait for user input at each turn), and controllers such as the joystick were employed. The odd strategy game or would also be published in COMPUTE's pages, but these were exceptions to the general rule. As evidence of this, many of these games employed what COMPUTE referred to as "machine language", which could issue instructions directly to the computer's CPU, without having first to be translated from a high-level language such as BASIC. 123

This machine language took the form of pairs of hexadecimal digits, with several pairs occupying each line of code, separated by spaces. Compared to BASIC, it was indecipherable to all but the most expert users, and extremely difficult to code with. Yet machine language could be used to draw and animate graphics at a level of sophistication that was not possible with straight BASIC, and it could be interpreted much more quickly, leading to faster-paced games. As an example, here is a description of the machine language routines used in the game Rats!, taken from COMPUTE's First Book of Commodore 64 Games:

There are five machine language routines in "Rats!" LINE, as its name implies, draws a line…PLOT sets the hi-res cursor to the position from which the next line is to be drawn, and plots that point on the screen. The COLOR routine fills the screen with color.

INIT removes everything that is not a letter or number from the screen…and sets all the variables used by the other routines (locations 826-837) to zero.

SCR either loads or saves something to or from the screen. This routine is used to save the screen to memory after the top view of the maze has been displayed the first time, and from then on is used to display the maze almost instantly, so you have to wait only once (Commodore 64 Games, 1983, p. 29).

This last function is particularly relevant, as it is used to display graphics "almost instantly" when invoked a certain way. Such speed is generally only required for games in which the game state changes quickly, and does not wait for the user to respond to an INPUT command, or something similar.

Machine code was heavily employed in COMPUTE games, particularly when more sophisticated systems such as the Commodore 64 emerged in the early 1980s. In COMPUTE's First Book of Commodore 64 Games, an entire chapter was devoted to games written purely in machine language. COMPUTE subsequently released two books for the Commodore 64 that contained nothing but machine language games (Machine Language Games, 1986; and More Machine Language Games, 1987). They even developed their own machine language editor called MLX, which could save programs in progress and provide checksum values to ensure that each line was entered correctly (Commodore 64 Games, 1983, pp. 137-143). Virtually all of these games were 124

of the action/arcade variety; as the editor of one of the machine language volumes put it, "[f]or arcade-game speed on the Commodore 64, nothing beats machine language" (Keizer, 1987, p. 3).

What COMPUTE was able to do, then, was to carve out a space in hobbyist programming culture for arcade and console-style games. At the same time, however, it was planting the seeds of its own downfall. By aligning itself so closely with action games, it became enveloped in an industry in which the power and performance of both hardware and software was paramount; arcade and action games had to move fast, yet remain visually appealing, which typically called for complex graphical routines. By the late 1980s, it was readily apparent that such games were growing too complex for their source code to fit easily inside books and magazines, and it was not realistic to expect users to input exceedingly large programs from such sources. COMPUTE would only rarely branch out into other game genres that relied less on graphics and speed. Perhaps inevitably, then, they dropped type-in programs entirely in 1988 to focus more on commercial software offerings. Its editor at the time justified the decision to phase out printed code as follows:

As computers and software have grown more powerful, we've realized it's not possible to offer top quality type-in programs for all machines. And we also realize that you're less inclined to type in those programs. You're more interested in hands-on features, dependable and forthright product reviews, and insightful columns. We've changed because we saw you changing. It's that simple (Keizer, 1988, p. 4).

Implied in these remarks is the belief that rigid lines may be drawn between programmers and users. Games and other types of "software" are reified as consumer products, so that code itself becomes something of a commodity. Home computer users are thus assumed to be "less inclined" to program, as programming itself has been cast as a tool for professional manufacturers in the same way that the moving assembly line is a tool used to create many physical goods.

4.4 The Limitations of BASIC and the End of the Type-In Era

As already noted, Ahl's last programming book, BASIC Computer Adventures, was published in 1986. Over the course of the 1980s code was becoming more complex even for games that were 125

entirely text-based. Ahl acknowledged this reality when he published his work Big Computer Games, as discussed above. While the code for Hammurabi (from BASIC Computer Games) could be fit on one page, code for a resource management game like Lost & Forgotten Island could take up seven pages (Ahl, 1978, p. 79; and Ahl, 1984, pp. 26-32). Text adventure games could be even larger, given that each room required its own description, as did each potential user action. The increasing popularity of maps, menu systems, and elaborate game worlds all required more code. RPGs, of course, were the most complex of all, even without their graphics routines. Gamers were becoming more sophisticated, and required games that reflected their evolving tastes.

BASIC – and, more specifically, Microsoft BASIC – served as an ideal platform for the sort of design world-related programming that emerged in the earliest years of the PC era. The BASIC language is simple and reasonably straightforward, avoiding many of the syntactical complexities of C and other languages of the time. The Microsoft BASIC interface, moreover, is simple but powerful, allowing the user to execute statements immediately or build them up into a program. By the mid-1980s, however, the system was starting to show its age in a number of ways. As already noted, type-in programs at that time were starting to become excessively long, while the increasing availability of pre-made software on disks and cassettes lured users away from them. Moreover, the different capabilities of various PCs with respect to graphics created compatibility issues. These problems point to a larger issue, however: the extreme structural limitations of the BASIC language, as well as the limitations of the Microsoft BASIC environment. Such limitations reflected and emphasized the individualist ethos that, as has been noted several times already, supported the emergence and development of digital computing artefacts and practices that reflected, and continue to reflect, a wider neoliberal context.

While BASIC's simplicity is in many ways an asset, there are certain structural elements that are missing that could potentially have made it easier to assemble larger programs. The most glaring omission is support for well-structured subroutines, also known as functions or procedures. These constructs allow programmers to encapsulate lines of code that are frequently used so that they may be called upon at any point in the main program. Such subroutines may then be grouped together into libraries, which may be transported from program to program. BASIC does offer basic subroutine functionality with the GOSUB command, but it is a relatively weak construct that does not support basic operations such as parameter passing, and still requires the 126

use of line numbers, making it extremely difficult to build libraries. BASIC also does not allow a program to link to other programs in any way, which is another major obstacle in terms of developing libraries. Many modern languages also provide support object-oriented programming. In this context, objects are user-defined frameworks, made up generally of variables and subroutines, which are usually called functions. Once defined, an object may be instantiated – that is, multiple copies may be produced within the same program, as long as each has a different label. Each instance will inherit the framework outlined in the object definition, and will thus contain its own values for each variable, and may be referenced as such. These constructs make it much easier for developers to piece together large programs by breaking them up into smaller, encapsulated routines. While BASIC is easier to learn because these do not exist, it also means that it remains something of a "toy" language, not suitable for serious development (with a partial exception that will be discussed momentarily).

Another major limitation is related to the Microsoft BASIC system itself, but for reasons that are somewhat nuanced. As discussed above, Microsoft BASIC provides an environment that is conducive to improvisational design world development. But it was still a highly-simplistic environment. As will be discussed in detail in the following chapter, the 1970s and 1980s witnessed the emergence of sophisticated interactive programming environments in which users could build up entire new systems of interlocking programs, instead of just working on one program at a time. These environments were also often "live", in the sense that code could be running in the background while the user was developing a new program in the same space. While Microsoft BASIC could be classified as an interactive programming environment, it only offered a fraction of the features found in more sophisticated environments such as Smalltalk and Interlisp. Such design choices allowed it to be more accessible, of course, but they also limited its potential in terms of more sophisticated program development. As will be discussed shortly, a more complex programming environment has the potential to make programming an immersive experience, allowing the programmer to develop a constellation of related programs and objects.

4.5 From BASIC to HTML5

As more sophisticated home computers such as the Commodore Amiga, Apple Macintosh, and IBM AT were launched, users gradually migrated to more sophisticated interactive 127

environments. Operating systems such Mac OS and were developed to support graphical user interfaces, multitasking, advanced memory management, and other complex tasks and features. Software for such systems was built in more advanced languages such as C++ and assembler. Various forms of BASIC were still popular in this era, but the BASIC-based hobbyist "market" – that is, the assemblages of books, magazines, and other media that supported home programming, particularly game programming – was rapidly dissipating by the early to mid-1990s.

That is not to say that hobbyist programming completely disappeared. The Internet – developed at research institutions, and later expanded to serve an emerging consumer marketplace – continues to serve as an effective platform for current hobbyists to interact with other like- minded users. There are a number of personal and commercial websites where the hobbyist games from the BASIC era are turned into interactive web applications.26 Some of these even operate off of BASIC code running within an emulator of a vintage personal computer and/or operating system. The most ambitious efforts implement novel graphics and sound effects. From an antiquarian or nostalgic perspective, such games are exceedingly intriguing. Given the critical model developed in the present study, however, they are of limited importance. The main problem is that they do not have the influence in shaping programming practices as they did in the BASIC era. Whereas early BASIC games served as an important influence on the development of commercial games, most online hobbyist games simply attempt to recapitulate the past. No new computer system or technology, moreover, is built entirely with hobbyist programmers in mind. Machines like the Commodore 64 (and VIC-20) and Apple II were made for amateur PC enthusiasts. Modern computers, however, are designed to support rich applications such as Microsoft Word and Excel, and feature GUI interfaces that do not immediately invite customized coding. The development tools that are available, moreover, are in some cases even more limiting than Microsoft BASIC, as I will argue in the next chapter.

However, another related area in which hobbyist programming is still quite active is "Web" programming – that is, the development of applications using the diverse set of languages that power the World Wide Web. The term "languages" is used loosely here, as HTML, the

26 For example, versions of Hammurabi may be found at http://www.hammurabigame.com/hammurabi- game.php, as well as http://www.apollowebworks.com/russell/samples/hamurabi.html. 128

foundation of every website, is not, strictly speaking, a programming language. Rather, it is a markup language, and is used to format and organize text and related media. It is generally supported by Cascading Style Sheets (CSS) scripts, which provides more granular, as well as more sophisticated, formatting options. Javascript, a full-fledged programming language, is also often used to support user interactions. PHP is another important language, and is used generally to communicate with backend servers. And libraries such as jQuery expand on all of this functionality.

A major impetus to hobbyist web programming came with the development of HTML5 over the past several years. HTML5 is the fifth version of the HTML markup language, and has been designed in part to support complex, interactive web applications. Audio and video files may be embedded into pages with relative ease. Crucially, HTML5 also features the "canvas" element, a designated space upon which graphics may be drawn and animated. HTML5 is supported by an expanded Javascript language, as well as the highly-powerful CSS3 version of CSS. These technologies have encouraged a flourishing of HTML5 games, which in many ways have supplanted games made from Flash, Adobe's platform for the creation of interactive web applications and games.27 Such games are even beginning to migrate off of the web via the node.js runtime environment, which provides Javascript support without the need for a web browser.

Web programming is thus an area that offers substantial flexibility in terms of tools and approaches. Much of this vast potential, moreover, has yet to be explored. Web pages are still generally designed to keep most users away from any code. Playing an HTML5 game is no different from playing commercial games in this respect.28 Most website development practices thus continue to put up barriers between programmers and users, perpetuating many of the issues discussed here and in earlier chapters. It appears, however, that this is starting to change, and there are in fact a growing number of sites that offer rich, code-based interactive experiences.

27 There is much to discuss here with respect to the "rivalry" between HTML5 and Flash, but it mostly falls outside the focus of the present study. It is sufficient here to know that both languages are used by hobbyists, though Flash's popularity has plummeted since HTML5 arrived. 28 Virtually all browsers do allow users to view the source code behind each web page, though this has to be specifically requested. Even when source code is displayed, moreover, it is generally not modifiable. 129

Many of these take the form of live coding "playgrounds" in which users may enter and execute code within their browsers. JSFiddle is probably the most well-known of these sites.29 As will be seen in chapter six, such affordances may be leveraged to create an entire programming environment that enables users to create and manage their own code. Before that, however, the basic framework for such an environment needs to be developed. This will be the primary topic of the next chapter.

While an immense variety of games and applications were introduced here and in earlier chapters, it is important to recognize the remarkable consistencies in the approaches taken to designing and developing these programs.30 Whether a given program was simulating in detail the activities of a specific type of business, or was running a simplified model of the material conditions of an ancient empire, or was parsing player commands from within an imaginary cave system, it relied on discrete exchanges of data with the user/player in order to move forward. Programming systems such as JOSS and BASIC facilitated this transactional mode of computing by continuously interacting with the user in real time, even when the user was developing a program. All of this is to say that the operational model paradigm developed at RAND, and reified via the JOSS language, was incredibly influential with respect to user-computer interactivity for several decades. It was not until the advent of GUI operating and programming systems that the transactional model of computing evolved into something new.

These languages, and, arguably, most traditional programming languages, share a common trait: they impose specific, and largely inflexible, worldviews on the programs written in them. As noted by Kitchin and Dodge, Fuller makes a similar point when he speaks of "digital subjectivity", arguing that "software constructs sensoriums," in the sense that "each piece of software constructs ways of seeing, knowing, and doing in the world that at once contain a model of that part of the world" (Fuller, 2003, 19; as cited in Kitchin and Dodge, 2011, p. 27). Kitchin and Dodge themselves build upon these arguments by developing the concept of "capta," which they define as "units that have been selected and harvested from the sum of all potential data," while arguing that "the phenomenal growth in software creation and use stems from its emergent

29 https://jsfiddle.net 30 With the exception of the arcade and console games that were discussed. 130 and executable properties, that is, how it codifies the world into rules, routines, algorithms, and captabases…and then uses these to do work in the world" (Kitchin and Dodge, 2011, p. 5).31 These ideas can be extended out to programming languages themselves. All of the constructs named by Kitchin and Dodge – "rules, routines, algorithms, and captabases" – are represented by and through specific languages. None of these entities are "natural" to the digital computer. Rather, they were constructs developed within programming languages by computer scientists and engineers in order to facilitate specific forms of computation.

Despite its utility as a simple, common language for hobbyist programmers, then, Microsoft BASIC was ultimately unable to extend its own "sensorium" far beyond text-based games, at least not without supplementary machine code. The JOSS/BASIC-style interactive programming environment was out-of-date with respect to the needs of programmers beyond, arguably, the mid-1980s. The movements from teletype to screen, from text-based to graphics-based programs, from command-line prompts to GUI systems, from operational gaming to recreational gaming, and from single-use programs to game engines and libraries of subroutines, could not be effectively accommodated. The next chapter will look at programming systems that did in fact respond to such trends, and discuss the advantages and disadvantages of each.

31 The authors contrast their notion of capta with that of data, which they argue as overly broad; as they argue, "where data are the total sum of facts in relation to an entity; in other words, with respect to a person, data is everything that it is possible to know about that person, capta is what is selectively captured through measurement" (Kitchin and Dodge, 2011, p. 5). 5 Building Blocks: An Analysis of Interactive Programming Systems

As the last three chapters have indicated, hobbyist programmers inherited a computational paradigm that had been in development even before the first digital computers were developed. This paradigm relied heavily on a form of numerical analysis in which small, largely arithmetical operations, performed unreflectively, could be organized and manipulated to solve larger problems. This type of rote, procedural work was what digital computers were first designed to perform. The advent of time-sharing networks allowed users to interact more directly with their machines, and opened up new opportunities for intervention. Rather than developing and executing lengthy programs one at a time, time-sharing allowed for simpler commands that enabled users to customize their systems more carefully. The earliest PCs borrowed this style of interactivity, but working at a more localized level. Microsoft BASIC was in many ways a culmination of these efforts, at least with respect to the consumer market. But, as was discussed in the previous chapter, the level of agency it afforded to users was constrained in ways that rendered it increasingly obsolete roughly a decade after its launch. However, at the same time that Microsoft BASIC was spreading out across PC platforms, research teams at places such as BBN and Xerox PARC were developing new types of systems that would give users even more control over their machines by adding a small but significant layer of functionality. Such systems employed cutting-edge technologies such as graphical user interfaces (GUIs) and mouse control over the screen. Perhaps more importantly, they also incorporated programming languages and programming systems that were radically different from what came before. Rather than treating programs as discrete, enclosed entities, they instead offered sophisticated "ecosystems" that accommodated fragments of code and data within the same space. It is these systems that are the subject of this chapter.

As noted above, the programming and operating systems that have been studied thus far operate solely via command-line interfaces – that is, they are purely text-based, solicit user commands via continuous prompts, and process such commands one at a time (one per user; on a time- sharing network, multiple users will be served this way). But it is rare to see such an interface on a modern computing device, particularly in the home consumer market. Command prompts are seemingly reserved for network administrators and dedicated users. So is the research that

131 132

has been presented here with respect to interactive computing no longer relevant? On the contrary; I would argue, in fact, that the scarcity of command-line systems has resulted in the virtual disappearance of knowledge with respect to certain forms of interactive computing that may still prove useful.32 The progressive ethos in computer engineering, exhibited most strongly in the consumer marketplace and exacerbated by misunderstood concepts such as "Moore's Law", would have us believe in an arch-evolutionary form of change in the industry, so that new machines are automatically superior to those that came before, and trends represent evolution(s) away from defunct ideas and towards the more effective and/or more efficient.33

This chapter will not argue against GUI-based systems, however, but will rather look at one particular type of GUI system that emerged in the 1970s, but that has been largely lost: the interactive programming system (IPS). The IPS, from a contemporary perspective, might be viewed as something of a hybrid between a GUI operating system like Microsoft Windows and a modern programming environment such as Microsoft Visual Basic or Apple's Xcode34, or it might be described as a programming language that lets you build an operating system. Such impressions are not necessarily inaccurate, but they are misleading in the sense that those that developed the original IPSs did not model them after these other types of systems. In fact, both the GUI operating system and GUI programming environment emerged more or less out of the IPS movement. That is to say, before there was Windows and Xcode, there were IPSs such as Smalltalk and Interlisp. The organizations where such IPSs were designed and developed – Xerox PARC for Smalltalk, Bolt, Beranek and Newman and PARC for Interlisp – were reputed leaders in research on advanced computing systems. Yet their close connections to the private sector meant that their innovations were quickly commercialized, and either thrived or failed on

32 This has also been argued elsewhere; see, for example, Stephenson, 1999. 33 With respect to Moore's Law, former Sun Microsystems chief technology officer Greg Papadopoulos noted the following in a 2005 blog post: "It is a prediction about the doubling of the number of transistors on an integrated circuit, about every 24 months. It isn't a prediction about the speed of computers. It isn't a prediction about their architecture. It isn't a prediction about their size. It isn't a prediction about their cost. It is a prediction about the number of transistors on a chip. Full stop. That's it" (Papadopoulos, 2005). 34 Briefly, modern programming environments typically blend tools that allow for coding with graphical tools that allow developers to specific the look and feel of their programs. The code can directly reference components developed with the graphical tools, changing their appearance and behaviour. It can also respond to "events" that are triggered when users interact with such components in the finished program. This is the basic framework for virtually all programs, including games, in modern operating system environments. 133

the open market. With respect to IPSs, their relative complexity meant that they only served niche interests, and this is partly why their constituent components – windows and other onscreen graphical elements, as well as mouse-driven interfaces – are more familiar to the typical user than the systems themselves.

In this chapter, however, I will argue that interactive programming systems could and should be revived, with some caveats, in order to address the concerns outlined in the previous two chapters with respect to the limitations of modern programming languages and development environments. Rather than imposing multitudes of modes with commands such as Demand and INPUT, and sheltering code when programs are executed, IPSs are extremely open environments that allow all users total access to code in memory, and do away with modes almost entirely. Scientists in the field such as Warren Teitelman and Alan Kay drew inspiration from existing languages such as Lisp and Simula, and expanded upon the programming philosophies they deemed to be inherent in them. Such work resulted in languages that behaved as programming "universes", to use Kay's term (see below), in which users developed ecosystems of symbiotic components, not self-contained, self-propelled applications. At roughly the same time, academic sources emerged that listed off the requirements for the prototypical IPS, including such qualities as code that could be treated like data, and the ability to "bootstrap" – that is, to build on and expand an IPS using the same programming language that it supports in its development environment. Early IPS-like languages such as Teitelman's Interlisp pointed the way forward, but it was arguably Kay's Smalltalk – released in its first full form in 1980, though it had been under development for several years prior – that most fully fulfilled the IPS vision.

I will not, however, simply call for the revival of a language such as Smalltalk; rather, I will argue that the IPS paradigm must be modified so that the languages that IPSs incorporate are more easily accessible for novice users. Smalltalk itself is a very powerful system, but it is also complicated to a degree that non-IPS languages such as Visual Basic are not. Interlisp, moreover, while offering its own powerful features, was based off of the Lisp language. While an extremely capable language, Lisp is also highly specialized, and Interlisp did not survive far past its initial release. I will therefore discuss ways in which IPSs could be made more accessible by drawing on IPS theory itself, and reinterpreting certain aspects of it. Specifically, I will argue that the code-as-data paradigm requires that code be simple enough that it can be effectively managed as data, and that IPS developers should design their systems in such a way 134

as to direct users towards efficient and effective tasks. While such an approach may be slightly different from that advocated by the original proponents of IPSs, I believe that it can lead to more intuitive, and therefore more powerful, systems.

The research presented in this chapter is thus a form of response to concerns raised over Microsoft BASIC and other earlier programming environments, as outlined in previous chapters. Recall McCarty's argument that modeling should be considered as "a form of craftsmanship set into the context of scholarship" (McCarty, 2005, p. 22). For the purposes of this chapter, this means that the IPS model crafted in the 1970s and 1980s may serve as a form of argument in a scholarly context. Specifically, I will argue that it inherits the benefits of simpler programming environments such as Microsoft BASIC, but that is also builds on the functionality of such environments via several important new features, to be discussed in more detail below. This argument will provide a theoretical foundation for the work of developing a new form of IPS, to be discussed in chapter six. As noted earlier, Mahoney states that "[t]he future of digital scholarship depends on whether we can now design computational models of the aspects of the world that most interest us" (Mahoney, 2005, p. 33). This chapter, and the one that follows, put such ideas into practice.

5.1 Programming Systems vs. Programming Languages

Comparing modern computing practices to those of the hobbyist era is, admittedly, somewhat misleading. Though they differ in a number of ways, many of these differences have a shared root cause: while hobbyist machines booted users automatically into a programming environment – typically Microsoft BASIC – modern operating systems put the user in a more abstract "system" level from which they can browse their files and launch applications. In order to program in a given language, the development software for that language – for example, the Visual Studio IDE – has to be loaded into the system by the user, typically by clicking on an icon located in a menu or right on the "desktop" screen. Only then may coding begin.35 The

35 There are certain variants to this approach that are worth noting. Some users prefer to write their code in a text editor, as opposed to a full-fledged IDE. This is particularly true for Web programming (i.e. HTML, CSS, and Javascript), since such languages are never compiled. Certain compilers, moreover, may be accessed only via the command line. This is a holdover from the command-line OS era, and is encountered more frequently in UNIX and Linux-based systems. 135

differences between these two approaches may seem minor; for most modern users, calling up an application such as an IDE is a relatively trivial task. But there is more at stake here than simple convenience. Microsoft BASIC was not simply an application embedded within a larger operating system. Rather, it was the operating system, at least as far as the typical user was concerned. Files were loaded directly into Microsoft BASIC and accessed via its command-line prompt. External peripheral devices such as disk drives and printers were also accessed from the BASIC command line. And, of course, all program development took place within the BASIC environment, including machine code programming.

Because it served as an OS, Microsoft BASIC was structured much differently from the modern IDE. Not only did MS BASIC allow users to build and execute programs; it also allowed them to define the global environment within which such programs were run. As an example, after loading up Microsoft BASIC on a Commodore VIC-20, I might type in the following:

LET X=10

Since there is no line number attached to this statement, it would execute immediately. A variable labelled 'X' would be created, and would be assigned the number 10. This might seem rather trivial, but something important is happening here: a global variable is being added to the MS BASIC environment. I may then create a one-line program:

10 PRINT X

Once I typed in RUN to execute the program, the computer would print out the number 10. Again, this may seem like a simple enough operation, but here is what is interesting: most modern IDEs, including Visual Studio, Xcode, and Eclipse, do not allow for these sorts of operations to take place. This is because Microsoft BASIC is not only a language, but also a programming environment, as well as an operating system. This system runs the MS BASIC command-line interface, allowing for the development and execution of programs, but it also exists independently from this interface, and can be accessed and queried by the user. In this example, that variable 'X' that I created becomes an element within the larger system; it therefore also exists independently of any specific BASIC program, but can be accessed and altered by such programs. 136

This is important because it allows the user to not only create BASIC programs, but also to configure the larger environment(s) within which such programs are developed and executed. This power was leveraged by game designers that needed to manage the memory that their programs were occupying. With a computer such as the VIC-20, which contained only 3.5 kilobytes of memory, this was a particularly pressing concern, but even the Commodore 64's 64 kilobytes could be used up by particularly complex games. One means of handling this issue was to develop multi-stage programs. In the 1984 Usborne book Write Your Own Fantasy Games for your Microcomputer, discussed in the previous chapter, the main game Dungeon of Doom is actually three different programs. The first, the "dungeon creator", allows the user to design a setting for the game. The second, the "character creator", allows them to create the statistics needed for the player-controlled character. The third, the "game module", takes the dungeon and character information and supplies the game's mechanics (Howarth and Evans, 1984). Both the dungeon and character creators enact system variables through which to store all of their associated data. The game module then accesses this data and uses it to play the game. When each program is loaded, the previous one is erased, but the variables stored within the system are retained, thus creating the necessary environment within which the game module will run.

Modern IDEs are presented as "development environments", but these are not the same sorts of environments that Microsoft BASIC offers. Rather, these programs allow for the development of individual programs that only come to life when executed by the user. Variable assignments in Visual Basic or Objective-C do not spread beyond the boundaries of the programs within which they were made. A running program is given an allotment of memory by the OS when it starts up, and all activity takes place within that temporary space, even when external files are accessed. Once a program stops running, all of its internal data is removed from memory, and the larger operating system behaves essentially like the running program never even existed. There are no leftover variables embedded within the system, ready to be used by another program.. The most pressing issue, however, at least for the purposes of this study, is the way in which modern IDEs inherently treat the act of programming itself. Programs that are developed within such systems behave as brittle and temporary phenomena; the slightest mistake in the code can easily crash the entire edifice, potentially wiping away an entire day's work. They are also solitary creations, existing apart from one another even when running concurrently. This 137

has to happen in order for the modern consumer market for software to operate as it does. Software programs are self-contained products that are meant to run identically on every users' own computer. As such, they cannot rely on specific contextual configurations in order to operate. As Kay et al. noted, "[s]eparate applications are an old 60s idea which force a 'stovepipe' mode of authoring that is limited to what the application can do, and this makes new ideas by end-users difficult to fit in" (Kay et al., 2006, p. 5). Each program is essentially an island unto itself.36

This notion of discrete, solitary programs can make program development seem like a rather daunting task. If an aspiring programmer loaded up an IDE for the first time, they would be presented with what is essentially a blank page. It is therefore up to them to decide what sort of program they would like to make, what external libraries to bring in, how to arrange the overall project, how to divide up the program itself into logical components, and decide on how potential users will be able to interact with the finished product. The very notion of a "finished product" is in fact built into the system. Programmers, hypothetically speaking, have to invent a new product, and then create this product via their language of choice. This is partly the reason why hobbyist programming is less common, or at least less visible, than it was in the 1970s and 1980s: modern software engineering practices demand that programmers design and create end products that address specific needs in the most efficient manner possible. The space for experimentation within such a process is exceedingly narrow.37

36 An important exception worth noting here are "frameworks", which are essentially collections of functions and routines that plug into software programs so that they behave correctly within a given operating system. Microsoft in particular relies heavily on frameworks, such as its .NET framework for applications and its DirectX network for games. These frameworks are not shipped with any piece of software, but rather reside within the file system of a given OS, ready to be accessed. Despite all this, such frameworks do not give rise to the sorts of shared systems found in Microsoft BASIC and other programming environments such as Smalltalk, as will be discussed below. 37 It might be argued that the "open-source" software movement counters this trend, in that it allows users access to the code they need to customize their programs. The open-source movement is a vital and important aspect of modern computing, and theoretically does away with notion of the finished product, but it still adheres closely to the concept of discrete, solitary programs, as well as to the notion of programs as products (the use of version numbers with most open-source project demonstrates this.) The programming systems I will be discussing here and in the next chapter enable the user to play the role of developer themselves, and to focus on the development of tools and components within environments that may house interconnected collections of programs. 138

But what if there existed a programming system with a different philosophy, one that did not focus on the development of products? How would such a system operate? Adele Goldberg and Joan Ross, in a 1981 article in BYTE magazine on the educational value of the Smalltalk-80 programming system (to be discussed in more detail below), provide important insights on a potential alternate vision of programming:

Contrary to the idea that a computer is exciting because the programmer can create something from seemingly nothing, our users were shown that a computer is exciting because it can be a vast storehouse of already existing ideas (models) that can be retrieved and modified for the user's personal needs. Programming could be viewed and enjoyed as an evolutionary rather than a revolutionary act. The frustration of long hours of writing linear streams of code and then hoping to see some aspect of that code execute was replaced by incremental development. Emphasis was placed on learning how to make effective use of existing system components (Goldberg and Ross, 1981, p. 354)

Such an approach deals directly with many of the issues that have been discussed up to this point. Instead of behaving as discrete solitudes, all programs would co-exist within the same environment; the very notion of individual programs, in fact, is completely gone. Instead, what is being developed is a "vast storehouse" of all the components within the programming environment. Programming would therefore consist of arranging these components into workable configurations, and adding new components if need be. As with Microsoft BASIC, the user interacts with an entire system, rather than simply a programming language IDE, the difference being – and this point is crucial – that Goldberg and Ross's system allows for entire programming modules to be stored within the system, rather than just global variables. This makes for a much more powerful overall programming system. It also makes programs much more robust, in that they are built on foundations consisting of familiar and (presumably) well- tested components embedded within the same environment, as opposed to external libraries that must be configured to meet the needs of individual projects. And, as will be discussed in more detail below, arguably the most important difference is that both users and developers can and do inhabit the same system; there is no black-boxing of code, so users are not limited in terms of how they interact with the programs they use. Rather than reducing users to suppliers of specific 139

variable values, this type of system would inherently support the examination and modification of all aspects of a given program.

5.2 Worlds of Lisp

Lisp (short for "List Processing", though the full form is rarely used) was one of the first programming languages to reside within a larger runtime system/environment that it could examine and alter. It is also very different from the languages that have been looked at up to this point. James Noyes, in his work on the use of Lisp in the field of artificial intelligence, summarizes the salient points as follows:

How can a language with a seemingly mundane purpose as list processing be the primary research language for something as advanced as artificial intelligence? The answer is that a list is an extremely general data type and can consist not only of symbols (such as letters and words) but of other lists as well – without limit (Noyes, 1992, p. 24).

Lisp was first conceived of in the late 1950s by John McCarthy while working at MIT; McCarthy would go on to Stanford University in 1962, founding the Stanford Artificial Intelligence Laboratory (SAIL). McCarthy's research in AI, which began at MIT, led him to first develop the hypothetical "Advice Taker" program. The goal of this project was to find a way of "representing information about the world by sentences in a suitable formal language," and then develop a "reasoning program that would decide what to do by drawing logical consequences" (McCarthy, 1978, p. 217). Partly inspired by the RAND Corporation's Information Processing Language, which allowed for operations on lists of values and other symbolic expressions, McCarthy went on to develop an evolving prototype of the Lisp language, but Steve Russell – a student of McCarthy's, who would go on to become the primary programmer responsible for Spacewar, as discussed in a previous chapter – created the first fully-realized implementation of the language in 1958 (McCarthy, 1978; and Graham, 2004; see also Newell and Tonge, 1960). Version 1.5 of the language was particularly popular, and was implemented across a wide variety of computing platforms. There are now several modern versions of the language, each somewhat different than the others, with Scheme, Common Lisp, and Clojure being perhaps the most well- known. 140

Before these implementations were developed, however, there was Interlisp; designed and developed both at BBN and Xerox PARC, largely by Warren Teitelman, Interlisp is arguably one of the major reasons that the Lisp language has become as important as it has. The fact that it does not receive as much credit as perhaps it should may be due to the fact that it was initially largely, though not entirely, superseded in popularity by Maclisp, a Lisp variant developed by MIT's Project MAC. Steele, Jr. and Gabriel, who wrote a detailed account of the evolution of the language following the 1.5 version's release, make it quite clear that these were the two dominant forms of Lisp through until roughly the 1980s. The major difference was that Interlisp contained a global environment, as the authors note:

The primary differences came from different philosophical approaches to the problem of programming…MacLisp users…were willing to use a less integrated programming environment in exchange for a good optimizing compiler…Interlisp users preferred to concentrate on the task of coding by using a full, integrated development environment (Steele, Jr., and Gabriel, 1993, p. 19).

Teitelman, according to a paper he wrote on the history of Interlisp, had been interested in developing tools to allow for more intuitive development of programs since his graduate school years. Initially aiming to create a program that could dynamically learn optimal strategies for generic game types, he noted the following in an important passage:

At first, I wanted to develop a general game playing program, one that could be given the rules for a new, simple game, and devise a strategy…I quickly realized that I was going to be spending a significant amount of effort changing my program as I evaluated its behavior and identified shortcomings. I would not be able to work out a design and then code and debug it (Teitelman, 2008, p. 1).

What Teitelman expresses here is one of the major obstacles to traditional programming. In a punched-card system, or even in a batch-based command-line system such as the modern GNU Compiler Collection (GCC), programs are essentially expected to be prepared, designed, coded, and tested in their entirety – that is, a program is meant to be developed in a single development phase. In reality, of course, this rarely happens – when using a compiler such as GCC, a 141

developer will generally build up their code in multiple phases, testing as they go. But such compilers are not fundamentally designed to accommodate such a development process.38 What Teitelman was looking for was a method to develop a program in something of an organic fashion, building it up and testing it at the same time, with the results of these tests informing the overall direction of the project. It is this philosophy that will be applied in the next chapter not only to game development, but to game play itself.

Getting back to the present discussion, Teitelman's thinking was heavily influenced by the emergence of time-sharing networks, yet rather than taking up FOCAL or one of the other time- sharing languages of the day, he set out to make Lisp – a language that was popular among AI researchers , as already noted – into a time-sharing language by developing a programming environment for it. He began this work at BBN and, in 1972, moved to Xerox PARC, and with the assistance of other researchers at both institutions went on to develop Interlisp, so called because of the funding received for the project from ARPA. Yet rather than simply being another JOSS-like language with a rather simple global environment that allowed for the storage of variables, the nature of Lisp was such that Interlisp introduced something far more advanced: the ability to store code into the environment, not just variables. This was a pivotal step towards the development of full interactive programming systems.

It is worth illustrating this process in more detail in order to better understand its importance.39 As with JOSS and BASIC, Interlisp operated via a command-line interface. Unlike these later languages, however, commands were always executed immediately; there is no notion in Interlisp of a multi-line stored program. This sort of interface would become known as a read- evaluate-print-loop (REPL) system; it is now also commonly referred to as a "console" interface. As already noted, the list is the primary element of computation in Lisp. Lists are typically enclosed either in parentheses or square brackets, depending on the implementation, and appear generally as follows:

38 Modern GUI development systems help to make amends for this, but they do so only partially, as will be discussed. 39 The examples used here are modeled after those found in Chassell, 2009; and Fong, n.d. All examples were tested with GNU CLisp v2.49. 142

(4 10 7)

(eggs milk cheese)

Lists may also contain functions and operators, meaning that their contents will be evaluated and the end result displayed to the user. Such functions may be simple arithmetic operations, as follows:

(+ 4 8)

(* 2 4 6)

The first statement will return the value 12, while the second will return 48. While these were simple examples, these sorts of statements can become quite elaborate, as lists can be nested, and more complex functions included.

For the purposes of the present discussion, however, there are three commands in particular that are crucial: setq, defun, and defmacro.40 The setq command allows you to define a variable, just like in BASIC. The major difference is, in Lisp, variables can store entire lists, as follows:

(setq x '(2 4 8 16))

This assigns the list (2 4 8 16) to the variable x. In Interlisp, if a programmer were to then simply type 'x' as a follow-up command, the list would be returned and displayed. As with the BASIC command "LET X=10," this variable is not related to any specific program – rather, it is stored in the larger system environment, and may be freely accessed and modified in subsequent statements. However, what makes Interlisp and later variants truly powerful is the fact that they can store entire processes, and not just variable values, within the system. The defun command allows the user to define functions – that is, sets of operations that are to be performed when called upon, with the user supplying needed variable values. This is best demonstrated with an example. The following statement creates a function called "double" that will multiply any supplied value by two:

40 Lisp employs many commands with names that are difficult to decipher, which is one of its major drawbacks. 143

(defun double (x) (* x 2))

If the user were to then type in the statement "(double 7)", they would receive the value "14" in return. Note what is going on here. The initial defun statement created a function called "double", which was then stored within the Lisp environment. At this point, every time the user includes the term "double" in subsequent statements, this function will be called upon. As with the variables discussed above, defun functions belong to the system environment, not any one specific program. The defmacro command is an advanced topic, and will not be discussed here; it is similar to the defun command, but is much more powerful, allowing the user to change the very syntax and semantics of Lisp itself. Again, macros are stored within the larger system, and may be called upon at any time.

As might be imagined, Interlisp environments can get rather complex and sophisticated; in Sandewall's highly-important article on interactive programming systems – to be discussed in more detail below – he develops an entire Lisp-based environment that "administers the calendars of meetings, appointments, etc. for a number of users" (Sandewall, 1978, p. 41). When such a complex system is developed, the user will presumably want to preserve the environment for future use, much in the same way that programs can be saved and later retrieved in other languages. A copy of an environment at a specific point in time is generally referred to as an "image". Storing and retrieving images is not a particularly easy task; MIT professor Joel Moses, using the term "free variables" to describe environmental elements, described the situation as follows:

When a function uses free variables or when it calls functions which use them, then the value of this function is not completely determined by its arguments, since the value might depend on the values of the free variables. Thus in order to completely specify a computation, one has to specify a function and its arguments in addition to an environment in which the values of the free variables can be determined (Moses, 1970, p. 2).

Such computation images can in fact be stored in virtually all modern dialects of Lisp. Interestingly, however, there is not one uniform command to do so. In Clojure, the rather awkward CCL:SAVE-APPLICATION command will save a system image. In CLisp, the command is EXT:SAVEINITMEM. In Common Lisp, the rather striking command save-world is 144 employed. Here the term "world" is used to describe the system environment. This is a powerful metaphor which treats the collection of variables, functions, and macros that make up a Lisp environment as a sort of digital ecosystem, connoting notions of connectivity and interdependence among elements that mirror those of biological and/or social systems. It also suggests that the potential to create new combinations of old elements, and to develop and integrate new elements, is virtually limitless. Similar metaphors will again arise when discussing Alan Kay's work, below.

Despite its innovative environment, or perhaps because of it, Interlisp did not circulate as widely as Maclisp. Performance was something of a problem, given its bulky nature. Maclisp was more nimble, as therefore more capable of evolution; Steele, Jr. and Gabriel note that the "Interlisp programming style is heavily influenced by the programming environment…therefore the Lisp portion of Interlisp remains very similar to early Lisps," whereas "[i]n dialects like MacLisp…the Lisp part of the language itself was advanced" (Steele, Jr., and Gabriel, 1993, p. 18). What changed things was an increasing recognition of the importance of the environment, as well as the emergence of more powerful machines capable of better realizing such environments. An interesting yet somewhat fleeing manifestation of such thinking was the emergence of the "Lisp machine" – that is, digital computers that used a Lisp environment as an operating system, similar to Microsoft BASIC – in the 1970s and 1980s. Such machines did not meet with great success, but they advertised quite effectively the potential of Lisp environments, given the proper hardware. Steele Jr. and Gabriel explain this influence as follows:

With the advent of the MIT Lisp Machines, with their greater speed and much greater address space, the crowd that had once advocated a small, powerful execution environment with separate programming tools embraced the strategy of writing programming tools in Lisp and turning the Lisp environment into a complete programming environment. (Steele, Jr., and Gabriel, 1993, p. 13).

At this point, the history of Lisp splinters into several different variations, as already noted. Yet all the major new versions of the language – Scheme, Common Lisp, CLisp, and so forth – operate within their own environments, as newer computers are able to handle the required overhead. What began as something of a fringe idea became, once the proper technology became available, an expected and powerful feature. 145

5.3 IPSs, in Theory

As influential as Interlisp turned out to be, the impetus for more generalized development of interactive programming systems would come from elsewhere. The major push for the IPS paradigm in fact came from within academia, and with the research of a handful of scholars who laid the foundation for much of the IPS work that was to take place in the late 1970s and early 1980s. The focus here will be on two doctoral dissertations, with related articles brought in when relevant. Both works proposed development projects that reflected IPS concepts and ideas, yet all were only partially realized. Regardless, these research projects helped to shape in large part what later researchers came to think of as interactive programming systems. While Interlisp provided much of the technical foundation for IPSs, then, these works provided the intellectual support needed to fully realize their potential.

The earlier of these two works was Alan Kay's doctoral thesis entitled The Reactive Engine. Kay does not use the term interactive programming system in his work, but what he describes would serve as the intellectual foundation for his later work, and the work of others, on IPSs. Kay states at the outset that his primary interest is "[t]he design of machines which can participate in an interactive dialogue," and in particular "the design of a system to represent the form (syntax) and the modeling (semantics) that is necessary to represent a universe" (Kay, 1969, pp. 1, 11). A "universe" in the sense that Kay is using it is essentially the same as Papert's notion of the microworld – that is, a self-contained system of rules and concepts that can be used to piece together meaningful, expressive elements and entities that "inhabit" this system.

As he describes it, a major influence on Kay's approach to programming was the Simula programming language, developed by computer scientists Ole-Johan Dahl and Kristen Nygaard at the University of Oslo. The Simula language was, according to its creators, "a language designed to facilitate formal description of the layout and rules of operation of systems with discrete events," and was therefore one of the earliest languages designed specifically for simulation programs (Dahl and Nygaard, 1966, p. 671). As Jean Sammet describes it, the language was a "true extension of ALGOL 60," an early programming language, and the "basic idea" behind it was "to add to ALGOL the concept of a collection of programs called processes conceptually operating in parallel" (Sammet, 1969, p. 657). Markoff effectively explains the impact that Simula had on Kay, and is worth citing at length: 146

Previously, Kay had not fully understood what Sutherland had been doing inside his Sketchpad program…but as he looked at the Simula listing lying on the floor he realized that the two programs shared a basic approach.41 The insight came to him…when he saw that both programs were attempting to create something that was akin to a biological cell mechanism in which simple building blocks are used to create complex systems…Traditionally, computer programs have been divided into data structures and procedures…Now he had stumbled across an entirely new way of looking at computation in which all the components are modular, mimicking the intrinsically cellular structure of living systems (Markoff, 2005, p. 143).

Kay himself explained his follow-up thinking as follows:

For the first time I thought of the whole as the entire computer and wondered why anyone would want to divide it up into weaker things called data structures and procedures. Why not divide it up into little computers, as time sharing was starting to? But not in dozens. Why not thousands of them, each simulating a useful structure? (Kay, 1993, p. 71).

In Kay's dissertation, he describes the design of a hypothetical computer called the "FLEX Machine" that is often given credit as being one of the first descriptions of a personal computer, and, indeed, the first time the term "personal computer" was even used (Johnstone, 2003). The FLEX language, an essential component of his machine, echoes many of the ideas Kay encountered with Sketchpad and Simula. As Maxwell notes in his dissertation on Kay's later work, the FLEX Machine was an attempt to "generalize the master and instance architecture of Ivan Sutherland’s computer graphics research from the early 1960s, and to elaborate a vision of what might be the successor to the dominant computing paradigm of the day" (Maxwell, 2007, p. 107; emphasis added). Kay also had the notion that such a language could become a powerful means of expression; as he notes at one point in his work, "FLEX is an idea debugger and, as such, it is hoped that it is also an idea media [sic]" (Kay, 1969, p. 75). Yet FLEX was also

41 In a favourite anecdote, Kay recounts how he and a fellow graduate student parsed the source code for Simula by laying it out in a hallway in the building where they studied. 147 intended to be a work in progress, and more of a space for hypothetical experimentation than a fixed language to be implemented on an actual machine. As Kay explains it, FLEX "is really the author's own private vehicle for investigation into these areas and is (and will be) a project not considered closed" (Kay, 1969, p. 74). As such, the fact that FLEX was never developed does not mean that Kay failed in his amibitions.

A year after Kay completed his dissertation, James Mitchell, a doctoral student at Carnegie- Mellon University, finished his thesis entitled The Design and Construction of Flexible and Efficient Interactive Programming Systems (Mitchell, 1970). This was not the first time that the term "interactive programming system" had been used – Harold Borko of System Development Corporation, to take one example, had used it in a 1966 paper to refer to what were essentially time-sharing interactive systems, such as a document retrieval system (Borko, 1966). But Mitchell's usage is the first in which he describes a programming system that meets the criteria set out by later computer scientists such as Sandewall. Swinehart, in his own later dissertation, calls Mitchell's thesis "a significant contribution to Interactive Programming Systems" in and of itself (Swinehart, 1974, p. 5). Mitchell eventually went on to become one of the head designers of the Mesa programming language, which, along with Smalltalk and Interlisp, was one of three IPS-like languages developed at Xerox PARC, and became the operating system language used by the Xerox Alto personal computer; as such, it plays very much the same role that Lisp played in the various Lisp machines that were created. Yet it is worth first exploring Mitchell's graduate work because he so comprehensively expresses many of the ideas that would go into later IPS research and development.

Mitchell notes that there are two important factors that inform many of the arguments he makes with respect to IPSs in his work: flexibility, and efficiency. Efficiency is rather self-explanatory, but his explanation of flexibility is worth citing in full: "[f]lexibility is the ease with which the user of an interactive programming system (IPS) can manipulate the objects of interest to him; this universe of discourse includes programs, variables, data and control, in their various representations" (Mitchell, 1970, p. 1-1A). He explains that "the notion that the program is a changeable, fluid object" is what motivates this design feature, and notes that Lisp is one of the few pre-existing languages in which this property holds true (Mitchell, 1970, p. 1C-7). His explanation as to why this is advantageous, however, is somewhat inscrutable; I will therefore 148

cite from Sandewall, who makes the benefits of programs being represented as structured data (that is, data embedded within data structures) clear in the following passage:

[T]here should be a predefined, system-wide internal representation of programs which reflects their structure in as pure a form as possible, for example as a tree structure. This structure should be a data structure in the programming language, so that user-written programs may inspect the structure and generate new programs (Sandewall, 1978, p. 37).

The ability for user-written programs to examine and generate code in this manner is known as reflection. More specifically, the ability to examine code is known as introspection, while the ability to generate new code is known as intercession; together, they are called reflection. The notion of representing code as data, while it did not have a name when Mitchell wrote his thesis, is now known as reification (Kind and Padget, 1999).

Mitchell's dissertation, then, provides the firmest foundation thus encountered for the development of IPSs, and he goes on in fact to develop the prototype system LC2, as noted above. LC2, however, did not truly advance beyond the prototype stage. As will be discussed in the following section, however, Mesa, a language Mitchell co-designed, was integrated into a short-lived but important personal computer system.

5.4 IPSs, in Practice: Xerox Star, Mesa, and Smalltalk

It is impossible to discuss IPSs without referencing the outsized influence of Xerox PARC. Along with the development of Interlisp (se above), PARC was also the site where two other major IPS projects were undertaken: Mesa, which became the development language of the highly-influential Xerox Star personal computer, and Smalltalk, which became arguably the most successful IPS of its day, given that it is the only one that is still used today. No other organization came even close to this sort of IPS output. It could, in fact, be argued that PARC was the only research laboratory that was truly interested in IPS development; prominent computer scientists interested in the subject tended to stay within the organization for several years, including Kay, Mitchell, and Swinehart. Despite the breakthroughs achieved there, however, PARC's work, particularly on the personal computer front, was eclipsed by that of companies such as IBM, Commodore, and Apple. Yet Apple would incorporate many of the 149 design features of the Xerox Star PC, as would Microsoft when it began to develop its Windows system. Yet despite such imitative work, most of the foundational elements of interactive programming systems did not survive, with the result being the decoupling of system and programming language, an issue touched upon earlier.

As noted in the literature review, the story of the Xerox Star has been told many times, and there is no need to recapitulate all of the details here. Suffice it to say that the Star, introduced in 1981, was the successor to the Alto system developed several years earlier, which itself was inspired in part by the work of Engelbart on the NLS. Among the Alto's innovations was the development of one of the first windowing systems, a legacy which strongly resonates today. Yet the Alto, like the NLS, was meant to be a holistic problem-solving tool, and many of the Altos that were made ended up in the hands of computer scientists and engineers. The Mesa programming language was a key element of the Star's infrastructure; its principal architects were James Mitchell, as noted above, as well as Butler Lampson, Edwin Satterthwaite, Charles Geshchke, and Richard Sweet (Mitchell, Maybury, and Sweet, 1979). Mesa was based in part on ALGOL, the same language that Simula was based on, except that it became its own, separate language. For the purposes of this study, however, the details of the language are not overly important – what is important is the fact that the Xerox Star system was built using Mesa. This means that, given the right tools, Star users could use Mesa to control the very system that they worked with – and this indeed is what happened. A passage from a manual for the Xerox Development Environment (XDE), a key component of the Mesa system, makes this clear:

The XDE provides a standard set of tools, or applications programs, that support and simplify many common programming tasks. However, this tool kit is completely open-ended; you are not limited to the existing tools. Rather, the tools are built from a large library of extensively layered system routines and primitives. You have access to all the system routines, both simple and complex, and are free to use them to custom tailor the existing tools or to create your own specialized tools…you can modify the environment as much as you like; tools that you create have exactly the same status as the built-in ones. The software environment is thus layered, fully extensible, and extremely flexible (Office Systems Division, 1984, p. I-1). 150

The primary problem with Mesa was its total reliance on a specific computer system. The fact that it was linked inextricably with the Star was both a major asset and a serious hindrance when it came to the language's continued survival. It remained an in-house system, evolving eventually into the Cedar language, which expanded its functionality somewhat and added more tools for developing and debugging programs (see Lampson, 1983/1986).

Unlike Mesa and Cedar, the Smalltalk language survives today, albeit somewhat obscurely.42 Smalltalk was another product of Xerox PARC, with Alan Kay expanding on his ideas concerning the Flex Machine, and adding in new ideas formed when he conceived of another hypothetical device called the Dynabook. Bardini stated that "[t]he Dynabook enriched the early vision of Flex on the basis of the computer as a medium," while also noting that "Kay described it as a 'dynamic' medium 'for creative thought...a self-contained knowledge manipulator in a portable package the size and shape of an ordinary notebook'" (Bardini, 2000, p. 151; citing Kay and Goldberg, 1977/2003, p. 394; emphasis added). Kay was inspired to move beyond Flex in part by the work of Papert, and in particular his focus on children. As such, his major early article on the Dynabook concept was entitled, "A Personal Computer for Children of all Ages" (Kay, 1972). Yet, as the title also suggests, the machine was meant to be a means for creative expression for all users; Kay at one point argues that computing technology can be "like a piano…but one which can be a tool, a toy, a medium of expression, a source of unending pleasure and delight" (Kay, 1972, p. 1). In order to illustrate the basic functionality of his hypothetical device, Kay writes a fictional account of the Dynabook in use. It is worth citing this at length, since it reflects many of Kay's ideas with respect to how computing could function that will become important in the discussion to follow:

Zap! With a beautiful flash and appropriate noise, Jimmy's spaceship disintegrated; Beth had won Spacewar again. The nine-year-olds were lying on the grass of a park near their home, their DynaBooks hooked together to allow each of them a viewscreen into the space world where Beth's ship was now floating triumphantly alone.

"Y' wanna play again?" asked Jimmy.

42 Smalltalk's relative lack of contemporary popularity is discussed in more detail later in this section. 151

"Naw," said Beth, "It's too easy."

"Well, in real space you'd be in orbit around the sun. Betcha couldn't win then!"

"Oh yeah?" Beth was piqued into action. "How could we do the sun?"

"Here look." Her fingers started to fly on the DynaBook's keyboard, altering the program she had written several weeks before after she and the rest of her school group had "accidently" been exposed to Spacewar by Mr. Jacobsen. "You just act as though the ship is pointed towards the sun and add speed!" As she spoke her ship started to fall, but not towards the sun. "Oh no! It's going all over the place!"

Jimmy saw what was wrong. "You need to add speed in the direction of the sun no matter where your ship is."

"But how do we do that? Cripes!"

(Kay, 1972, p. 2).

One might critique the believability of the story, or the quality of the dialogue, but there is no missing its key points. Note, for example, how the children in the story both play, and develop the game they are playing, at the same time. Development is as much a part of the play as what would be considered to be more familiar game play. This is a crucial aspect of the scenario which will go on to inform much of the discussion of Smalltalk below, as well as the discussion in the following chapter on potential future directions for similar work. The collaborative aspect of the work is also something that will be discussed, though I would argue that Kay never strays too far in this direction, focused so much as he is on the "personal" aspect of the personal computer. Finally, given that the scenario involves children, it may be assumed that ease-of-use is implicit in the narrative, in that they are able to develop and modify the game seemingly without extensive technical knowledge, or the years of experience a professional programmer would have acquired.

While Kay never built his Dynabook – the Alto is sometimes nicknamed the "interim" Dynabook, which perhaps assigns too much credit for the design to Kay – the Smalltalk language 152

was a rather full realization of his vision from a software perspective. Smalltalk is many things: a spiritual successor to Lisp, arguably the first object-oriented language, the inspiration for Objective-C, and the first language to come with its own graphical development environment. But it is also a programming playground, or programming ecosystem, of the sort envisioned by Papert and later by Kay himself, within which users may immerse themselves and build program "universes", and not just single programs; it is, to repeat the citation of Goldberg and Ross above, "a vast storehouse of already existing ideas (models) that can be retrieved and modified for the user's personal needs," designed so that "[p]rogramming could be viewed and enjoyed as an evolutionary rather than a revolutionary act" (Goldberg and Ross, 1981, p. 354). Smalltalk- 80, released in 1980, was the first truly complete version of the language to be developed. Drawing on Kay's thinking with respect to "cellular" system design, Smalltalk consisted of two fundamental programming elements: objects and messages. Objects are self-contained bundles of code that express specific qualities and behaviours, and may be instantiated as much as memory will allow. Messages are the means by which objects communicate, and generally consist of data arrays that call procedures and pass parameters. Users may build and modify objects, or simply take advantage of those that come packaged with the language by passing messages between them. They do so within a graphical interface that is itself built out of the Smalltalk language, in keeping with IPS principles. Programs are never "executed" – message passing occurs in what other languages would consider to be "design" mode, meaning that all code is available and modifiable. Objects may also support multiple "applications", all existing and operating within the same environment.43

In Smalltalk, then, development does not follow the two-step process of coding and execution used by languages such as JOSS, FOCAL, and BASIC. Program "consumers" are not limited to the affordances allowed by commands such as INPUT and Demand – there is, in fact, no real concept of a program consumer. Rather, the entire system is always live, ready to be built upon and tweaked as any and all users see fit. When such users need to perform specific tasks, they

43 Stéphane Ducasse's website contains a host of freely-available Smalltalk books; see http://stephane.ducasse.free.fr/FreeBooks.html. Foundational texts include Goldberg and Robson, 1983; and Goldberg, 1983; the so-called "blue" and "orange" books, respectively. 153

pass messages to the appropriate objects, and let the ongoing processes unfold.44 In such an environment, the top-down programming paradigms discussed here in previous chapters begin to fade. Recall the computational paradigm developed at the Mathematical Tables Project, and continued on in the ENIAC, as designed by Mauchly, and in all other digital computers since. As Chun noted, numeric computation has been organized such that "bureaucracies within the machine…mirror the bureaucracies and hierarchies that historically made computing possible" (Chun, 2011, p. 28). Neither the managers nor the workers at the Mathematical Tables Project had much agency over the processes they designed and obeyed, respectively; rather, the processes themselves dictated the actions that were taken. Similarly, neither developers nor users have control over traditional programs once they are executed. Computer software runs according to the rules embedded within it, with no human agency involved to alter the processes in place; users do supply data, but typically have no say in how this data is used. Yet with a language like Smalltalk, software is no longer a "recipe", as Chun puts it. Rather, users guide processes from start to finish, passing messages, monitoring results, modifying components, and building new components as needed. Human agency is perpetually present, and users have control over all aspects of the programming system. Fordist algorithms are discarded in favour of a framework that empowers programmers to build digital objects that are open and fully interactive.

5.5 Towards a Modern IPS

The history of computing traced in chapters two and three demonstrated the power of the digital computational paradigm as it was established in the first half of the twentieth century. Even when variant forms of computing emerged, program development still adhered at least in part to its Fordist roots – that is, computation based on arithmetic algorithms that are the equivalent of the physical assembly lines of modern manufacturing plants. Such an approach, as discussed, subsumes human agency in favour of process, with managers/programmers devising rules that they then surrender to automation. There were attempts to expand the reach of more interactive forms of computing, from time-sharing networks to the development of command-line languages

44 It is possible to build standalone applications out of Smalltalk that do resemble those produced by other languages. But this feature is something of an expedient measure so that Smalltalk developers can create the sorts of programs that modern users expect. The language is really meant to be experienced in its native environment. 154 such as JOSS. Such work provided the foundation for the BASIC hobbyist programming movement, chronicled in chapter four. It was at this moment that the freeform nature of game programming on the earliest personal computers emerged, as game elements were mixed and matched and exchanged between users via such media as books and magazines. Despite such innovative practices, however, BASIC still largely promoted the production and consumption model of computation. The notion of software reigned supreme. Finally, however, in this chapter, systems (actual and hypothetical) that attempted to break out of this model, with varying success, were discussed.

At this point, it is worth considering how the best ideas drawn from this earlier work may be applied in the development of new systems. Much of the knowledge of IPSs discussed in this chapter has been largely lost, with only small groups of enthusiasts remaining to champion their respective causes. Yet systems such as Interlisp and Smalltalk were not without their faults, and therefore attempting to promote them might not be the best approach with respect to presenting alternate forms of computation. Based on the research presented here, I would argue that the best way forward is to prototype new systems based on more modern, flexible technologies such as HTML5 and Javascript. Such languages are more popular, more accessible, and are capable of supporting highly-sophisticated programming techniques such as reflection. The goal, then, will be to create something that bridges the gap between the IPS movement and more modern computing practices. 6 Gaming the IPS Paradigm: Hail Workshop

In the previous chapter, when the interactive programming systems that were developed in the 1970s and 1980s were discussed, it was noted that games were largely absent from the discussion. The major exception was Alan Kay's use of Spacewar as an illustrative example of the power of his hypothetical Dynabook machine, yet even in this case Kay made sure to emphasize the educational value of playing with the mechanics of the game, at least for children. Nowhere is it expressed that games are meaningful and important simply as games, or that game development has value beyond the pedagogical. In this chapter I will move precisely in that direction, and argue that games could, and often should, form a central component of any IPS, and that, moreover, many games could be effectively developed within IPSs, thus leveraging the benefits of IPS program development outlined in chapter five.. This entails a significant alteration in perspective with respect to what games are, how they function, and how they should be developed and otherwise interacted with. In particular, the roles of developer and player are going to be largely merged – in the sense that the user may both develop and play at the same time – and it will be argued that such a merge is mutually beneficial. By moving games into the IPS sphere, they turn into "vast storehouses of ideas," to paraphrase the Goldberg and Ross quotation from the previous chapter (Goldberg and Ross, 1981, p. 354). This means moving beyond the traditional perspective of games as discrete bundles of rules and mechanics, supplemented by limited user input to keep things moving. Instead, games must be thought of as being composed of building blocks of various ideas, with each block having its own intrinsic value. So, for example, a variable that contains a value associated with a given game can be treated as its own block; it can then be used in ways that go beyond the mechanics of a traditional game system. A variable that describes the population of a city in a city-management game, for example, could form the basis of a different game in which an entire region is being managed; or it could be used to generate a textual history of the city that is updated whenever the value changes. The only true limit to the potential uses of such a variable is the imagination of the user/player working with it within a given IPS system.

Given this merger of user and player into a single role, the issue of limited player agency discussed in previous chapters is, I would argue, largely rectified. No longer is the player limited to providing a prescribed list of variable values to a given game system. In an IPS, the user has

155 156

full control over not only every game variable, but over the game code itself. From a certain perspective, such agency may be seen as destructive; if the user has full access to the variables that determine whether or not they are successfully playing a given game, then what is to stop them from simply altering those values to whatever they please? There is, in fact, nothing stopping them from doing just that. It should be recalled, however, that digital games are rather unique with respect to the control they have over their own rules and mechanics. In a board game, for example, there is nothing stopping the players from breaking the rules and simply moving their counters to the finish line, hypothetically speaking. Yet this is not seen as a serious hindrance to game play; on the contrary, such freedom allows players to adapt game rules to suit their own styles of play, and even to invent new rules. Board games do not lock players into a certain mode of play; rather, they invite players to engage with them however they choose. Adhering to the game rules is a choice that players make, and while it could be argued that this is the most popular way to play most games, it still allows for a type of active engagement missing from most digital games. IPSs, then, provide similarly open environments, thus games "belong" to the players.

An important element of this process is language itself. As noted in previous chapters, modern programming languages are strictly utilitarian, and structured to enable top-down planning and hierarchical divisions of labour. BASIC was different because it enabled a more unstructured style of programming in which line numbers, not structural elements, bound code together. Yet BASIC was also something of a messy language; the more abstract components that made up a given program could be hard to discern, especially if they were spread out over multiple locations within a single program. What is needed, then, is a language that provides some structure to organize elements, without imposing any sort of rigid organizational principle. To this end, I have developed a prototype IPS language that borrows many features from BASIC and other teaching languages such as Pascal, and embedded it within a windows-based IPS designed as a standalone, HTML and Javascript-based application. After discussing the benefits of IPS gaming from a hypothetical perspective, and then discussing existing programming languages and applications that support some IPS principles, this prototype system will be introduced. 157

6.1 Games as Interactive Programming Systems

What would an IPS game look like? Before getting into the specifics of code, it is worth looking at this issue from a higher level. For this reason, I have selected the game Hammurabi, one of the most important programs from the hobbyist BASIC era, to serve as an example of how a game might operate within an interactive programming system. Before getting into the details of this work, however, it is necessary to discuss the concept of "pseudocode," as it will be relied upon heavily to describe the ways in which IPS programs are designed and executed. The essential mechanics of Hammurabi were discussed in the chapter on hobbyist programming in order to show how it operated at a general level. But in order to understand the advantages of IPS-based development, these mechanics must be considered at a more detailed level – that is, it is necessary to examine the specific components that make up Hammurabi, and how they relate to one another. It is possible, however, to do so without having to be immersed in the code used to create the game. The step-by-step work performed by the program may be represented using common language; this is the essence of pseudocode (see chapter one).

In practice, pseudocode typically involves breaking a program into steps and sub-steps, and then describing each in informal language. The execution of the program may then be simulated by starting at the top of the list and proceeding downwards one step at a time, except at points where the reader is explicitly directed backwards or forwards to a specific line.

Given the structure and content of the original Hammurabi program, I have created a pseudocode version of Hammurabi which proceeds as follows: 1. Initialize variables a. 1000 acres of land b. 100 population c. 3000 bushels in storage d. Year 1 of rule

2. Present information to user

a. Print year # b. Print # of people starved, # of people came to city c. Print population, acres of land d. Print # bushels harvested per acre e. Print # of bushels eaten by rats, total bushels f. Print cost of land in bushels/acre

158

3. Get user input a. Get # of acres to buy, sell b. Get # of bushels to feed people c. Get # of acres to plant with seed

4. Calculate results a. Calculate size of harvest b. Calculate rat infestation c. Calculate # new people d. Calculate # of people who starved e. Calculate new cost of land

5. Cycle a. Increase year by 1 b. If less than 10 years, return to step 2 c. If 10 years, print final results, evaluation

Some of the details of the game have been simplified, while others have been left out entirely – code, for example, that checks to see if the user has selected to sell more acres than they own has not been specifically referenced, but it may be assumed that this process is an element of step 3a. It is also important to remember that the actual BASIC code is not organized so neatly – as already noted, it includes a number of goto commands that divide up steps into different locations in the program. This, in fact, is what the game would look like if structured programming techniques were applied. As will be seen, IPS programming offers us many of the advantages of structured programming without also requiring us to adopt its more burdensome practices.

Now that a pseudocode version of Hammurabi has been developed, then, it is possible to conceptually create an IPS version of the game. This will require us, however, to view the pseudocode listed above from a different perspective. As noted, this code is meant to "read" in a rigid way, beginning with step 1a and working downwards unless otherwise indicated. This is the recipe aspect of programming that Chun emphasizes – this metaphor is even more visible with pseudocode, given its use of language instead of code. And this is how program code is executed in Microsoft BASIC, and in virtually all other programming environments outside of IPSs.45 But in IPSs, the situation is different, and it is thus more effective to take this

45 Modern languages are admittedly much more sophisticated than Microsoft BASIC. Code is typically broken up into a variety of functions, and multiple "threads" may execute at the same time. But such code is still 159

pseudocode and break it down into constituent components. It will then be up to the user to determine when and how such components are executed.

Recall our discussion from the previous chapter about programming systems and environments. In Microsoft BASIC, when the user types a line such as "LET X = 10," the variable is assigned and stored within the larger environment, and may be called upon by any other program running within the same environment. In languages such as Lisp and Smalltalk, code could also be added to the environment in the form of functions, macros, and objects. Such functionality allows for a greater degree of flexibility than that offered by most modern programming "studios," where programs are self-contained and larger environments are non-existent. The same variables and code may be used for a variety of different uses, creating networks of elements that users can link together in various ways in order to accomplish specific tasks, or simply to experiment and play with code. Importantly, code within a programming environment may lie "dormant," and will only run if called upon specifically. In this way, users may build up entire libraries of code within the same environment, and only use what is needed at any given time.

IPSs are able to store code in this way using a variety of schemes. A particularly effective approach would be to create "procedures" out of the pseudocode listed above. Procedures are a construct used by languages such as Pascal to facilitate the modularization of structured code (see Dale and Weems, 1997), but they may also be used in an IPS environment in a non- structured manner. A procedure is simply a collection of programming commands that are executed one step at a time; they can be thought of as smaller versions of programs, and typically accomplish one task, or a set of related sub-tasks. Procedures are then given labels, and, within a structured programmed system, they are called upon within the main body of the program by references to these labels. In our hypothetical IPS, these procedures will be activated by the users themselves via these labels. So, for example, step 1 could be converted into a procedure called "INITIALIZE," which would look something like the following:

run in a recipe-like manner. Programs have specific entry points at which execution must begin, and subsequent code is always run in a pre-determined order. As we will see, in an IPS, the user determines the order in which code is executed. 160

INITIALIZE 1. Set variable ACRES to 1000 2. Set variable POPULATION to 100 3. Set variable BUSHELS to 3000 4. Set variable YEAR to 1

Note that I have made the sub-steps more specific as compared to the above pseudocode version by actually naming the variables that are created within this procedure. So, for example, in the first line, a variable called ACRES is created, and set to an initial value of 1000. Whenever the user calls upon the INITIALIZE procedure, then these variables are reset to the values indicated.46 In an IPS, the user would have the freedom to call this procedure whenever they wished, so they may reset the game if they wished to start over again, or if they wanted to try out new code with these base values. And since this is taking place within an IPS, it must also be remembered that these variables are stored within the same environment as the INITIALIZE procedure itself; this means that they may be used, and reassigned, by any other procedure within the same system.

The procedure INITIALIZE, then, creates and defines some of the variables used within Hammurabi, but others are also required in order to calculate the user's performance from year to year. Recall that in step 3, the user was asked to input four values: the amount of acres to buy and sell, the number of bushels to use to feed the population, and the number of acres to plant with seed. In the BASIC version of the game, these values were entered via the INPUT command, a problematic construct that has been discussed in the present study at length. In an IPS environment, anything that limits user agency should be avoided, so commands such as INPUT are generally not available. How, then, is the user supposed to enter these values? They may do so, in fact, by directly assigning the values they have chosen to the variables themselves. As with Microsoft BASIC, where the statement "LET X = 10" creates a variable X with that value, an IPS would allow the user to enter a statement such as "LET ACRESTOBUY = 10," thereby indicating how may acres they wished to buy in a given year. A more elegant way to do

46 We may assume that the variables are actually created the first time the assignments are made, and are simply reassigned these values whenever the INITIALIZE procedure is called upon again. In many languages – with BASIC and Javascript being notable exceptions – variables must be "declared" in a prior statement before any assignments may be made, but for the sake of simplicity this step was omitted here. 161

this, however, would be to create a procedure that set the value for the user. To do this, it would be necessary to pass a parameter to the procedure to let it know what value to use. Parameters are essentially variable values that are used internally by functions to various ends. For example, a BUY procedure could set the ACRESTOBUY variable as follows:

BUY(X) 1. Set ACRESTOBUY to X

X, then is the parameter value used in the BUY procedure. If the user were then to type in BUY(10), then ACRESTOBUY would be assigned the value 10. Following this pattern, procedures may be made out of the rest of the sub-steps in step 3:

SELL(X) 1. Set ACRESTOSELL to X

FEED(X) 1. Set BUSHELSTOFEED to X

PLANT(X) 1. Set ACRESTOPLANT to X

Note that the user may call these procedures in any order that they wish, and at any time that they wish. This allows users to change values if they wish, or to go with the same values from year to year; such flexibility is one of the main advantages of enabling total user agency over code via an IPS.

It is now possible to take the remaining Hammurabi pseudocode and turn it into procedures. In keeping with the INITIAIZE procedure created out of step 1, each remaining numbered step, with the exception of step 3, may be converted into its own procedure. Note the use of the variables defined above in some of these steps, as well as the definition and use of new variables as needed:

INFORMATION (formerly step 2) 1. Print YEAR 2. Print STARVED and NEWCOMERS 3. Print POPULATION and ACRES 4. Print HARVESTPERACRE 162

5. Print EATENRATS and BUSHELS 6. Print COSTOFLAND

CALCULATE (formerly step 4) 1. Calculate HARVESTPERACRE 2. Calculate EATENRATS 3. Calculate NEWCOMERS 4. Calculate STARVED 5. Calculate COSTOFLAND

CYCLE (formerly step 5) 1. Increase YEAR by 1 2. If 10 years, print final results, evaluation

Note that in the CYCLE procedure, step 5b, which returned the program to step 2, has been omitted. The reason for this is because the user should in control of program execution as much as possible. The user, in fact, is in charge of keeping the game going; the advantages of such a model are substantial, however, as will be discussed below.

At this point, all the elements to play what might be described as a "normal" game of Hammurabi have been created. The user, if they wished to, could execute each procedure in the order given in the earlier pseudocode version of the game in order to recreate the Microsoft BASIC experience. This would mean first executing the INITIALIZE procedure, then running the INFORMATION procedure, then entering the input values via the parameterized procedures described above, and then running the CALCULATE and CYCLE procedures. They would then return to the INFORMATION procedure and begin again. This is a perfectly valid way of playing the game in an IPS environment, and demonstrates the increased agency the player has over the program. Rather than being beholden to an executing program, the player directs their play on their own, much as they would while playing a board game or pen-and-paper role- playing game. This added flexibility, however, also allows the player to engage with Hammurabi in new ways, to alter and expand the game by creating new variables and new procedures that all inhabit the same environment, allowing for numerous paths through the code that combine aspects of both game development and game play. 163

6.2 Benefits of IPS Gaming

Now that the process of translating a single game into a hypothetical IPS environment has been detailed, it is now possible to observe the consequences of having done so. It is important to understand why this process is beneficial, because in some ways it radically alters more familiar perspectives on gaming and game development. But it is this change of perspective that allows players and developers to engage with their games in ways that are not possible in other contemporary development environments.

6.2.1 Programming as Gaming

In an introductory article in a Compute! book of game programs for the VIC-20, writer and editor Dan Carmichael makes an interesting observation about hobbyist game development. After noting that the programs listed in the book are "fun for their own sake," he goes on to argue that "one of the best things about typing in programs yourself is that you can see exactly how another programmer created the effects you want to use in your own games. You may soon find that the best computer game of all is programming games for other people to play!" (Carmichael, 1983, p. 8; emphasis added). Can game programming actually be thought of as a form of gaming? Certainly, there was a lot of adaptation of existing programs happening during this era, as was discussed at length in chapter four. Recall Jerry Pournelle's comments on Hammurabi from an issue of BYTE Magazine: "[h]alf the people I know wrote a Hammurabi program back in the 1970s; for many, it was the first program they'd ever written in their lives" (Pournelle, 1989, p. 115). And programming was incorporated into play practices; a series of Choose Your Own Adventure-style books from the era called Arcade Explorers made game programming a fundamental aspect of "playing" each book; as the introduction to the first title notes, "[y]ou are the hero of this book. You'll make choices that will reveal different parts of a computer program, which you'll use to create your own unique video game" (McEvoy and Smith, 1985, p. vii). In its game programming books Computer Space Games and Computer Battle Games, moreover, Usborne Press challenged readers to improve on the listed games in a section labeled "Puzzle Corner," where they would pose questions such as "There are at least four ways of making this game harder. Can you work out what they are?" (Isaaman and Tyler, 1982, p. 15). Game programming is thus presented as a component of gaming, and as a puzzle challenge, but does that mean it is its own form of gaming? 164

In chapter one of this study, the notions of creativity and constraints was introduced within the context of rule-based gaming. As noted, Boden argues that creativity must be directed and goal- based, and it is only within a constraint-based framework that creativity will emerge, while scholars such as Suits and Costikyan argue for the importance of rules and goals with respect to games. In a "sandbox" simulation game such as SimCity, however, the player is expected to devise their own goals. By bringing together all of these concepts, the argument that game programming is an actual game has significant scholarly support.

Programming in any language is inherently a rule-based practice. Proper syntactical and grammatical constructions must be used just to get a given program up and running. Programmers, moreover, tend to be goal-directed, in the sense that they generally expect their programs to perform certain functions effectively and (usually) efficiently. But the presence of rules and goals does not automatically signal that programming is a form of gaming. It is here that the importance of the "lusory attitude" as devised by Suits becomes clear. Programming is so often depicted as a means to an end, so much so that concepts such as "software engineering" have been devised so that it may be imbued with the language and values of capitalist production processes. Within such a context, the adherence to the rules that structure a given programming language is hardly a voluntary practice. Even with respect to game modding (see previous chapters), programming is instrumental, in the sense that the end goal is a program that modifies the way a given game is played. Coding is still considered to be "work," and actually playing the game, modified or otherwise, is completely separate practice.47

In an IPS environment, however, developing in-game components via code is an integral aspect of the overall play experience. Recall Costikyan's comments with respect to SimCity (see

47 There are certain contexts in which this separation between programming and play is perhaps not so rigid. For users that enjoy programming, coding may be considered a play-based practice. The present study, however, is focused on games and programs as texts, as well as on the discursive meanings that these texts express. User impressions could serve as a useful complement to this work, but would require methodologies that are beyond the present scope. In addition, there are certain games in which users are able to create and modify code as they play. Else Heart.Break(), for example, is an adventure game in which you have to reprogram elements of the game world in order to win, while Human Resource Machine is a puzzle game that requires players to master a simplified . In both of these examples, however, most of the game comes pre-built, with the user-supplied code affecting only pre-determined elements, and neither of them functions like an IPS, in the sense of having fully- accessible, modular code. They do, however, indicate an interest in user/player coding within the commercial gaming industry. 165

chapter one), which we can expand on here. Building a city in a game such as SimCity is not meant to be a chore. While players may have specific plans with respect to how they want their city to function, the design and construction of the city is an integral aspect of playing the game. If the game did not make building roads, zoning land, managing financials, etc. part of the fun, it is unlikely that it would have become such a successful franchise.48 These tools are not simply means to an end, regardless of the goals that players have set (or not set) for themselves. Rather, they are meant to support the lusory attitude assumed by the player.

In an IPS environment, I argue, programming becomes very much a lusory practice. When players build (or modify) procedures and components within an IPS system, they follow the rule- based systems defined by IPS environments and their built-in programming languages. But such practices do not constitute work, at least outside of a professional context. Users might have in mind specific games that they would like to build, but they are also free to develop components that do not immediately fit into any specific game. Like in SimCity, tools are provided, and are meant to be used in a lusory manner. Users may set their own objectives, and a "finished" game product is only one of many possible goals. A given user, for example, may want to experiment on the IPS system itself, or build new tools that will support future game development. That is not to say that every IPS tool is going to be "fun." Certainly, in SimCity, certain tasks, such as connecting power lines across building zones, are not particularly exciting. But the overall experience is positive from a game-playing perspective, at least for its many fans. IPSs make coding an essential element of the creative process, and challenge users to play within them.

6.2.2 Gaming as World Building

Orson Scott Card, while still writing for Compute! Magazine, noted that his favourite aspect of game programming was "creating something that never existed before." Interestingly, he compared his form of play to that of children:

Most of all…their playing time is spent making things or pretending things. They spend hours with wooden or plastic building blocks, making castles or spaceships or houses or anything they can imagine…In fact, they do exactly what I like to do

48 Notwithstanding the intense criticism faced by the most recent release, which became mired in technical difficulties and faulty game logic. 166

with the computer: create their own small world that works just the way they want it to work (Card, 1983, p. 34).

Scholars such as T.L. Taylor have delineated the importance of player actions and input in shaping the character of the virtual game worlds within which they play. This is particularly true for massively multiplayer online games (MMOGs), ranging from role-playing games such as World of Warcraft to building games such as Minecraft. Players make such game worlds meaningful through their interactions with them, as well as within larger communities that form around them (such as the EverQuest convention-goers that Taylor profiles in her work; see Taylor, 2006). IPS games have the potential to deepen such interactions by allowing players to interact with the rules and logic that govern gameplay itself, both in MMOGs and single-player games. As already noted, game modders occasionally (though increasingly) have access to such mechanics, but the worlds they interact with are already largely pre-built. IPSs, however, could allow gamers to build new worlds – including all game mechanics – from the ground up, allowing them to grow and evolve as they see fit.

Consider the power a game master wields with respect to pen-and-paper role-playing games. Game masters are not only responsible for devising adventures for their players, but they are also generally responsible for building up the larger worlds within which these adventures take place. A series of adventures taking place within the same setting is generally referred to as a campaign, and, as the fifth edition of the Dungeons & Dragons Dungeon Master's Guide explains it, "[e]very DM is the creator of his or her campaign world. Whether you invent a world, adapt a world from a favorite movie, or use a published setting for the D&D game, you make that world your own over the course of a campaign" (Dungeon Master's Guide, 2014, p. 4). The guide then goes on to explain the of creating a world, from mapping out its geography to developing its major religions to deciding on its basic economic and political systems. These are details that may be tangential to the adventure at hand, but they accomplish three major goals. First, they create a richer setting for the adventure, allowing for the insertion of details into the narrative that make the overall gaming experience more immersive. Second, they provide a framework within which to expand upon the adventure if need be; given that players can improvise many of their actions, a given GM needs to be able to accommodate such actions in a consistent manner, and having a well-built game world provides many of the contextual 167

details needed to shape novel outcomes. Finally, similar to the previous point, game worlds provide frameworks within which to develop new adventures.

Given all this, I argue that IPS games would enhance the meaning-making potential for players because they support all three of these goals. A collection of components within an IPS system would serve as a base upon which to piece together a diverse range of new game worlds, as well as "adventures" within those worlds (however that term may apply in a given game system.) Games such as Minecraft, of course, already support the construction of digital worlds out of blocks. Where IPSs differ is that they would allow players to create the actual components themselves. Role-playing game systems are more than simply collections of rules and regulations. They are also mechanisms through which game worlds are expressed and explored, and are thus fundamental elements of such worlds. The concept of fighting in-game "monsters," for example, is contingent on a given RPG system having rules that explain how such fighting should unfold. The story elements of RPGs are linked to and shaped by the rules. Similarly, the IPS model conflates what might be labelled the "content" and the mechanics in more contemporary game development. Users/players thus have the means by which to shape not only game worlds, but also the architectures of such worlds.

As Wolf noted, however (see chapter one), even modest additions or alterations to a game world can be made meaningful by invested players. Recall that, in the BASIC version of Hammurabi, the phrase "HAMMURABI: I BEG TO REPORT TO YOU" is printed at the start of each turn. While it only consists of eight words, this text does much to set the proper atmosphere for the game, casting the player as a supreme ruler with assistants that use highly deferential language. This phrase is not at all necessary from a gaming point of view; the user only needs to know the statistical information that follows. But it serves as an entry point for players to impose their own understandings of the game world. Any additional mechanic or rule added to a game, moreover, expands upon its overall world, in that it reflects an aspect of how that world functions. The components of a game system, as well as the rules, mechanics, and data contained within such components, can be viewed as the infrastructure of a given game world, which can be configured in various ways to play games within that world. Play, moreover, can also be considered a form of world building, in that it creates additional data that may be used to describe a given world. This is particularly true is there is a mechanism by which game data is recorded and stored. In an IPS environment, it is relatively simple to add such a component to a 168 given game system, as well as to create the variables necessary to store the data produced. Such variables could even record multiple plays through the game, presenting alternative game "histories" that collectively shed light on the ways in which events can unfold in a given game world, given the rules and mechanics that govern it at any given moment.

6.3 The Hail Programming System

In order to test out the ideas discussed here and in earlier chapters with respect to text-based gaming, I developed a prototype interactive programming system designed to play games of the sort that were popular in the BASIC hobbyist gaming era. This system comes with its own programming language that I call Hail, so named because it is a descendent of the IPSs that were popular in the 1970s and 1980s (i.e. it "hails" from such earlier systems.) The system itself is tentatively called Hail Workshop, implying that it is an environment within which one may "tinker" with the Hail language. Both language and environment are my own creations.

Hail Workshop offers a simple graphical user interface within which reside various windows which allow the user to write code, execute code via a console, and display output. Like BASIC, variables are stored in memory, and may be referenced by any single program, or by the user at the console. Like Lisp, moreover, functions are also stored in memory, and individual programs may reference any function that has been stored as such. System images may be saved and loaded, and contain all variables and functions stored in memory at a given moment (and also objects, which will be discussed below), as well as the contents of each window.

Figure 6-1 shows Hail Workshop with the user running a version of the Hammurabi discussed earlier. Note that there are three main windows open within the system environment: a console window used to run game functions, a code window within which to build programs, and an output window where various in-game statistics are displayed. The system defaults to having one of each window open, sized, and positioned as shown, but, since the system itself is initialized in the Hail language, it may be reconfigured as the user sees fit. The user might wish, for example, to store various code libraries in different windows, or to use multiple output windows to display different types of information (multiple console windows are also possible.)

The functionality of each of these windows, and of the system as a whole, will be described in more detail below. First, however, I will discuss the programming tools that were used to 169 develop the prototype version of Hail Workshop. For those readers not interested in such technical details, the following subsection may be skipped. I provide this information about how such a system may be designed using modern tools. For those looking to learn more about the actual code used to produce Hail Workshop, my source code is available on Github.49

Figure 6-1. The Hail Workshop programming environment

6.3.1 Programming Languages and Tools

Hail Workshop was developed entirely using the Javascript programming language. While originally designed for websites and web applications, Javascript has recently begun to branch out into other application environments through the efforts of various individuals and organizations. The foundation for what has arguably been the most successful of these efforts – the Node.js runtime environment – is Google's V8 Javascript engine. While designed originally for its Chrome browser, the V8 engine can compile Javascript code on its own, just as, for example, GCC compiles C code, and as other compilers do for other languages. This has allowed Javascript to migrate outside the browser to support a myriad of other functions via

49 https://github.com/Adaax/hail-workshop 170

third-party runtime environments, which typically also extend the language. Node.js first emerged in 2009 as one of these third-party systems, and was originally designed for server-side network applications.50 Node.js's flexibility, however, has enabled its users to build a variety of applications and application environments that support a wide variety of uses. Such applications have become so numerous that a dedicated package manager, npm, has been created as a central host for users and developers.

Hail Workshop makes use of a Node.js application called NW.js. NW.js combines a Node.js- related framework called IO.js with a variant, standalone web browser based on Google's Chromium framework. The browser operates similarly to a typical web browser, but is designed to read local HTML files, as opposed to those retrieved from the Internet. Features such as enhanced file input/output functionality are designed around this principle. Essentially, what NW.js allows one to do is develop standalone applications (and games) that run in HTML and Javascript. A NW.js application package would contain the NW.js engine, as well as all the HTML and Javascript files needed to run the application. This package may then be distributed and run on any supported platform.51 Hail Workshop, then, is essentially an HTML page in an application window, supported by a Javascript engine. The Javascript does much of the work, however; apart from the background space and CSS styles which are defined in the HTML file, the major elements of the program are all built and handled by Javascript code.

In order to parse and interpret the Hail language, Hail Workshop also makes use of the Jison parser generator.52 A parser generator is a tool that takes a programming language specification and turns it into a program that can parse any program written in that language. When a program is parsed, its contents are arranged in a way that makes it ready to be executed in a logical manner. In a compiler, this would typically mean that the contents were arranged in a tree-based data structure. For our purposes, however, the Hail parser is used to convert programs written in the Hail language into Javascript. These Javascript programs are then run using the "eval"

50 See https://nodejs.org/ 51 There are currently versions of NW.js for Windows, Mac OS, and Linux. 52 Jison functions similarly to a GNU utility called Bison, hence its name (i.e. it is a Javascript version of Bison.) See http://zaach.github.io/jison/ 171 command.53 This is the primary advantage of using the NW.js framework: since it contains a version of the V8 Javascript compiler (in an IO.js-based environment), it can interpret and execute new Javascript code at any time. This means that it is possible parse and interpret another programming language via Jison without having to build a full compiler. While interpreted code generally runs slower than compiled code, this is not an issue for two major reasons. First, the speeds at which Hail Workshop interprets and executes code are still well within acceptable limits. Second, this is a prototype system, and if speed were really a concern, a later version could be outfitted with a true compiler.

6.3.2 The Hail Language

An overview of the Hail language will be provided here. This is not meant to be a full language reference – that may be viewed in Appendix A – nor will Hail's IPS features be discussed, as that will happen in the following section. Rather, what will be explain here are the fundamentals of the Hail language itself, and how Hail programs are structured. Hail's syntax borrows much from the high-level teaching languages developed and refined in the 1960s and 1970s, particularly BASIC and Pascal. It is an imperative language in which a program's state is determined by user-defined variables, whose values may be altered via the assignment operator. The assignment operator is ":=", as in Pascal, as opposed to BASIC's '=' operator. This is done to distinguish it from conditional comparisons in "if" statements (i.e. "x := 5" assigns the value 5 to the variable x, while "x = 5" determines whether or not x already equals 5.) As in BASIC, variables do not have to be declared; rather, variables are initialized the first time they are assigned a value (a technique which also works in Javascript.)

Hail is also a procedural language, meaning that code is segmented into procedures, as was discussed extensively with respect to the Hammurabi example earlier. Procedures in Hail are called subroutines, a term borrowed from BASIC. Like Pascal's procedures, Hail subroutines may accept parameter values – that is, variable values that are assigned whenever a given

53 The eval command is often cautioned against because of the potential security issues it raises when used in web pages; in certain circumstances, if connected to a text box or related user input mechanism, the potential is there for users to send malicious code back to the website's server. Since Hail Workshop is a standalone application, however, this is not an issue; the only computer that users could send code to is their own, and protections are in place so that the eval command in Hail Workshop only executes Hail-based Javascript code, which is not a security threat. Despite security concerns, moreover, eval is used in websites quite extensively; see Richards et al., 2011. 172 subroutine is invoked – but unlike in Pascal functions, subroutines do not return values. As with BASIC, all variables apart from parameter values are global. While global variables are generally avoided in structured programming, in an IPS environment they are vital in terms of establishing a shared environment. Variables defined in one subroutine, then, may be used in other subroutines.

Despite these similarities with respect to procedures, there is one major difference between Hail and most high-level languages: all Hail code is contained in procedures, and must be in order to be syntactically correct. In languages such as Pascal and BASIC, code may exist outside of procedures; this is the code that is executed immediately when the program is run.54 It is important to have such external code in these languages so that the programs written in them have "scripts" to work through when they are executed, but in an IPS, the user creates such scripts themselves. The need to shift agency to the user in an IPS with respect to running code was discussed at length above; Hail is architecturally structured to enable such agency.

With respect to conditionals and loops, Hail works very similarly to BASIC and Pascal, and most other high-level languages. Hail makes us of for loops, while loops, and if-then conditionals, including elseif and else statements. The syntax for each, however, differs slightly from what is found in other languages. Please consult Appendix A for details.

One major addition to Hail that is not found in BASIC and Pascal, but is a valid construct (with differing syntax) in Javascript is the object. Objects have been discussed at length with respect to Smalltalk, but other languages, such as C++ and Java, also support objects. Javascript object functionality is somewhat specialized, though one can accomplish anything related to objects that is possible in these other languages. Regardless, Hail borrows from Javascript to support limited, purely variable-based objects. Objects are defined within obj code blocks, and all variables associated with a given object are initialized within this code block via the var command. So, for example, a "person" object with a set of associated variables may be created as follows:

54 Such code must be placed within a BEGIN…END block in Pascal. In BASIC, code inside and outside of subroutines may be placed anywhere within a given program. 173

obj person var age var height end obj

Instances of this object may then be created via the terminal window, or within subroutines. Such instances are initialized as follows:

John := new person Alice := new person

These statements would create two "person" objects names John and Alice. Variables may then be attached to these objects, as shown below:

John.age := 27 Alice.age := 28 John.height := 170 Alice.height := 140

Object variables are initialized when they are declared, meaning that they do not have to be initialized beforehand.

Once an object is created, it may be passed as a parameter to any subroutine. The subroutine can then reference its variables. For example, if the subroutine process(x) accepted person objects as parameters, object variables would be referenced as x.age and x.height. Any changes made to variable values in such subroutines will be reflected in the values of the object passed to them.

6.3.3 The Hail Workshop Environment

All Hail programs are developed within the Hail Workshop environment. This environment provides the functionality necessary for the system to operate as an IPS. It consists of a single HTML page with a backing Javascript engine that handles most functionality. When Hail Workshop is loaded, NW.js is run to create an instance of this webpage housed within its stripped-down browser window; the page is full-screen by default, but may be toggled into a window by pressing the F11 key. The webpage serves as a space to house smaller windows of three varieties, each of which will be discussed in more detail below. Each window has a name, which is displayed in the window header. A window may be moved about the Workshop space by clicking on its header and dragging the mouse. Any window that is clicked on, moreover, 174

will be brought to the "front" of the screen, meaning that it will overlap any other windows within the same space. As of yet, windows may not be resized using the mouse, but the "resize" command allows one to do so via the command line.

The first type of window is the terminal, or console, window. Like terminal windows in Linux and Max OS, and like the "command prompt" window in Windows, Hail terminal windows allow one to type and send commands to the system via a prompt. Commands may make use of the entire range of Hail commands, so variables may be created and defined right from the prompt, and will be stored globally just like variables defined in subroutines. Terminal windows allow the user to run code contained within code windows, which has the effect of storing the subroutines contained within a given code window into memory. Users may also save and load images via the terminal, create new windows, resize windows, and quit Hail Workshop entirely. Perhaps the most important function terminal windows support, however, is the execution of subroutines. Any subroutine may be run simply by typing in its name, and passing the appropriate parameter values. Terminal windows, then, are where users convert code into action, and build the components that will shape and define the IPS. If the user wishes, they may create more terminal windows using the terminal command.

Code windows are where users type in Hail subroutines and objects. Similar to the Notepad application in Windows, code windows function as simple, multi-line text editors. Unlike in terminal windows, code is not executed automatically. Rather, the run command must be used in a terminal window in order to execute code from a code window, with the name of the window passed as a parameter (e.g. "run(maincode)".) All subroutines and objects in a code window must be structured appropriately, as per Hail's grammatical rules. Multiple subroutines and objects, however, may be entered in the same window. When the run command is used, the subroutines and objects in the specified window are parsed and, if there are no syntax errors, stored in memory. Once they are stored in memory, they may be called upon at any time by the user via a terminal window. Multiple code windows may be created and used simultaneously; the user creates such windows via the code command.

The third type of window is the output window. Output windows are simple, multi-line text windows that programs can use to print any type of alphanumeric information. Such information may also be printed to terminal windows; output windows serve as another option for 175

information that needs to be more organized or more enduring, as information printed to the terminal window tends to scroll away as the user enters more commands. The printtab command will print information in columns in an output window, meaning that the user can build a table as they repeatedly run the same (or related) subroutines. The output command is used to create new output windows.

All of these elements – the windows, as well as the variables, objects, and subroutines stored in memory – together make up the system image. It is these images that may be saved and loaded via the save and load commands. Image files are text files that contain all of the necessary information to rebuild the system windows and restore the memory to the state it was in when the save command was called. When an image is loaded, all existing windows, as well as the system memory, are erased, so that there are no additional elements that might interfere with the loaded image. The proper handling of these image files is key to Hail's functionality as an IPS.

6.3.4 Hammurabi in Hail

A prototype version of the Hammurabi game described in section 6.2 has been developed for the Hail system. All of the functions described in this section have been created, though some of the names have been altered, and the CYCLE function has been conflated with the CALCULATE function in order to streamline play. All of the functions are anchored to an object called city, which is defined as follows:

This object allows the user to play Hammurabi with multiple cities at the same time, each one holding their own city-specific statistics (including the year, so play may be asynchronous). The functions themselves, including the parameters that each accepts, are described in more detail below:

176

init(x): Takes the city object x and initializes all of its variables. This means that the city's statistics are set to the values provided at the outset in the original version of the game.

• info(x): Prints out information about city x. This information is very similar to that provided in the original version of the game (an actual game transcript will be provided below).

• buy(x, c_acres): Sets the number of acres of land purchases for a city x in a given year. This value is stored in the city's buyacres variable. The variables acres and store are also updated to reflect the purchase.

• sell(x, c_acres): Sets the number of acres of land sold for city x in a given year. This value is stored as a negative in the city's buyacres variable. The variables acres and store are also updated to reflect the sale.

• feed(x, c_bushels): Sets the number of bushels to be used to feed the population of city x in a given year. This value is stored in the city's feedbushels variable. The store variable is also updated.

• seed(x, c_acres): Sets the number of acres to plant with seed for city x in a given year. This value is stored in the city's seedacres variable. The store variable is also updated. 177

• calculate(x): Processes the above inputs and generates statistics that inform how the city's population and holdings have changed over the course of the year. The year variable is also increased by one.

In order to play the game, then, the user must generate a city object, and then run the various functions as described in section 6.2. This process may be broken down into the following steps:

1. Create a city object. To create the city "sumer", for example, the user would enter the command "sumer := new city" into the console.

2. Run the init function on the new city. For example, the user might enter "init(sumer)" into the console.

3. Run the info function to print out the city's statistics.

4. Run the buy (or sell), feed, and seed functions for the city.

5. Run the calculate function for the city.

6. Return to step 3.

Note that steps 3 to 6 may be run as many times as the user wishes.

The following is a transcript of Hammurabi in play, beginning with step one as listed above, and continuing through two years of data entry and processing:

As discussed in section 6.2, this is a perfectly valid way of playing the game, but, given that Hail Workshop is an IPS, it is far from the only option. The user could easily create a second city, and play it alongside the first. Or they could modify or add to the game's codebase. There is no execution mode that locks a player into experiencing the game a certain way, and the code window is always accessible. 178

6.4 Beyond Hail-Workshop

This chapter could only provide a brief outline of the potential of IPS gaming. While Hammurabi was the only game discussed, a system such as Hail Workshop could support most or all of the other hobbyist games discussed in chapter four. Some effort would be needed, however, to translate such games correctly in order to provide users with a maximum amount of agency. Typically, queries such as those used in Hammurabi to determine the amount of land to 179

purchase and seed, as well as the amount of food to feed the population, may be sectioned off into their own subroutines. The mathematical processing then needed to advance to the game's next turn may then be contained in its own routine, like Hammurabi's calculate subroutine. This, however, is only one possible configuration. IPSs actually allow us the freedom to experiment with new forms of gaming in much more flexible environment than that provided by typical programming language systems. As noted earlier, there is no reason multiple calculate routines could not be created, thus enabling branching paths between queries and processing, with the user directing the action. The task of coding new subroutines could even serve as part of the game itself. It is difficult to conceive of what such a game would look like, for the simple reason that this sort of format has never been tried out before. Even in modern games where some code can be modded, users cannot change how code is executed over the course of play, or generate new code while they are playing. Play instead consists of passing variable values to the game through various input means, as has been discussed at length.

The major limitation of this discussion, of course, is that it has been focused solely on text-based gaming. What about modern, graphics-based games? Could they be translated into IPS systems? This will addressed in more detail in the following section, but it is worth introducing some of the major concerns here. Modern games are somewhat problematic, in that many of them run in "real time" – that is, they are constantly running, even when the user is not feeding in any input. Consider how even in a game as old and simple as Pac-Man, the game's ghosts move around and chase the player even when he or she is not controlling the character. Such a scenario presents challenges in terms of allowing the user to browse and modify code, and otherwise direct onscreen action, as they should be allowed to do in an IPS system. There could, perhaps, be a mechanism by which the user could "pause" the action while they are not directly playing a given game. This would allow them the time to make changes to the code without having to worry about their in-game character. But this seems like a somewhat awkward workaround, and would make playing such a game something of a halting experience. In fact, I would argue that such real-time games are not ideal candidates for translation to IPS systems. Rather, turn-based games, including certain forms of strategy games, simulation games, and role- playing games, are better suited to IPSs. Such games provide spaces in which users are free to take their time to make decisions, and the actions taken by the game itself are discrete, as opposed to continuous, and therefore easier to organize into subroutines. Such games are closer 180

in terms of how they play to pen-and-paper and board games, and it is not a coincidence that both formats are more suited to improvisation and creative alterations as compared to real-time games. This will be discussed in more detail in the concluding chapter to follow.

Conclusion

In the late 1970s, Philip Kraft, a sociologist at the State University of New York at Binghamton, wrote and published a short work entitled Programmers and Managers: The Routinization of Computer Programming in the United States (Kraft, 1977). Kraft's book describes an ethnographic research project he undertook as a series of interviews with professional computer programmers. He was motivated in part by the dearth of existing research materials on programmers, particularly since the work that had been published was written almost exclusively from the perspective of managers, and was intended to teach techniques on how to get programmers to "to do what they are told, not simply to write better programs" (Kraft, 1977, p. 4). It had been nearly a decade since the term "software crisis" had first been used to describe what was perceived by many in the industry to be an inability to create functional computer programs to solve increasingly complex tasks (see Naur and Randell, 1969). In the years since the first digital computer mainframes were built on university campuses, and at pioneering firms such as IBM, the notion of computer programming had been supplanted, or at least diminished, by a grander vision labeled "software engineering." Kraft had a particular take on the engineering profession that reflects his more specific concerns with respect to programming:

Since their emergence at the end of the 19th century, modern engineers have been employed by owners and managers to redesign work processes in order to make them amenable to standardization. Their job was to break down other people's jobs into subtasks, each of which required less skill to perform than the original whole job which they collectively replaced. To the extent the engineers succeeded, much work formerly done by skilled labor could be transferred to less skilled workers or even to machines (Kraft, 1977, p. 19).

The end result, which Kraft expresses in rather dire terms, is that "[c]omputers are the most sophisticated instruments available to managers in their efforts to de-skill production workers and now they are being used against the very people who made it possible for managers to so use them" (Kraft, 1977, p. 22).

The primary weapon managers wielded against programmers, according to Kraft, was the imposition of "structured programming" techniques. Emerging in the late 1960s and early

181 182

1970s, the structured programming movement sought to rationalize programming and programming languages by implementing structural elements that compartmentalized and organized fundamental computational operations. As Earl Parsons notes, "[s]tructured programming is the stepwise process of parsing program functions into a hierarchical chain of modules," adding that "[e]ach of these modules perform a single task, are efficient and economical…and are used to produce provable, reliable software systems" (Parsons, 2002, p. 6). Structured programming principles have informed the design of most major languages used for application and game development, including C++ (as well as C), Visual Basic, C#, and Java.55 While structural elements within those languages are ostensibly intended to make programs more efficient and coherent, Kraft sees in them more sinister consequences:

Structured programming makes it possible to organize programming along the lines of industrial rather than craft production. In this case, what is standardized is not a material product like an automobile or a package of breakfast cereal or a bank statement. What is standardized is a mode of thought, a logic, a pattern of decision-making (Kraft, 1977, p. 99).

For Kraft, structured programming is a sort of digital Newspeak,56 confining and constraining programmers so that only those operations allowed by system and language designers are permitted. As he notes, "[s]tructured programming offered an entirely new way of writing programs…Briefly, programmers using structured programming would be limited to a handful of logical procedures which they could use – no others were permitted" (Kraft, 1977, p. 57). Prior to having such constricting principles imposed on them, according to Kraft, "[p]rogrammers (and analysts) followed a logic and procedures which were largely of their own making," and "[p]rograms made in this manner had distinct 'personalities' which reflected their creators" (Kraft, 1977, p. 56).

55 Also Javascript, but, as the previous chapter demonstrated, it is significantly more malleable than most other structured programming languages. 56 A fictional language from George Orwell's novel 1984, imposed by the governing "Ingsoc" party on its citizens. As Orwell notes, "The purpose of Newspeak was not only to provide a medium of expression for the world-view and mental habits proper to the devotees of Ingsoc, but to make all other modes of thought impossible. It was intended that when Newspeak had been adopted once and for all and Oldspeak forgotten, a heretical thought – that is, a thought diverging from the principles of Ingsoc – should be literally unthinkable, at least so far as thought is dependent on words" (Orwell, 1949/1989, p. 312). 183

Kraft's work merits the occasional citation within STS and computer science scholarship, but it would be inaccurate to say that it had anything approaching a transformative effect on professional programming practices. Contemporary writers and scholars, in fact, tend to be critical with respect to Kraft's conclusions, noting that his dire predictions did not pan out, and that programming is in fact recognized as a creative activity within professional circles, at least in part (Glass, 2005; and Ensmenger, 2010). I would argue, however, that his work provides a useful vantage point from which to review the research and analysis conducted in the present study. When Kraft raises the issue of creativity versus conformity within professional programming environments, he is really speaking the language of agency, which has been a key theme of this work since the opening discussion on human computers. Recall that digital computation was founded on a model of computation in which agency was stripped entirely from workers, and even from managers, who had to trust in the efficacy of the procedures that they put in place. The Mathematical Tables Project placed a particular emphasis on deskilling mathematical workers, a model that Mauchly would later imitate in his design of the ENIAC computer. Contemporary software engineering practices, moreover, continue to severely inhibit user agency at a fundamental level. Programming systems and studios such as Visual Studio and Xcode treat programs as "recipes," to employ Chun's term, which describe specific sets of instructions to be carried out automatically by a digital computer. This sharp division between the act of programming and the act of running a program impose a rigid logic on users with respect to how programs are conceived of and executed. Despite the apparent recognition of the more creative aspects of programming by certain organizations, this fragmented model of program development is fundamental to our understanding of what has come to be called computer software. Given its ubiquity in modern computing, it is important to recognize that the notion of mass-produced uniform software programs was wholly invented, and was not an inevitability in the early years of digital computing. Kraft pointed to several advantages gained by employers that bought into the software model:

It means being able to buy or rent a ready-made program which is…a tested and proved product, without the bother and expense of employing highly skilled software experts to design, write, test, modify, maintain, and update an ad hoc solution to a unique problem. Since the product has been in a sense mass- 184

produced, while limited, it is also likely to be cheaper than writing and installing a one-of-a-kind program (Kraft, 1977, p. 55).

Yet, as with structured programming, Kraft sees in software – which he refers to pejoratively as "canned programs" – the imposition of standards that inhibit the very functionality of digital computing. Organizations that use software, according to Kraft, must conform to its needs. This may involve altering internal practices so that they may produce usable input for programs that, for example, calculate payroll or manage inventories. As Kraft puts it, "[i]n effect, the use of canned programs represents a joint decision by software sellers and software buyers to make the problems fit the solutions at hand" (Kraft, 1977, p. 55). I would argue that the modern programming languages and systems used to create such software – and which are, of course, software programs themselves – are a crucial component of such arrangements. Modern programming environments operate as specialist laboratories in which programmers concoct "projects" that are then rendered into compiled code and delivered to users as "executable" files. Such users never see the original source code of the programs that they work with, and thus are limited in what they can do to whatever functionality was anticipated and accommodated by the developers.

It is for these reasons that the interactive programming system paradigm offers important advantages. In an IPS, all users are programmers, but there are no programs, at least not of the sort used in the creation software. Rather, the concept of "systems" has been introduced here to describe the sorts of component-based toolkits the IPS users create. Such systems may then be used for a variety of purposes, and may be added to as needed. Software devotees might dismiss the IPS model as overly chaotic or unstable. But, as this study has demonstrated, the sharp divide between programmers and users has significant drawbacks. If Kraft is to be believed, such drawbacks were the price to pay to standardize programming practices and deskill developers. This would then be yet another example of political exigencies playing an important role in shaping the character of digital computation. These are important findings in part because they serve to contradict notions of programming as a pure, "logical" enterprise that, much like classical Newtonian mechanics, operates solely by its own internal rules and is unencumbered by political or philosophical bias. It is generally recognized now that Newton's physics presents and reflects an entirely rationalist conception of the universe and the objects within it, as if each were a component of a "vast machine" (McGrath, 2011, p. 67). This view would later be challenged 185

with the emergence of quantum physics.57 Similarly, modern computing continues to adhere to an ideology in which software, operating according to specific mathematical and logical principles, regulates all user interactions. Users are only able to create documents because software such as Microsoft Word and Adobe Acrobat allow them to interact with our machines in particular ways. Such programs may be thought of as "vast machines," created by groups of developers, which users can operate, but not modify.

The incorporation of human agency into these software machines is an important outcome of IPS programming. At first glance, this shift in thinking may not be readily apparent. The Hail language described in chapter six draws from the principles and syntaxes of a variety of traditional, high-level programming languages. Yet, to return to the above example, quantum physics is built and described, however imperfectly, using the same mathematical "language" that is used in classical mechanics. What makes the language of quantum physics differ from the language of classical mechanics is the wider context within which the language is used, or expressed. Human agents employ both sets of languages to, for example, describe observations made in experimental settings, but the experiments conducted in both realms, as well as the tools used to observe such results, differ significantly from each other. Similarly, Hail may superficially resemble languages such as BASIC and Pascal, but it is fundamentally different because of the way it is used to make components that inhabit a wider system environment, rather than solitary programs or multiple programs within the same project grouping. This focus on "language in use" adheres closely to the tenets of the linguistic field of pragmatics (see Huang, 2014,) and the wider application of pragmatics to programming and development environments is something that could be carried forward into future research in fields such as software studies. While works such as Kitchin and Dodge's Code/Space examine the contextual impact of software, such an approach could also be used to analyze the impact of language environments on software.

This shift to increased user agency is designed in part to challenge prevailing and longstanding perspectives on digital computing. The MTP/Mauchly vision of computing privileges automation, with human managers/operators setting up problems in such a manner that they can

57 Also, arguably, by relativity theory, though it also espoused a rational, mathematical understanding of reality. 186

be worked out by (human or digital) computers without having to make any changes to the original plan. The advent of interactive computing challenged this model, but only to a limited degree. Command-line interfaces allowed for the development of programs right on the computer. This was an important step, given that a command-line interface is really just a running program serving as a gateway to a variety of system tools. For the first time, computers had to respond to dynamic input, rather than just marching through a program step by step. But these interactive systems still served largely as platforms for the construction of such fully- automated programs. Using languages such as Fortran and COBOL, users would create the same sorts of programs favoured by Mauchly. The introduction of commands such as Demand and INPUT in later high-level languages (see chapter 3) introduced an additional level of interactivity, but only in a very limited form. Programs could ask for user input, but only in the form of variable values. Such programs were still digital automatons, impervious to change while running, and intended to execute from start to finish according to a plan frozen in their source code. Initially, when most computer users were also programmers, and when most programs were generally short and simple, users were still "close" to the code, and could modify it with relative ease. Programming environments such as Microsoft BASIC catered to such audiences in the sense that programs could be created, loaded into memory, modified, and executed all from the same command line. But as increasingly sophisticated machines were being sold to a burgeoning consumer market, "off-the-shelf" software, compiled so that its source code was inaccessible, became the norm. At this stage, programs such as Word behave not all that differently from the automated calculating programs developed at the Mathematical Tables Project. While users can create documents and other types of media files with such software, they cannot truly alter their functionality.

The IPS framework does not solve all the problems associated with automated digital computation. But what it does do is assign the user a central role in terms of controlling the flow of computational operations much more than the software model allows for. Recall the role of the "GREEN" player in the interactive digital war game outlined by RAND researcher G. M. Northrop (see chapter three.) GREEN not only facilitated exchanges of data between the two players; they could also potentially swap new code in and out of the game. Here, then, is a clear example of a human agent directing and altering the flow of computation in the middle of a given task. While digital computation does much of the work in terms of calculating outcomes 187 and relaying messages to each terminal, GREEN's decisions with respect to managing and dispatching new code ensure that the overall process of playing the game is not entirely deterministic. IPS programmers have this level of control over their code by default. As such, they may experiment with alternate forms of computation that incorporate human decision making, as compared to the fully-automated paradigm championed by Mauchly and adopted on most computer systems since. An IPS user is cast in a role similar to the human computers that worked at the Army's BRL, who understood the entire scope of the problems they were given, and were familiar with the factors that informed the design of such problems. While the BRL workers were not given leeway to experiment with the procedures they were using, they hypothetically had the knowledge necessary to be able to experiment. Similarly, an IPS programmer needs to understand how the components in their system function in order to be able to configure and use them. With this knowledge, they can embed their actions – that is, running, modifying, or creating new components – within a larger process that incorporates such actions with the execution of deterministic computer code.

This model of agent-driven computation would almost certainly require users to understand more about the code that they work with as compared to contemporary software consumers. As noted in chapter five, however, the amount of knowledge required would depend to a significant degree on the complexity of the IPS language. Smalltalk, it was argued, expected the user to work at too low a level with respect to controlling the programming environment. An ideal IPS would encapsulate functionality within a language that allowed the user to efficiently develop and modify components, but was not "user-friendly" to the point that the user had only very limited access to the underlying system. As was discussed, Microsoft BASIC declined in part because it did not have the expressive power to support intensive graphics and animation; such functionality had to be rendered in machine code, which required a level of expertise beyond many hobbyist programmers. But for text-based games, the Microsoft BASIC system was ideal. Consider the fact that its users did not have to worry about how their computers were going to render text onscreen. When they wished to display a line of text, they simply used the PRINT command. The processes involved to make the PRINT command function – that is, the steps needed to parse a given statement and render it correctly (and continuously) on a CRT screen – are relatively complex, particularly as each line of text has to scroll up and off of the screen. 188

Microsoft BASIC handles all the low-level communication with a given machine's CPU in order to make PRINT work properly.

An ideal IPS, then, would allow the user as much control over their system environment as possible without making it overly difficult to perform basic tasks. Yet I would still argue that the typical computer user would benefit from having more control over and a better understanding of the systems they work with. IPSs could and should be simplified as compared to Smalltalk, but only to a point. This issue may be considered further by focusing on operational gaming. In Olaf Helmer's paper on gaming (see chapter 3), he distinguishes between what he calls "rigid" games and "non-rigid" games. In a rigid game, "the rules are complete in the sense that at any stage of the game the options open to the players are clearly defined" (Helmer, 1960, p. 14). Non-rigid games, by contrast, "require umpire rulings to determine the admissibility or inadmissibility of proposed strategies as well as the outcome resulting from playing out opposing strategies that have been admitted" (Helmer, 1960, p. 14). Computer-based operational games – what Helmer refers to as "play by machine" games – are generally rigid. This is, of course, an issue that has already been discussed at length in previous chapters. What makes Helmer's perspective interesting, however, is that he raises and develops the notion of the "expert," or of "expertise." The expert is an individual, or individuals, that work to ensure that a game functions as intended – in the case of RAND's games, this generally meant that games had to be "realistic," however that was defined. In the case of a rigid game, it was up to the "game constructor" to apply the necessary expertise. By "enforcing realism through the rules," rigid games were designed so that players did not need to be experts themselves – that is, they only needed to know how to play the game, and did not have to concern themselves with its fundamental mechanics, or the game's underlying model (Helmer, 1960, p. 18). In non-rigid games, however, expertise shifts to the players, as well as the "umpire," if one is used. As Helmer notes, if realism is the goal, player expertise may be employed in two different ways:

The player's expertness may be directed either at strategy or at simulation; in the first case he would try to play as well as possible within the confines of the game, in the second he would try to mirror as closely as possible what he predicts would be the behavior…of the real-life decision maker whom he is representing in the game (Helmer, 1960, p. 18). 189

Regardless of approach, what is important here is the fact that non-rigid game players possess a level of knowledge that rigid game players do not require. Such knowledge is required in order to play the game properly, and to critique the game's model.

This notion of expertise may be translated into the typical IPS programming context by focusing on operational gaming and operational game development. In a typical text-based operational game, such as those that were prevalent in the hobbyist era, the players did not have to have any expertise within a given game's problem domain. As long as they knew how to play – which often meant keying in a few variable values per turn, via the INPUT command – the game would operate properly. The players remained non-experts as long as they did not explore the underlying code. In an IPS context, however, players often have to understand at least some aspects of the games they play. As was demonstrated in chapter six, the Hail version of Hammurabi requires players to work directly with the components that made up the game. They have to call functions directly in order to set values, to have them processed, and to have the results output back to the user. While such players do not necessarily need to master all aspects of the game, they have to have a reasonable level of expertise, to use Hellmer's term, with respect to the IPS and its associated programming language. This expertise, however, would allow them to engage much more closely with the game's code, allowing them to modify specific components, add new components, and use these components in novel ways. To return to the issue at hand, then, while it is true that IPS users would likely need to have a better understanding of the programs they work with as compared to contemporary computer users, this would allow a shift in expertise, in the sense that both developers and users would gain a certain level of expertise with the software they use. I would argue that this is a worthwhile tradeoff.

Gaming, Graphics, and Operational Models

The focus on digital games throughout this study has been acknowledged several times, and in chapter six this position was defended with respect to modern IPS development. The games that have been discussed, however, have been almost exclusively the text-only, INPUT-based strategy games that were developed at research institutions such as RAND and later passed on to hobbyists. Modern, graphics-based gaming has been mentioned on occasion, and links between modern games and hobbyist games have been made, however tentatively. But larger questions with respect to the level of interactivity offered by modern games, as well as the potential for 190

using such games as the foundation for interactive programming systems, remain. While a proper treatment of these issues would likely require an entirely new round of research and analysis, it is possible to touch on a few key points here.

It should be noted that the Hail Workshop program developed in the previous chapter is not purely a text-based system. While users were limited to text-based commands and output, every textual element was contained in a graphical window that could be moved around and minimized, just like windows in modern graphical operating systems. Importantly, users could create their own windows, both on the command line and from within stored code. In the Javascript prototype developed for this study, moreover, the command line is essentially a simulated version of a true command-line interface. All of this is to say that the divide between text-based systems and graphics-based systems is not entirely clear. I would even argue that all screen-based programming and operating systems are graphical in nature, in the sense that everything needs to be "drawn" onscreen, with text being a specific form of graphical output.

Yet there are fundamental qualities of fully graphics-based games and systems that set them apart from text-based systems such as JOSS and FOCAL.58 I will list two of them here. First, they often allow for forms of user interactivity beyond the typical computer keyboard, with the mouse and "gamepad" being among the most popular devices available to users. All input devices, moreover, interact with programs using events, as opposed to commands such as INPUT (see chapter five). Second, graphical systems, and in particular games, allow for changes to made with respect to both internal and onscreen information in "real time" even when the user is not interacting with the system. Graphical animation is one of the more visible examples of this, as arcade-style games demonstrate clearly. If a user runs a version of Pac-Man on their home machine, the ghosts move around and chase the player character even if the player decides not to move their own character (i.e. Pac-Man) or otherwise interact with the game. But even small details like the blinking cursor in a Word document are a product of real-time animation. "Internal" changes are arguably even more important, as variables may be set and modified by the system without the user being aware of what is happening. With respect to game programs,

58 With Microsoft BASIC being something of a hybrid system, in that it allowed for machine-specific graphical routines that enabled the creation of "real-time" (see below) games. 191 such real-time code execution happens in what is called a "game loop." Hail Workshop in fact employs such a loop to handle various background operations.

In an event-driven, real-time environment, are users empowered to a greater degree as compared to command-line systems? As discussed briefly in chapter five, I would argue that users of such systems have as little agency as those using text-based, command-line systems, and in certain circumstances they have even less agency. In the typical graphical software program, users are still limited to setting variable values within a prefabricated system. In a game, such information may be limited to directional movement, with perhaps one or two keys (or buttons on a gamepad) that perform actions such as firing an onscreen weapon. In a program such as Word, the document itself is the variable, or at least a collection of related variables. Documents may be created, modified, and closed, but there is no means by which to modify the code that controls the behaviour of such documents. Other media creation tools, such as Adobe Acrobat and Photoshop, operate under similar principles. It is only in interactive development environments such as Visual Studio that users may interact with code, and the limitations of such systems have been discussed extensively in earlier chapters. In all cases, the use of onscreen interactive elements such as menus and toolbars often serve to limit interactivity even more than command- line interfaces, in that they are (arguably) less flexible as compared to text-based commands (see Stephenson, 1999).

Having said that, do modern games thus qualify as operational games? This is a more difficult question to address, and there probably is no true answer. Everything depends on how flexible one wishes to be in terms of defining operational games. As discussed in chapter three, operational games rely on underlying mathematical models. In this respect, modern games are operational. Graphics-based games, in fact, require more sophisticated models as compared to text-based games. Games that incorporate "three-dimensional" geometry, such as first-person shooters, use models that are incredibly complex. But the games discussed in earlier chapters were also turn-based, and this quality was defined as essential to operational gaming. There are, of course, many turn-based graphical games, with the Civilization series being perhaps the most prominent. But there are also many games that operate purely in real time (see above,) including virtually all "AAA" commercial games such as the Grand Theft Auto and Assassin's Creed series. In such cases, the situation is more complex. It could be argued that such games operate along similar principles, in that players are able to interact with game models using whatever 192

affordances are available. For real-time strategy games such as SimCity, which use resource management-based models that are similar in character (albeit much more complex) to those used in text-based games such as Hammurabi, I would argue that they still can be treated as operational games. Despite the real-time changes that are enacted by such games over the course of play, users still interact with their underlying models via variable inputs – such as, for example, setting a tax rate – that result in visible changes to the system being modeled. With respect to AAA action games, however, I would hesitate to classify them as being operational. While they still use models, such games have more in common with the arcade games of old, as opposed to operational games. They thus have a lineage that does not link back to RAND, or to the hobbyists, but rather to arcade and console game developers such as Atari and Activision.59 While I would argue that many of the issues I have associated with operational games are still applicable to these titles, counting them as operational games would broaden the category to the point where its significance would be diminished.

What is needed, then, is further research into graphics-based gaming, and graphics-based computing in general, to make connections and recognize differences with respect to operational gaming and text-based computing. Work in this area could focus on a number of institutions and projects. A closer look at MIT's Whirlwind and TX-0 mainframe machines, as well as the PDP computers produced at DEC (which, as noted in chapter three, were built off of TX-0 technologies), would be warranted, as these were among the earliest computers that supported CRT display screens, and hosted important programs such as Sketchpad and Spacewar. Arcade and console computing would also need to be looked at. Intriguingly, such games, while electronic, were not computer-based, in the sense of having a central processing unit (CPU), until well into the 1970s. Yet these are the style of games that currently dominate the marketplace, even on home computers. Perhaps most importantly, however, the research carried out by Xerox PARC on graphics-based computing – the same research that led to the development of the Mesa and Smalltalk systems – served as the foundation for graphics-based computing as it is still practiced today. This work allowed for the development of game genres that did not exist on consoles, with the mouse becoming a particularly powerful tool for strategy and simulation

59 As was discussed in chapter four, however, many hobbyists eventually turned to arcade-style games, particularly towards the end of the era. 193

games. The history and origins of the event model of interactivity, developed to a significant degree at PARC, could also be explored in some detail. Finally, the emergence of purely (and almost purely) graphics-based development systems such as Visual Studio and Scratch could form an important component of such research, in that such systems mark a break from the text- based programming languages that continue to dominate even on graphics-based systems.

Why RAND?

The RAND Corporation featured prominently in this study with respect to the emergence of operational gaming. It is thus worth taking a moment to think about why it was that RAND, and not another research institution, assumed such a prominent role. As already noted, RAND was but one of several groups within the United States military that conducted operational games. The Operations Research Office (ORO) of the United States Army, in particular, was heavily invested in gaming. Yet it was RAND that brought operational gaming to the corporate sector, from which it was eventually picked up by hobbyist programmers. In order to gain an understanding of why RAND became the primary advocate of operational gaming, it is necessary to approach the issue at two levels. First, it is worth looking at the organization of RAND itself, and the freedoms it was given that other research groups lacked. Second, the rise to prominence of the United States Air Force in the postwar period needs to be emphasized, as it was within this larger context that RAND was able to grow and thrive.

RAND's primary advantage was that it was not considered to be an internal organization within the Air Force, but rather as a separate entity that was sponsored (in part) by the Air Force. This allowed its researchers much leeway with respect to the work that they conducted, as well as how such work was disseminated. RAND's articles of incorporation reflected these aspirations, stating that the purpose of the organization was to "further and promote scientific, educational, and charitable purposes, all for the public welfare and security of the United States of America."60 Communication with extra-military organizations, then, was a central tenet right from the start. While RAND focused largely on tactical aspects of warfare in its earliest years, moreover, it shifted to more long-range strategic and logistical issues in the early years of the

60 http://www.rand.org/about/history/a-brief-history-of-rand.html 194

Cold War, and this required a new approach, as noted in an internal report on the history of the organization:

In peacetime there are a lot of things we don't know very well. It is not clear when – or whether – the war is going to occur, and if it does, where it will be or what kind of war it will be...[and] under what sort of political constraints and objectives it will have to be fought (Goldstein, 1961, p. 9).

To respond to these issues, RAND created a variety of divisions: Economics, Logistics, and, crucially, Social Science. RAND also gave its researchers a significant degree of freedom and flexibility with respect to where and what they researched, as Goldstein indicates:

How do…RAND studies start? A one-man project often starts with one man who has an idea, the usefulness of which he wants to explore. It is possible for someone to pick up the ball and run with it…We can and must afford small projects of the one-man-off-in-a-corner type with little or no administrative review in their early stages (Goldstein, 1961, p. 13; emphasis added).

This last point about the initial lack of "administrative review" is particularly crucial, and may be contrasted with the situation at the ORO, where, arguably, enthusiasm for operational gaming rivaled RAND. Whereas the Air Force appears to have fully supported RAND as it expanded its purview, the Army was deeply skeptical of the ORO's activities. As Shrader notes, there were "a number of Army officers who were unwilling to accept the meddling of operations analysts," and many "failed to see how Army interests were served by an organization such as ORO that operated with a good deal of independence and objectivity rather than being a good Army 'team player'" (Shrader, 2006, p. 120). Unlike at RAND, where research endeavours in new areas were supported and encouraged, the Army grew skeptical of "the ever-expanding scope of ORO interests" (Shrader, 2006, p. 120). As a result of such concerns, the Army shut ORO down entirely in 1961, replacing it with a (presumably more pliant) research group called the Research Analysis Organization.

RAND, then, can be seen as something of a special case with respect to the various research organizations working for the United States military in the postwar years. Taking this wider perspective, however, RAND's uniqueness invites speculation as to why it existed at all. Or, to 195

reframe the question, why is it that the Air Force, which only became an independent branch of the military in 1947, was able to fund and support a sprawling, independent research institution such as RAND? Why not one of the more established military branches? In order to answer these questions, it is necessary to discuss the rise to prominence of the United States Air Force over the course of the Second World War and after.

The Second World War was a conflict won in large part by technological innovation. As Goldstein notes, many of the war's most important technologies were developed only after fighting began (Goldstein, 1961). Among the most prominent of these technologies were radar, jet-powered aircraft, self-propelled missiles, and, of course, the atomic bomb. All of these served to enhance (as well as protect against, in the case of radar) aerial power. The doctrine of strategic bombing, moreover, which caused much of the civilian suffering and physical destruction that was incurred over the course of the war, depended on control of the skies. In addition, the notion of developing a "man-made satellite," or "spaceship," a term employed in a landmark 1946 RAND report on the subject, was just beginning to emerge. While government- sponsored aerospace research and development in the United States would eventually be taken over (in large part) by the National Aeronautics and Space Administration (NASA), in the immediate postwar years the Air Force did the bulk of the research in this area, and operated the Patrick Air Force Base at Cape Canaveral, launch site for the United States' early attempts at suborbital and orbital manned spaceflights.61

All of this is to say that the Air Force had been at the forefront of military-based research and development since the latter years of the Second World War, and, arguably, continued in this role at least until the end of the Cold War.62 This was especially true after it became evident that the Soviet Union, with the launch of its Sputnik satellites in the late 1950s, had more advanced capabilities in this area as compared to the United States. The emerging threat of full-scale nuclear warfare with the Soviet Union, moreover, concentrated military minds on aerial attack scenarios (remembering that the nuclear weapons of the early Cold War era were still dropped from aircraft.) RAND's relationship with the Air Force thus gave it a position of privilege that

61 http://www.rand.org/about/history/a-brief-history-of-rand.html 62 The heavy use of airborne drones in modern warfare suggests that aerospace remains dominant. 196

organizations such as the ORO lacked. This became especially true given its early research into satellites and nuclear warfare. Yet RAND seems to have not lost any of its support as its researchers moved into other areas, as already noted. While more research on RAND and other postwar military research organizations would be needed to verify this, I would tentatively argue that a sprawling, largely independent research group such as RAND could not have emerged out of any other branch of the United States military, at least at the time. The Office of Naval Research (ONR) arguably comes the closest to RAND in terms of its power and scope, but it serves largely as a funding agency for universities and other research institutions, not as an autonomous corporation. Furthermore, ONR was founded in order to "plan, foster and encourage scientific research in recognition of its paramount importance as related to the maintenance of future naval power, and the preservation of national security."63 It thus did not have the broad mandate bestowed on RAND to promulgate its findings outside of military circles. RAND came into existence in the wake of a war that, at least in the case of the Pacific theatre, was ultimately won in the air, and in the early years of a Cold War in which aerospace technologies played an outsized role. Its power and influence were a reflection of the importance of military airpower within this context.

Role-Playing Games

Pen-and-paper RPG campaigns were discussed in some detail in the previous chapter. Compared to computerized role-playing game (CRPG) franchises such as Ultima (an early industry leader) and Dragon Age (a more recent creation,) pen-and-paper games are typically sprawling, "messy" affairs. Due to the importance of improvisation, the typical game is extremely flexible with respect to rules, settings, and events. House rules were discussed in chapter six as a means to change the way a game is played. But RPG developers themselves also frequently change the "official" rule set. Wizards of the Coast, the company that produces Dungeons & Dragons, recently adopted a "living rule set" philosophy through which they can experiment with new rules and changes to older rules by advertising them to players and DMs and collecting feedback, as they explain:

63 http://www.onr.navy.mil/en/About-ONR/History/Timeline.aspx 197

[T]he living rule set approach gives R&D the space to offer up suggested improvements and alterations to D&D. If we have an idea for improving the game, we can present it as an option for DMs, gather feedback, and make informed decisions for future products and potential changes to the core game. Those options can remain exactly that – options you can take or leave as you see fit (Mearls, 2014).

In addition to flexible mechanics, most RPG publishers produce supplements that may be altered and adapted as game masters see fit. Adventure books, or "modules" as they are sometimes called, are quite common, and delineate rich settings within which GMs may cast their players to go on various quests. Typically this involves guiding players through a network of discrete locales, such as rooms in a dungeon, and presenting them with whatever perils and/or rewards await them within these spaces. Modules may be rich in lore that GMs may leverage to fill out the details of the game world, but they almost always allow for GMs to invent their own contexts within which to situate their adventures. GMs, in fact, have a great deal of flexibility in terms of how they use such supplements. They may adhere closely to the information provided in a given title, but they can also alter and adapt these contents however they see fit. Other types of supplements include catalogues that describe new items, weapons, and monsters that GMs may incorporate into their campaigns. In such cases, the GM is required to provide the necessary setting and context outright, or else adapt an existing adventure module.

All of this is to say that pen-and-paper RPGs incorporate elements from a variety of sources, both in terms of official materials and those that are developed by a given GM on their own. These materials, moreover, may be reused in part or in whole each time players gather for a new adventure. As additional sources are incorporated over the course of a multi-stage campaign, a larger game world gradually emerges. Such fictional worlds are shaped by the GM, but are also shaped by players. A particularly diligent GM might interpret certain decisions made by their players as having far-reaching consequences that affect later adventures. The players, moreover, tend to use the same characters each time they play, allowing them to grow more powerful with each successful quest (such characters may also "die" in-game as well, of course.) RPG groups essentially build up archives of used and usable materials. Such archives are highly- personalized, and rarely, if ever, would any two be alike. 198

Despite such practices in pen-and-paper gaming, however, retail CRPGs – that is, games that are sold as consumer products, and make up the largest market share in the genre – are still produced and sold as largely unitary products, and each game must be played in strict adherence to the rules embedded within its code. Outside of modding, there is generally no way to add house rules to a CRPG. The same is true with game worlds. Though modern CRPGs tend to offer the player some freedom to explore their settings, these are presented "as-is," and cannot be reinterpreted or reimagined by the player. While CRPGs, moreover, may offer dozens of adventures tied to a larger campaign or campaigns, these are fixed and rigid, typically forcing the player to complete a serious of adventures in a specific order, with each one unfolding exactly as planned in the code. Once these quests are completed, the overall narrative arc of the game is usually complete. At that point, the game is over, or else the player may only complete whatever side tasks are left on the map.64

While contemporary CRPGs can certain be highly-ambitious, critically-acclaimed titles, I would argue that the CRPG genre would be better served if they were developed within interactive programming systems – or, at the very least, IPS games would complement typical CRPGs by offering a different framework for gameplay and game design. As discussed in detail in chapters five and six, IPSs allow users to develop components that fit within a larger environment, and then to configure such components to complete a variety of tasks. With the right IPS language and system, CRPG users and designers could hypothetically store all of the major components of a given RPG system in one place, and could add, alter, and remove components as needed. Such components could store the "data" needed for adventuring, including character data, maps, and narrative elements, as well as the mechanics and rules. Individual adventures could be assembled out of these pieces, and there would likely be many reusable components. If

64 A side task being a small-scale mission that may involve, for example, the retrieval of treasure or other items, the slaying of a group of monsters, or the exploration of a specific area. Most modern RPGs include such side quests to give the player more to do in the game world beyond the main narrative, though they still operate in the same rigid, deterministic manner as the main missions. Also note that modern CRPG developers develop small expansions for their games in the form of downloadable content (DLC). These are released in the weeks and months after the release of the initial game. DLC often includes additional adventures for the player. Both modding and DLC may thus expand the number of quests available in a given game, and I am not intending to minimize their importance. Rather, I will argue that interactive programming systems are a more ideal vehicle for expandable and alterable CRPGs. 199

implemented properly, such an IPS could allow, for example, developers to add monsters from a digital catalogue to a specific adventure, or borrow elements from one adventure to expand another. Players could bring characters from one adventure to another, or even maintain a group of adventurers. Game mechanics could be updated and revised so that the rule sets are as "living" as they are with pen-and-paper games.

Such a system, moreover, would problematize the dichotomy between players and GMs that is also emerging within pen-and-paper games. Several "GM-less" pen-and-paper games, in fact, have been released in recent years, to much acclaim. Titles such as Microscope and Fiasco break from the combat and quest-driven mechanics of traditional RPGs and encourage players to collaborate to craft narratives that fit within specified parameters. The Mythic role-playing system, moreover, eliminates the GM position and relies on randomness and player prompts (e.g. instructing players to imagine the most logical outcomes of in-game actions and decisions) to move adventures along (such adventures typically being of the more traditional RPG sort.) Mythic even allows for solo play, as do systems such as Scarlet Heroes and The GameMaster's Apprentice. In all of these cases, players are assumed to be self-motivated to the point that they will abide by the rules without GM supervision, and will accept the consequences when unfavourable events arise. In this way GM-less RPG players are inheriting aspects of an older tradition of solo hobbyist war gaming in which the same player controls all sides in a given battle (see Featherstone, 1973). Rather than encouraging cheating, solo war gamers appear to be especially concerned about realism and accuracy. As Featherstone notes, "the solo-wargamer has had a completely free hand…unfettered by the possible whims and lack of understanding of another person who may not share his visions or enthusiasms" (Featherstone, 1973, p. 8). From this perspective, the entire "lifespan" of a given game – from inception to setup to execution – may be viewed as a single text, developed by the player.65 It is thus to the player's benefit that the game is played in such a way as to adhere to the larger goals of the project. For war gamers, realism is typically one of the most important goals. Game rules are intended to allow combat to

65 Featherstone was referring specifically to war games involving miniature figures on a hand-crafted battlefield, and not boxed board games. Miniature games tend to be much more personal than board games, in that the player(s) themselves typically devise the rules for a specific battle, decide on the configuration of forces for each side, and then play through the battle or series of battles. Unlike board games, these experiences are not meant to be repeatable. Rather, players tend to review and revise their systems continuously, so that no two battles are ever identical. 200

unfold in a logical and realistic manner, and game play is meant to reflect what would happen on a real battlefield. The end goal of such a project, then, is not to win a game, but rather to design and direct simulated warfare in a way that is compelling to the player, as well as to others who share in the experience.66

Similarly, I would argue that a well-designed CRPG-based interactive programming system would allow single players to design, execute, and report on individual adventures and campaigns. This would mean giving the player the freedom to change game rules, create new worlds, and develop their own quests. Again, it may sound as if such a tool would invite a significant degree of cheating, but the pen-and-paper GM-less games discussed above seem to suggest that at least a subgroup of dedicated players would be more concerned about creating interesting settings, characters, and adventures. The IPS would serve as a sort of toolbox with which to develop rules and components, as well as adventures and campaigns in which they are used. As with the war games described above, the goal would not be simply to win, but rather to produce compelling narratives in which player characters encounter challenges that match up with their capabilities. Thus the entire production of a game, from the development of rules to the execution of the adventure or campaign, could be thought of as a text, perhaps one of many in a series. Such texts could be recorded by, for example, keeping a log of events, and maintaining a living rule set.

While further research into IPS-based role-playing games would be required, I believe that the CRPG genre is one that would benefit immensely in the proper IPS environment(s). By allowing for a more improvisational form of play as compared to traditional CRPGs, IPS games would borrow many of the positive aspects of pen-and-paper gaming that are otherwise left out. Instead of just following a pre-programmed narrative, players would be able to craft their own stories as they go using whatever set of components they feel is appropriate to the context. The IPS thus becomes both warehouse and workbench for CRPG development and play.

66 Tabletop war gamers often record their battles in some form that may be shared with others. Online message boards are a popular contemporary venue for such practices. 201

The Future of Hobbyist Programming

This dissertation began with a discussion of certain trends that seem to suggest that a new hobbyist programming movement is starting to emerge or, arguably, has already emerged. The development of the Raspberry Pi computer, as well as the increasing popularity of the Python programming language, were cited as evidence of a burgeoning new wave of amateur game programmers. As I have argued, however, the programming paradigm that these tools adhere too is problematic, in that it is meant to recast hobbyist practices from the 1970s and 1980s within modern environments. It is, in short, an attempt to revive something that was seemingly lost. Given the immense technological gap that exists between current personal computers of those of this earlier era, the past cannot truly be recreated, nor, I would argue, should it be. The problems with Microsoft BASIC and similar systems have been detailed in earlier chapters, and the advantages of developing new interactive programming systems have been discussed at length. Having said all this, however, I recognize the fact that change will not come quickly, and that current systems and technologies will continue to be used for some time. In this final section, then, I will offer suggestions as to how a modern hobbyist movement might leverage existing tools and systems, and where attention should be paid with respect to incrementally improving on them.

It should be noted that pedagogy is not the focus here. Despite the links between hobbyist computing and educational computing, it is beyond the scope of the present study to offer specific policy recommendations to educators with respect to how programming should be taught, or what systems should be used to teach it. In terms of making recommendations to home hobbyists, however, I would argue that I am on firmer ground. This is due in part to the fact that I have been a hobbyist computer programmer myself since the mid-1980s. I first learned BASIC on a Commodore VIC-20. In the 1990s I learned Pascal, and worked largely with PC clones running Turbo Pascal. Several years later I began to use Visual Basic, as well as some C++. Right now my language of choice is Javascript, as discussed in chapter six. The vast majority of the programs I have written were pure hobbyist creations, in that I never profited from them. In fact, I only rarely have shared my code with others. The act of programming itself I often find more interesting than the actual games and other applications I create out of code. I am therefore deeply invested in hobbyist computing, and have formed opinions about it based on many years of programming experience. I will thus make recommendations with 202

respect to how hobbyist programming could grow and evolve that are informed both by scholarly research and hands-on knowledge.

Hobbyist computing is heavily dependent on tools, as the present study has made clear. As it happens, contemporary programming languages and systems largely complicate, rather than support, hobbyist programming. The Python language probably comes closest to meeting hobbyist needs. It avoids many of the syntactical complications of other languages, offers some console support, and is widely and freely available. It is also now supported by a vast library of guides and reference materials, both online and offline, much of which is geared towards novice users. Despite these advantages, however, Python is problematic in several ways. First, it exists in several different versions, with versions two and three of the language being largely incompatible. It is difficult to figure out which version should be prioritized when one first encounters the language. Second, while Python can be both powerful and flexible, its functionality is spread over an expansive collection of add-on libraries. Such libraries must be present both on the developer's system, and on the systems of all potential users. This leads into the third problem, which is that Python programs are interpreted, so that users are required to have the proper Python interpreter on their own computers in order to run them. This is especially confusing because there are so many different versions of the language available, and users must determine which version to call to execute a given program.67

Despite such drawbacks, Python provides an effective jumping-off point for a larger discussion of programming languages and systems. Among Python's most important features, for example, the IDLE environment stands out as an example of command-line interactivity within a larger development environment. As noted in chapter six, IDLE does not provide a persistent environment for Python programs. A variable declared in a program will not be set as such within the IDLE environment. In this respect, IDLE is weaker even than Microsoft BASIC in terms of providing a "universe" within which programs inhabit. Having said that, IDLE may serve as a template for more sophisticated command-line systems. Such systems, I would argue, are needed in order to enable more sophisticated operations than can be accommodated with GUI interfaces. A powerful feature of the standard Linux command-line console, for example, is

67 Tools such as py2exe do allow users to create standalone executable files, but these can be quite complex and cumbersome, particularly for new users. 203 support for "shell scripts," which are text files that list series of commands to be executed in order. The amount of time saved via skillful use of shell scripting is usually significant. A programming language-based command-line console could serve as a means to interact directly with the underlying system that operates the programming environment, as well as a means to quickly create variables that programs in that environment can access.

Another advantage to Python programming is that it focuses almost exclusively on language itself. There is an emerging trend in contemporary digital computing to do away with text-based programming – as much as that is possible – and instead supply users with GUI tools. Visual Basic is one of the earliest systems to adopt this position, though it does in fact require language skills for most tasks, as the GUI tools are used largely to build user interfaces. Another important example is GameMaker: Studio, a development system for games that relies heavily on GUI components. While GameMaker does have its own scripting language, its supporters often claim that the typical user need not write a single line of code in order to create a game (see Rohde, 2014). Other game making systems such as Stencyl and Construct 2 adhere to the same philosophy. And then, of course, there is Scratch, discussed in chapter one, which is a wholly visual programming system. While these systems often earn praise for their de-emphasis on written code, I would argue that this is not the direction that developers should be heading towards. There are two main reasons I take this position. First, GUI tools cannot offer the sort of flexibility and control that language allows. There is a reason why GameMaker, for example, supports a scripting language: it is simply not possible for checklists, drop-down boxes, and other GUI elements to accommodate all the possible actions that developers may implement via text-based systems. Second, a user that has mastered a given GUI language will find it much harder to migrate to more "serious" languages such as C++ and Java as compared to a user that is more familiar with text-based languages. GUI systems thus inhibit novice users that may wish to further develop their skills in other languages and environments.

In terms of the programs that hobbyists produce, I would argue that a renewed emphasis on game development is required. This may seem like a redundant request, given that games are the focus of many of the reference books and educational materials that have been discussed throughout this study. But games are largely represented as means to supposedly more enlightened ends. This is most evident with the trend towards teaching "computational thinking," with organizations such as CS Unplugged using pen-and-paper games to teach students about 204

computer science concepts such as data and algorithms.68 Games are seen as a palatable vehicle by which to impart knowledge about more serious issues, but are not a goal in and of themselves. I would argue for a shift in thinking on this issue, so that games may be considered as the "core" products of computer programming. A theme running through this dissertation has been the notion that digital games reflect the fundamental character of the systems upon which they are built more than any other type of program. Whirlwind's Bouncing Ball game (see chapter three) incorporated essentially the same procedures that were used to determine the trajectories of ballistic weapons – the very problem that digital computers were first designed to handle. RAND's operational games borrowed the same methods, but with more ambition, as they attempted to simulate logistical systems and global warfare. Such ambition was motivated in part by the increasing speed and power of digital computers in the postwar period, yet it also evidenced the fact that digital computation was still built on the same principles that governed the earliest machines. Modern games continue in this tradition, as they are built on models that are both extremely elaborate and functionally similar to those used in the earliest digital games.

This is all to say that digital computers emerged as machines designed to solve specific sorts of mathematical problems, and that games are built on the same functionality. Hobbyist computing, therefore, which was already quite focused on game development, could and should continue on this path. What should shift, however, is the general understanding of what hobbyist game programming entails, and how it may be accommodated. In order to better understand this issue, consider the state of 1970s and 1980s-era hobbyist computing materials in the present day. A large percentage of the books and magazines from that time were essentially discarded until recently, while the games themselves are largely forgotten. Even when the code for a given game is accessible, moreover, it is generally difficult to find a platform upon which to write and play it. Microsoft BASIC is long gone, and, given the tangled nature of BASIC code, porting such games into other languages can be tricky. There is, in short, a major infrastructural gap between modern computer systems and those of the hobbyist era, and bridging the gap between the two is not simple.

68 http://csunplugged.org/ 205

While there are likely many reasons why such a gap exists, I would argue that one of the most important issues is the general lack of respect given to hobbyist games. As noted earlier, many of the most important and acclaimed commercial games make use of hobbyist game mechanics. Yet once such titles began to enter the marketplace in the late 1980s, they quickly eclipsed the largely text-based games that they borrowed from. It may seem now like this was an inevitability, given the relative flashiness and sophistication of the commercial titles, but I would argue that it was specific market forces that played a major role in shaping these events. The consumer market for digital computers is (or perhaps was; see below) built upon the premise that newer is better, and that older products quickly become obsolete. In terms of games, this means discarding seemingly archaic formats and systems and embracing the notion that hardware and software must be perpetually upgraded. From this perspective, text-based BASIC games look hopelessly obsolete.

Yet our perspective has been slowly changing in recent years, and older hardware and software is being studied with renewed interest. Perhaps the most visible evidence of this shift is the rise of "retro" gaming. Retro gaming involves two sets of practices: the playing of older games, particularly those from the "eight-bit" game console era, and the development of new games that imitate the look and feel of older titles. In both cases, modern technologies are being "underused" in the sense that they are not being pushed to the limits of their functionality. A crucial example of this approach is the emergence of "emulators" – that is, programs that reproduce the fundamental hardware and software elements of older systems. If programmed correctly, a given emulator is capable of reading in and executing all or most programs designed for the original system, including games. Emulation is only possible because modern computers are fast enough to handle the work that is required to operate a virtual system within a larger operating system.69 With respect to recent commercial releases such as Axiom Verge and Shovel Knight, they are generally developed using modern programming systems and languages, and thus do not require emulation to function. In all cases, high-performance graphics and complex mechanics are not prioritized. Rather, the focus is on the quality of the gaming experience itself.

69 Since emulation is a somewhat slow, complex process, it is much more difficult to emulate more recent computers and consoles. When emulating older, slower machines, this lag is not noticeable. 206

Retro game players may find, for example, that the simple but effective mechanics of a game like Super Mario Brothers are preferable to those of modern first-person shooters.

Despite its seemingly populist ethos, however, retro gaming is still linked quite closely with the commercial game marketplace. As a result, non-commercial games have a minimal amount of cultural capital. While some of the more well-known text-based BASIC games are brought up occasionally on online discussion boards and other, related outlets, such mentions are few and far between as compared to the attention paid to older commercial console and computer games. Since text-based games were so quickly set aside with the advent of more powerful hardware, there was little to no chance of them being rescued for posterity, at least not right away. So while these games had a major influence on the emergence of entire genres, they did not have the commercial support that is generally needed to build a widespread retro movement. The runaway sales success of the first Super Mario titles all but ensured that Nintendo would keep them in the spotlight. The company has since released a wide variety of new Super Mario games, several of which use the same two-dimensional scrolling spaces and simple mechanics that helped make the originals so popular. Text-based BASIC games such as Hammurabi do not have such private-sector support. More than that, contemporary computer and console systems are built around graphics and GUI interfaces. Creating the proper interface for a text-based game in such environments is not a trivial task.70

That is not to say that modern hobbyist game programming should be entirely text-based. The Hail Workshop program discussed in chapter six, while text-based, incorporates graphical windows, making it something of a hybrid system. In all likelihood, graphics could and probably should play an important role in future programming environments. What is truly needed to facilitate hobbyist games – and this relates back to the question of what hobbyist programming actually entails – is the sorts of software and hardware infrastructures that enabled home users to program in BASIC (and machine code) in an earlier era. Products such as Raspberry Pi are useful to a certain degree, but they are not complete, in that they serve as only one piece of a larger puzzle. While Microsoft BASIC integrated both operating system and programming system into a single command-line interface, the RPi uses a powerful but complex Linux-based

70 This is particularly true with Internet browser-based games, an increasingly popular platform for retro gaming. 207

operating system, and on top of that are various disparate tools, with Python and Scratch being the two most focused on program development. Novice users thus have to figure out how to navigate the OS, where to go to launch the programming tools available, which tool is right for their needs, and where and how to begin coding. At that point, the user might wonder why they are doing all of this on the RPi at all, as opposed to just programming on their regular computer.

Microsoft BASIC was cleverly designed, in the sense that it provided exactly what was needed for home hobbyists to begin programming, and nothing more. This was to be its downfall, as was discussed, as games became more graphics-based and sophisticated. At the height of its popularity, however, it was an effective platform for coding on a variety of systems. This, then, is what I alluded to with respect to changing our thinking about hobbyist programming. Hobbyists, I would argue, are not software engineers. They do not, or perhaps should not, think of their programming efforts as a means to create standalone consumer products. Hobbyists may commercialize their work at some point, but then it becomes something else. What should be clear at this point is that hobbyist programmers, unlike private-sector developers, play with code.71 In the 1970s and 1980s, as was demonstrated in chapter four, hobbyists were continuously reworking the programs they found in books and magazines. By changing just a few lines of code, a game could be set in an entirely different context, or its underlying model could represent a different sort of system. The vast majority of such variant games were never published in any form, and it is likely that many of them were never even saved to any storage media. What mattered was the play sessions themselves, so that programming could in fact be a highly personal pastime, rich with moments of experimentation and experience that could not be recorded or saved for posterity.

It is these sorts of highly-personal moments that modern advocates for hobbyist programming need to recognize and promote. Too often programming is discussed as a means to an end. Contemporary educational and "how-to" reference materials typically employ a project-based approach, so that, for example, each chapter in a given work focuses on the completion of a particular programming "product." Even in Scratch, where there is an emphasis on "remixing" existing materials, there is the expectation that end products will be produced, to be uploaded

71 At least from a professional perspective. Certainly there are sure to be developers who also enjoy programming from a play-based standpoint. 208

and displayed online. The interactive programming system paradigm is so effective because it challenges the notion of finished and discrete programs at a fundamental level. Even without IPS tools at hand, however, the rhetoric surrounding modern hobbyist practices can start to shift. The concept of programming as play should be further emphasized, as well as the notion that programs are mutable objects. User agency can then be supported via command-line interfaces such as IDLE, as opposed to pure GUI development environments. Software systems may be built that are adapted to the needs of hobbyists, along with sites for hobbyists to share and circulate code.72 Tools and ideas from the past can help point the way towards meaningful change, but ultimately new tools and new ideas will need to be developed in order to sustain such change.

72 The website JSFiddle is a good example of an online code "playground" that allows users to experiment with and share HTML, CSS, and Javascript code. See https://jsfiddle.net/ References or Bibliography

101 BASIC computer games. 3rd edition. (1973/1975). Maynard, MA: Digital Equipment Corporation.

CAL reference manual for SDS-940 time-sharing computer systems. (1967). Santa Monica, CA: Scientific Data Systems.

COMPUTE!'s first book of Commodore 64 games (1983). Greensboro, NC: COMPUTE! Publications.

COMPUTE!'s first book of VIC games. (1983). Greensboro, NC: COMPUTE! Publications.

COMPUTE!'s machine language games for the Commodore 64. (1986). Greensboro, NC: COMPUTE! Publications.

COMPUTE!'s more machine language games for the Commodore 64. (1987). Greensboro, NC: COMPUTE! Publications.

The Dartmouth Time-Sharing System: a brief description. (1964). Hanover, NH: Computation Center, Dartmouth College.

EduSystem handbook. (1973). Maynard, MA: Digital Equipment Corporation.

Gamestar Mechanic parent's guide (n.d.) Gamestar Mechanic. Retrieved from http://gamestarmechanic.com/parents

See it now: Jay W. Forrester and the Whirlwind computer. (1951) [Video file]. Retrieved from http://video.mit.edu/watch/see-it-now-jay-w-forrester-and-the-whirlwind-computer-1951- 6629/

209 210

Aarseth, E. (1997). Cybertext: perspectives on ergodic literature. Baltimore: The Johns Hopkins University Press.

Adams, R. H., and Jenkins, J. L. (1960). Simulation of air operations with the Air-Battle Model. Operations Research, 8(5), 600-615.

Ahl, D. (1978). BASIC computer games: microcomputer edition. New York: Workman Publishing.

Ahl, D. (1979). More BASIC computer games. New York: Workman Publishing.

Ahl, D. (1984). Big computer games. Morris Plains, NJ: Creative Computing Press.

Ahl, D. (1986). BASIC Computer Adventures. Redmond, WA: Microsoft Press.

Akera, A. (2007). Calculating a natural world: scientists, engineers, and computers during the rise of U. S. Cold War research. Cambridge, MA: The MIT Press.

Akrich, M. (1992). The de-scription of technical objects. In W.E. Bijker & J. Law (Eds.), Shaping Technology/Building Society (pp. 205-24). Cambridge, MA: The MIT Press.

Allen, T. B. (2012). The evolution of wargaming: from chessboard to Marine Doom. In Cornell, T. J., and Allen, T. B. (Eds.), War and Games (pp. 231-261). Woodbridge, UK, and Rochester, NY: The Boydell Press.

Allison, D. (Interviewer) & Gates, B. (Interviewee). (1993). Bill Gates interview [Interview transcript]. Retrieved from http://americanhistory.si.edu/comphist/gates.htm

Altheide, D. L. (1987). Reflections: ethnographic content analysis. Qualitative Sociology, 10 (1), 65-77.

Altheide, D. L. (1996). Qualitative media analysis. Thousand Oaks, CA: SAGE Publications. 211

Anthony, S. (2011). MS-DOS is 30 years old today. ExtremeTech. Retrieved from http://www.extremetech.com/computing/91202-ms-dos-is-30-years-old-today

Arfken, G. B., Weber, H. J., and Harris, F. E. (2012). Mathematical models for physicists. 7th edition. Burlington, MA: Elsevier Academic.

Bagnall, B. (2010). Commodore: a company on the edge. Variant Press.

Baker, C. L. (1966). JOSS: introduction to a helpful assistant. Santa Monica, CA: The RAND Corporation.

Bardini, T. (2000). Bootstrapped: Douglas Engelbart, coevolution, and the origins of personal computing. Stanford, CA: Stanford University Press.

Bardinit, T., and Friedewald, M. (2002). Chronicle of the death of a laboratory: Douglas Engelbart and the failure of the knowledge workshop. History of Technology, 23, 191- 212.

Barnes, S. B. (1997). Douglas Engelbart: developing the underlying concepts for contemporary computing. IEEE Annals of the History of Computing, 19(3), 16-26.

Banzi, M. (2011). Getting started with Arduino. 2nd edition. Sebastopol, CA: Make:Books.

Bellman, R., Clark, C., Craft, C., Malcolm, D. G., and Ricciardi, F. (1957). On the construction of a multi-stage, multi-person business game. Santa Monica, CA: The RAND Corporation.

Bloor, D. (1976/1991). Knowledge and social imagery. 2nd edition. Chicago and London: The University of Chicago Press.

212

Boden, M. (1995). Creativity and unpredictability. Stanford Humanities Review, 4(2). Retrieved from http://web.stanford.edu/group/SHR/4-2/text/boden.html

Boden, M. (2004). The creative mind: myths and mechanisms. London and New York: Routledge.

Bogost, I. (2007). Persuasive games: the expressive power of videogames. Cambridge, MA: The MIT Press.

Boocock, S. S. (1996). Games with simulated environments: educational innovation and applied sociological research. In J. Clark (Ed.), James S. Coleman (pp. 133-146). London: Falmer Press.

Borko, H. (1966). Utilization of on-line interactive displays. Retrieved from http://www.dtic.mil/docs/citations/AD0640652

Brainerd, J. G., and Sharpless, T. K. (1948). The ENIAC. Electrical Engineering, 67(2), 163- 172.

Brand, S. (1972, 7 December). Spacewar: fanatic life and symbolic death among the computer bums. Rolling Stone, 50-57. Reprinted at http://www.wheels.org/spacewar/stone/rolling_stone.html

Brig, E. (1988). BASIC. In Encyclopedia of Microcomputers. (Vol. 2, pp. 133-153). New York: Marcel Dekker.

Callon, M. (1986). The sociology of an actor-network: the case of the electric vehicle. In M. Callon, J. Law, & A. Rip (Eds), Mapping the Dynamics of Science and Technology (pp. 19-34). London: Macmillan Press.

Campbell-Kelly, M., Aspray, W., Ensmenger, N., and Yost, J. (2014). Computer: a history of the information machine. 3rd edition. Boulder, CO: Westview Press. 213

Cassell, J., and Jenkins, H. (Eds.). (1998). From Barbie to Mortal Kombat: gender and computer games. Cambridge, MA: The MIT Press.

Card, O. S. (1983, July). Constructing the ideal computer game, part I. Compute!, 5(7), 30-40.

Carmichael, D. (1983). VIC features: color, graphics, sound, etc. In Compute!'s first book of VIC games (pp. 3-8). Greensboro, NC: Compute! Publications.

Ceruzzi, P. E. (2003). A history of modern computing. 2nd edition. Cambridge, MA: The MIT Press.

Chassell, R. J. (2009). An introduction to programming in Emacs LISP. Rev. 3rd ed. Boston: GNU Press.

Chun, W. H. K. (2005). On software, or the persistence of visual knowledge. Grey Room, 18, 26- 51.

Chun, W. H. K. (2011). Programmed visions: software and memory. Cambridge, MA: The MIT Press.

Cohen, K. J., and Rhenman, E. (1961). The role of management games in education and research. Management Science, 7(2), 131-166.

Cohen, K. J., Cyert, R. M., Dill, W. R., Kuehn, A. A., Miller, M. H., Van Wormer, T. A., and Winters, P. R. (1960). The Carnegie Tech-Management Game. The Journal of Business, 33(4), 303-321.

Collins, H. M. (1983). The sociology of scientific knowledge: studies of contemporary science. Annual Review of Sociology, 9, 265-285.

214

Collins, H. M., & Yearley, S. (1992). Epistemological chicken. In A. Pickering (Ed.), Science as Practice and Culture (pp. 301-326). Chicago: University of Chicago Press.

Corbató, F. J., Merwin-Daggett, M., and Daley, R. C. (1962). An experimental time-sharing system. Proceedings of the May 1-3, 1962, Spring Joint Computer Conference (pp. 335- 344). New York: Association for Computing Machinery.

Corbató, F. J., Merwin-Daggett, M., Daley, R. C., Creasy, R. J., Hellwig, J. D., Orenstein, R. H., and Korn, L. K. (1963). The Compatible Time-Sharing System: a programmer's guide. Cambridge, MA: The MIT Press.

Cory, C. (1966, September 30). Kids with the problems of kings. LIFE Magazine, 61(14), R2, R6.

Costikyan, G. (2002). I have no words & I must design: toward a critical vocabulary for games. In F. Mäyrä (Ed.), Proceedings of Computer Games and Digital Cultures Conference (pp. 9-33). Tampere, Finland: Tampere University Press.

Cushen, W. E. (1955). War games and operations research. Philosophy of Science, 22(4), 309- 320.

Dahl, O.-J., and Nygaard, K. (1966). SIMULA – an ALGOL-based simulation language. Communications of the ACM, 9(9), 671-678.

Dale, N. B., and Weems, C. (1997). Introduction to Pascal and structured design. 4th edition. Sudbury, MA: Jones and Bartlett Publishers.

Davidson, D. (2008). Interpreting Prince of Persia: The Sands of Time. Games and Culture, 3(3- 4), 356-386.

Day, R. P., and Lymberopolous, P. J. (1961). The decision game: progress and prospects. The Southwestern Social Science Quarterly, 42(3), 250-258. 215

Dill, W. R. (1961). What management games do best. Business Horizons, 4(3), 55-64.

Donat, W. (2014). Learn Raspberry Pi programming with Python. New York City: Apress.

Donovan, T. (2010). Replay: the history of video games. Lewes, UK: Yellow Ant.

Dougherty, D. (2012). The maker movement. innovations, 7(3), 11-14.

Douglas, D. (2010). MIT and war. In D. Kaiser (Ed.), Becoming MIT: moments of decision (pp. 81-102). Cambridge, MA: The MIT Press.

Douglass, J. (2007). Command lines: aesthetics and technique in interactive fiction and new media (doctoral dissertation). Retrieved from ProQuest Dissertations & Theses Global. (Order Number 3291315)

Dredge, S. (2014, September 4). Coding at school: a parent's guide to England's new computing curriculum. The Guardian. Retrieved from http://www.theguardian.com/technology/2014/sep/04/coding-school-computing- childrenprogramming

Edwards, B. (2009, July 19). Forty years of Lunar Lander. Technologizer. Retrieved from http://www.technologizer.com/2009/07/19/lunar-lander/

Edwards, P. N. (1990). The Army and the microworld: computers and the politics of gender identity. Signs, 16(1), 102-127.

Edwards, P. N. (1996). The closed world: computers and the politics of discourse in Cold War America. Cambridge, MA: The MIT Press.

Eilon, S. (1963). Management games. Operational Research Quarterly, 14(2), 137-149.

216

Engelbart, D. C. (1962). Augmenting human intellect: a conceptual framework. Menlo Park, CA: Stanford Research Institute.

Ensmenger, N. L. (2010). The computer boys take over: computers, programmers, and the politics of technical expertise. Cambridge, MA: The MIT Press.

Expert Panel on STEM Skills for the Future. (2015). Some assembly required: STEM skills and Canada's economic productivity. Retrieved from http://www.scienceadvice.ca/uploads/ENG/AssessmentsPublicationsNewsReleases/STE M/STEMFullReportEn.pdf

Faria, A. J., and Nulsen, R. (1996). Business simulation games: current usage levels; a ten year update. Developments in Business Simulation & Experiential Exercises, 23, 22-28.

Farish, M. (2010). The contours of America's Cold War. Minneapolis: University of Minneapolis Press.

Fass, V. H. (1969). Conversational computing…how to achieve a satisfactory interactive relationship with a machine. Electronics & Power, 15(2), 46-51.

Featherstone, D. (1973). Solo-wargaming. London: Kaye & Ward.

Feldt, A. G. (1966). Operational gaming in planning education. Journal of the American Institute of Planners, 32(1), 17-23.

Ferguson, R. L., and Jones, C. H. (1969). A computer aided decision system. Management Science, 15(10), B-550 - B-561.

Fong, P. (n.d.). LISP tutorial 1: basic LISP programming. Simon Fraser University. Retrieved from http://www.cs.sfu.ca/CourseCentral/310/pwfong/Lisp/1/tutorial1.html

217

Forbes, J. (1965). Operational gaming and decision simulation. Journal of Educational Measurement, 2(1), 15-18.

Forsberg, E. (1969). Operational gaming for vocational awareness: a survey. Retrieved from http://eric.ed.gov/?id=ED030894

Franzosi, R. (2008). Content analysis. London: SAGE Publications.

Fuller, M. (2003). Behind the clip: essays on the culture of software. Brooklyn, NY: Autonomedia.

Fuller, M. (2008). Introduction: the stuff of software. In M. Fuller (Ed.), Software Studies: A Lexicon (pp. 1-13). Cambridge, MA: The MIT Press.

Games, I. A. (2010). Gamestar Mechanic: learning a designer mindset through communicational competence with the language of games. Learning, Media and Technology, 35(1), 31-52.

Geisler, M. A., and Ginsberg, A. S. (1965). Man-machine simulation experience. Santa Monica, CA: The RAND Corporation.

Ghamari-Tabrizi, S. (2000). Simulating the unthinkable: gaming future wars in the 1950s and 1960s. Social Studies of Science, 30(2), 163-223.

Glass, R. L. (2005). The plot to deskill software engineering. Communications of the ACM, 48(11), 21-24.

Goldberg, A. (1983). Smalltalk-80: the interactive programming environment. Reading, MA: Addison-Wesley Publishing Company.

Goldberg, A., and Robson, D. (1983). Smalltalk-80: the language and its implementation. Reading, MA: Addison-Wesley Publishing Company.

218

Goldberg, A., and Ross, J. (1981, August). Is the Smalltalk-80 system for children? BYTE, 6(8), 348-368.

Goldstein, J. R. (1961). RAND: the history, operations, and goals of a nonprofit organization. Santa Monica, CA: The RAND Corporation.

Graetz, J. M. (1981). The origin of Spacewar. Creative Computing, 7(8), 56-67.

Graham, P. (2004). Hackers & painters: big ideas from the computer age. Sebastopol, CA: O'Reilly Media.

Grier, D. A. (1997). Gertrude Blanch of the Mathematical Tables Project. IEEE Annals of the History of Computing, 17(4), 18-27.

Grier, D. A. (1998). The Math Tables Project of the Work Projects Administration: The reluctant start of the computing era. IEEE Annals of the History of Computing, 20(3), 33-50.

Grier, D. A. (2005). When computers were human. Princeton and Oxford: Princeton University Press.

Grint, K., & Woolgar, R. (1997). The machine at work: technology, work, and organization. Cambridge, UK: Polity Press.

Harbour, J. (2014). Video game programming for kids. 2nd edition. Boston: Course Technology.

Harrison, Jr., J. O. (1964). Computer-aided information systems for gaming. McLean, VA: Research Analysis Corporation.

Hayles, N. K. (2005). My mother was a computer: digital subjects and literary texts. Chicago and London: The University of Chicago Press.

Helmer, O. (1960). Strategic gaming. Santa Monica, CA: The RAND Corporation. 219

Helmer, O., and Quade, E. S. (1963). An approach to the study of a developing economy by operational gaming. Santa Monica, CA: The RAND Corporation.

Hiltzik, M. A. (1999). Dealers of lightning: Xerox PARC and the dawn of the computer age. New York: HarperBusiness.

Hoag, M. W. (1956). An introduction to systems analysis. Santa Monica, CA: The RAND Corporation.

Hoffman, T. R. (1965). Programmed heuristics and the concept of par in business games. Behavioral Sciences, 10(2), 169-172.

Howarth, L., and Evans, C. (1984). Write your own fantasy games for your microcomputer. London: Usborne Publishing.

Huang, Y. (2014). Pragmatics. 2nd edition. Oxford: Oxford University Press.

Hughes, T. (1998). Rescuing Prometheus: four monumental projects that changed our world. New York: Pantheon Books.

Hunter, D. (1983, March). The roots of DOS: Tim Paterson. Softalk for the IBM Personal Computer, 1(10), 32-41.

Hunter, L. (1998). Exploring the cybertext: literary criticism and the reader [Review of Cybertext: perspectives on ergodic literature, by E. Aarseth]. Convergence, 4(3), 100- 102.

Hurd, C. C. (1985). A note on early Monte Carlo computations and scientific meetings. IEEE Annals of the History of Computing, 7(2), 141-155.

220

Hurst, J., Mahoney, M. S., Taylor, N. H., Ross, D. T., and Fano, R. M. (1989), Retrospectives I: the early years in computer graphics at MIT, Lincoln Lab, and Harvard, in: AEM SIGGRAPH (Association for Computing Machinery, Special Interest Group on GRAPHics and Interactive Techniques), SIGGRAPH '89, Boston, 31 July-4 August 1989.

Isaaman, D., and Tyler, J. (1982). Computer space games. London: Usborne Publishing.

James, J. (2010). New technology in developing countries: a critique of the One-Laptop-per- Child program. Social Science Computer Review, 28(3), 381-390.

Joerges, B. (1999). Do politics have artifacts? Social studies of science, 29(3), 411-431.

Johnson, S. B. (2002). The United States Air Force and the culture of innovation, 1945-1965. Washington, DC: Air Force History and Museums Program.

Johnstone, B. (2003). Never mind the laptops: kids, computers, and the transformation of learning. New York: iUniverse.

Jones, S. E. (2008). The meaning of video games: gaming and textual strategies. New York: Routledge.

Ju, W. (2008). The mouse, the demo, and the big idea. In T. Erickson, and D. W. McDonald (Eds.), HCI Remixed: essays and works that have influenced the HCI community (pp. 29- 33). Cambridge, MA: The MIT Press.

Kahn, H. (1949). Stochastic (Monte Carlo) attenuation analysis. Santa Monica, CA: The RAND Corporation.

Kahn, H. (1953). Methods of reducing sample size in Monte Carlo computations. Santa Monica, CA: The RAND Corporation.

221

Kahn, H, and Mann, I. (1957). Techniques of systems analysis. Santa Monica, CA: The RAND Corporation.

Kaiser, D. (2010). Elephant on the Charles: postwar growing pains. In D. Kaiser (Ed.), Becoming MIT: moments of decision (pp. 103-121). Cambridge, MA: The MIT Press.

Kaplan, F. (1983). The wizards of Armageddon. Stanford, CA: Stanford University Press.

Kaufman, D. (1973, May). Lost in the caves. People's Computer Company, 1(5), 4.

Kay, A. C. (1969). The reactive engine (doctoral dissertation). Retrieved from ProQuest Dissertations & Theses Global. (Order Number 7003806)

Kay, A. C. (1972). A personal computer for children of all ages. Proceedings of the ACM Annual Conference (Article No. 1, pp. 1-11). New York: Association for Computing Machinery.

Kay, A. C. (1993). The early history of Smalltalk. Proceedings of the Second ACM SIGPLAN Conference on History of Programming Languages (pp. 69-95). New York: Association for Computing Machinery.

Kay, A., and Goldberg, A. (1977/2003). Personal dynamic media. In N. Montfort, and N. Wardrip-Fruin (Eds.), The New Media Reader (pp. 391-404). Cambridge, MA: The MIT Press.

Kay, A., Ingalls, D., Ohshima, Y., Piumarta, I., and Raab, A. (2006). Steps toward the reinvention of programming: a compact and practical model of personal computing as a self-exploratorium. Retrieved from http://worrydream.com/refs/Kay%20- %20NSF%20proposal.pdf

Keizer, G. (1987). More amazing games. In COMPUTE!'s More Machine Language games for the Commodore 64 (pp. 3-5). Greensboro, NC: COMPUTE! Publications.

222

Keizer, G. (1988, May). Editorial license. COMPUTE!, 10(5), 4.

Kelleher, C., Pausch, R., and Kiesler, S. (2007). Storytelling Alice motivates middle school girls to learn computer programming. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 1455-1464). New York: Association for Computing Machinery.

Kemeny, J. G., and Kurtz, T. E. (1985). Back to BASIC: the history, corruption, and future of the language. Reading, MA: Addison-Wesley Publishing Company.

Kerr, A. (2006). The business and culture of digital games: gamework and gameplay. London: SAGE Publications.

Keys, J. B., Fulmer, R. M., and Stumpf, S. A. (1996). Microworlds and simuworlds: practice fields for the learning organization. Operational Dynamics, 24(4), 36-49.

Kind, A., and Padget, J. A. (1999). Towards meta-agent protocols. In J. A. Padget (Ed.), Collaboration Between Human and Artificial Societies: Coordination and Agent-Based Distributive Computing (pp. 30-42). Berlin and Heidelberg: Springer-Verlag.

Kitchin, R., and Dodge, M. (2011). Code/space: software and everyday life. Cambridge, MA: The MIT Press.

Kohli, S. (2015, May 14). The economic importance of teaching coding to teens. The Atlantic. Retrieved from http://www.theatlantic.com/education/archive/2015/05/the-economic- importance-of-teaching-coding-to-teens/393263/

Kӧlling, M. (2010). The Greenfoot programming environment. ACM Transactions on Computing Education, 10(4), article 14.

Kraft, P. (1977). Programmers and managers: the routinization of computer programming in the United States. New York: Springer-Verlag. 223

Krzywinska, T. (2006). The pleasures and dangers of the game: up close and personal. Games and Culture, 1(1), 119-122.

Kuhn, T. S. (1962/1996). The structure of scientific revolutions. 3rd edition. Chicago: The University of Chicago Press.

Lampson, B. (1983/1986). A description of the Cedar language. Retrieved from http://research.microsoft.com/en-us/um/people/blampson/32a-CedarLang/32a- CedarLangAbstract.htm

Latour, B. (1983). Give me a laboratory and I will move the world. In K. Knorr-Cetina, and M. Mulkay (Eds.), Science Observed: Perspectives on the Social Study of Science (pp. 141- 169). London: Sage.

Latour, B. (2005). Reassembling the social: an introduction to actor-network theory. Oxford: Clarendon, 2005.

Law, J. (1992). Notes on the theory of the actor-network: ordering, strategy, and heterogeneity. Systems practice, 5(4), 379-393.

Leslie, S. W. (1993). The Cold War and American science: the military-industrial-academic complex at MIT and Stanford. New York: Columbia University Press.

Leslie, S. W. (2010). "Time of troubles" for the special laboratory. In D. Kaiser (Ed.), Becoming MIT: moments of decision (pp. 123-143). Cambridge, MA: The MIT Press.

Levy, S. (1984/2010). Hackers: heroes of the computer revolution. Garden City, NY: Anchor Press/Doubleday.

Licklider, J. C. R. (1960). Man-computer symbiosis. IRE Transactions on Human Factors in Electronics, 1, 4-11. 224

Light, J. S. (1999). When computers were women. Technology and Culture, 40(3), 455-483.

Lomas, N. (2015, Feb. 7). Raspberry Pi sales pass 5 million. TechCrunch. Retrieved from http://techcrunch.com/2015/02/17/raspberry-pi-sales-pass-5-million/

Lowen, R. S. (1997). Creating the Cold War university: the transformation of Stanford. Berkeley and Los Angeles: University of California Press.

Lowood, H. (2006). A brief biography of computer games. In P. Vorderer and J. Bryant (Eds.), Playing Video Games: Motives, Responses and Consequences (pp. 25-41). Hillsdale, NJ: Lawrence Erlbaum Associates.

Lowood, H. (2009). Videogames in Computer Space: the complex history of Pong. Annals of the History of Computing, IEEE, 31(3), 5-19.

Mace, S. (1984, April 9). Atarisoft vs. Commodore. InfoWorld, 6(15), 50.

Mackenzie, A. (2006b). Cutting code: software and sociality. New York: Peter Lang.

MacKenzie, D., and Wajcman, J. (1999). Introductory essay: the social shaping of technology. In MacKenzie and Wajcman (Eds.), The Social Shaping of Technology, Second Edition (pp. 3-27). Buckingham, UK: Open University Press.

Maher, J. (2012). The future was here: the Commodore Amiga. Cambridge, MA: The MIT Press.

Mahoney, M. S. (2005). The histories of computing(s). Interdisciplinary Science Reviews, 30(2), 119-135.

Maloney, J., Resnick, M., Rusk, N., Silverman, B., and Eastmond, E. (2010). The Scratch programming language and environment. ACM Transactions on Computing Education, 10(4), Article 16, 1-15. 225

Manovich, L. (2001). The language of new media. Cambridge, MA: The MIT press.

Marino, M. C. (2006, December 4). Critical code studies. Electronic Book Review. Retrieved from http://www.electronicbookreview.com/thread/electropoetics/codology

Marino, M. C. (2013). Reading exquisite_code: critical code studies of literature. In Hayles, N. K., and Pressman, J. (Eds.), Comparative Textual Media: Transforming the Humanities in the Postprint Era (pp. 283-310). Minneapolis: University of Minnesota Press.

Markoff, D. (2005). What the dormouse said: how the sixties counterculture shaped the personal computer industry. New York: Penguin Books.

Marks, S. L. (1982). JOSS – conversational computing for the nonprogrammer. Annals of the History of Computing, 4(1), 35-52.

Mauchly, J. W. (1942/1982). The use of high speed vacuum tube devices for calculating. In D. Gries (Ed.), The Origins of Digital Computers: Selected Papers, 3rd edition (pp. 355- 358). Berlin and Heidelberg: Springer-Verlag.

Maxwell, J. W. (2007). Tracing the Dynabook: A study of technocultural transformations (doctoral dissertation). Retrieved from ProQuest Dissertations & Theses Global. (Order Number NR26757)

McCarty, W. (2005). Humanities computing. London and New York: Palgrave.

McCarthy, J. (1978). History of LISP. ACM SIGPLAN Notices, 13(8), 217-223.

McCullough, M. (1996). Abstracting craft: the practiced digital hand. Cambridge, MA: The MIT Press.

226

McEvoy, S., and Smith, L. (1985). Save the Venturians! New York: Dell Publishing.

McGrath, A. E. (2011). Christian theology: an introduction. Chichester, UK: Wiley-Blackwell.

McKenney, J. L., and Dill, W. R. (1966). Influences on learning in simulation games. American Behavioral Scientist, 10, 28-32.

Mearls, M. (2014, June 26). A living rule set [Blog post]. Retrieved from http://dnd.wizards.com/articles/features/living-rule-set

Metropolis, N. (1987). The beginning of the Monte Carlo method. Los Alamos Science, 15, 125- 130.

Mitchell, J. G. (1970). The design and construction of flexible and efficient interactive programming systems (doctoral dissertation). Retrieved from ProQuest Dissertations & Theses Global. (Order Number 7104538)

Mitchell, J. G., Maybury, W., and Sweet, R. (1979). Mesa language manual: version 5.0. Palo Alto, CA: Xerox Corporation.

Montfort, N. (2003). Twisty little passages: an approach to interactive fiction. Cambridge, MA: The MIT Press.

Montfort, N., and Bogost, I. (2009). Racing the beam: the Atari video computer system. Cambridge, MA: The MIT Press.

Monfort, N., Baudoin, P., Bell, J., Bogost, I., Douglass, J., Marino, M. C., Mateas, M., Reas, C., Sample, M., and Vawter, N. (2013). 10 PRINT CHR$(205.5+RND(1)); : GOTO 10. Cambridge, MA: The MIT Press.

Morgan, N. (2014). Javascript for kids: a playful introduction to programming. San Francisco: No Starch Press. 227

Morse, P. M. (1948). Mathematical problems in operations research. Bulletin of the American Mathematical Society, 54(7), 602-621.

Moses, J. (1970). The function of FUNCTION in LISP, or why the FUNARG problem should be called the environment problem. Cambridge, MA: Massachusetts Institute of Technology, Project MAC.

Murdoch, J. (2001). Ecologising sociology: actor-network theory, co-constructionism, and the problem of human exemptionalism. Sociology, 35(1), 111-133.

Naur, P., and Randell, B. (Eds.) (1969). Software engineering: report of a conference sponsored by the NATO Science Committee. Brussels: NATO Scientific Affairs Division.

Newell, A. (1954). The chess machine: an example of dealing with a complex task by adaptation. Santa Monica, CA: The RAND Corporation.

Newell, A., and Tonge, F. M. (1960). An introduction to Information Processing Language V. Santa Monica, CA: The RAND Corporation.

Nord, C., Roey, S., Perkins, S., Lyons, M., Lemanski, N., Schuknecht, J., and Brown, J. (2011). America's high school graduates: results of the 2009 NAEP high school transcript study. Retrieved from https://nces.ed.gov/pubsearch/pubsinfo.asp?pubid=2011462

Northrop, G. M. (1967). Use of multiple on-line, time-shared computer consoles in simulation and gaming. Santa Monica, CA: The RAND Corporation.

Noyes, D. (1992). Artificial intelligence with Common Lisp: fundamentals of symbolic and numeric processing. Lexington, MA: D. C. Heath and Company.

Office Systems Division, Xerox Corporation (1984). Xerox Development Environment: concepts and principles. Palo Alto, CA: Xerox Corporation. 228

Oliveira, M. (2014, April 30). Code kiddies: Canadian elementary schools teach computer programming. The Globe and Mail. Retrieved from http://www.theglobeandmail.com/technology/tech-news/code-kiddies- canadianelementary-schools-teach-computer-programming/article18337552

Orpwood, G., Schmidt, B., and Jun, H. (2012). Competing in the 21st century skills race. Retrieved from http://www.ceocouncil.ca/wp-content/uploads/2012/07/Competing-in- the-21st-Century-Skills-Race-Orpwood-Schmidt-Hu-July-2012-FINAL.pdf

Orr, W. D. (1968). Conversational computers. New York: John Wiley & Sons.

Orwell, G. (1949/1989). 1984. Harmondsworth, UK: Penguin Books.

Owens, T. (2011). Social videogame creation: lessons from RPG Maker. On the Horizon, 19(1), 52-61.

Oxford, N. (2011). Ten facts about the great video games crash of '83. IGN. Retrieved from http://www.ign.com/articles/2011/09/21/ten-facts-about-the-great-video-game-crash-of- 83

Papadopoulos, G. (2005, November 11). Don't become Moore confused (or, the death of the microprocessor, not Moore's Law) [Blog post]. Retrieved from https://blogs.oracle.com/Gregp/entry/don_t_become_moore_confused

Papert, S. (1980). Mindstorms: children, computers, and powerful ideas. New York: Basic Books.

Papert, S. (1993). The children's machine: rethinking school in the age of the computer. New York: Basic Books.

229

Parsons, E. H. (2002). Structured programming with COBOL examples. Lincoln, NE: Writer's Showcase.

Paxson, E. W. (1963). War gaming. Santa Monica, CA: The RAND Corporation.

Peppler, K. A., and Kafai, Y. B. (2010). Gaming fluencies: pathways into a participatory culture in a community design studio. International Journal of Learning and Media, 1(4), 1-14.

Perla, P. P. (1990). The art of wargaming: a guide for professionals and hobbyists. Annapolis, MD: Naval Institute Press.

Philbin, C. A. (2014). Adventures in Raspberry Pi. Chichester, UK: John Wiley & Sons.

Pinch, T. J., and Bijker, W. E. (1984). The social construction of facts and artefacts: or how the sociology of science and the sociology of technology might benefit each other. Social Studies of Science, 14(3), 399-441.

Polachek, H. (1997). Before the ENIAC. IEEE Annals of the History of Computing, 19(2), 25-30.

Pournelle, J. (1989, January). To the stars. BYTE 14(1), 109-124.

Rampell, C. (2014, November 24). Chicago schools add computer science to the core curriculum. The Washington Post. Retrieved from https://www.washingtonpost.com/opinions/catherine-rampell-chicago-schools-add- computer-science-to-the-core-curriculum/2014/11/24/037c78f0-7417-11e4-a5b2- e1217af6b33d_story.html

Redmond, K. C., and Smith, T. H. (1980). Project Whirlwind: The History of a Pioneer Computer. Bedford, MA: Digital Press.

Redmond, K. C., and Smith, T. H. (2000). From Whirlwind to MITRE: The R&D story of the SAGE air defense computer. Cambridge, MA: The MIT Press. 230

Rehak, B, (2008). The rise of the home computer. In M. J. P. Wolf (Ed.), Video Game Explosion: A History from Pong to Playstation and Beyond (pp. 75-80). Wesport, CT and London: Greenwood Press.

Renshaw, J. R., and Heuston, A. (1960). The game Monopologs. Santa Monica, CA: The RAND Corporation.

Resnick, M. (2012). Mitch Resnick: Let's teach kids to code [Video file]. Retrieved from http://www.ted.com/talks/mitch_resnick_let_s_teach_kids_to_code

Resnick, M., and Rosenbaum, E. (2013). Designing for tinkerability. In M. Honey, and D. E. Kanter (Eds.), Design, Make, Play: Growing the Next Generation of STEM Innovators (pp. 163-181). New York: Routledge.

Resnick, M, Maloney, J., Monroy-Hernández, A., Rusk, N., Eastmond, E., Brennan, K., Millner, A., Rosenbaum, E., Silver, J., Silverman, B., and Kafai, Y. (2009). Scratch: programming for all. Communications of the ACM, 52(11), 60-67.

Ricciardi, F. M., Craft, C. J., Malcolm, D. G., Bellman, R., Clark, C., Kibbee, J. M., & Rawdon, R. H. (1957). Top Management Decision Simulation: the AMA approach. New York: American Management Association.

Richards, G., Hammer, C., Burg, B., Vitek, J. (2011). The eval that men do: A large-scale study of the use of eval in Javascript applications. In ECOOP'11: Proceedings of the 25th European conference on Object-oriented programming (pp. 52-78). Berlin and Heidelberg: Springer-Verlag.

Rochberg, R. (1967). STROP: player's manual for JOSS version. Santa Monica, CA: The RAND Corporation.

Rohde, J. (2014). GameMaker: Studio for dummies. New York: John Wiley & Sons. 231

Salen, K., and Zimmerman, E. (2003). Rules of play: game design fundamentals. Cambridge, MA: The MIT Press.

Salter, A., and Murray, J. (2014). Flash: building the interactive web. Cambridge, MA: The MIT Press.

Sammet, J. E. (1969). Programming languages: history and fundamentals. Englewood Cliffs, NJ: Prentice-Hall.

Sandewall, E. (1978). Programming in an interactive environment: the LISP experience. Computing Surveys, 10(1), 35-71.

Schӧn, D. A. (1983). The reflective practitioner: how professionals think in action. New York: Basic Books.

Schön, D. A. (1988). Designing: rules, types and worlds. Design Studies, 9(3), 181-190.

Schön, D. A. (1992). Designing as reflective conversation with the materials of a design situation. Research in Engineering Design, 3(3), 131-147.

Servan-Schreiber, J.-J. (1980). The world challenge. New York: Simon and Schuster.

Servomechanisms Laboratory, Massachusetts Institute of Technology. (1946). Summary report no. 1. Cambridge, MA. Retrieved from http://dome.mit.edu/handle/1721.3/40719

Servomechanisms Laboratory, Massachusetts Institute of Technology. (1947). Digital computation for anti-submarine problem. Cambridge, MA. Retrieved from http://dome.mit.edu/handle/1721.3/45940

232

Servomechanisms Laboratory, Massachusetts Institute of Technology. (1949). Summary Report No. 20: Third Quarter, 1949. Cambridge, MA. Retrieved from http://dome.mit.edu/handle/1721.3/45940

Shrader, C. R. (2006). History of operations research in the United States Army: volume 1, 1942- 62. Washington, DC: Office of the Deputy Under Secretary of the Army for Operations Research, United States Army.

Shubik, M. (1968). Gaming: costs and facilities. Management Science, 14(11), 629-660.

Shurkin, J. (1984). Engines of the mind: a history of the computer. New York and London: W. W. Norton & Company.

Slayton, R. (2013). Arguments that count: physics, computing, and missile defense, 1949-2012. Cambridge, MA: The MIT Press.

Smith, B. L. R. (1966). The RAND Corporation: case study of a nonprofit advisory corporation. Cambridge, MA: Harvard University Press.

Smith, J. S. (2005). Building New Deal liberalism: the political economy of public works, 1933- 1956. Cambridge, MA: The MIT Press.

Snodgrass, A., & Coyne, R. (2006). Interpretation in architecture: design as a way of thinking. London: Routledge.

Stangler, D., and Maxwell, K. (2012). DIY producer society. Innovations, 7(3), 3-10.

Steele, Jr., G. L., and Gabriel, R. P. (1993). The evolution of Lisp. Retrieved from http://www.dreamsongs.com/Files

Stephenson, N. (1999). In the beginning…was the command line. New York: HarperCollins.

233

Stern, N. (1981). From ENIAC to UNIVAC: an appraisal of the Eckert-Mauchly Computers. Bedford, MA: Digital Press.

Sutherland, I. (1963). Sketchpad: a man-machine graphical communication system (doctoral dissertation). Retrieved from http://worrydream.com/refs

Sweigart, A. (2012). Invent your own computer games with Python. 2nd edition. Retrieved from http://inventwithpython.com/

Swinehart, D. C. (1974). Copilot: a multiple-process approach to interactive programming systems (doctoral dissertation). Retrieved from ProQuest Dissertations & Theses Global. (Order Number 7420241)

Szczepaniak, J. (2012, August). A basic history of BASIC. Game Developer Magazine, 19(8), 35-39.

Teitelman, W. (2008). History of Interlisp. Proceedings of LISP 50: Celebrating the 50th Anniversary of Lisp (Article No. 5, pp. 1-5). New York: Association for Computing Machinery.

Tucker, S. (2012). Early online gaming: BBSs and MUDs. In M. J. P. Wolf (Ed.), Before the Crash: Early Video Game History (pp. 209-224). Detroit: Wayne State University Press.

Upton, E., and Halfacree, G. (2014). Raspberry Pi user guide. 2nd edition. Chichester, UK: John Wiley & Sons.

Venners, B. (2003, Jan. 13). The making of Python: a conversation with Guido van Rossum, part I. Retrieved from http://www.artima.com/intv/pythonP.html

Von Neumann, J. (1945). First draft of a report on the EDVAC. Philadelphia, PA: Moore School of Electrical Engineering, University of Pennsylvania.

234

Wadsworth, B. J. (2003). Piaget's theory of cognitive and affective development: foundations of constructivism, 5th ed. Upper Saddle River, NJ: Allyn & Bacon.

Waks, L. J. (2001). Donald Schön's philosophy of design and design education. International Journal of Technology and Design Education, 11(1), 37-51.

Walden, D. (2011). Early years of basic computer and software engineering. In D. Walden, and N. Nickerson (Eds.), A Culture of Innovation: Insider Accounts of Computing and Life at BBN (pp. 51-68). East Sandwich, MA: Waterside Publishing.

Walden, D., and Vleck, T. V. (Eds.) (2011). The Compatible Time-Sharing System (1961-1973): fiftieth anniversary commemorative overview. Washington, DC: IEEE Computer Society.

Waldrop, M. N. (2001). The dream machine: ]. C. R. Licklider and the revolution that made computing personal. New York: Viking.

Wardrip-Fruin, N. (2009). Expressive processing: digital fictions, computer games and software. Cambridge, MA: The MIT Press.

Ware, W. H. (1966). JOHNNIAC eulogy. Santa Monica, CA: The RAND Corporation.

Ware, W. H. (2008). RAND and the information evolution: a history in essays and vignettes. Santa Monica, CA: The RAND Corporation.

Weizenbaum, J. (1976). Computer power and human reason: from judgment to calculation. New York and San Francisco: W. H. Freeman and Company

Wells, H. G. (1913). Little wars: a game for boys from twelve years of age to one hundred and fifty and for that more intelligent sort of girls who like boys' games and books. London: Frank Palmer.

235

Welsh, T, and Welsh, D. (2011). Priming the pump: how TRS-80 enthusiasts helped spark the PC revolution. Ferndale, MI: The Seeker Books.

Wildes, K. L., and Lindgren, N. A. (1985). A century of electrical engineering and computer science at MIT, 1882-1982. Cambridge, MA: The MIT Press.

Wilson, C., Sudol, L. A., Stephenson, C., and Stehlik, M. (2010). Running on empty: the failure to teach K–12 computer science in the digital age. New York: Association for Computing Machinery.

Wing, R. L. (1966). Two computer-based economics games for sixth graders. American Behavioral Scientist, 10(3), 31-35.

Wing, R. L., Addis, M., Goodman, W., Leonard, J., and McKay, W. (1967). The production and evaluation of three computer-based economics games for the sixth grade. Retrieved from http://eric.ed.gov/?id=ED014227

Winner, L. (1980). Do artifacts have politics? Daedalus, 109(1), 121-136.

Winner, L. (1993). Upon opening the black box and finding it empty: social constructivism and the philosophy of technology. Science, Technology, & Human Values, 18(3), 362-378.

Wolf, M. J. P., (2008). Genre profile: adventure games. In M. J. P. Wolf (Ed.), Video Game Explosion: A History from Pong to Playstation and Beyond (pp. 81-88). Wesport, CT and London: Greenwood Press.

Wolf, M. J. P., (2012). Building imaginary worlds: the theory and history of subcreation. New York: Routledge.

Wolf, M. J. P, and Perron, B. (2003). Introduction. In M. J. P. Wolf, and B. Perron (Eds.), The Video Game Theory Reader. London and New York: Routledge.

236

Woods, S. (2004). Loading the dice: the challenge of serious videogames. Game Studies, 4(1). Retrieved from http://www.gamestudies.org/0401/woods/

Woolgar, S. (1991). The turn to technology in social studies of science. Science, Technology and Human Values, 16(1), 20-50.

Yob, G. (1975, Sept.-Oct.) Hunt the Wumpus. Creative Computing, 1(5), 51-54.

Yob, G. (1979). Wumpus 1, Wumpus 2. In D. Ahl (Ed.), More BASIC Computer Games (pp. 178-184). New York: Workman Publishing.

Zimmerman, R. E. (1956). Monte Carlo computer war gaming (U): a feasibility study. Chevy Chase, MD: Operations Research Office, The Johns Hopkins University. Appendix A: Using Hail-Workshop

Hail Workshop is a desktop programming environment for the Hail language. The Hail system was inspired by Smalltalk, though it is much simpler (the word "hail" is a distant cognate of the term "small talk", hence the name). The user codes and run commands from the same environment, and a persistent image may be saved and loaded. Hail is modeless, meaning that programs are never "executed": rather, the same prompt is used both to send commands to the system and to send requested information to programs. Hail Workshop runs on top of the Electron framework developed by GitHub.73 The language parser was built using the Jison utility developed by Zach Carter.74

A.1 Hail

Hail is a very simple and straightforward programming language that borrows elements from Pascal and BASIC, while also exhibiting unique structural elements. If you've ever coded in one of these languages - or a more complex language such as C or Java - you should have no problem at all picking up Hail. The language will be explained in more detail in subsequent sections.

A.1.1 Objects and Subroutines

The key components of the Hail Workshop system are objects and subroutines.

Objects are collections of pre-defined variables that may be instantiated in the subroutines or on the terminal command line. For more on objects, read through section 6.3.2 in the present study.

Subroutines are discrete units of code that are generally called to modify variable values, to modify object variable values, and to print information on the screen for the user. Once you have defined a set of objects, most of your work will involve the creation of subroutines.

73 http://electron.atom.io 74 http://zaa.ch/jison/

237 238

A.1.1 Architecture

As already noted, Hail Workshop is run off of the Electron framework. This means that it was coded entirely in Javascript, using node.js routines where and when necessary. Electron also provides full file I/O dialog box support, which is highly useful. The Hail parser was generated using Jison, so a separate Javascript file ("hail.js") handles language parsing.

A.1.2 Licence

Hail Workshop, including all the code listed here, is licensed under an MIT License.

A.2 Getting Started

The Hail Workshop is a programming environment that contains both code and program output. When you build and execute programs, you never leave the Workshop environment. Variable values also persist globally, so that you can use and reuse the same variables in different code. The environment can also be saved and loaded as files called images, so that you can pick up where you left off whenever you start a new session.

When you start your first session of Hail Workshop, it will look something like figure A-1, below.

Figure A-1. The Hail Workshop programming environment. 239

Note that there are three "windows" on the screen. Specifically, there is one terminal window (on the upper left), one output window (on the lower left), and one code box window (on the right).

A terminal window is the primary vehicle for user interactivity. Similar to a command-line interface, each terminal provides a prompt where the user can type in and execute commands. The user has access to the entire Hail language from the prompt, and may create variables, draw new windows, run code, and perform many similar functions. The output generated by code from a code box window may also be displayed in the current terminal window.

An output window is a dedicated space for program output. While the terminal may be used for such purposes, the output window is more versatile and robust. Output in the terminal window may be lost as it scrolls upward and out of the window space, but in an output window, information is easier to retain and manage.

A code box window is where users build Hail programs. Unlike in a terminal, code is not executed directly in a code box. This allows for the development of long, complex programs that may then be run via a terminal window.

A.2.1 Focus

To use a specific window, you have to put it in focus by clicking on it. The main terminal window will have focus when Hail Workshop is opened, by default.

A.2.2 Bring to Front

If the window you want to use is behind others, you can click on its header to bring it to the "front" of the screen.

A.2.3 Moving

To move a window, click and hold on its header, drag it to where you want it to go on the screen, then release the mouse button. 240

A.3 Commands and Code A.3.1 Entering Commands

All direct commands are executed via terminal prompts, depicted as '>' symbols. When a terminal has focus (see Getting Started), a flashing cursor will appear beside the prompt. To execute a command, type the command in at the prompt and press Enter.

In the example shown in figure A-2, a number value is assigned to 'a', and then is printed to the terminal:

Figure A-2. Setting and printing a variable value.

The terminal can be used to run any command in the Hail language, though it cannot process commands that require more than one line of code, such as if-end and for-next statements.

A.3.2 Entering Code

Longer programs may be typed into code boxes, to be executed later via the terminal. Pressing Enter while in a code box will make the cursor jump to a new line. Pressing Tab will indent the text on one line.

In the example shown in figure A-3, a for-next loop is created to print the numbers 1 to 10: 241

Figure A-3. A simple for-next loop

Note that all code in a code box must either be contained within a subroutine, or an object declaration (see below).

A.3.4 Running Code

Before code in a code box can be used, it has to be "run" in order to check for errors, and to put all objects and subroutines in memory. The run or runcode command may be used for such purposes. You must pass along the name of the code box itself as a parameter (see figure A-4).

If the system does not display any errors, you may now begin creating objects and running subroutines. In figure A-4, the code shown in figure A-3 is run, and then the subroutine count() is invoked:

Figure A-4. Running code. 242

A.4 Saving and Loading Images

When you invoke the save or saveimage command in Hail Workshop, you don't just save code: you save an image of your entire workspace, including code, of course, but also the names and positions of windows, the values of variables, and the transcripts of terminals. When you use the load or loadimage command, the workspace is adjusted to match the specifications set out in the image file.

The commands save and load open up a dialog box that lets you choose an existing file, or name a new one (save mode only). The commands saveimage and loadimage let you type in the name of a file within parentheses.

All Hail Workshop files end in a .hil extension. This will be added automatically, so do not type it in.

A.5 The Hail Programming Language A.5.1 Creating Objects and Instances

A specific set of specialized code is used to define and instantiate objects. Objects must be created and defined in a code box. Instances can be created either from within a subroutine, or on the command line. Figure A-5, below, shows how as object is defined.

Figure A-5. An object definition 243

Objects are defined within the statements obj and end obj. The object is named after the initial obj statement, followed by a colon. All object names must be unique.

Within the obj and end obj statements are var statements that define an object's internal set of variables. Each variable will be bound to a unique object instance.

All object code must be executed before defining any instances. Typically, "run([name of code box])" is used to do this (e.g. "run(maincode)"). If there are no errors in the code, objects may then be instantiated. This is done by using the new term within a variable assignment statement, as shown in figure A-6.

Figure A-6. Instantiating an object

Once an instance is defined, you may set any of the variable values within the original object definition. This is done via the typical variable assignment statement, with a period separating the name of the object from the name of the variable, as shown in figure A-7.

Figure A-7. Setting instance variables

Multiple instances of each object may be created and used simultaneously within the same Hail Workshop session, as shown in figure A-8. 244

Figure A-8. Setting instance variables

Instances may also be passed as parameters to subroutines, as will be discussed below.

A.5.2 Creating and Executing Subroutines

Subroutines perform actions on objects and variables via the imperative elements of the Hail language. When run, subroutines are stored in memory, and may then be called upon on the command line.

Subroutines are defined within the statements sub [name] and end sub, as shown in figure A-9. All subroutine names must be unique.

Figure A-9. A subroutine that accepts a generic object x

The name portion of the initial sub statement must also indicate the parameter(s) that must be set when the subroutine is called. These must be placed inside parentheses and separated by 245

commas if there is more than one, as shown in figure A-10. A subroutine with no parameters still requires parentheses to be added after the name, also as shown in figure A-10.

Figure A-10. Subroutines with two and zero parameters

Parameters that act as objects may reference variable values using the notation discussed earlier, with a period placed between the parameter name and variable (see figure A-7). So, for example, if the user entered "init(sumer)" on the command line (see figure A-9), then the "x" variable used in the init subroutine would stand in place for the sumer object. If the user entered "init(toronto)", the toronto object would be used, as shown in figure A-11.

Figure A-11. Passing objects as parameters 246

Note that if standalone variables – that is, variables that are not contained within objects – are used as parameters, they are passed by value. In its current, Javascript-based implementation, this is an inherent limitation of the underlying language.

A.5.3 Language Reference

The following commands may be used within subroutines, and, to a lesser extent, via a terminal (constructions such as for-next and if-then, for example, often span several lines of code, making them unfeasible on the command line.)

System Commands

• terminal(x, y, width, height): Creates a new system terminal window at the specified location.

• codebox(x, y, width, height): Creates a new coding window at the specified location. This allows the user to enter multiple programs into the system.

• resize(name, width, height): Re-sizes the window name to the specified width and height.

• clear(terminal): Clears the specifies terminal of text.

• run(codebox): Runs the code listed the specified code window. This will involve loading listed subroutines into memory, and directly executing the mainsubroutine.

• quit: Exits Hail Workshop.

Print Commands

• print(text): Prints the given text inside the terminal window currently in use.

• printat(terminal, text): Prints the given text inside the given terminal.

• printtab(tabsize, textarray): Prints the items listed in textarray as column headers, at a separation distance given by tabsize.

Variable Assignments

• All variable assignments are made using the := operator, not the = operator:

name := "Hail" valid code. name = "Hail" invalid code. 247

• Variable arrays are defined when the first index entry is made. Only one-dimensional arrays are supported.:

name[0] = "Matthew" array "name" is initialized, with value stored at index 0. name[1] = "Wells" value added to index 1; array is already initialized from previous statement. name[0][0] = "Matthew" invalid code; only one-dimensional arrays are supported.

Variable Functions

• int(value): Coverts value into an integer by rounding down.

• rand(max): Generates a random number between 1.0 and max.

Loops and Conditionals

• for (i from x1 to x2): A BASIC-style for statement, iterating i in the given range. • next: Closes off most recent for loop (no variable required). • if (condition) then: Standard BASIC-style if statement. The condition parameter supportsANDand ORarguments. • elseif (condition) then: Standard BASIC-style elseif statement. Must follow an if statement. • else: Standard BASIC-style else statement. Must follow an if or elseif statement.

• end: Closes off the conditional statement(s).

Subroutines

• Subroutines are called by typing in the sub name at the command prompt, followed by parameters (if needed) in parentheses.

• sub name(): Declaration statement for subroutine with no parameters.

• sub name(parameter1): Declaration statement for subroutine with one parameter.

• end sub: End statement for all subroutines.

File I/O

• saveimage(filename): Saves an image of the working environment with the name filename.hil.

• save: Opens the save file dialog box, allowing you to save an image. 248

• loadimage(filename): Loads image of the working environment with the name filename.hil.

• load: Opens the open file dialog box, allowing you to load an image.

Appendix B: Historical Timeline

This timeline chronicles events related to the development of computing technologies, as well as relevant political events, from the years 1938-1988. Public-sector and private-sector events have been separated, though private-sector research initiatives blur the boundaries somewhat – this is particularly true with Xerox PARC. Judgment calls were made in each case.

Political and Economic Research and Education Consumer and Corporate

1938: Mathematical Tables 1938: Ballistic Research Project initiated Laboratory (BRL) established by U.S. Army

1939 (Sept. 1): Germany invades Poland; start of Second World War

1941 (Dec.): United States enters Second World War

1943: U.S. Army commissions UPenn to build ENIAC

1944: Development of Bretton 1944: Project Whirlwind begins, Woods economic system MIT

1945 (Sept. 2): End of Second 1945 (Oct. 1): Creation of World War Project RAND

1946 (Feb. 15): Completion of 1946 (March): Formation of ENIAC computer, UPenn; U.S. Eckert-Mauchly Computer Army commissions UPenn to Corporation (EMCC) build EDVAC

1947 (Sept. 18): Formation of CIA

1948 (June 24): Start of Berlin 1948 (Nov. 1): Foundation of 1948: Founding of Bolt, Beranek Blockade RAND Corporation and Newman (BBN)

249 250

Political and Economic Research and Education Consumer and Corporate

1949: Formation of NATO; first 1949 (Aug.): Completion of atomic bomb test in USSR EDVAC computer, UPenn

1950 (June 25): Start of Korean 1950: "Bouncing Ball" program 1950 (Feb. 15): Remington Rand War created for Whirlwind, MIT acquires EMCC

1951 (Apr. 20): Completion of 1951 (Mar. 21): Development of Whirlwind computer, MIT UNIVAC I computer, Remington Rand

1952 (Nov. 1): First U.S. test of 1952 (Apr. 29): IBM introduces hydrogen bomb 701, its first commercial digital computer

1953: End of Korean War; 1953: RAND develops JOHNNIAC formation of KGB; Nikita computer Khrushchev becomes leader of USSR

1954: Geneva Conference 1954: IBM develops 650 divides Vietnam into separate computer; initial development of North and South states. FORTRAN language, IBM

1955 (May 14): Establishment of 1955: RAND develops Warsaw Pact; U.S. enters into "Monopologs" game Vietnam War

1956: Hungarian Revolution 1956: Completion of TX-0 erupts; Suez Crisis erupts computer, MIT

1957 (Oct. 4): Launch of Sputnik 1957: Founding of DEC; 1 satellite, USSR American Management Association (AMA) develops Top Management Decision Simulation (TMDS) game.

1958: Foundation of NASA; 1958: Development of Lisp foundation of ARPA language, MIT

1959: Development of Carnegie Tech Management Game; emergence of COBOL language

1960 (May 1): USSR shoots 1960: Licklider publishes "Man- 1960: DEC develops PDP-1 down American U-2 spy plane Computer Symbiosis" computer 251

Political and Economic Research and Education Consumer and Corporate

1961: Construction of Berlin 1961: Development of CTSS 1961: Foundation of Digital Wall begins; Yuri Gagarin orbits time-sharing network, MIT Equipment Computer Users' Earth in Vostok spacecraft Society (DECUS)

1962: Cuban Missile Crisis; John 1962: Spacewar! developed at Glenn orbits Earth in Friendship MIT 7 spacecraft

1963 (Nov. 22): U.S. President 1963: Ivan Sutherland develops John F. Kennedy assassinated Sketchpad, MIT; Douglas Engelbart forms Augmentation Research Center (ARC); initiation of JOSS project, RAND

1964: Civil Rights Act enacted in 1964: Development of BASIC U.S.; Khrushchev removed from language, Dartmouth power, Leonid Brezhnev becomes leader of USSR 1965: The Sumerian Game 1965 (Mar. 22): DEC develops developed and tested, BOCES PDP-8 computer

1966: 3rd version of BASIC 1966: Hewlett-Packard develops developed (with INPUT) HP2116a computer

1967 (June): Six-Day War erupts 1967: STROP war game in Middle East implemented on JOSS, RAND; work on BBN Lisp (later Interlisp) initiated; development of Logo language, BBN

1968: Tet Offensive launched 1968 (Dec. 9): ARC demonstrates 1968: Development of FOCAL against South Vietnam; Prague oN-Line System (NLS) language, DEC Spring in Czechoslovakia

1969: Apollo 11 lunar module 1969 (Dec.): ARPANET network 1969: David Ahl joins DEC, lands on moon; first draft lottery developed by BBN; Alan Kay focuses on educational outreach held in U.S. in Vietnam era completes dissertation "The Reactive Engine"

1970: Nuclear Non-Proliferation 1970: Xerox forms Palo Alto Treaty enters into force; Kent Research Center (PARC) State shootings

1971: Alan Kay begins work on 1971 (Nov.): Nutting Associates Smalltalk language, PARC (later Atari) creates Computer 252

Political and Economic Research and Education Consumer and Corporate

Space arcade game. 1972 (May. 26): Richard Nixon 1972: BBN Lisp developers move 1972: Magnavox Odyssey game and Leonid Brezhnev sign SALT I to PARC, create Interlisp; Ray console released; Atari creates nuclear arms control treaty; Tomlinson of BBN develops first Pong arcade game Nixon visits China email system; development of C programming language, ; development of Prolog language

1973: Oil embargo enacted by 1973 (Mar. 1): PARC develops 1973 (July): DEC publishes 101 OPEC nations; U.S. military Alto computer; first "Lisp BASIC Computer Games withdraws from Vietnam machine" developed, MIT

1974 (Aug. 9): Nixon resigns 1974: Robert Kahn and Vinton 1974: Ahl founds Creative from U.S. presidency amid Cerf develop Transmission Computing magazine Watergate scandal Control Program (TCP) networking protocol, ARPA

1975 (July): Apollo-Soyuz Test 1975: Foundation of Microsoft; Project launched MITS releases Altair 8800 PC; Microsoft releases Altair BASIC (later Microsoft BASIC)

1976: Kay and PARC develop 1976: Apple Computer Company Smalltalk-76 releases Apple I kit computer

1977: Commodore releases Commodore PET; Apple releases Apple II; Tandy releases TRS-80; Atari releases Atari VCS

1978 (Feb.): U.S. Department of 1978: Sandewall publishes 1978 (Nov.): Ahl publishes BASIC Defense launches first GPS "Programming in an Interactive Computer Games: satellite Environment" Microcomputer Edition

1979 (June 18): Jimmy Carter 1979: Niklaus Wirth develops 1979: Compute! Magazine and Brezhnev sign SALT II Pascal programming language launched; Mattel Electronics nuclear arms control treaty; releases Intellivision video game USSR invades Afghanistan; console. Iranian Revolution occurs; second oil crisis begins

1980: Iran-Iraq war begins; 1980: Kay and PARC develop severe oil shortage occurs; Smalltalk-80; UseNet created, Solidarity trade union founded, linked to ARPANET Poland 253

Political and Economic Research and Education Consumer and Corporate

1981: Attempted assassination 1981 (July): PARC develops Mesa 1981: Commodore releases of U.S. President Ronald Reagan; programming language Commodore VIC-20; IBM U.S. Congress passes Economic releases IBM PC; Xerox releases Recovery Tax Act Xerox Star; Microsoft releases MS-DOS operating system.

1982 (Jan.): Commodore releases Commodore 64

1983 (Mar. 23): Reagan 1983: Development of C++ 1983: North American video announces Strategic Defense programming language; Richard game crash; Compaq releases Initiative (SDI) program Stallman launches GNU Project, IBM PC-compatible Compaq MIT Portable computer

1984 (Jan. 24): Apple releases Macintosh computer; IBM releases IBM PCjr; IBM releases IBM AT; Nintendo releases Nintendo Entertainment System (NES) in North America

1985 (Mar. 11): Mikhail 1985 (Feb.): Stewart Brand 1985: Commodore releases Gorbachev becomes leader of launces Whole Earth 'Lectronic Amiga 1000; Microsoft USSR Link (WELL) virtual community. introduces Windows, a graphical shell for MS-DOS; Creative Computing ceases publication

1986 (Apr. 26): Chernobyl disaster, USSR

1987 (Dec. 8): Reagan and 1987 (Dec. 9): Microsoft releases Gorbachev agree to Windows 2.0 Intermediate-Range Nuclear Forces Treaty

1988 (Aug. 20): Iran-Iraq war 1988 (Oct. 12): NeXT Inc., ends founded by Steve Jobs, releases NeXT Computer