<<

The History of in the History of Haigh Thomas

To cite this version:

Haigh Thomas. The in the . Cahiers ’histoire Cnam, Cnam, 2017, La recherche sur les systèmes : des pivots dans l’histoire de l’informatique – II/II, 7-8 (7-8), pp77-90. ￿hal-03027081￿

HAL Id: hal-03027081 https://hal.archives-ouvertes.fr/hal-03027081 Submitted on 9 Dec 2020

HAL is a multi-disciplinary open access L’archive ouverte pluridisciplinaire HAL, est archive for the deposit and dissemination of sci- destinée au dépôt et à la diffusion de documents entific research documents, whether they are pub- scientifiques de niveau recherche, publiés ou non, lished or not. The documents may come from émanant des établissements d’enseignement et de teaching and research institutions in France or recherche français ou étrangers, des laboratoires abroad, or from public or private research centers. publics ou privés. 77

The History of Unix in the History of Software

Thomas Haigh University of Wisconsin - Milwaukee & Siegen University.

You might wonder what I am doing The “software crisis” and here, at an event on this history of Unix. the 1968 NATO Conference As I have not researched or written about on Software the history of Unix I had the same question myself. But I have looked at many other The topic of the “software crisis” things around the history of software and has been written about a lot by profes- this morning will be talking about how sional historians, more than anything some of those topics, including the 1968 else in the entire history of software2. It NATO Conference on Software Enginee- is usually connected the to 1968 NATO ring, Algol, IBM SHARE and mathema- Conference on , tical software in the 1970s, connect to which is sometimes claimed to be this the origins of Unix. As I worked to pull kind of very broadly based conference in this presentation together I realized that which industrial managers, , some of those connections are clearer and and academics came together. It is also more interesting than I had previously sometimes suggested that this had enor- assumed, to the extent that they challenge mous ramifications for our actual typical us to reinterpret some of what we think programming practice was conducted we know about software history during during the 1970s and later. this era1.

1 This text is the transcript of a keynote talk given at the International symposium “Unix in France and in 2 The software crisis and the NATO conference appear the United States: innovation, diffusion and prominently in Mahoney (1988), Campbell-Kelly & appropriation” organized at Conservatoire national des Aspray (1996), MacKenzie, (2001) and several of the arts et métiers, Paris, France, on October 19th, 2017. contributions to Hashagen & al. (2002). I have a more specific sense of why the product is often larger in size and those things were important. Suggesting slower in execution than need be. The experience with this technique has led that there is a connection of the Software some people to opine that any software engineering conference to the mainstream system that cannot be completed by programming in corporations producing some four or five people within a year packages, accounting, is, according to can never be completed. me, an exaggeration. I believe that the perceived crisis at the end of the 1960s As Fred Brooks, who was overseeing was very specifically in the development the OS/360 development eventually, and of systems software, primarily opera- famously, concluded: “Adding manpower ting systems, which is a category, in that to a late software project makes it later” period, that is still tightly bonded with (1975). To put his argument in economic and languages. terms, the marginal benefit that you get from having an extra- is more So what was the software crisis than offset by the increase transaction then? Looking back at the proceedings, costs in coordinating the work of a larger we see that there were indeed, at the group of people. The high profile pro- NATO conference, many people expres- blems of such projects increased interest in sing concern about what was going on finding better ways, which was to be called within the development of operating “software engineering”, although nobody systems. Two systems in particular were at conference knew what that might be. of concern: IBM OS/360 (although none of the thousand people developing it were I have a tangential argument that actually at the conference) and . I would like to share, from my paper Several people developing Multics were “Dijkstra’ Crisis: The End of Algol and at the conference, as well as the three the Beginning of Software Engineering.”3 partners of the project, MIT, General This was written back in 2010 as part of Electrics (GE), and , including the Software for Europe project, for inclu- Edward . David who was one of the sion in a since abandoned book project. three managers with overall responsibi- I noted that the specific phrase “software lity for the project. He represented Bell crisis” doesn’t actually appear in any of Labs and he wrote in his position paper the quoted dialog or position papers for the for the conference: conference, though it does appear in the in- troduction to the proceedings. My sense is Among the many possible strategies for that it spread and took its modern associa- producing large software systems, only tion with Dijkstra’s Lecture one has been widely used. It might be labeled “the human wave” approach, for typically hundreds of people become 3 See the draft version [URL: http://www.tomandmaria. involved over a several years period… com/Tom/Writing/DijkstrasCrisis_LeidenDRAFT. It is expensive, slow, inefficient, and pdf].

78 79

“The humble programmer” (1972), a title my argument is that the NATO conference which can be read as a rebuke to the preten- was to a large extent an attempt by those tiousness of the idea of being a “software Algol “refugees” – who quit or wrote mi- ”. In a way, this cements retroacti- nority reports – in order to create a new vely the idea that the most important thing community that would preserve that col- that happened at the conference was the laboration but take on a broader focus. declaration of a “software crisis” to which Dijkstra wrote a minority report, signed Dijkstra proposed some solutions. by seven members. The interesting thing about that is that there were many specific You might wonder what Algol criticisms circulating of Algol 68 as a lan- was doing in the title of my paper. The guage, but Dijkstra’s dissent did not really Algol effort launched in 1958, conti- address those at all. Instead it called for a nuing through the 1960s, and was for- philosophical kind of shift from traditional malized in 1962 by IFIP Working Group ideas about the purpose of a programming 2.1. My view is that the Algol effort did language towards the design of systems more than any other project in the history that would enforce good practice in the of computing to create an internatio- development of complex programs. There nal research commu- is a significant kind of overlap between the nity. If you look at how the early Turing participants in the Algol efforts, particular- awards were given, then you see seven of ly ones who were dissatisfied with Algol them awarded to members of the original 68, and the group that were most active groups that defined Algol (in 1958 and in the discussions of the 1968 conference 1960). But in the second half of the 1960s, and in editing the proceedings, which were things took a less happy turn. The group enormously important in shaping how the decided to go with a proposal (adopting a conference was remembered4. draft from Adriaan van Wijngaarden) for what became Algol 68, which was deeply In “Dijkstra’s Crisis...” I argued controversial as it rejected a proposal from that, looking at the trajectories of some Wirth for what became Pascal. Tensions of those individuals, including Dijsktra, only get worse over the few years and Hoare, Randell, Perlis, Naur, Wirth, etc., when in 1968 the new language (Algol an interesting kind of shared career tra- 68) was approved, the group essentially jectory emerges. Trained in science or fell apart. Eleven working group members mathematics, they worked in early 1960s either resigned or signed minority reports, developing systems software in expressing differences with the direction teams either in marginal manufacturers that the work took. 4 13 of the 28 members of IFIP WG 2.1 active in design If you look at the chronology there, of Algol 68 attended the 1968 or 1969 conference. Naur and Randell edited the proceedings. Naur resigned from 1968 is the same year as the NATO WG 2.1. Randell drafted a minority report opposing Software Engineering conference. Thus Algol 68 and signed Dijkstra's report. (not IBM) or in research settings. They It was with little surprise to any of the were very small groups, and they pro- participants in the Rome conference that no attempt was made to continue duced groundbreaking compilers and the NATO conference series, but the operating systems. By the time of the software engineering bandwagon began NATO conference, they were in a transi- to roll as many people started to use the tion from writing the systems to working term to describe their work, to my mind in corporate research labs or universities. often with very little justification. Reac- ting to this situation, I made a particular So, among that community, the word had point for many years of refusing to use spread concerning the massive scale of the term or to be associated with any IBM’s OS/360 and TSS efforts and their event which used it5. lack of success – what Dijkstra called the “Chinese army” approach and its failure. I think that would capture the atti- I think it was instinctive for them to try tude of some Algol veterans as well. But, to formalize and to institutionalize mathe- I now suspect that there is another kind matical and logical approaches to the de- of response that we can describe to the velopment of system software that came problems discussed at the 1968 NATO naturally from their training and their Conference, beyond the two I have just own experience as being successful in de- described (the people who picked up veloping such systems some years earlier. “software engineering” as an identity and the group that went instead in the After the NATO 1968 conference direction of IFIP Working Group 2.3 and and its much less successful follow up in “formal methods”). Rome in 1969, my argument is that this core Algol dissident group essentially I am going to illustrate this response abandoned the “software engineering” by looking at Douglas McIlroy. His back- project and instead found happiness in ground had something in common with a different IFIP working group (2.3) on the other people I have described. Born “Programming Methodologies”, which in 1932, he had a PhD in applied mathe- was focused on theory. That is where matics, and spent his carrier (1958-1997) ideas such as in Bell Labs, a research institution. He comes from. Randell wrote: “we re- was head of the computing technology garded this group, WG 2.3, as having research group between 1965 and 1986. twin roots, the Algol Committee and Bell Labs had been one of the core part the NATO Software Engineering Confe- of the development of Multics from rence.” (Randell, 2003) 1964 onward. At the beginning of that period McIlroy was deeply involved with Others of course picked up the term “software engineering”, that they dis- 5 “The 1968/69 NATO Software Engineering Reports” carded, and repurposed it. Randell was [URL: http://homepages.cs.ncl.ac.uk/brian.randell/ critical of this and wrote: NATO/NATOReports/].

80 81

Multics, implementing the stopgap PL/1 mething called “Mass Produced Software used in early Multics develop- Components”. Looking at the idea of the ment work. The fact that the project had “software factory”, an idea that was floa- chosen a language for which no compi- ting around at the conference, he argued ler yet existed gives you a sense of the that the idea of subassemblies was the challenges facing operating systems pro- most transferable part of an industrial jects in the era. He was one of the more production that corresponded with the vocal participants in the 1968 NATO idea of modularity6. He also said: conference. My thesis is that the software industry Looking at the chronology here, the is weakly founded, and that one aspect year after that Bell Labs quit Multics, in of this weakness is the absence of a April 1969, is the same year Unix deve- software components subindustry…. The most important characteristic of lopments begins. I am really interested a software components industry is in that early period of Unix history here. that it will offer families of routines The first version of Unix (1971) was for any given job. No user of a parti- used internally for text processing. It was cular member of a family should pay a penalty, in unwanted generality, for rewritten in in 1973, and in 1975 it the fact that he is employing a standard begins to spread widely outside Bell Labs model routine…. to eventual world domination. One of the best-known works on this history, at least among the historical community, is The Computer Boys Take Unix as a Software Over by Nathan Ensmenger (2010). The Components Factory dissertation it was based on was called “From “black art” to industrial disci- McIlroy made a famous remark at pline,” a title that gives you a good idea the 1968 conference: “We undoubtedly of the arc of the story. In Ensmenger’s produce software by backward tech- telling, the software crisis and the 1968 niques. We undoubtedly get the short end conference were a crucial episode of the of the stick in confrontations with hard- transformation of software from a “black ware people because they are the indus- art” practiced by eccentric and unmana- trialists and we are the crofters. Software geable virtuosos to a highly routinized production today appears in the scale of kind of industrial production. He puts industrialization somewhere below the more backward construction industries.” Crofters are more or less Scottish pea- 6 “Certain ideas from industrial technique I claim are sants, so McIlroy is opposing very small relevant. The idea of subassemblies carries over directly and is well exploited. The idea of interchangeable parts scale, inefficient domestic production to a corresponds roughly to our term ‘modularity’, and is factory. His solution to this was to be so- fitfully respected.” McIlroy’s paper at the conference at the be on an ordinary software project, center of that discussion, calling it “bla- because the level of programming here tantly management oriented”. According will be more abstract.” Among the deve- to Ensmenger “McIlroy rejected the idea lopment teams using these components that large software projects were inhe- to produce application systems, impli- rently unmanageable. The imposition citly a much larger group, I believe that of engineering management methods he thought that the computer itself takes had enabled efficient manufacturing in over the routine parts of programming. myriad other industries, and would not During the discussion at the conference fail to do the same for - McIlroy quipped “It would be immoral ming.” (Ensmenger, 2010, p.197) He also for programmers to automate everybody write that McIlroy’s “vision of a software but themselves.” ‘components factory’ invokes familiar images of industrialization and proleta- Admittedly the metaphors of com- rianization. According to his proposal, ponents, factories, and industrialization an elite corps of ‘software ’ were used rather incoherently at the confe- would serve as the Frederick Taylors of rence (Mahoney, 2004). My sense is that, the software industry, carefully orches- when he talked about industrialization trating every action of a highly stratified by software components, McIlroy’s idea programmer labor force.” (ibid.) was that it would allow the users of the components essentially to remain crof- So McIlroy has been cast by histo- ters, producing quick results in groups of rians as the man who brought, or at least no more than four or five people, which tried to bring, Taylorism into the heart as McIlroy’s former manager Edward for software production. Although I have E. David had noted was the only team been thinking for a while about the NATO configuration proven to be effective in conference for years, until last week I had software production during that era. not looked closely at what McIlroy actual- ly wrote or at how he managed his own In sum, my thesis is that McIlroy software projects. Now that I have done, is not talking about a Taylorized vision I don’t see any Taylorism in McIlroy’s of deskilled programming labor, but… text. My reading is that his idea was to about something very much like the eliminate routine work with the aid of ge- project he was about to manage: Unix. A neralized routines, not to deskill program- world in which small teams of eccentric mers. His paper reads: “The personnel men with beards produced remarkable of a pilot plant [producing components] things, thanks to the benefits of reusable should look like the personnel on many codes and modularity. Indeed, McIlroy’s big software projects, with the masses of 1968 paper proposed Bell Labs as an or- coders removed… There will be perhaps ganization that well equipped to produce more research flavor included than might software components:

82 83

Bell Laboratories is not typical of com- Secondly, the quote made me puter users. As a research and develop- wonder about the connection of pipes, ment establishment, it must per force his main technical contribution to Unix, spend more of its time sharpening its tools, and less using them than does to his earlier musings about configurable a production computing shop…. The software components. McIlroy seems to market would consist of specialists in have been thinking about this for years. system building, who would be able I found a document on the , to use tried parts for all the more com- monplace parts of their systems. purportedly from October 11th 1964 in which McIlroy wrote: “We should have ways of coupling programs like garden McIlroy even identified text pro- hoses - screw in another segment when cessing as an area where reusable tools it becomes necessary to manage data in were particularly important. He wrote: another way”7. This could be seen as an “Nobody uses anybody else’s general early statement of the idea realized with parsers or scanners today, partly because the pipe mechanism. In his oral history he a routine general enough to fulfil any said that from 1970 to 1972: particular individual needs probably has so much generality as to be inefficient.” one day I came up with a syntax for the And that of course was the original appli- that went along with the piping and cation of Unix a few years later. Ken said, ‘I’m gonna do it.’ He was tired of hearing all this stuff…. He put pipes into Unix. He put this notation into the When he oversaw the Unix project, shell [Here McIlroy points to the board, McIlroy had the perfect opportunity to where he had written f >g> c], all in one practice his beliefs on the management of night…. [M]ost of the programs up until programmer teams. In an oral history in- that time couldn’t take standard input, because, there wasn’t the real need. terview with Michael Mahoney in 1993, They all had file arguments…. Thomp- McIlroy said that “one of the only places son saw that that wasn’t going to fit into where I very nearly exerted managerial this scheme of things, and he went in control over Unix was pushing for the in- and changed all those programs in the same night. I don’t know how. And the clusion of the pipe mechanism. […] From next morning we had this orgy of ‘one time to time,” he would say ‘How about liners.’ Everybody had another one making something like this?’ and then liner. Look at this, look at that. put up another proposal.’”. So firstly, this quote is hard to square with the idea that That’s McIlroy’s description of how he ran projects like Frederick Taylor. By pipes got into Unix. Mahoney asked: “was his own estimation he did little more than there a notion of […] a almost exert control on a few occasions, giving his talented team great autonomy. 7 See [URL: http://doc.cat-v.org/unix/pipes/]. This is He was also a hands-on contributor to the on an image of what appears to be a page excerpted project, writing some programs himself. from a larger document. or a tool philosophy before pipes or did rather than managerial mechanisms to pipes create it?” According to McIlroy: coordinate the production of extremely complex software systems. One of the Pipes created it…. The philosophy that other things he is sometimes credited everybody started putting forth: “This with is pushing the Unix team to develop is the Unix philosophy. Write programs the system of man pages8. Pipes allow in- that do one thing and do it well. Write teroperability without requiring common programs to work together. Write pro- grams that handle text streams because coding conventions or master planned that is a universal interface.” All of structures. Likewise, man pages were those ideas, which add up to the tool ap- a flexible and minimalist alternative to proach, might have been there in some creating thousands of pages of documen- unformed way prior to pipes. But, they tation in advance of the creation of the really came in afterwards. system as the big projects of the 1960s have attempted. Mahoney had read McIlroy’s 1968 paper and asked if this realized his hopes for software components. McIlroy said: “if you if stand back, it’s the same idea. Parallels between But, it’s at a very different level, a higher mathematical software level than I had in mind.” and Unix

As a historian I do, of course, tend to Another interesting thing was that stand back. From that perspective I think McIlroy was clearly inspired by mathema- the pipe mechanism was not the only tical software. When he is describing the thing that McIlroy contributed to the mo- idea of reusable software components in dularity of Unix. His choice not to impose his 1968 paper, his example, perhaps sur- heavy handed managerial control on the prisingly, is a generator for sine routines. project encouraged or at least permitted the decentralized style that we associate with Unix and which comprises: loosely 8 For example, according to the Wikipedia page dedicated to the system of “man pages” [URL: https:// coupled parts, each part small enough to en.wikipedia.org/wiki/Man_page]: “The first actual be understood easily by an individual; a man pages were written by and at the insistence of their manager Doug relatively small kernel; different com- McIlroy in 1971.” I do not vouch for the accuracy of mands notoriously written by different this. McIlroy himself attributed the specific format of people in different ways with different man pages to Dennis Richie, though that does not preclude the idea that he had prodded Ritchie to come kinds of expectations for switches; the fact up with something. He noted that “During the system’s that you can choose your own shell, etc. first two years one literally had to work beside the originators to learn it. But the stock of necessary facts was growing and the gurus’ capacity to handle So McIlroy was a leading prac- apprentices was limited.” (See [URL: http://www. titioner of the art of using technical cs.dartmouth.edu/~doug/reader.pdf].)

84 85

He also mentions mathematical routines and configuration of machines that would as the area, as of 1968, where the reuse of facilitate collaboration between different components was most advanced. He men- user organizations. It had open circula- tions: “Choice of . In numerical tion of proposals and design documents. routines, as exemplified by those in the It also had the idea that there was a spe- CACM, this choice is quite well catered cific SHARE culture into which new for already. For nonnumerical routines, IBM users had to be indoctrinated before however, this choice must usually be they could be effective participants in decided on the basis of folklore.” the group. The software library had rou- tines contributed by user sites and relied This must be put in the context I on IBM to catalog and reproduce the mentioned earlier, by going back to 1950s code. It classified the routine contributed as I promised I would do, at least briefly, according to a scheme that permitted to in order to look at mathematical software organize it. Contributors were supposed and user groups in the 1950s. McIlroy to be responsible for maintenance. On talked about several sources and possible the standardization side you got a list of producers of software components: “What many standards that SHARE adopted, to about the CACM collected ? share code and practices, to standardize What about users’ groups? What about machine configuration (setting of swit- software houses? And what about manu- ches, control panels, etc.), to standardize facturers’ enormous software packages?” system software (assembler and utility He said that “User’s groups I think can be programs- not supplied by IBM), that dismissed summarily and I will spare you leads to big project to create the “SHARE a harangue on their deficiencies.” That is ”. As a well financed all he had to say on the topic. group they could afford to maintain the SSD – Share Secretarial Distribution, a Why? Remember that McIlroy had mechanism for communication between been working with SHARE on the com- meetings, mailing of large bundles of mittee that was defining PL/1 back in assorted materials (Committee reports, 1964. SHARE was founded in 1956 as a drafts for comments, letters, inquiries cooperative group for “large” IBM scien- and responses, including bug reports; tific machine users. As its name suggests, also microfilms of for pro- it was intended to “share” programs, grams). It was like the Internet RFCs or a expertise, experiences and best practices, Newsgroup, but in the 1950s and some similarities with the open source on paper. model as later defined. SHARE was com- posed of ad hoc collaboration groups for The installation representatives specific projects. It had mechanisms for were senior figures, responsible for mul- code in its library, to share and respond to timillion dollars computing installations. bug reports. It set up standards for coding Representatives often had an enginee- ring or science background, and advanced nies, IMSL and NAG, began to sell nume- degrees. They were responsible for the rical libraries. The creation of specialized design and specification of software packages by small teams of experts, e.g. tools. SHARE representatives were able LINPACK and EISPACK projects was to commit employees to develop code, also key to this emergence. Those are not driven by the economies of scale in deve- actually exclusive choices. The same code loping generic routines. One needed to was sometimes be made available in all accumulate a lot of library code and tools three channels: published, distributed in to a computer usable and there was non-commercial PACKS and incorpora- no proprietary advantage in developing ted with other routines into commercial something like a cosine routine. numerical libraries.

When talking about Unix and the From a history of science perspec- closely intertwined history of free software tive, it is interesting to see arguments that mechanisms for coordinating and distribu- the best way to improve the quality of ting software of this kind, it is interesting to the code was by peer reviewing it. This mention that SHARE had something very was tried in SHARE in the 1960s (the like the open source model in the 1950s. Project) although it But IBM, Bell Labs, and other software ran into problems there. Peer reviewing producers actually backed away from it. a mathematical routine took a lot of time Historian Atsushi Akera, discussing the and expertise, so as a way of improving problems with the “SHARE Operating the SHARE library didn’t work as well as System” project, after which the design people hoped. But peer review of software of system software migrated to IBM developed from there, thanks to a publica- (rather than being done within SHARE tion named ACM TOMS (Transactions on itself), called this “The Limits of Volunta- Mathematical Software) started 1975 by rism” (Akera, 2001). But in mathematical John Rice. This was created specifically software the transition for development as a publication venue for peer reviewed effort from users to IBM was much less mathematical software (program source pronounced than in operating systems. code was distributed via microfiche, card Here SHARE remained a primary distri- and tape). This was an important step bution mechanism until early 1970s. because academic culture gave little credit for writing programs, still less for doing In the 1970s, three models for pac- the work needed to produce high quality kaging and distributing mathematics rou- portable and robust code. As least code tines emerged. As McIlroy mentioned, distributed in TOMS could be considered this was supported by peer review and pu- a publication (Haigh, 2010). blications in journals, including CACM, as well as the commercial sale of software The PACK model is interesting to libraries. In the early 1970s, two compa- look at, beginning with EISPACK re-

86 87 leased in 1972, a package that computes tines. They didn’t expect them to fixing eigenvalues and eigenvectors of matrices, bugs in the code. giving standard routines in this area for a decade. Initially it was a The other aspect of mathematical conversion of Algol routines by James H. software that should be explored more Wilkinson and Christian Reinsch (who fully as a parallel with Unix is the mixture had collaboratively produced them a few of academic and commercial involvement, years earlier), which in turn implemented in a way that is different from the way we new, dramatically improved methods. The think of open source. The people involved government grant for EISPACK was given had pragmatic interest in getting their code explicitly to test a new methodology in used by as many people as possible. Their software production. There is an interes- salary was already paid by lab or universi- ting parallel to make with the approaches ty, they were not concerned with copyright used on the Unix project to produce and or licensing arrangements. That’s a parallel distribute software. The PACK model between the conditions at somewhere like was widely adopted9, particularly with Argonne National Laboratory and the spe- LINPACK, a sequel to EISPACK, which cific conditions at Bell Labs in the 1970s. had a huge impact in super computing. In many ways, it is an extension of social norms and practices of science where, As with Unix, modularization of according to which people are expected to platform-specific code, to give high per- share their work and data freely. formance and portability, was a crucial part of this approach.

What happened there can be com- Conclusion pared to the Unix phenomenon: experts producing the best routines they can on We can frame the history of Unix a particular purpose, which can then be in different ways. From my viewpoint, black boxed and distributed, and used by the history of Unix is part of the history other people. When I talked to the people of operating systems, which is part of involved with those projects, they didn’t the history of software, which is part of have the romanticized view of user driven the history of technology, which is part involvement pushed by the likes of Eric of… (insert additional levels here) … the Raymond. They didn’t expect the users history of the universe. You can approach to be contributing code or particular in- in different ways as an example of dif- sights. They wanted them to test the rou- ferent things.

We can’t look at Unix in isolation. 9 Dozens of specialized packages were produced within the labs during this era: FUNPACK, MUDPACK, Specifically, here I have tried to sketch FISHPACK, etc. (see Cowell, 1984). the mechanisms used in Unix, for the pro- duction and distribution of Unix that were to this issue, as opposed to the “Chinese inspired by, and have interesting parallels army” approach Dijkstra described. But with those used in mathematical software figuring out a workable modularization from the 1950s onto the 1970s. The histo- method is hard. rical range of development and distribu- tion practices can’t be reduced to a simple We are left with an open question: “open vs. closed source” dichotomy. is Unix the first solution to the “software crisis” that actually worked? It had the I also think we should recognize technological and social mechanisms that these quasi-academic environments to modularize the operating system; it like Bell Labs and the US National Labs created chunks that one or two very smart are crucial for software development people could design and code. We can during this period. They are less focused see its success in various areas as in pro- on publication than universities; but on ducing generalized tools that embed the the other hand, they are less focused on tacit knowledge to do the hardest stuff, products than regular corporate develop- like “lex” and “”. In the mid-1960s ment groups. developing a good compiler was a huge challenge, and development delays and In the era of the “software crisis”, disasters were much discussed in the com- in the late 1960s, it was acknowledged puter industry trade press and at the 1968 that operating systems and compilers NATO conference. Thanks to these UNIX were very complicated. They were be- tools, and the new body of theory they em- coming too complicated for one or two bodied, by the end of the 1970s the crea- people to code. Even earlier in the 1960s tion of a compiler could be assigned as a it was very hard for one or two people to class exercise to undergraduate students. take on a full-size operating system and Many attempts are later made to solve compiler. This is why software had a bad the same problem of creating the tools record at that time: the few who could do that people could use to produce complex systems software well tended to become software in modularized ways: design pat- famous computer scientists. However, terns, object oriented languages, applica- by 1968 even the geniuses couldn’t tion frameworks, source code repositories handle all the complexity involved in de- and version control systems. veloping systems with all the functiona- lities which were expected for operating Thus, we can understand Unix as a systems that could deal with time-sha- response to the broader software crisis, ring and the requirements of commercial in the broader range of problems in the software. In the 1968 conference procee- 1960s, and not just as an answer to the dings, the need for technical architecture specific problems of Multics, as it is to modularize the production of system usually understood. We can also see the software appears as an obvious solution connection of the Unix project to the work

88 89 that was going on at Bell Labs around Bibliography formal methods and compilers, would be very interesting for historians to dig into. Akera A. (2011). “Voluntarism and the The same goes for other ideas circulating fruits of collaboration: The IBM user group at around the same time such as structu- Share.” Technology and Culture, 42(4), red programming (the most famous IFIP pp.710-736. WG 2.3 contribution), chief programmer Brooks F. (1975). “The mythical man teams, etc. month.” Proceedings of the international confe- rence on Reliable software, Los Angeles, Cali- In that sense, Unix is an idea about fornia – (April 21 - 23, 1975), ACM. the structuring and management of software creation. An idea that is realized Campbell-Kelly M. & Aspray W. (1996). Computer: A History of the Information by features embedded in the code itself, Machine. New York, NY: Basic Books. including pipes. We can also see Unix as a model for the distribution and packaging Cowell W. (1984). Sources and Deve- of software. The work of situating Unix lopment of Mathematical Software. New York: Prentice-Hall. within the broader history of software and computer science has barely begun. Dijkstra E. W. (1972). “The humble pro- grammer”. Communications of the ACM 15.10, pp.859-866.

Ensmenger N. (2010). The Computer Boys Take Over. Cambridge (Mass.): MIT Press.

Haigh T. (2010). “John . Rice: Mathe- matical Software Pioneer.” IEEE Annals of the 32, n° 4, pp.72-80.

Hashagen U., Keil-Slawik R. & Norberg A. L., eds. (2002). Mapping the History of Computing: Software Issues. New York: Springer-Verlag.

MacKenzie D. (2001). Mechanizing Proof. Cambridge, MA: MIT Press.

Mahoney P. S. (1988). “The History of Computing in the History of Technology”. Annals of the History of Computing 10, no. 2, pp.113-125,

Mahoney P. S. (2004). “Finding a History for Software Engineering”. IEEE Annals of the History of Computing 25, no. 1, pp.8-19. Randell . (2003). “Edsger Dijkstra”. Ninth IEEE International Workshop on Ob- ject-Oriented Real-Time Dependable Systems (WORDS’03F), IEEE Computer Society.

90