A Science of Operations: Machines, Logic and the Invention Of

Total Page:16

File Type:pdf, Size:1020Kb

A Science of Operations: Machines, Logic and the Invention Of History of Computing Series Editor Martin Campbell-Kelly, University of Warwick, Coventry, UK Advisory Board Gerard Alberts, University of Amsterdam, Amsterdam, The Netherlands Jack Copeland, University of Canterbury, Christchurch, New Zealand Ulf Hashagen, Deutsches Museum, Munich, Germany John V. Tucker, Swansea University, Swansea, UK Jeffrey R. Yost, University of Minnesota, Minneapolis, USA The History of Computing series publishes high-quality books which address the history of computing, with an emphasis on the ‘externalist’ view of this history, more accessible to a wider audience. The series examines content and history from four main quadrants: the history of relevant technologies, the history of the core science, the history of relevant business and economic developments, and the history of computing as it pertains to social history and societal developments. Titles can span a variety of product types, including but not exclusively, themed volumes, biographies, ‘profile’ books (with brief biographies of a number of key people), expansions of workshop proceedings, general readers, scholarly expositions, titles used as ancillary textbooks, revivals and new editions of previous worthy titles. These books will appeal, varyingly, to academics and students in computer science, history, mathematics, business and technology studies. Some titles will also directly appeal to professionals and practitioners of different backgrounds. Author guidelines: springer.com > Authors > Author Guidelines For other titles published in this series, go to www.springer.com/series/8442 Mark Priestley A Science of Operations Machines, Logic and the Invention of Programming Dr. Mark Priestley London UK [email protected] url: http://www.markpriestley.net ISSN 2190-6831 e-ISSN 2190-684X ISBN 978-1-84882-554-3 e-ISBN 978-1-84882-555-0 DOI 10.1007/978-1-84882-555-0 Springer London Dordrecht Heidelberg New York British Library Cataloguing in Publication Data A catalogue record for this book is available from the British Library Library of Congress Control Number: 2011921403 © Springer-Verlag London Limited 2011 Apart from any fair dealing for the purposes of research or private study, or criticism or review, as per- mitted under the Copyright, Designs and Patents Act 1988, this publication may only be reproduced, stored or transmitted, in any form or by any means, with the prior permission in writing of the publish- ers, or in the case of reprographic reproduction in accordance with the terms of licenses issued by the Copyright Licensing Agency. Enquiries concerning reproduction outside those terms should be sent to the publishers. The use of registered names, trademarks, etc., in this publication does not imply, even in the absence of a specific statement, that such names are exempt from the relevant laws and regulations and therefore free for general use. The publisher makes no representation, express or implied, with regard to the accuracy of the information contained in this book and cannot accept any legal responsibility or liability for any errors or omissions that may be made. Cover design: deblik Printed on acid-free paper Springer is part of Springer Science+Business Media (www.springer.com) Preface It has long been recognized that there is a relationship between logic and computer programming. Since the 1940s many logicians, including Alan Turing and John von Neumann, and computer scientists such as Edsgar Dijkstra and John McCarthy, have drawn attention to the connection between the two disciplines and commented on its importance. During the formal methods boom of the 1980s and 1990s it was common to read assertions to the effect that programming could be reduced to a quasi-logical process of theorem proving and formula derivation. However, despite having a background in both programming and logic, I found something elusive about such assertions. My practical experience of programming had led me to understand it as a very different type of activity from the formal ma- nipulations of logical proof. With the intention of resolving this uncertainty, I em- barked upon a Ph.D. to study the ways in which the development of programming languages had been influenced by logic. This book is based on the resulting thesis, although it has a wider focus and marks a further stage in my growing understanding of the connection between the two areas. Broadly speaking, I have come to understand this connection not as a fact about the two disciplines, but as something more like a decision made by identifiable historical actors as to how the new discipline of programming should be understood and structured. A key event in this process was the creation of the Algol language in the years around 1960. The motivation for this decision has very deep roots, however, reaching back to ideas about machines and machinic processes that date from the very beginning of the scientific revolution. From this starting point, the book gives an account of the history of what we now call programming. It is useful, however, to view this not simply as computer programming, but to think more generally of attempts to define the steps involved in computations and other information-processing activities in such a way that they could be performed by machines, or at least by humans mimicking the behaviour of machines. From this perspective, the history of programming is distinct from the history of the computer, despite the close relationship between the two in the twentieth century. v vi Preface The story told in this book stops in the 1970s, at the point where object-oriented programming emerged as a successor, or alternative, to structured programming. This is not an arbitrary cut-off point; rather, I believe that structured programming marks the completion of a particular historical trajectory, and that object-orientation, despite its rich inheritance from existing practices, is something quite different, or at least something which has deep roots in approaches and disciplines other than logic. Untangling these roots is part of a different project, however, and the history of how programming and programming languages developed after the 1970s is part of a different story from the one told here. There are several things this book is not, therefore. Although it contains much historical material, it does not pretend to be a complete history of programming languages, and still less a history of the computer. What it does try to do is to tell a particular story about these historical developments and to show one way in which the history of programming languages can be set in a wider context and seen as an integral part of a development which, ultimately, is central to the project started by the scientific revolution in the seventeenth century. The history of computing has recently been expanding its focus from technical details and embracing wider narratives and historical contexts. These encouraging developments have been principally evident in studies of commercial and military applications of computing and their societal impacts. If I had a single hope for this book, it would be that it might contribute to a similar process in the more technical aspects of the subject and to inspire computer scientists and historians of science and technology to see programming not as a isolated technical field. but as an interesting and important part of general intellectual history. Acknowledgements I would like particularly to thank my supervisor Donald Gillies and external examiner John Tucker, without whose enthusiastic advice, sup- port and encouragement this book would never have seen the light of day. London, UK Mark Priestley Contents 1 Introduction ................................ 1 1.1 Minds, Method and Machines . ................... 3 1.2 Language and Science ........................ 4 1.3 The Age of Machinery ........................ 7 1.4 The Mechanization of Mathematical Language . ......... 8 2 Babbage’s Engines ............................ 17 2.1 The Division of Mental Labour ................... 18 2.2 The Difference Engine ........................ 21 2.3 The Meanings of the Difference Engine ............... 25 2.4 The Mechanical Notation . ................... 28 2.5TheAnalyticalEngine........................ 31 2.6 The Science of Operations . ................... 41 2.7 The Meanings of the Analytical Engine ............... 44 2.8 Conclusions ............................. 48 3 Semi-Automatic Computing ....................... 53 3.1TheCensusProblem......................... 53 3.2 The Hollerith Tabulating System of 1890 .............. 55 3.3 Further Developments in Punched Card Machines ......... 57 3.4 Comrie and the Mechanization of Scientific Calculation ...... 60 3.5Semi-AutomaticProgramming................... 65 4 Logic, Computability and Formal Systems ............... 67 4.1 Gödel’s Construction ........................ 69 4.2 Recursive Functions ......................... 72 4.3 λ-definability ............................. 74 4.4 Direct Approaches to Defining Effective Computability ...... 75 4.5 Turing’s Machine Table Notation .................. 77 4.6 Universal Machines ......................... 89 4.7 The Concept of a Formal Language ................. 92 4.8TheRelationshipBetweenTuring’sWorkandLogic........ 96 vii viii Contents 5 Automating Control ........................... 99 5.1 Konrad Zuse’s Early Machines ...................100 5.2 Mark I: The Automatic Sequence Controlled Calculator ......102 5.3TheENIAC.............................107 5.4 The Bell Labs Relay Machines
Recommended publications
  • 1 Oral History Interview with Brian Randell January 7, 2021 Via Zoom
    Oral History Interview with Brian Randell January 7, 2021 Via Zoom Conducted by William Aspray Charles Babbage Institute 1 Abstract Brian Randell tells about his upbringing and his work at English Electric, IBM, and Newcastle University. The primary topic of the interview is his work in the history of computing. He discusses his discovery of the Irish computer pioneer Percy Ludgate, the preparation of his edited volume The Origins of Digital Computers, various lectures he has given on the history of computing, his PhD supervision of Martin Campbell-Kelly, the Computer History Museum, his contribution to the second edition of A Computer Perspective, and his involvement in making public the World War 2 Bletchley Park Colossus code- breaking machines, among other topics. This interview is part of a series of interviews on the early history of the history of computing. Keywords: English Electric, IBM, Newcastle University, Bletchley Park, Martin Campbell-Kelly, Computer History Museum, Jim Horning, Gwen Bell, Gordon Bell, Enigma machine, Curta (calculating device), Charles and Ray Eames, I. Bernard Cohen, Charles Babbage, Percy Ludgate. 2 Aspray: This is an interview on the 7th of January 2021 with Brian Randell. The interviewer is William Aspray. We’re doing this interview via Zoom. Brian, could you briefly talk about when and where you were born, a little bit about your growing up and your interests during that time, all the way through your formal education? Randell: Ok. I was born in 1936 in Cardiff, Wales. Went to school, high school, there. In retrospect, one of the things I missed out then was learning or being taught Welsh.
    [Show full text]
  • A Universal Modular ACTOR Formalism for Artificial
    Artificial Intelligence A Universal Modular ACTOR Formalism for Artificial Intelligence Carl Hewitt Peter Bishop Richard Steiger Abstract This paper proposes a modular ACTOR architecture and definitional method for artificial intelligence that is conceptually based on a single kind of object: actors [or, if you will, virtual processors, activation frames, or streams]. The formalism makes no presuppositions about the representation of primitive data structures and control structures. Such structures can be programmed, micro-coded, or hard wired 1n a uniform modular fashion. In fact it is impossible to determine whether a given object is "really" represented as a list, a vector, a hash table, a function, or a process. The architecture will efficiently run the coming generation of PLANNER-like artificial intelligence languages including those requiring a high degree of parallelism. The efficiency is gained without loss of programming generality because it only makes certain actors more efficient; it does not change their behavioral characteristics. The architecture is general with respect to control structure and does not have or need goto, interrupt, or semaphore primitives. The formalism achieves the goals that the disallowed constructs are intended to achieve by other more structured methods. PLANNER Progress "Programs should not only work, but they should appear to work as well." PDP-1X Dogma The PLANNER project is continuing research in natural and effective means for embedding knowledge in procedures. In the course of this work we have succeeded in unifying the formalism around one fundamental concept: the ACTOR. Intuitively, an ACTOR is an active agent which plays a role on cue according to a script" we" use the ACTOR metaphor to emphasize the inseparability of control and data flow in our model.
    [Show full text]
  • Specification of Agent Explicit Knowledge in Cryptographic
    International Journal of Electrical and Computer Engineering 4:2 2009 Specification of Agent Explicit Knowledge in Cryptographic Protocols Khair Eddin Sabri, Ridha Khedri, and Jason Jaskolka Department of Computing and Software Faculty of Engineering McMaster University {sabrike, khedri, jaskolj}@mcmaster.ca Abstract— Cryptographic protocols are widely used in various The procedural knowledge involves a set of mecha- applications to provide secure communications. They are usually nisms/functions that enables an agent to obtain new informa- represented as communicating agents that send and receive messages. tion from its explicit knowledge. For example, if the explicit These agents use their knowledge to exchange information and communicate with other agents involved in the protocol. An agent knowledge of an agent contains an encrypted message as well knowledge can be partitioned into explicit knowledge and procedural as the key and the cipher used to decrypt the message, then by knowledge. The explicit knowledge refers to the set of information using the procedural knowledge, the secret can be obtained. which is either proper to the agent or directly obtained from other In this paper, we focus only on the explicit knowledge. agents through communication. The procedural knowledge relates to Both parts of the agent knowledge are necessary in ana- the set of mechanisms used to get new information from what is already available to the agent. lyzing cryptographic protocols. When we analyze the liter- In this paper, we propose a mathematical framework which spec- ature using this classification of agent knowledge, we find ifies the explicit knowledge of an agent involved in a cryptographic several representations of that knowledge.
    [Show full text]
  • 195907-08.Pdf
    IS LEMETER \GNETICS " IPPORTS IE WEIGHT ONE ~/ ./ )UNG .EPHANT s a jaster, more reliable memory, too! LLY TEST t we hope it attracts your ention to the thorough "ee-stage inspection and testing len every TELEMETER MAGNETICS ~mory product - from core to ray to buffers and memory systems. .C·l!r~.····~ ')1 r·,1 :!~:rIGS arufac'c;re: th" :-;)em;~r)' sy:;te;' complete from core pro- du~;t i::: irin.c: t ~is':,j ... and because TElEf~ETER MAGr'lETIGS tests FlU cal ;~e tnat milch :;ur~i when you specify memory ",'!;ar 'lJ':\::~~ mernar! "y~tems to satisfy practically any :;)t)figuf?t:on. linns :-;f :1:'''-' Hl to 1,~:;;O,OOU bits are COlllmon.,. memories can be iiiuil ~,il' :i, a:j:jitic;n, n.~! offers you :\ selection of memory up:t~, ":ili! r:vr.ie tirnl?s of ;,1 microsecond:;, 6 to 8 microseconds, ilnd 3 microseconds. i::', IIII'll/un s(uclt JUT [i.':cti or:ic circu in TEU:W::n:R f,li'IGNH!GS menlOl y systems employ solid state elements a '20-lIIir'f"()s(,(,()lId "ystem 20 Jf! UfJld., ui 2'2 liI',,\ I'll!'!! ilrc on n!il;,-in (>W;'j tor com~~3ctness anti mainten'lf]ce ease, I' .7 , / (':'! .' ,I .r ) / f '/ • ,'f ,. r' t ,.' f ( ,r .... I I j • ~ ~. / i "'(" ( ,: r I .~ • I i j' I ' { i I [ • / (~ '. \ ,\fulIu/ul'lur l '/'. ,,!, 1PORTAlH JOB OPPORTUNITIES liJ\11tW,ilffii j itJit:Il": l,~ill( ,1i(~i'tllijH !~'fu.;liT:USdjll\Ji~t !!lDH tfUfjjlF~l~ !!i~'(~:iij»; TELEMETER MAGNETICS Inc.
    [Show full text]
  • Non-Locality, Contextuality and Valuation Algebras: a General Theory of Disagreement Samson Abramsky1, Giovanni Carù1
    Non-locality, contextuality and valuation algebras: a general theory of disagreement Samson Abramsky1, Giovanni Carù1 1Department of Computer Science, University of Subject Areas: Oxford, Wolfson Building, Parks Road, Oxford OX1 Quantum information, quantum 3QD, U.K. foundations, generic inference Keywords: We establish a strong link between two apparently Contextuality, generic inference, unrelated topics: the study of conflicting information valuation algebras, algorithms in the formal framework of valuation algebras, and the phenomena of non-locality and contextuality. In particular, we show that these peculiar features of Email addresses: quantum theory are mathematically equivalent to a Samson Abramsky general notion of disagreement between information [email protected] sources. This result vastly generalises previously observed connections between contextuality, relational databases, constraint satisfaction problems, and logical paradoxes, and gives further proof that contextual behaviour is not a phenomenon limited to quantum physics, but pervades various domains of mathematics and computer science. The connection allows to translate theorems, methods and algorithms from one field to the other, and paves the way for the application of generic inference algorithms to study contextuality. 1. Introduction Non-locality and contextuality are characteristic features of quantum theory which have been recently proven to play a crucial role as fundamental resources for quantum information and computation [1,2]. In 2011, Abramsky and Brandenburger introduced an abstract mathematical framework based on sheaf theory to describe these phenomena in a unified treatment, thereby providing a common general theory for the study of non-locality and contextuality, which had been carried out in a rather arXiv:1911.03521v1 [quant-ph] 8 Nov 2019 concrete, example-driven fashion until then [3].
    [Show full text]
  • On Homomorphism of Valuation Algebras∗
    Communications in Mathematical Research 27(1)(2011), 181–192 On Homomorphism of Valuation Algebras∗ Guan Xue-chong1,2 and Li Yong-ming3 (1. College of Mathematics and Information Science, Shaanxi Normal University, Xi’an, 710062) (2. School of Mathematical Science, Xuzhou Normal University, Xuzhou, Jiangsu, 221116) (3. College of Computer Science, Shaanxi Normal University, Xi’an, 710062) Communicated by Du Xian-kun Abstract: In this paper, firstly, a necessary condition and a sufficient condition for an isomorphism between two semiring-induced valuation algebras to exist are presented respectively. Then a general valuation homomorphism based on different domains is defined, and the corresponding homomorphism theorem of valuation algebra is proved. Key words: homomorphism, valuation algebra, semiring 2000 MR subject classification: 16Y60, 94A15 Document code: A Article ID: 1674-5647(2011)01-0181-12 1 Introduction Valuation algebra is an abstract formalization from artificial intelligence including constraint systems (see [1]), Dempster-Shafer belief functions (see [2]), database theory, logic, and etc. There are three operations including labeling, combination and marginalization in a valua- tion algebra. With these operations on valuations, the system could combine information, and get information on a designated set of variables by marginalization. With further re- search and theoretical development, a new type of valuation algebra named domain-free valuation algebra is also put forward. As valuation algebra is an algebraic structure, the concept of homomorphism between valuation algebras, which is derived from the classic study of universal algebra, has been defined naturally. Meanwhile, recent studies in [3], [4] have showed that valuation alge- bras induced by semirings play a very important role in applications.
    [Show full text]
  • The Roots of Software Engineering*
    THE ROOTS OF SOFTWARE ENGINEERING* Michael S. Mahoney Princeton University (CWI Quarterly 3,4(1990), 325-334) At the International Conference on the History of Computing held in Los Alamos in 1976, R.W. Hamming placed his proposed agenda in the title of his paper: "We Would Know What They Thought When They Did It."1 He pleaded for a history of computing that pursued the contextual development of ideas, rather than merely listing names, dates, and places of "firsts". Moreover, he exhorted historians to go beyond the documents to "informed speculation" about the results of undocumented practice. What people actually did and what they thought they were doing may well not be accurately reflected in what they wrote and what they said they were thinking. His own experience had taught him that. Historians of science recognize in Hamming's point what they learned from Thomas Kuhn's Structure of Scientific Revolutions some time ago, namely that the practice of science and the literature of science do not necessarily coincide. Paradigms (or, if you prefer with Kuhn, disciplinary matrices) direct not so much what scientists say as what they do. Hence, to determine the paradigms of past science historians must watch scientists at work practicing their science. We have to reconstruct what they thought from the evidence of what they did, and that work of reconstruction in the history of science has often involved a certain amount of speculation informed by historians' own experience of science. That is all the more the case in the history of technology, where up to the present century the inventor and engineer have \*-as Derek Price once put it\*- "thought with their fingertips", leaving the record of their thinking in the artefacts they have designed rather than in texts they have written.
    [Show full text]
  • ONTOLOGICAL ENGINEERING for PUBLIC and PRIVATE PROCUREMENT Ontological Engineering for Public and Private Procurement
    EUROPEAN UNION BRAZIL ONTOLOGICAL ENGINEERING FOR PUBLIC AND PRIVATE PROCUREMENT Ontological Engineering for Public and Private Procurement ONTOLOGICAL ENGINEERING FOR PUBLIC AND PRIVATE PROCUREMENT Ontological Engineering for Public and Private Procurement MINISTRY OF GOVERNMENT SECRETARIAT MINISTER OF STATE GEDDEL VIEIRA LIMA SECRETARY OF MICRO AND SMALL ENTERPRISES JOSÉ RICARDO DA VEIGA DIRECTOR OF MARKETS AND INNOVATION ALEXANDRE MONTEIRO E SILVA TECHNICAL SUPPORT TEAM FABIO MEDEIROS DE SOUZA (GENERAL-COORDINATOR OF MARKETS ACCESS) CARLOS VELOSO DE MELO JR. (FOREIGN TRADE ANALYST) JANIO MOREIRA DA COSTA (INFORMATION TECHNOLOGY ANALYST) EUROPEAN UNION DELEGATION TO BRAZIL HEAD OF THE EUROPEAN UNION DELEGATION JOÃO GOMES CRAVINHO MINISTER COUNSELLOR - HEAD OF DEVELOPMENT AND COOPERATION SECTION THIERRY DUDERMEL COOPERATION ATTACHÉ – EU-BRAZIL SECTOR DIALOGUES SUPPORT FACILITY COORDINATOR ONTOLOGICAL ASIER SANTILLAN LUZURIAGA IMPLEMENTING CONSORTIUM CESO DEVELOPMENT CONSULTANTS/FIIAPP/INA/CEPS ENGINEERING FOR MINISTRY OF PLANNING, DEVELOPMENT AND MANAGEMENT PUBLIC AND PRIVATE MINISTER OF STATE DYOGO OLIVEIRA SECRETARY OF MANAGEMENT PROCUREMENT GLEISSON CARDOSO RUBIN PROJECT NATIONAL DIRECTOR MARCELO MENDES BARBOSA CONTACTS PROJECT COORDINATION UNIT Authors: EUROPEAN UNION-BRAZIL SECTOR DIALOGUES SUPPORT FACILITY SECRETARIAT OF PUBLIC MANAGEMENT Edilson Ferneda MINISTRY OF PLANNING, DEVELOPMENT AND MANAGEMENT [email protected] TELEPHONE: + 55 61 2020.4645/4168/4785 Fernando William Cruz [email protected] [email protected] WWW.SECTORDIALOGUES.ORG © 2016 EUROPEAN UNION Brasília • March • 2016 THE CONTENT OF THIS PUBLICATION DOES NOT REFLECT THE OFFICIAL OPINION OF THE EUROPEAN UNION AND THE BRAZILIAN GOVERNMENT. RESPONSIBILITY FOR THE INFORMATION AND VIEWS EXPRESSED THEREIN LIES ENTIRELY WITH THE AUTHOR. Ontological Engineering for Public and Private Procurement LIST OF FIGURES AND CHARTS Figure 1.1 Some of the main knowledge organization systems .......................
    [Show full text]
  • Fiendish Designs
    Fiendish Designs A Software Engineering Odyssey © Tim Denvir 2011 1 Preface These are notes, incomplete but extensive, for a book which I hope will give a personal view of the first forty years or so of Software Engineering. Whether the book will ever see the light of day, I am not sure. These notes have come, I realise, to be a memoir of my working life in SE. I want to capture not only the evolution of the technical discipline which is software engineering, but also the climate of social practice in the industry, which has changed hugely over time. To what extent, if at all, others will find this interesting, I have very little idea. I mention other, real people by name here and there. If anyone prefers me not to refer to them, or wishes to offer corrections on any item, they can email me (see Contact on Home Page). Introduction Everybody today encounters computers. There are computers inside petrol pumps, in cash tills, behind the dashboard instruments in modern cars, and in libraries, doctors’ surgeries and beside the dentist’s chair. A large proportion of people have personal computers in their homes and may use them at work, without having to be specialists in computing. Most people have at least some idea that computers contain software, lists of instructions which drive the computer and enable it to perform different tasks. The term “software engineering” wasn’t coined until 1968, at a NATO-funded conference, but the activity that it stands for had been carried out for at least ten years before that.
    [Show full text]
  • A Purdue University Course in the History of Computing
    Purdue University Purdue e-Pubs Department of Computer Science Technical Reports Department of Computer Science 1991 A Purdue University Course in the History of Computing Saul Rosen Report Number: 91-023 Rosen, Saul, "A Purdue University Course in the History of Computing" (1991). Department of Computer Science Technical Reports. Paper 872. https://docs.lib.purdue.edu/cstech/872 This document has been made available through Purdue e-Pubs, a service of the Purdue University Libraries. Please contact [email protected] for additional information. A PURDUE UNIVERSITY COURSE IN mE HISTORY OF COMPUTING Saul Rosen CSD-TR-91-023 Mrrch 1991 A Purdue University Course in the History of Computing Saul Rosen CSD-TR-91-023 March 1991 Abstract University COUISes in the history of computing are still quite rarc. This paper is a discussion of a one-semester course that I have taught during the past few years in the Department of Computer Sciences at Purdue University. The amount of material that could be included in such a course is overwhelming, and the literature in the subject has increased at a great rate in the past decade. A syllabus and list of major references are presented here as they are presented to the students. The course develops a few major themes, several of which arc discussed briefly in the paper. The coume has been an interesting and stimulating experience for me. and for some of the students who took the course. I do not foresee a rapid expansion in the number of uniVcCiities that will offer courses in the history of computing.
    [Show full text]
  • Saskia Lourens Writing His-1
    UvA-DARE (Digital Academic Repository) Writing history : national identity in André Brink's post-apartheid fiction Lourens, S.T. Publication date 2009 Document Version Final published version Link to publication Citation for published version (APA): Lourens, S. T. (2009). Writing history : national identity in André Brink's post-apartheid fiction. General rights It is not permitted to download or to forward/distribute the text or part of it without the consent of the author(s) and/or copyright holder(s), other than for strictly personal, individual use, unless the work is under an open content license (like Creative Commons). Disclaimer/Complaints regulations If you believe that digital publication of certain material infringes any of your rights or (privacy) interests, please let the Library know, stating your reasons. In case of a legitimate complaint, the Library will make the material inaccessible and/or remove it from the website. Please Ask the Library: https://uba.uva.nl/en/contact, or a letter to: Library of the University of Amsterdam, Secretariat, Singel 425, 1012 WP Amsterdam, The Netherlands. You will be contacted as soon as possible. UvA-DARE is a service provided by the library of the University of Amsterdam (https://dare.uva.nl) Download date:30 Sep 2021 WRITING HISTORY: NATIONAL IDENTITY IN ANDRÉ BRINK’S POST-APARTHEID FICTION ACADEMISCH PROEFSCHRIFT ter verkrijging van de graad van doctor aan de Universiteit van Amsterdam op gezag van de Rector Magnificus prof. dr. D. C. van den Boom ten overstaan van een door het college voor promoties ingestelde commissie, in het openbaar te verdedigen in de Aula der Universiteit op woensdag 28 oktober 2009, te 10:00 uur door Saskia Theodora Lourens geboren te Port Elizabeth, Zuid Afrika 1 Promotiecommissie Promotor: prof.
    [Show full text]
  • Actor Model of Computation
    Published in ArXiv http://arxiv.org/abs/1008.1459 Actor Model of Computation Carl Hewitt http://carlhewitt.info This paper is dedicated to Alonzo Church and Dana Scott. The Actor model is a mathematical theory that treats “Actors” as the universal primitives of concurrent digital computation. The model has been used both as a framework for a theoretical understanding of concurrency, and as the theoretical basis for several practical implementations of concurrent systems. Unlike previous models of computation, the Actor model was inspired by physical laws. It was also influenced by the programming languages Lisp, Simula 67 and Smalltalk-72, as well as ideas for Petri Nets, capability-based systems and packet switching. The advent of massive concurrency through client- cloud computing and many-core computer architectures has galvanized interest in the Actor model. An Actor is a computational entity that, in response to a message it receives, can concurrently: send a finite number of messages to other Actors; create a finite number of new Actors; designate the behavior to be used for the next message it receives. There is no assumed order to the above actions and they could be carried out concurrently. In addition two messages sent concurrently can arrive in either order. Decoupling the sender from communications sent was a fundamental advance of the Actor model enabling asynchronous communication and control structures as patterns of passing messages. November 7, 2010 Page 1 of 25 Contents Introduction ............................................................ 3 Fundamental concepts ............................................ 3 Illustrations ............................................................ 3 Modularity thru Direct communication and asynchrony ............................................................. 3 Indeterminacy and Quasi-commutativity ............... 4 Locality and Security ............................................
    [Show full text]