22:07 29 May 1979 Time: 22:55 Date: 29 May 1979 Stanford University Artificial Intelligence Laboratory Computer Science Department Stanford, California

Total Page:16

File Type:pdf, Size:1020Kb

22:07 29 May 1979 Time: 22:55 Date: 29 May 1979 Stanford University Artificial Intelligence Laboratory Computer Science Department Stanford, California »;«;«;<. .>:<»:..;..;<.:.»:<.;..;<>:»:..;« FOR DELIVERY TO JACKS HALL ,:<,:<,:,».:.,:<«.».:<.:«:< t Please place in bin for Margaret Jacks Hall *.*.*..... >:<>:< . File under: S Date: 29 May 1979 Name: Stephen Westfold Name: Stephen Westfold Project: I Programmer: SJW File Name: AIL2.XGP[AIH,SJW] File Last Written: 22:07 29 May 1979 Time: 22:55 Date: 29 May 1979 Stanford University Artificial Intelligence Laboratory Computer Science Department Stanford, California i>>:.!;«:«:<!;«;<>:<!:«:»:»:»:< K«:<>;<>:":<>:<>;c>:n;<>:o:o:»;o;o;i>;n;»M^^ AI Languages 1 An Historical Overview of Artificial Intelligence Programming Languages Stephen Westfold Artificial Intelligence programming languages have had a central role in the history of Artificial Intelligence. They have two important <functions,uses>. First, they allow programs demonstrating and testing AI ideas to be implemented conveniently. Second, they provide vehicles of thought. As with any high-level languages, they allow the user to concentrate on higher level concepts and avoid being distracted by low-level implementation details. In this article we distinguish between the general purpose languages widely used in Al, such as LISP and POP-2, and the higher level AI languages, such as PLANNER and QLISP. Using the concepts of the higher level languages imposes a structure on the way one thinks, which can be restrictive, but without some such structure it is very difficult to approach the problems of AI. Frequently, new ideas in Al are accompanied by a new AI language in which it is natural to apply these ideas. Usually, such a higher level language is built on an existing high-level AI language so that the desirable features of the host language do not have to be re-implemented in the new language. Figure 1 gives a rough indication of the directions in which AI languages developed and the major influences on the languages. IPL was developed around 1956 by Newell, Shaw, and Simon [] as the first programming language specifically for AI. Its design was guided by ideas from psychology, especially the intuition of association <more?>. The primary elements of the language were symbols as opposed to numbers, around which all other languages of the time were built. To form associations of these symbols, list processing was introduced. The objective was to enable programs to build data structures of unpredictable size and shape conveniently. The problem of unpredictable shape was solved by using data elements consisting of two fields, each of which could hold either a symbol or a pointer to another such data element. This simple arrangement allows arbitrary binary trees or list structure to be built. The problem of unpredictable size is handled by having a free list of data elements that are allocated to the various data structures as required. A major advantage of list structure is that new elements can be inserted and elements can be removed from existing structure very simply. However, it is clearly desirable that elements deleted from all structures should be available for reuse in new structure. In IPL the user is responsible for returning cells to the free list when they were no longer required. For programs that build complex structures that include some sharing of sub-structures, however, it is difficult to determine in general when an element is deleted from one structure whether it, or any structure it points to, is part of any other structure. Later list processing systems have therefore taken responsibility for reclaiming elements no longer used. One method for doing this is to maintain a reference count for each element showing how many other elements point to it. Every primitive list operation that deletes a pointer must decrement the count of the element pointed to and, if the count is zero, reclaim it and recursively delete any pointers from the element. An alternative method for reclaiming elements is that of garbage collection. Periodically, such as when the free list is empty, the garbage collector traces all the pointers to elements accessible to the program. Tracing a pointer consists of marking the element pointed to and recursively tracing the pointers of this element. This process ensures that all elements still in use are marked, so the unmarked elements can be collected and added to the free list. It turns out that there are some combinations of list operations where it is not feasible to maintain reference counts correctly [MOSE ?], whereas garbagecollection can still be applied. AI Languages 2 Another feature of IPL is the generator which is a procedure for computing a series of values. It produces one value each time it is called and is then suspended so that it starts from where it left off the next time it is called, [see Control Structures section for more information] This idea was to turn up later in CONNIVER and similar languages. Most of the programs of the early history of AI were written in IPL, in particular the version called IPL-V. <The Logic Theorist (the first heuristic program) The General Problem Solver The Newell-Shaw-Simon Chess Program EPAM Feldman's two-choice decision model Lindsay's SAD SAM Green, et.al. BASEBALL (I think) Tonge's Assembly Line Balancing program > -Where do the Carnegie languages- L*, etc. - fit in? > 1956 1957 1958 1959 1960 1961 1962 1963 1964 1965 1966 1967 1968 1969 LEAP 1970 1971 1972 1973 1974 1975 Figure 1. Historical chart of the development of AI languages. The dotted lines indicate major influences. The bold lines indicate stronger relationships. Many of the ideas of IPL went into LISP. Unlike IPL, a rather low-level language with a sequence-of-instruction style although it was considered quite a high-level' language at the time of its invention,LISP is a high-level language with an algebraic style inspired by, but different from, the AI Languages 3 algebraic notation pioneered by FORTRAN. During the design phase of LISP some of the LISP ideas were implemented within FORTRAN leading to the language FLPL [GELE], FLPL was created so that some of the features of LISP would be in a working system that could be used for writing a plane geometry program [GELE], FLPL will not be discussed further. The first dialect of LISP to be widely used was LISP 1.5. In the late sixties a number of new dialects of LISP were developed to meet the demand for sophisticated programming aids. The most important of these dialects are MACLISP and INTERLISP (derived from 940 LISP through BBN LISP). Another significantly used dialect is UCI LISP, an extension of STANFORD LISP 1.6 (itself an extended version of LISP 1.5) developed at the University of California at Irvine. UCI LISP will not be discussed further because its advanced features are basically a subset of those of INTERLISP. POP-2 was developed in the United Kingdom to have some of the features of LISP in an ALGOL- like syntax and to be suitable for implementation on medium-sized machines. <the intra, to this paragraph should be changed. At the end of the sixties many AI researchers wished to have primitives in their languages to allow convenient associative retrieval of symbolic data. LEAP was one embodiment of these ideas and it was added to SAIL which was otherwise a version of ALGOL6O. In the LISP communities such ideas lead to QA4 and PLANNER, higher level languages implemented in LISP. These languages were also designed to allow special purpose problem solving and deduction methods to be written conveniently. QLISP is a slightly revised version of QA4 made to fit cleanly into INTERLISP. CONNIVER embodied a reaction against PLANNER, giving the AI programmer lower level control while still maintaining many of the ideas of PLANNER. A number of features first introduced in AI programming languages have been found to be useful in Computer Science more generally and have been included in many programming languages. It is difficult to trace the source of ideas with much certainty - frequently the same or similar ideas occur independently at different times and places - so some people probably would not agree with some of the items listed below. Also, the list is not intended to be complete. Probably the most important idea so far has been list processing itself. This is one of the basic ideas of the field of data structures with applications in many areas of Computer Science including complexity theory, operating systems and compilers. An early addition of list processing primitives to a more generally available language was SLIP, a subroutine package for FORTRAN [WEIZ 1963]. SLIP was used for some A I work but mainly for other fields. Since ALGOL6B most general purpose proramming languages, including PASCAL, ... , have flexible data structures such as records which owe much to the ideas of list processing. Garbage collection is typically used to manage the storage of such structures. The designs of LISP and ALGOL overlapped in time with McCarthy involved with both. He was influential in the decision to include both conditional expressions and recursion in ALGOL6O, having already decided to include them in LISP. Symbol manipulation has been critical for the field of algebraic manipulation. Indeed, many of 'them, including MACSYMA [] and REDUCE [HEAR 19??] have used LISP as a base language. Some symbol manipulation facilities were included in the language COMIT developed around 1960 tl COMIT is primarily a string handling language with some features specially designed to help in the analysis of natural language text. It was used for some AI work. It has been superceded to a large extent by SNOBOL []. The applicative style of programming, pioneered in LISP (also used in APL) has been AI Languages 4 suggested by a number of people including Backus [BACK 1978] to be a more appropriate style than the yon Neumann machine-oriented languages currently dominant.
Recommended publications
  • Introduction to Programming in Lisp
    Introduction to Programming in Lisp Supplementary handout for 4th Year AI lectures · D W Murray · Hilary 1991 1 Background There are two widely used languages for AI, viz. Lisp and Prolog. The latter is the language for Logic Programming, but much of the remainder of the work is programmed in Lisp. Lisp is the general language for AI because it allows us to manipulate symbols and ideas in a commonsense manner. Lisp is an acronym for List Processing, a reference to the basic syntax of the language and aim of the language. The earliest list processing language was in fact IPL developed in the mid 1950’s by Simon, Newell and Shaw. Lisp itself was conceived by John McCarthy and students in the late 1950’s for use in the newly-named field of artificial intelligence. It caught on quickly in MIT’s AI Project, was implemented on the IBM 704 and by 1962 to spread through other AI groups. AI is still the largest application area for the language, but the removal of many of the flaws of early versions of the language have resulted in its gaining somewhat wider acceptance. One snag with Lisp is that although it started out as a very pure language based on mathematic logic, practical pressures mean that it has grown. There were many dialects which threaten the unity of the language, but recently there was a concerted effort to develop a more standard Lisp, viz. Common Lisp. Other Lisps you may hear of are FranzLisp, MacLisp, InterLisp, Cambridge Lisp, Le Lisp, ... Some good things about Lisp are: • Lisp is an early example of an interpreted language (though it can be compiled).
    [Show full text]
  • High-Level Language Features Not Found in Ordinary LISP. the GLISP
    DOCUMENT RESUME ED 232 860 SE 042 634 AUTHOR Novak, Gordon S., Jr. TITLE GLISP User's Manual. Revised. INSTITUTION Stanford Univ., Calif. Dept. of Computer Science. SPONS AGENCY Advanced Research Projects Agency (DOD), Washington, D.C.; National Science Foundation, Washington, D.C. PUB DATE 23 Nov 82 CONTRACT MDA-903-80-c-007 GRANT SED-7912803 NOTE 43p.; For related documents, see SE 042 630-635. PUB TYPE Guides General (050) Reference Materials General (130) EDRS PRICE MF01/PCO2 Plus Postage. DESCRIPTORS *Computer Programs; *Computer Science; Guides; *Programing; *Programing Languages; *Resource Materials IDENTIFIERS *GLISP Programing Language; National Science Foundation ABSTRACT GLISP is a LISP-based language which provides high-level language features not found in ordinary LISP. The GLISP language is implemented by means of a compiler which accepts GLISP as input and produces ordinary LISP as output. This output can be further compiled to machine code by the LISP compiler. GLISP is available for several ISP dialects, including Interlisp, Maclisp, UCI Lisp, ELISP, Franz Lisp, and Portable Standard Lisp. The goal of GLISP is to allow structured objects to be referenced in a convenient, succinct language and to allow the structures of objects to be changed without changing the code which references the objects. GLISP provides both PASCAL-like and English-like syntaxes; much of the power and brevity of GLISP derive from the compiler features necessary to support the relatively informal, English-like language constructs. Provided in this manual is the documentation necessary for using GLISP. The documentation is presented in the following sections: introduction; object descriptions; reference to objects; GLISP program syntax; messages; context rules and reference; GLISP and knowledge representation languages; obtaining and using GLISP; GLISP hacks (some ways of doing things in GLISP which might not be entirely obvious at first glance); and examples of GLISP object declarations and programs.
    [Show full text]
  • Oliver Knill: March 2000 Literature: Peter Norvig, Paradigns of Artificial Intelligence Programming Daniel Juravsky and James Martin, Speech and Language Processing
    ENTRY ARTIFICIAL INTELLIGENCE [ENTRY ARTIFICIAL INTELLIGENCE] Authors: Oliver Knill: March 2000 Literature: Peter Norvig, Paradigns of Artificial Intelligence Programming Daniel Juravsky and James Martin, Speech and Language Processing Adaptive Simulated Annealing [Adaptive Simulated Annealing] A language interface to a neural net simulator. artificial intelligence [artificial intelligence] (AI) is a field of computer science concerned with the concepts and methods of symbolic knowledge representation. AI attempts to model aspects of human thought on computers. Aspectrs of AI: computer vision • language processing • pattern recognition • expert systems • problem solving • roboting • optical character recognition • artificial life • grammars • game theory • Babelfish [Babelfish] Online translation system from Systran. Chomsky [Chomsky] Noam Chomsky is a pioneer in formal language theory. He is MIT Professor of Linguistics, Linguistic Theory, Syntax, Semantics and Philosophy of Language. Eliza [Eliza] One of the first programs to feature English output as well as input. It was developed by Joseph Weizenbaum at MIT. The paper appears in the January 1966 issue of the "Communications of the Association of Computing Machinery". Google [Google] A search engine emerging at the end of the 20'th century. It has AI features, allows not only to answer questions by pointing to relevant webpages but can also do simple tasks like doing arithmetic computations, convert units, read the news or find pictures with some content. GPS [GPS] General Problem Solver. A program developed in 1957 by Alan Newell and Herbert Simon. The aim was to write a single computer program which could solve any problem. One reason why GPS was destined to fail is now at the core of computer science.
    [Show full text]
  • The Evolution of Lisp
    1 The Evolution of Lisp Guy L. Steele Jr. Richard P. Gabriel Thinking Machines Corporation Lucid, Inc. 245 First Street 707 Laurel Street Cambridge, Massachusetts 02142 Menlo Park, California 94025 Phone: (617) 234-2860 Phone: (415) 329-8400 FAX: (617) 243-4444 FAX: (415) 329-8480 E-mail: [email protected] E-mail: [email protected] Abstract Lisp is the world’s greatest programming language—or so its proponents think. The structure of Lisp makes it easy to extend the language or even to implement entirely new dialects without starting from scratch. Overall, the evolution of Lisp has been guided more by institutional rivalry, one-upsmanship, and the glee born of technical cleverness that is characteristic of the “hacker culture” than by sober assessments of technical requirements. Nevertheless this process has eventually produced both an industrial- strength programming language, messy but powerful, and a technically pure dialect, small but powerful, that is suitable for use by programming-language theoreticians. We pick up where McCarthy’s paper in the first HOPL conference left off. We trace the development chronologically from the era of the PDP-6, through the heyday of Interlisp and MacLisp, past the ascension and decline of special purpose Lisp machines, to the present era of standardization activities. We then examine the technical evolution of a few representative language features, including both some notable successes and some notable failures, that illuminate design issues that distinguish Lisp from other programming languages. We also discuss the use of Lisp as a laboratory for designing other programming languages. We conclude with some reflections on the forces that have driven the evolution of Lisp.
    [Show full text]
  • Allegro CL User Guide
    Allegro CL User Guide Volume 1 (of 2) version 4.3 March, 1996 Copyright and other notices: This is revision 6 of this manual. This manual has Franz Inc. document number D-U-00-000-01-60320-1-6. Copyright 1985-1996 by Franz Inc. All rights reserved. No part of this pub- lication may be reproduced, stored in a retrieval system, or transmitted, in any form or by any means electronic, mechanical, by photocopying or recording, or otherwise, without the prior and explicit written permission of Franz incorpo- rated. Restricted rights legend: Use, duplication, and disclosure by the United States Government are subject to Restricted Rights for Commercial Software devel- oped at private expense as specified in DOD FAR 52.227-7013 (c) (1) (ii). Allegro CL and Allegro Composer are registered trademarks of Franz Inc. Allegro Common Windows, Allegro Presto, Allegro Runtime, and Allegro Matrix are trademarks of Franz inc. Unix is a trademark of AT&T. The Allegro CL software as provided may contain material copyright Xerox Corp. and the Open Systems Foundation. All such material is used and distrib- uted with permission. Other, uncopyrighted material originally developed at MIT and at CMU is also included. Appendix B is a reproduction of chapters 5 and 6 of The Art of the Metaobject Protocol by G. Kiczales, J. des Rivieres, and D. Bobrow. All this material is used with permission and we thank the authors and their publishers for letting us reproduce their material. Contents Volume 1 Preface 1 Introduction 1.1 The language 1-1 1.2 History 1-1 1.3 Format
    [Show full text]
  • UC Berkeley Previously Published Works
    UC Berkeley UC Berkeley Previously Published Works Title Building the Second Mind, 1961-1980: From the Ascendancy of ARPA-IPTO to the Advent of Commercial Expert Systems Permalink https://escholarship.org/uc/item/7ck3q4f0 ISBN 978-0-989453-4-6 Author Skinner, Rebecca Elizabeth Publication Date 2013-12-31 eScholarship.org Powered by the California Digital Library University of California Building the Second Mind, 1961-1980: From the Ascendancy of ARPA to the Advent of Commercial Expert Systems copyright 2013 Rebecca E. Skinner ISBN 978 09894543-4-6 Forward Part I. Introduction Preface Chapter 1. Introduction: The Status Quo of AI in 1961 Part II. Twin Bolts of Lightning Chapter 2. The Integrated Circuit Chapter 3. The Advanced Research Projects Agency and the Foundation of the IPTO Chapter 4. Hardware, Systems and Applications in the 1960s Part II. The Belle Epoque of the 1960s Chapter 5. MIT: Work in AI in the Early and Mid-1960s Chapter 6. CMU: From the General Problem Solver to the Physical Symbol System and Production Systems Chapter 7. Stanford University and SRI Part III. The Challenges of 1970 Chapter 8. The Mansfield Amendment, “The Heilmeier Era”, and the Crisis in Research Funding Chapter 9. The AI Culture Wars: the War Inside AI and Academia Chapter 10. The AI Culture Wars: Popular Culture Part IV. Big Ideas and Hardware Improvements in the 1970s invert these and put the hardware chapter first Chapter 11. AI at MIT in the 1970s: The Semantic Fallout of NLR and Vision Chapter 12. Hardware, Software, and Applications in the 1970s Chapter 13.
    [Show full text]
  • Inconsistency Robustness for Logic Programs Carl Hewitt
    Inconsistency Robustness for Logic Programs Carl Hewitt To cite this version: Carl Hewitt. Inconsistency Robustness for Logic Programs. Inconsistency Robustness, 2015, 978-1- 84890-159. hal-01148496v6 HAL Id: hal-01148496 https://hal.archives-ouvertes.fr/hal-01148496v6 Submitted on 27 Apr 2016 HAL is a multi-disciplinary open access L’archive ouverte pluridisciplinaire HAL, est archive for the deposit and dissemination of sci- destinée au dépôt et à la diffusion de documents entific research documents, whether they are pub- scientifiques de niveau recherche, publiés ou non, lished or not. The documents may come from émanant des établissements d’enseignement et de teaching and research institutions in France or recherche français ou étrangers, des laboratoires abroad, or from public or private research centers. publics ou privés. Copyright Inconsistency Robustness for Logic Programs Carl Hewitt This article is dedicated to Alonzo Church and Stanisław Jaśkowski Abstract This article explores the role of Inconsistency Robustness in the history and theory of Logic Programs. Inconsistency Robustness has been a continually recurring issue in Logic Programs from the beginning including Church's system developed in the early 1930s based on partial functions (defined in the lambda calculus) that he thought would allow development of a general logic without the kind of paradoxes that had plagued earlier efforts by Frege, etc.1 Planner [Hewitt 1969, 1971] was a kind of hybrid between the procedural and logical paradigms in that it featured a procedural embedding of logical sentences in that an implication of the form (p implies q) can be procedurally embedded in the following ways: Forward chaining When asserted p, Assert q When asserted q, Assert p Backward chaining When goal q, SetGoal p When goal p, SetGoal q Developments by different research groups in the fall of 1972 gave rise to a controversy over Logic Programs that persists to this day in the form of following alternatives: 1.
    [Show full text]
  • A Note on the Optimal Allocation of Spaces in MACLISP by Henry C
    MASSACHUSETTS INSTITUTE OF TECHNOLOGY ARTIFICIAL INTELLIGENCE LABORATORY Al Working Paper 142 March 16, 1977 A Note on the Optimal Allocation of Spaces in MACLISP by Henry C. Baker, Jr. This note describes a method for allocating storage among the various spaces in the MACLISP Implementation of LISP. The optimal strategy which minimizes garbage collector effort allocates free storage among the various spaces in snch a way that they all run out at the same time. In an equilibrium situation, this corresponds to allocating free storage to the spaces in proportion to their usage. Methods are investigated by which the rates of usage can be inferred, and a ge-daemon interrupt handler is developed which implements an approximately optimal strategy in MACLISP. Finally, the sensitivity of this method to rapidly varying differential rates of cell usage is discussed. Key Words and Phrases: garbage collection, list processing, virtual memory, storage management, storage allocation, LISP. CR Categories: 3.50, 3.60, 3.73, 3.80, 4.13, 422, 4.32, 4.33, 4.35, 4.49 This report describes research done at the Artificial Intelligence Laboratory of the Massachusetts Institute of Technology. Support for the laboratory's artificial intelligence research is provided in part by the Advanced Research Projects Agency of the Department of Defense under Office of Naval Research contract N00014-75-C-0522. Working Papers are informal papers intended for internal use. March 16, 1977 A Note on the Optimal Allocation of Spaces in MACLISP Henry C. Baker, Jr. MACLISP [11 unlike some other implementations of. LISP, allocates storage for different types of objects in non-contiguous areas called spaces.
    [Show full text]
  • Special Topics: Programming Languages
    Lecture #11 • 0 V22.0490.001 Special Topics: Programming Languages B. Mishra New York University. Lecture # 11 Programming Languages • MISHRA 2008 Lecture #11 • 1 —Slide 1— Common Lisp Language Survey 4 Functional Programming • Pure Functional Programming: Implicit Principle ◦ The value of an expression depends only on the values of its subexpressions, if any. – No side-effect. (No State—No assignment) – An expression has the same value, every time. – Implicit Storage Management: Allocation on Demand + Garbage Collection. – Functions are First Class Objects: 1) As value of an expression 2) As parameters 3) As data Objects. Programming Languages • MISHRA 2008 Lecture #11 • 2 —Slide 2— Common Lisp • LISP: LIst Processing Language Not— Lots of Insidious Sill Parentheses • Second oldest Programming Language (Af- ter Fortran) • Application Areas: 1. Theorem Proving 2. Symbolic Algebra 3. AI (Artificial Intelligence) (Natural Language Processing, Com- puter Vision, Robot Control Systems, Ex- pert Systems, Neural Networks, Automatic Programming) Programming Languages • MISHRA 2008 Lecture #11 • 3 —Slide 3— HISTORY • Developed at MIT AI Lab—1959. LISP 1.5 running on an IBM machine. • BBN LISP (PDP 1/SDS 940) became → INTERLISP (PDP 10) • MACLISP (MIT Project MAC) • LISP 1.6—A version of MACLISP ◦ UCI-LISP (Univ. of Cal. at Irvine) ◦ Standard Lisp (Univ. of Utah) • Lisp Machine Lisp Large Personal Lisp Machine built at MIT • FranzLISP for Vax/UNIX (UC Berkeley) • NIL for Vax/VMS (MIT) • Scheme at MIT • T Lisp at Yale Programming Languages • MISHRA 2008 Lecture #11 • 4 —Slide 4— Common Lisp • 1981/Carnegie-Mellon/Guy L. Steele • Clean Lisp Inconsistencies and illogical conventions were resolved • Transportable Programs written in Common Lisp and debugged in one implementation should run on another ma- chine/implementation without change.
    [Show full text]
  • ' MACSYMA Users' Conference
    ' MACSYMA Users'Conference Held at University of California Berkeley, California July 27-29, 1977 I TECH LIBRARY KAFB, NY NASA CP-2012 Proceedings of the 1977 MACSYMA Users’ Conference Sponsored by Massachusetts Institute of Technology, University of California at Berkeley, NASA Langley Research Center and held at Berkeley, California July 27-29, 1977 Scientific and TechnicalInformation Office 1977 NATIONALAERONAUTICS AND SPACE ADMINISTRATION NA5A Washington, D.C. FOREWORD The technical programof the 1977 MACSPMA Users' Conference, held at Berkeley,California, from July 27 to July 29, 1977, consisted of the 45 contributedpapers reported in.this publicationand of a workshop.The work- shop was designed to promote an exchange of information between implementers and users of the MACSYMA computersystem and to help guide future developments. I The response to the call for papers has well exceeded the early estimates of the conference organizers; and the high quality and broad ra.ngeof topics of thepapers submitted has been most satisfying. A bibliography of papers concerned with the MACSYMA system is included at the endof this publication. We would like to thank the members of the programcommittee, the many referees, and the secretarial and technical staffs at the University of California at Berkeley and at the Laboratory for Computer Science, Massachusetts Instituteof Technology, for shepherding the many papersthrough the submission- to-publicationprocess. We are especiallyappreciative of theburden. carried by .V. Ellen Lewis of M. I. T. for serving as expert in document preparation from computer-readableto camera-ready copy for several papers. This conference originated as the result of an organizing session called by Joel Moses of M.I.T.
    [Show full text]
  • History of the Lisp Language
    History of the Lisp Language History of the Lisp Language The following information is derived from the history section of dpANS Common Lisp. Lisp is a family of languages with a long history. Early key ideas in Lisp were developed by John McCarthy during the 1956 Dartmouth Summer Research Project on Artificial Intelligence. McCarthy’s motivation was to develop an algebraic list processing language for artificial intelligence work. Implementation efforts for early dialects of Lisp were undertaken on the IBM 704, the IBM 7090, the Digital Equipment Corporation (DEC) PDP−1, the DEC PDP−6, and the PDP−10. The primary dialect of Lisp between 1960 and 1965 was Lisp 1.5. By the early 1970’s there were two predominant dialects of Lisp, both arising from these early efforts: MacLisp and Interlisp. For further information about very early Lisp dialects, see The Anatomy of Lisp or Lisp 1.5 Programmer’s Manual. MacLisp improved on the Lisp 1.5 notion of special variables and error handling. MacLisp also introduced the concept of functions that could take a variable number of arguments, macros, arrays, non−local dynamic exits, fast arithmetic, the first good Lisp compiler, and an emphasis on execution speed. For further information about Maclisp, see Maclisp Reference Manual, Revision 0 or The Revised Maclisp Manual. Interlisp introduced many ideas into Lisp programming environments and methodology. One of the Interlisp ideas that influenced Common Lisp was an iteration construct implemented by Warren Teitelman that inspired the loop macro used both on the Lisp Machines and in MacLisp, and now in Common Lisp.
    [Show full text]
  • Edward Feigenbaum 1994 ACM Turing Award Recipient
    Edward Feigenbaum 1994 ACM Turing Award Recipient Interviewed by: Don Knuth Recorded: April 4, and May 2, 2007 Mountain View, California This transcript, and the video interview on which it is based, is copyright by the Computer History Museum and the ACM is using it by their generous permission. © 2007 Computer History Museum CHM Reference number: X3897.2007 DK: Don Knuth, the interviewer EF: Edward (Ed) Feigenbaum, the 1994 ACM Turing Award Recipient DK: Hello, everyone. My name is Don Knuth. Today [April 4, 2007] I have the great privilege e of interviewing Ed Feigenbaum for oral history where we hope to reach people generations from now and also people of today. I’m going to try to ask a zillion questions about things that I wish I could have asked people who unfortunately are dead now so that students and general scientific people, people with general interest in the world, historians and so on in the future will have some idea as to what Ed was really like even though none of us are going to be around forever. Now this is the second part of a two- phase study. Ed grilled me a couple weeks ago and now it’s my turn to do him. I hope I can do half as well as he did for me. I’d like to get right into it -- But first of all sort of for organization, my plan -- who knows if it’ll carry it out -- is to do a little bit first that’s chronological, in order to set the scenes for his whole life.
    [Show full text]