R»• R- .71 ' • •11 , - Y • Orl • -• • '

Total Page:16

File Type:pdf, Size:1020Kb

R»• R- .71 ' • •11 , - Y • Orl • -• • ' — OCTOBER 1996 IEEE 1E3(1:PIE:FIT INTELLIGENT SYSTEMS THEIIR AIPIPILICATICIMS 4 t 11 411 * 1.11 • :* • • ' , •• • • ...a . • 41 me gr I. y g7e • ft • • ,. .,,,. • et' "' r: 4 , . ir,,,,,,• • . * i • - -. .. ..." -.". -.. • I.:. -i' , • - -.et.-.„ • ,Aiff,Lr A . ..... ' - • . ..445. • • i--.- Ilig - . • .4$* ' 4$ :04 - '.!. ,,_ -'•I ,' . , _ , ,. r ''.." ••1 Illi .- - . .. ... .... * . .4-- 1, lif - ...K.•.- -i; .•f . • ., • - .: - ... - .,.:,-- .-.. .4." '• .1. ....•. / 4').1° :- ..-•.* * - .... • *Oat r• . •., :.P. r -•: -' t .- .,‘41. • -.4 ‘. v i.... • •• I% .•.' . ---/ - I ii- :•:....:It...2:4:2:bits.t......._:,,...., 41::;, /( d - #- 40, , - .- . .4. e . I rr- r r»• r- .71 ' • •11 , - y • orl • -• • '. • r orp 4; - " • • 4 e' ••• • • • . FM TRIP A conversation with Al pignem Oliver Selfridge Using a knowledge-based architecture for a design automation application Facing Web content EE distribution challenges CO14PUTER SOCIETY 50yEARsoFsERvi( E • 1946-1996 0 What makes a compelling 4J10 THE INSTITUTE OF ELECTRICAL AND ELECTRONICS ENGINEERS, INC. empirical evaluation? powers of simple processing units but also Oliver Selfridge— in the physiology of computing in the brain. in from the start What was the next stagefbr you? In 1951, I transferred from Fort Mon- Peter Selfridge mouth in New Jersey to MIT Lincoln Labs [email protected] (Fort Monmouth had knuckled under to Senator Joe McCarthy, but Lincoln Labs had not, and therefore I moved). By another Driven by his curiosity about the nature I was studying mathematics under Norbert stroke of luck, I bumped into Marvin Min- of learning, Oliver Selfridge has spent over Wiener. By luck, of which I've had a great sky, who was fresh from a Doctorate at a half century enmeshed in the most excit- deal in my life, I was introduced to Walter Princeton and was a Fellow at Harvard. He ing developments in artificial intelligence, Pitts, who was working with Warren was a very bright mathematician and communications, and computer science. A McCulloch on a topic they called theoreti- thinker in general—his thesis was on a par- participant at the original conference at cal neurophysiology. I had studied logic, ticular model of, again, neural nets. Marvin Dartmouth in 1956(and at the Western and through Walter, Warren, and Norbert worked for me one summer and then got a Joint Computer Conference in Los Angeles got introduced to neural nets at that time. I job in the mathematics department at MIT, the year before, which he considers the true went to the Pacific at the end of World War working next to RLE,the Research Labora- start of Al), Selfridge formed working rela- II with the US Navy and came back to tory for Electronics, where we met often tionships and cemented friendships with graduate school, again at MIT. Norbert was with Warren, Walter, and the rest of the AI's founding members—John McCarthy, then writing Cybernetics,1 and Walter and I early community.(MIT's Radiation Labora- Marvin Minsky, and Allen Newell, among were helping him with various aspects of it. tory played a crucial role during World War others—as he went on to become a true AI As I studied mathematics(my original II. It invented and built all of the really pow- pioneer himself. field) and interacted with Norbert, Warren, erful radar systems in the latter part of the Before retiring in 1993 after 10 years as and Walter, I began to be interested in the war. It evolved into RLE, which is celebrat- Chief Scientist at GTE Laboratories, Com- specific processing that neural nets could ing its 50th anniversary this fall.) puter and Information Systems Lab, he do and even more interested in the general At that point, we were using the term served as a member of the National Secu- properties of learning. artificial intelligence and thinking about rity Agency's Advisory Board for 20 years, At this point McCulloch and Pitts had many of the general ideas about intelli- chairing its Data Processing Panel for the written the first two Al papers (although it gence that are still around today—the last 15 of those. He also served on various wasn't called that).2 The first showed that a tough cognitive processing that people did, advisory panels to the White House, as neural net could work out certain kinds of perception, games like chess. We were try- well as on the peer review committee for problems, such as pattern recognition in the ing to tie in the modeling that could be the National Institute of Health (NIH), general cognitive sense, and the second dis- done with neural nets to these more obvi- directed Project MAC and the Cambridge cussed acquisition of patterns(how we ous intelligent behaviors of people. My Project at MIT's Lincoln Labs, and was know "universals"). These two works fol- stress was then, as it still is, to understand Staff Scientist at Bolt, Beranek, and lowed all the glorious mathematics that how learning takes place as the primary Neuman (BBN). He continues to write and Turing and &Mel had done in the twenties source of intelligence in people. speak on machine learning and Al, and and thirties about computability and Turing especially on self-improving systems. machines. This mathematics was, of course, This was before computers were generally This profound, long-term familiarity with the beginning of a formal description of available—true? Was it before Von Neu- both the philosophical underpinnings of Al what computability meant. Johnny Von mann had invented the programmable com- and the practical application of AI technol- Neumann visited us at MIT occasionally, so puter? Ifso, how did you work—on paper? ogy places him in a most advantageous again by pure luck, before the age of twenty, We were essentially doing mathematical position to comment on the growth of the I had been introduced to McCulloch, Pitts, and algorithmic modeling on paper. This field in this 40th anniversary year of the Wiener, and Von Neumann. was in 1953, so we knew all about Von Dartmouth Conference. Peter Selfridge, Neumann's work—he had visited us and Oliver's son and an Al luminary in his own Your original background is mathematics, understood what we were up to. And, while right(AT&T Laboratories—Research, pre- but many ofthese ideas begin to verge into we weren't writing Al programs yet, we had viously part of AT&T Bell Laboratories, biology. Was this accidental? had some experience with early computers. member ofIEEE Expert's Editorial Board), I went through MIT, and so I got a very I had done some programming on World- recently asked his father to assess Al's fair liberal arts training, including things wind, a very old computer put together by progress since its earliest days. such as biology. One of my roommates was Jerry Wiesner at MIT to get money out of Jerry Lettvin, a super neurophysiologist, the Air Force (which it did superbly well). Peter Selfridge: How did you become inter- and McCulloch had also been a neurophysi- Lincoln Labs was a follow-on to that. ested in Al? ologist of some renown.3 When one looks at So, while Marvin and I were having Oliver Selfridge: It was at MIT, a long the early work on neural nets, you find these interesting thoughts, computer tech- time before the Dartmouth Conference, and yourself interested not just in the computing nology was exploding in the basement of OCTOBER 1996 15 interested—Wes Clark and and so yes, we were in our late twenties. Belmont Farley at Lincoln There was a great deal of excitement. Labs come to mind. Bel- mont Farley wanted to be a Can you now talk about the Dartmouth wet neurophysiologist, and Conference, which people think ofas being so he was interested in the start ofAl? neural nets, too. By luck, at The Dartmouth Conference was funded one point I made a trip out by the Rockefeller Foundation, as I men- to the Rand Corporation— tioned. It started, I think, on August 6„ 1956, this was 1954—in Santa and lasted roughly four weeks. In that Monica, California. Rand sense, it wasn't a conference like today—it was smaller then; I was went on much longer, and was much more talking to Willis Ware and a loosely structured. Many people were young, smart psychologist invited and dropped by. I was not there the from Carnegie—it wasn't whole time. We would meet and give talks Carnegie Mellon then, it and argue, all those wonderful things. Many was Carnegie Institute of came who shared our goals and excitement: Technology—named Allen Roland Silver, Art Samuel, Leon Harmon. Newell. I talked about pat- There were not many people doing research tern recognition. And Allen in computing, but there was lots of interest and I hit it off, and this was among those who were. John Backus, who really the trigger. I mean he invented Fortran, turned up there. Oliver Selfridge, 1996. was ready for it. He got the And it was a John McCarthy show, so to whole idea, he got turned speak. At this stage, he was already begin- on, and he went back to ning to think about symbol processing. Building C at Lincoln Labs where they had Carnegie and turned Herb Simon on. Those Alan Newell really pushed us all into think- the Memory Test computer, all vacuum two went on to become, well, Newell and ing symbol processing rather than bit pro- tubes, and then the Transistor Test com- Simon of CMU,extraordinarily influential cessing—well, not quite rather than, but as puter(TXO). Jay Forrester was at MIT, and thinkers over the years. well as bit processing.
Recommended publications
  • Historical Perspective and Further Reading 162.E1
    2.21 Historical Perspective and Further Reading 162.e1 2.21 Historical Perspective and Further Reading Th is section surveys the history of in struction set architectures over time, and we give a short history of programming languages and compilers. ISAs include accumulator architectures, general-purpose register architectures, stack architectures, and a brief history of ARMv7 and the x86. We also review the controversial subjects of high-level-language computer architectures and reduced instruction set computer architectures. Th e history of programming languages includes Fortran, Lisp, Algol, C, Cobol, Pascal, Simula, Smalltalk, C+ + , and Java, and the history of compilers includes the key milestones and the pioneers who achieved them. Accumulator Architectures Hardware was precious in the earliest stored-program computers. Consequently, computer pioneers could not aff ord the number of registers found in today’s architectures. In fact, these architectures had a single register for arithmetic instructions. Since all operations would accumulate in one register, it was called the accumulator , and this style of instruction set is given the same name. For example, accumulator Archaic EDSAC in 1949 had a single accumulator. term for register. On-line Th e three-operand format of RISC-V suggests that a single register is at least two use of it as a synonym for registers shy of our needs. Having the accumulator as both a source operand and “register” is a fairly reliable indication that the user the destination of the operation fi lls part of the shortfall, but it still leaves us one has been around quite a operand short. Th at fi nal operand is found in memory.
    [Show full text]
  • John Mccarthy
    JOHN MCCARTHY: the uncommon logician of common sense Excerpt from Out of their Minds: the lives and discoveries of 15 great computer scientists by Dennis Shasha and Cathy Lazere, Copernicus Press August 23, 2004 If you want the computer to have general intelligence, the outer structure has to be common sense knowledge and reasoning. — John McCarthy When a five-year old receives a plastic toy car, she soon pushes it and beeps the horn. She realizes that she shouldn’t roll it on the dining room table or bounce it on the floor or land it on her little brother’s head. When she returns from school, she expects to find her car in more or less the same place she last put it, because she put it outside her baby brother’s reach. The reasoning is so simple that any five-year old child can understand it, yet most computers can’t. Part of the computer’s problem has to do with its lack of knowledge about day-to-day social conventions that the five-year old has learned from her parents, such as don’t scratch the furniture and don’t injure little brothers. Another part of the problem has to do with a computer’s inability to reason as we do daily, a type of reasoning that’s foreign to conventional logic and therefore to the thinking of the average computer programmer. Conventional logic uses a form of reasoning known as deduction. Deduction permits us to conclude from statements such as “All unemployed actors are waiters, ” and “ Sebastian is an unemployed actor,” the new statement that “Sebastian is a waiter.” The main virtue of deduction is that it is “sound” — if the premises hold, then so will the conclusions.
    [Show full text]
  • John Mccarthy – Father of Artificial Intelligence
    Asia Pacific Mathematics Newsletter John McCarthy – Father of Artificial Intelligence V Rajaraman Introduction I first met John McCarthy when he visited IIT, Kanpur, in 1968. During his visit he saw that our computer centre, which I was heading, had two batch processing second generation computers — an IBM 7044/1401 and an IBM 1620, both of them were being used for “production jobs”. IBM 1620 was used primarily to teach programming to all students of IIT and IBM 7044/1401 was used by research students and faculty besides a large number of guest users from several neighbouring universities and research laboratories. There was no interactive computer available for computer science and electrical engineering students to do hardware and software research. McCarthy was a great believer in the power of time-sharing computers. John McCarthy In fact one of his first important contributions was a memo he wrote in 1957 urging the Director of the MIT In this article we summarise the contributions of Computer Centre to modify the IBM 704 into a time- John McCarthy to Computer Science. Among his sharing machine [1]. He later persuaded Digital Equip- contributions are: suggesting that the best method ment Corporation (who made the first mini computers of using computers is in an interactive mode, a mode and the PDP series of computers) to design a mini in which computers become partners of users computer with a time-sharing operating system. enabling them to solve problems. This logically led to the idea of time-sharing of large computers by many users and computing becoming a utility — much like a power utility.
    [Show full text]
  • Fpgas As Components in Heterogeneous HPC Systems: Raising the Abstraction Level of Heterogeneous Programming
    FPGAs as Components in Heterogeneous HPC Systems: Raising the Abstraction Level of Heterogeneous Programming Wim Vanderbauwhede School of Computing Science University of Glasgow A trip down memory lane 80 Years ago: The Theory Turing, Alan Mathison. "On computable numbers, with an application to the Entscheidungsproblem." J. of Math 58, no. 345-363 (1936): 5. 1936: Universal machine (Alan Turing) 1936: Lambda calculus (Alonzo Church) 1936: Stored-program concept (Konrad Zuse) 1937: Church-Turing thesis 1945: The Von Neumann architecture Church, Alonzo. "A set of postulates for the foundation of logic." Annals of mathematics (1932): 346-366. 60-40 Years ago: The Foundations The first working integrated circuit, 1958. © Texas Instruments. 1957: Fortran, John Backus, IBM 1958: First IC, Jack Kilby, Texas Instruments 1965: Moore’s law 1971: First microprocessor, Texas Instruments 1972: C, Dennis Ritchie, Bell Labs 1977: Fortran-77 1977: von Neumann bottleneck, John Backus 30 Years ago: HDLs and FPGAs Algotronix CAL1024 FPGA, 1989. © Algotronix 1984: Verilog 1984: First reprogrammable logic device, Altera 1985: First FPGA,Xilinx 1987: VHDL Standard IEEE 1076-1987 1989: Algotronix CAL1024, the first FPGA to offer random access to its control memory 20 Years ago: High-level Synthesis Page, Ian. "Closing the gap between hardware and software: hardware-software cosynthesis at Oxford." (1996): 2-2. 1996: Handel-C, Oxford University 2001: Mitrion-C, Mitrionics 2003: Bluespec, MIT 2003: MaxJ, Maxeler Technologies 2003: Impulse-C, Impulse Accelerated
    [Show full text]
  • The Computational Attitude in Music Theory
    The Computational Attitude in Music Theory Eamonn Bell Submitted in partial fulfillment of the requirements for the degree of Doctor of Philosophy in the Graduate School of Arts and Sciences COLUMBIA UNIVERSITY 2019 © 2019 Eamonn Bell All rights reserved ABSTRACT The Computational Attitude in Music Theory Eamonn Bell Music studies’s turn to computation during the twentieth century has engendered particular habits of thought about music, habits that remain in operation long after the music scholar has stepped away from the computer. The computational attitude is a way of thinking about music that is learned at the computer but can be applied away from it. It may be manifest in actual computer use, or in invocations of computationalism, a theory of mind whose influence on twentieth-century music theory is palpable. It may also be manifest in more informal discussions about music, which make liberal use of computational metaphors. In Chapter 1, I describe this attitude, the stakes for considering the computer as one of its instruments, and the kinds of historical sources and methodologies we might draw on to chart its ascendance. The remainder of this dissertation considers distinct and varied cases from the mid-twentieth century in which computers or computationalist musical ideas were used to pursue new musical objects, to quantify and classify musical scores as data, and to instantiate a generally music-structuralist mode of analysis. I present an account of the decades-long effort to prepare an exhaustive and accurate catalog of the all-interval twelve-tone series (Chapter 2). This problem was first posed in the 1920s but was not solved until 1959, when the composer Hanns Jelinek collaborated with the computer engineer Heinz Zemanek to jointly develop and run a computer program.
    [Show full text]
  • 1. with Examples of Different Programming Languages Show How Programming Languages Are Organized Along the Given Rubrics: I
    AGBOOLA ABIOLA CSC302 17/SCI01/007 COMPUTER SCIENCE ASSIGNMENT ​ 1. With examples of different programming languages show how programming languages are organized along the given rubrics: i. Unstructured, structured, modular, object oriented, aspect oriented, activity oriented and event oriented programming requirement. ii. Based on domain requirements. iii. Based on requirements i and ii above. 2. Give brief preview of the evolution of programming languages in a chronological order. 3. Vividly distinguish between modular programming paradigm and object oriented programming paradigm. Answer 1i). UNSTRUCTURED LANGUAGE DEVELOPER DATE Assembly Language 1949 FORTRAN John Backus 1957 COBOL CODASYL, ANSI, ISO 1959 JOSS Cliff Shaw, RAND 1963 BASIC John G. Kemeny, Thomas E. Kurtz 1964 TELCOMP BBN 1965 MUMPS Neil Pappalardo 1966 FOCAL Richard Merrill, DEC 1968 STRUCTURED LANGUAGE DEVELOPER DATE ALGOL 58 Friedrich L. Bauer, and co. 1958 ALGOL 60 Backus, Bauer and co. 1960 ABC CWI 1980 Ada United States Department of Defence 1980 Accent R NIS 1980 Action! Optimized Systems Software 1983 Alef Phil Winterbottom 1992 DASL Sun Micro-systems Laboratories 1999-2003 MODULAR LANGUAGE DEVELOPER DATE ALGOL W Niklaus Wirth, Tony Hoare 1966 APL Larry Breed, Dick Lathwell and co. 1966 ALGOL 68 A. Van Wijngaarden and co. 1968 AMOS BASIC FranÇois Lionet anConstantin Stiropoulos 1990 Alice ML Saarland University 2000 Agda Ulf Norell;Catarina coquand(1.0) 2007 Arc Paul Graham, Robert Morris and co. 2008 Bosque Mark Marron 2019 OBJECT-ORIENTED LANGUAGE DEVELOPER DATE C* Thinking Machine 1987 Actor Charles Duff 1988 Aldor Thomas J. Watson Research Center 1990 Amiga E Wouter van Oortmerssen 1993 Action Script Macromedia 1998 BeanShell JCP 1999 AngelScript Andreas Jönsson 2003 Boo Rodrigo B.
    [Show full text]
  • Pioneers of Computing
    Pioneers of Computing В 1980 IEEE Computer Society учредило Золотую медаль (бронзовую) «Вычислительный Пионер» Пионерами учредителями стали 32 члена IEEE Computer Society, связанных с работами по информатике и вычислительным наукам. 1 Pioneers of Computing 1.Howard H. Aiken (Havard Mark I) 2.John V. Atanasoff 3.Charles Babbage (Analytical Engine) 4.John Backus 5.Gordon Bell (Digital) 6.Vannevar Bush 7.Edsger W. Dijkstra 8.John Presper Eckert 9.Douglas C. Engelbart 10.Andrei P. Ershov (theroretical programming) 11.Tommy Flowers (Colossus engineer) 12.Robert W. Floyd 13.Kurt Gödel 14.William R. Hewlett 15.Herman Hollerith 16.Grace M. Hopper 17.Tom Kilburn (Manchester) 2 Pioneers of Computing 1. Donald E. Knuth (TeX) 2. Sergei A. Lebedev 3. Augusta Ada Lovelace 4. Aleksey A.Lyapunov 5. Benoit Mandelbrot 6. John W. Mauchly 7. David Packard 8. Blaise Pascal 9. P. Georg and Edvard Scheutz (Difference Engine, Sweden) 10. C. E. Shannon (information theory) 11. George R. Stibitz 12. Alan M. Turing (Colossus and code-breaking) 13. John von Neumann 14. Maurice V. Wilkes (EDSAC) 15. J.H. Wilkinson (numerical analysis) 16. Freddie C. Williams 17. Niklaus Wirth 18. Stephen Wolfram (Mathematica) 19. Konrad Zuse 3 Pioneers of Computing - 2 Howard H. Aiken (Havard Mark I) – США Создатель первой ЭВМ – 1943 г. Gene M. Amdahl (IBM360 computer architecture, including pipelining, instruction look-ahead, and cache memory) – США (1964 г.) Идеология майнфреймов – система массовой обработки данных John W. Backus (Fortran) – первый язык высокого уровня – 1956 г. 4 Pioneers of Computing - 3 Robert S. Barton For his outstanding contributions in basing the design of computing systems on the hierarchical nature of programs and their data.
    [Show full text]
  • Publications Core Magazine, 2007 Read
    CA PUBLICATIONo OF THE COMPUTERre HISTORY MUSEUM ⁄⁄ SPRINg–SUMMER 2007 REMARKABLE PEOPLE R E scuE d TREAsuREs A collection saved by SAP Focus on E x TRAORdinARy i MAGEs Computers through the Robert Noyce lens of Mark Richards PUBLISHER & Ed I t o R - I n - c hie f THE BEST WAY Karen M. Tucker E X E c U t I V E E d I t o R TO SEE THE FUTURE Leonard J. Shustek M A n A GI n G E d I t o R OF COMPUTING IS Robert S. Stetson A S S o c IA t E E d I t o R TO BROWSE ITS PAST. Kirsten Tashev t E c H n I c A L E d I t o R Dag Spicer E d I t o R Laurie Putnam c o n t RIBU t o RS Leslie Berlin Chris garcia Paula Jabloner Luanne Johnson Len Shustek Dag Spicer Kirsten Tashev d E S IG n Kerry Conboy P R o d U c t I o n ma n ager Robert S. Stetson W E BSI t E M A n AGER Bob Sanguedolce W E BSI t E d ESIG n The computer. In all of human history, rarely has one invention done Dana Chrisler so much to change the world in such a short time. Ton Luong The Computer History Museum is home to the world’s largest collection computerhistory.org/core of computing artifacts and offers a variety of exhibits, programs, and © 2007 Computer History Museum.
    [Show full text]
  • Creativity in Computer Science. in J
    Creativity in Computer Science Daniel Saunders and Paul Thagard University of Waterloo Saunders, D., & Thagard, P. (forthcoming). Creativity in computer science. In J. C. Kaufman & J. Baer (Eds.), Creativity across domains: Faces of the muse. Mahwah, NJ: Lawrence Erlbaum Associates. 1. Introduction Computer science only became established as a field in the 1950s, growing out of theoretical and practical research begun in the previous two decades. The field has exhibited immense creativity, ranging from innovative hardware such as the early mainframes to software breakthroughs such as programming languages and the Internet. Martin Gardner worried that "it would be a sad day if human beings, adjusting to the Computer Revolution, became so intellectually lazy that they lost their power of creative thinking" (Gardner, 1978, p. vi-viii). On the contrary, computers and the theory of computation have provided great opportunities for creative work. This chapter examines several key aspects of creativity in computer science, beginning with the question of how problems arise in computer science. We then discuss the use of analogies in solving key problems in the history of computer science. Our discussion in these sections is based on historical examples, but the following sections discuss the nature of creativity using information from a contemporary source, a set of interviews with practicing computer scientists collected by the Association of Computing Machinery’s on-line student magazine, Crossroads. We then provide a general comparison of creativity in computer science and in the natural sciences. 2. Nature and Origins of Problems in Computer Science December 21, 2004 Computer science is closely related to both mathematics and engineering.
    [Show full text]
  • Arxiv:2106.11534V1 [Cs.DL] 22 Jun 2021 2 Nanjing University of Science and Technology, Nanjing, China 3 University of Southampton, Southampton, U.K
    Noname manuscript No. (will be inserted by the editor) Turing Award elites revisited: patterns of productivity, collaboration, authorship and impact Yinyu Jin1 · Sha Yuan1∗ · Zhou Shao2, 4 · Wendy Hall3 · Jie Tang4 Received: date / Accepted: date Abstract The Turing Award is recognized as the most influential and presti- gious award in the field of computer science(CS). With the rise of the science of science (SciSci), a large amount of bibliographic data has been analyzed in an attempt to understand the hidden mechanism of scientific evolution. These include the analysis of the Nobel Prize, including physics, chemistry, medicine, etc. In this article, we extract and analyze the data of 72 Turing Award lau- reates from the complete bibliographic data, fill the gap in the lack of Turing Award analysis, and discover the development characteristics of computer sci- ence as an independent discipline. First, we show most Turing Award laureates have long-term and high-quality educational backgrounds, and more than 61% of them have a degree in mathematics, which indicates that mathematics has played a significant role in the development of computer science. Secondly, the data shows that not all scholars have high productivity and high h-index; that is, the number of publications and h-index is not the leading indicator for evaluating the Turing Award. Third, the average age of awardees has increased from 40 to around 70 in recent years. This may be because new breakthroughs take longer, and some new technologies need time to prove their influence. Besides, we have also found that in the past ten years, international collabo- ration has experienced explosive growth, showing a new paradigm in the form of collaboration.
    [Show full text]
  • Introduction to the Literature on Programming Language Design Gary T
    Computer Science Technical Reports Computer Science 7-1999 Introduction to the Literature On Programming Language Design Gary T. Leavens Iowa State University Follow this and additional works at: http://lib.dr.iastate.edu/cs_techreports Part of the Programming Languages and Compilers Commons Recommended Citation Leavens, Gary T., "Introduction to the Literature On Programming Language Design" (1999). Computer Science Technical Reports. 59. http://lib.dr.iastate.edu/cs_techreports/59 This Article is brought to you for free and open access by the Computer Science at Iowa State University Digital Repository. It has been accepted for inclusion in Computer Science Technical Reports by an authorized administrator of Iowa State University Digital Repository. For more information, please contact [email protected]. Introduction to the Literature On Programming Language Design Abstract This is an introduction to the literature on programming language design and related topics. It is intended to cite the most important work, and to provide a place for students to start a literature search. Keywords programming languages, semantics, type systems, polymorphism, type theory, data abstraction, functional programming, object-oriented programming, logic programming, declarative programming, parallel and distributed programming languages Disciplines Programming Languages and Compilers This article is available at Iowa State University Digital Repository: http://lib.dr.iastate.edu/cs_techreports/59 Intro duction to the Literature On Programming Language Design Gary T. Leavens TR 93-01c Jan. 1993, revised Jan. 1994, Feb. 1996, and July 1999 Keywords: programming languages, semantics, typ e systems, p olymorphism, typ e theory, data abstrac- tion, functional programming, ob ject-oriented programming, logic programming, declarative programming, parallel and distributed programming languages.
    [Show full text]
  • An Interview with Oliver Selfridge1
    An Interview with Oliver Selfridge1 Oliver Selfridge was born in 1926 in London. He studied Mathematics at MIT under Norbert Wiener and went on to write important early papers on pattern recognition and machine learning. His 1958 paper on the Pandemonium system is regarded as one of the classics of machine intelligence. He has worked at MIT Lincoln Laboratory, BBN and GTE Laboratories where he was a Chief Scientist. He has served on various advisory panels to the White House and numerous national committees. As well as his scientific writings, he has authored several books for children2. This is an edited transcript of an interview conducted on the 8th May 2006. Phil Husbands: Could you start by saying a little about your early years? Were there any particular influences from home or school that put you on the road to a career in science and engineering? Oliver Selfridge: Well, an important part of my education was my father. Without knowing any mathematics himself, he was wildly enthusiastic about my interest in it, which started at quite an early age: seven or eight. As was usual in England back then, I went away to school when I was ten. At the age of thirteen, I entered Malvern College, one of the (so-called) public schools. I remember we spent the year of 1940 in Blenheim Palace because the Royal Radar Establishment (RRE) had taken over the school. While at Malvern I covered calculus to the standard you’d reach after the first two years of a degree at MIT. One of the great things about education back then, and I am not sure that it’s true any more, is that if you were good in one subject they’d move you ahead in that subject.
    [Show full text]