Programming in America in the 1950S- Some Personal Impressions
Total Page:16
File Type:pdf, Size:1020Kb
Load more
Recommended publications
-
Advancing Automatic Programming: Regeneration, Meta-Circularity, and Two-Sided Interfaces
SOFTNET 2019 Advancing Automatic Programming: Regeneration, Meta-circularity, and Two-Sided Interfaces HERWIGMANNAERT NOVEMBER 2 7 , 2 0 1 9 • Automatic Programming • Toward Automatic Regeneration • Creating Meta-Circular Regeneration • On Meta-Programming Interfaces ADVANCING AUTOMATIC PROGRAMMING • Concluding Facts and Thoughts Overview • Questions and Discussion • Automatic Programming • Concept and Trends • Remaining Challenges • Toward Automatic Regeneration ADVANCING AUTOMATIC PROGRAMMING • Creating Meta-Circular Regeneration • On Meta-Programming Interfaces Overview • Concluding Facts and Thoughts • Questions and Discussion Automatic Programming • Automatic programming: • The act of automatically generating source code from a model or template • Sometimes distinguished from code generation, as performed by a compiler • Has always been a euphemism for programming in a higher-level language than was then available to the programmer [David Parnas] • Also referred to a generative or meta-programming • To manufacture software components in an automated way • It is as old as programming itself: System.out.println(“Hello world.”); System.out.println(“System.out.println(\“Hello world.\”);”); The Need for Automatic Programming • Goal is and has always been to improve programmer productivity • In general, to manufacture software components in an automated way, in the same way as automation in the industrial revolution, would: • Increase programming productivity • Consolidate programming knowledge • Eliminate human programming errors • Such -
Turing's Influence on Programming — Book Extract from “The Dawn of Software Engineering: from Turing to Dijkstra”
Turing's Influence on Programming | Book extract from \The Dawn of Software Engineering: from Turing to Dijkstra" Edgar G. Daylight∗ Eindhoven University of Technology, The Netherlands [email protected] Abstract Turing's involvement with computer building was popularized in the 1970s and later. Most notable are the books by Brian Randell (1973), Andrew Hodges (1983), and Martin Davis (2000). A central question is whether John von Neumann was influenced by Turing's 1936 paper when he helped build the EDVAC machine, even though he never cited Turing's work. This question remains unsettled up till this day. As remarked by Charles Petzold, one standard history barely mentions Turing, while the other, written by a logician, makes Turing a key player. Contrast these observations then with the fact that Turing's 1936 paper was cited and heavily discussed in 1959 among computer programmers. In 1966, the first Turing award was given to a programmer, not a computer builder, as were several subsequent Turing awards. An historical investigation of Turing's influence on computing, presented here, shows that Turing's 1936 notion of universality became increasingly relevant among programmers during the 1950s. The central thesis of this paper states that Turing's in- fluence was felt more in programming after his death than in computer building during the 1940s. 1 Introduction Many people today are led to believe that Turing is the father of the computer, the father of our digital society, as also the following praise for Martin Davis's bestseller The Universal Computer: The Road from Leibniz to Turing1 suggests: At last, a book about the origin of the computer that goes to the heart of the story: the human struggle for logic and truth. -
Historical Perspective and Further Reading 162.E1
2.21 Historical Perspective and Further Reading 162.e1 2.21 Historical Perspective and Further Reading Th is section surveys the history of in struction set architectures over time, and we give a short history of programming languages and compilers. ISAs include accumulator architectures, general-purpose register architectures, stack architectures, and a brief history of ARMv7 and the x86. We also review the controversial subjects of high-level-language computer architectures and reduced instruction set computer architectures. Th e history of programming languages includes Fortran, Lisp, Algol, C, Cobol, Pascal, Simula, Smalltalk, C+ + , and Java, and the history of compilers includes the key milestones and the pioneers who achieved them. Accumulator Architectures Hardware was precious in the earliest stored-program computers. Consequently, computer pioneers could not aff ord the number of registers found in today’s architectures. In fact, these architectures had a single register for arithmetic instructions. Since all operations would accumulate in one register, it was called the accumulator , and this style of instruction set is given the same name. For example, accumulator Archaic EDSAC in 1949 had a single accumulator. term for register. On-line Th e three-operand format of RISC-V suggests that a single register is at least two use of it as a synonym for registers shy of our needs. Having the accumulator as both a source operand and “register” is a fairly reliable indication that the user the destination of the operation fi lls part of the shortfall, but it still leaves us one has been around quite a operand short. Th at fi nal operand is found in memory. -
John Mccarthy
JOHN MCCARTHY: the uncommon logician of common sense Excerpt from Out of their Minds: the lives and discoveries of 15 great computer scientists by Dennis Shasha and Cathy Lazere, Copernicus Press August 23, 2004 If you want the computer to have general intelligence, the outer structure has to be common sense knowledge and reasoning. — John McCarthy When a five-year old receives a plastic toy car, she soon pushes it and beeps the horn. She realizes that she shouldn’t roll it on the dining room table or bounce it on the floor or land it on her little brother’s head. When she returns from school, she expects to find her car in more or less the same place she last put it, because she put it outside her baby brother’s reach. The reasoning is so simple that any five-year old child can understand it, yet most computers can’t. Part of the computer’s problem has to do with its lack of knowledge about day-to-day social conventions that the five-year old has learned from her parents, such as don’t scratch the furniture and don’t injure little brothers. Another part of the problem has to do with a computer’s inability to reason as we do daily, a type of reasoning that’s foreign to conventional logic and therefore to the thinking of the average computer programmer. Conventional logic uses a form of reasoning known as deduction. Deduction permits us to conclude from statements such as “All unemployed actors are waiters, ” and “ Sebastian is an unemployed actor,” the new statement that “Sebastian is a waiter.” The main virtue of deduction is that it is “sound” — if the premises hold, then so will the conclusions. -
Model Based Software Development: Issues & Challenges
Model Based Software Development: Issues & Challenges N Md Jubair Basha 1, Salman Abdul Moiz 2 & Mohammed Rizwanullah 3 1&3 IT Department, Muffakham Jah College of Engineering & Technology, Hyderabad, India 2IT Department, MVSR Engineering College, Hyderabad, India E-mail : [email protected] 1, [email protected] 2, [email protected] 3 Abstract - One of the goals of software design is to model a system in such a way that it is easily understandable . Nowadays the tendency for software development is changing from manual coding to automatic code generation; it is becoming model-based. This is a response to the software crisis, in which the cost of hardware has decreased and conversely the cost of software development has increased sharply. The methodologies that allowed this change are model-based, thus relieving the human from detailed coding. Still there is a long way to achieve this goal, but work is being done worldwide to achieve this objective. This paper presents the drastic changes related to modeling and important challenging issues and techniques that recur in MBSD. Keywords - model based software; domain engineering; domain specificity; transformations. I. INTRODUCTION domain engineering and application. The system described by a model may or may not exist at the time Model is an abstraction of some aspect of a system. the model is created. Models are created to serve Model-based software and system design is based on the particular purposes, for example, to present a human end-to-end use of formal, composable and manipulable understandable description of some aspect of a system models in the product life-cycle. -
John Mccarthy – Father of Artificial Intelligence
Asia Pacific Mathematics Newsletter John McCarthy – Father of Artificial Intelligence V Rajaraman Introduction I first met John McCarthy when he visited IIT, Kanpur, in 1968. During his visit he saw that our computer centre, which I was heading, had two batch processing second generation computers — an IBM 7044/1401 and an IBM 1620, both of them were being used for “production jobs”. IBM 1620 was used primarily to teach programming to all students of IIT and IBM 7044/1401 was used by research students and faculty besides a large number of guest users from several neighbouring universities and research laboratories. There was no interactive computer available for computer science and electrical engineering students to do hardware and software research. McCarthy was a great believer in the power of time-sharing computers. John McCarthy In fact one of his first important contributions was a memo he wrote in 1957 urging the Director of the MIT In this article we summarise the contributions of Computer Centre to modify the IBM 704 into a time- John McCarthy to Computer Science. Among his sharing machine [1]. He later persuaded Digital Equip- contributions are: suggesting that the best method ment Corporation (who made the first mini computers of using computers is in an interactive mode, a mode and the PDP series of computers) to design a mini in which computers become partners of users computer with a time-sharing operating system. enabling them to solve problems. This logically led to the idea of time-sharing of large computers by many users and computing becoming a utility — much like a power utility. -
Fpgas As Components in Heterogeneous HPC Systems: Raising the Abstraction Level of Heterogeneous Programming
FPGAs as Components in Heterogeneous HPC Systems: Raising the Abstraction Level of Heterogeneous Programming Wim Vanderbauwhede School of Computing Science University of Glasgow A trip down memory lane 80 Years ago: The Theory Turing, Alan Mathison. "On computable numbers, with an application to the Entscheidungsproblem." J. of Math 58, no. 345-363 (1936): 5. 1936: Universal machine (Alan Turing) 1936: Lambda calculus (Alonzo Church) 1936: Stored-program concept (Konrad Zuse) 1937: Church-Turing thesis 1945: The Von Neumann architecture Church, Alonzo. "A set of postulates for the foundation of logic." Annals of mathematics (1932): 346-366. 60-40 Years ago: The Foundations The first working integrated circuit, 1958. © Texas Instruments. 1957: Fortran, John Backus, IBM 1958: First IC, Jack Kilby, Texas Instruments 1965: Moore’s law 1971: First microprocessor, Texas Instruments 1972: C, Dennis Ritchie, Bell Labs 1977: Fortran-77 1977: von Neumann bottleneck, John Backus 30 Years ago: HDLs and FPGAs Algotronix CAL1024 FPGA, 1989. © Algotronix 1984: Verilog 1984: First reprogrammable logic device, Altera 1985: First FPGA,Xilinx 1987: VHDL Standard IEEE 1076-1987 1989: Algotronix CAL1024, the first FPGA to offer random access to its control memory 20 Years ago: High-level Synthesis Page, Ian. "Closing the gap between hardware and software: hardware-software cosynthesis at Oxford." (1996): 2-2. 1996: Handel-C, Oxford University 2001: Mitrion-C, Mitrionics 2003: Bluespec, MIT 2003: MaxJ, Maxeler Technologies 2003: Impulse-C, Impulse Accelerated -
The History of Computer Language Selection
The History of Computer Language Selection Kevin R. Parker College of Business, Idaho State University, Pocatello, Idaho USA [email protected] Bill Davey School of Business Information Technology, RMIT University, Melbourne, Australia [email protected] Abstract: This examines the history of computer language choice for both industry use and university programming courses. The study considers events in two developed countries and reveals themes that may be common in the language selection history of other developed nations. History shows a set of recurring problems for those involved in choosing languages. This study shows that those involved in the selection process can be informed by history when making those decisions. Keywords: selection of programming languages, pragmatic approach to selection, pedagogical approach to selection. 1. Introduction The history of computing is often expressed in terms of significant hardware developments. Both the United States and Australia made early contributions in computing. Many trace the dawn of the history of programmable computers to Eckert and Mauchly’s departure from the ENIAC project to start the Eckert-Mauchly Computer Corporation. In Australia, the history of programmable computers starts with CSIRAC, the fourth programmable computer in the world that ran its first test program in 1949. This computer, manufactured by the government science organization (CSIRO), was used into the 1960s as a working machine at the University of Melbourne and still exists as a complete unit at the Museum of Victoria in Melbourne. Australia’s early entry into computing makes a comparison with the United States interesting. These early computers needed programmers, that is, people with the expertise to convert a problem into a mathematical representation directly executable by the computer. -
The Computational Attitude in Music Theory
The Computational Attitude in Music Theory Eamonn Bell Submitted in partial fulfillment of the requirements for the degree of Doctor of Philosophy in the Graduate School of Arts and Sciences COLUMBIA UNIVERSITY 2019 © 2019 Eamonn Bell All rights reserved ABSTRACT The Computational Attitude in Music Theory Eamonn Bell Music studies’s turn to computation during the twentieth century has engendered particular habits of thought about music, habits that remain in operation long after the music scholar has stepped away from the computer. The computational attitude is a way of thinking about music that is learned at the computer but can be applied away from it. It may be manifest in actual computer use, or in invocations of computationalism, a theory of mind whose influence on twentieth-century music theory is palpable. It may also be manifest in more informal discussions about music, which make liberal use of computational metaphors. In Chapter 1, I describe this attitude, the stakes for considering the computer as one of its instruments, and the kinds of historical sources and methodologies we might draw on to chart its ascendance. The remainder of this dissertation considers distinct and varied cases from the mid-twentieth century in which computers or computationalist musical ideas were used to pursue new musical objects, to quantify and classify musical scores as data, and to instantiate a generally music-structuralist mode of analysis. I present an account of the decades-long effort to prepare an exhaustive and accurate catalog of the all-interval twelve-tone series (Chapter 2). This problem was first posed in the 1920s but was not solved until 1959, when the composer Hanns Jelinek collaborated with the computer engineer Heinz Zemanek to jointly develop and run a computer program. -
CSC 272 - Software II: Principles of Programming Languages
CSC 272 - Software II: Principles of Programming Languages Lecture 1 - An Introduction What is a Programming Language? • A programming language is a notational system for describing computation in machine-readable and human-readable form. • Most of these forms are high-level languages , which is the subject of the course. • Assembly languages and other languages that are designed to more closely resemble the computer’s instruction set than anything that is human- readable are low-level languages . Why Study Programming Languages? • In 1969, Sammet listed 120 programming languages in common use – now there are many more! • Most programmers never use more than a few. – Some limit their career’s to just one or two. • The gain is in learning about their underlying design concepts and how this affects their implementation. The Six Primary Reasons • Increased ability to express ideas • Improved background for choosing appropriate languages • Increased ability to learn new languages • Better understanding of significance of implementation • Better use of languages that are already known • Overall advancement of computing Reason #1 - Increased ability to express ideas • The depth at which people can think is heavily influenced by the expressive power of their language. • It is difficult for people to conceptualize structures that they cannot describe, verbally or in writing. Expressing Ideas as Algorithms • This includes a programmer’s to develop effective algorithms • Many languages provide features that can waste computer time or lead programmers to logic errors if used improperly – E. g., recursion in Pascal, C, etc. – E. g., GoTos in FORTRAN, etc. Reason #2 - Improved background for choosing appropriate languages • Many professional programmers have a limited formal education in computer science, limited to a small number of programming languages. -
E.W. Dijkstra Archive: on the Cruelty of Really Teaching Computing Science
On the cruelty of really teaching computing science Edsger W. Dijkstra. (EWD1036) http://www.cs.utexas.edu/users/EWD/ewd10xx/EWD1036.PDF The second part of this talk pursues some of the scientific and educational consequences of the assumption that computers represent a radical novelty. In order to give this assumption clear contents, we have to be much more precise as to what we mean in this context by the adjective "radical". We shall do so in the first part of this talk, in which we shall furthermore supply evidence in support of our assumption. The usual way in which we plan today for tomorrow is in yesterday’s vocabulary. We do so, because we try to get away with the concepts we are familiar with and that have acquired their meanings in our past experience. Of course, the words and the concepts don’t quite fit because our future differs from our past, but then we stretch them a little bit. Linguists are quite familiar with the phenomenon that the meanings of words evolve over time, but also know that this is a slow and gradual process. It is the most common way of trying to cope with novelty: by means of metaphors and analogies we try to link the new to the old, the novel to the familiar. Under sufficiently slow and gradual change, it works reasonably well; in the case of a sharp discontinuity, however, the method breaks down: though we may glorify it with the name "common sense", our past experience is no longer relevant, the analogies become too shallow, and the metaphors become more misleading than illuminating. -
Evolution of the Major Programming Languages
COS 301 Programming Languages Evolution of the Major Programming Languages UMaine School of Computing and Information Science COS 301 - 2018 Topics Zuse’s Plankalkül Minimal Hardware Programming: Pseudocodes The IBM 704 and Fortran Functional Programming: LISP ALGOL 60 COBOL BASIC PL/I APL and SNOBOL SIMULA 67 Orthogonal Design: ALGOL 68 UMaine School of Computing and Information Science COS 301 - 2018 Topics (continued) Some Early Descendants of the ALGOLs Prolog Ada Object-Oriented Programming: Smalltalk Combining Imperative and Object-Oriented Features: C++ Imperative-Based Object-Oriented Language: Java Scripting Languages A C-Based Language for the New Millennium: C# Markup/Programming Hybrid Languages UMaine School of Computing and Information Science COS 301 - 2018 Genealogy of Common Languages UMaine School of Computing and Information Science COS 301 - 2018 Alternate View UMaine School of Computing and Information Science COS 301 - 2018 Zuse’s Plankalkül • Designed in 1945 • For computers based on electromechanical relays • Not published until 1972, implemented in 2000 [Rojas et al.] • Advanced data structures: – Two’s complement integers, floating point with hidden bit, arrays, records – Basic data type: arrays, tuples of arrays • Included algorithms for playing chess • Odd: 2D language • Functions, but no recursion • Loops (“while”) and guarded conditionals [Dijkstra, 1975] UMaine School of Computing and Information Science COS 301 - 2018 Plankalkül Syntax • 3 lines for a statement: – Operation – Subscripts – Types • An assignment