John Backus IBM Research Laboratory San Jose, California I

Total Page:16

File Type:pdf, Size:1020Kb

John Backus IBM Research Laboratory San Jose, California I THE HISTORY OF FORTRAN I, II, AND III John Backus IBM Research Laboratory San Jose, California I. Early background and environment. After May 1954 the A-2 compiler acquired a "pseudocode" which was similar to the or- 1.1 Attitudes about automatic programming der codes for many floating point interpret- in the 1950's. ive systems that were already in operation in 1953: e.g., the Los Alamos systems, DUAL Before 1954 almost all programming was and SHACO [Bouricius 1953; Schlesinger 1953], done in machine language or assembly lan- the MIT "Summer Session Computer" [Adams and guage. Programmers rightly regarded their Laning 1954], a system for the ILLIAC de- work as a complex, creative art that re- signed by D. J. Wheeler [Muller 1954], and quired human inventiveness to produce an the SPEEDCODING system for the IBM 701 efficient program. Much of their effort [Backus 1954]. was devoted to overcoming the difficulties created by the computers of that era: the The Laning and zierler system was quite lack of index registers, the lack of built- a different story: it was the world's first in floating point operations, restricted operating algebraic compiler, a rather ele- instruction sets (which might have AND but gant but simple one. Knuth and Pardo [1977] not OR, for example), and primitive input- assign this honor to Alick Glennie's AUTO- output arrangements. Given the nature of CODE, but I, for one, am unable to recognize computers, the services which "automatic the sample AUTOCODE program they give as programming" performed for the programmer "algebraic", especially when it is compared were concerned with overcoming the machine's to the corresponding Laning and Zierler shortcomings. Thus the primary concern of program. some "automatic programming" systems was to allow the use of symbolic addresses and All of the early "automatic programming" decimal numbers (e.g., the MIDAC Input systems were costly to use, since they slow- Translation Program [Brown and Carr 1954]). ed the machine down by a factor of five or ten. The most common reason for the slow- But most of the larger "automatic. pro- down was that these systems were spending gramming" systems (with the exception of most of their time in floating point sub- Laning and Zierler's algebraic system [Lan- routines. Simulated indexing and other ing and Zierler 1954] and the A-2 compiler "housekeeping" operations could be done with [Remington Rand 1953; Moser 1954]) simply simple inefficient techniques, since, slow provided a synthetic "computer" with an or- as they were, they took far less time than der code different from that of the real the floating point work. machine. This synthetic computer usually had floating point instructions and index Experience with slow "automatic program- registers and had improved input-output com- ming" systems, plus their own experience mands; it was therefore much easier to pro- with the problems of organizing loops gram than its real counterpart. and address modification, had convinced programmers that efficient programming was The A-2 compiler also came to be a syn- something that could not be automated. An- thetic computer sometime after early 1954. other reason that "automatic programming" But in early 1954 its input had a much was not taken seriously by the computing cruder form; instead of "pseudo-instruc- community came from the energetic public tions" its input was then a complex sequence relations efforts of some visionaries to of "compiling instructions" that could take spread the word that their "automatic pro- a variety of forms ranging from machine code gramming" systems had almost human abilities itself to lengthy groups of words consti- to understand the language and needs of the tuting rather clumsy calling sequences for user; whereas closer inspection of these the desired floating point subroutine, to same systems would often reveal a complex, "abbreviated form" instructions that were exception-ridden performer of clerical tasks converted by a "Translator" into ordinary which was both difficult to use and ineffi- "compiling instructions" [Moser 1954]. cient. Whatever the reasons, it is diffi- cult to convey to a reader in the late sev- © 1978 Association for Computing Machinery, Inc. 1 6 5 ACM SlGPLAN Notices, VoI. 13, No. 8, August 1978 enties the strength of the skepticism about standpoint of its input "programs" it pro- "automatic programming" in general and about vided fewer conveniences than most of the its ability to produce efficient programs then current interpretive systems mention- in particular, as it existed in 1954. ed earlier; it later adopted a "pseudo- code" as input which was similar to the (In the above discussion of attitudes input codes of these interpretive systems. about "automatic programming" in 1954 I have mentioned only those actual systems of which The Laning and Zierler system accepted my colleagues and I were aware at the time. as input an elegant but rather simple alge- For a comprehensive treatment of early pro- braic language. It permitted single-letter graining systems and languages I recommend variables (identifiers) which could have a the article by Knuth and Pardo [1977] and single constant or variable subscript. The Sammet [1969].) repertoire of functions one could use were denoted by "F" with an integer superscript 1.2 The economics of programming. to indicate the "catalog number" of the de- sired function. Algebraic expressions were Another factor which influenced the de- compiled into closed subroutines and placed velopment of FORTRAN was the economics of on a magnetic drum for subsequent use. The programming in 1954. The cost of program- system was originally designed for the mers associated with a computer center was Whirlwind computer when it had 1,024 stor- usually at least as great as the cost of the age cells, with the result that it caused computer itself. (This fact follows from a slowdown in execution speed by a factor the average salary-plus-overhead and number of about ten [Adams and Laning 1954]. of programmers at each center and from the computer rental figures.) In addition, from The effect of the Laning and Zierler one quarter to one half of the computer's system on the development of FORTRAN is a time was spent in debugging. Thus p~ogram- question which has been muddled by many ming and debugging accounted for as much as misstatements on my part. For many years three quarters of the cost of operating a I believed that we had gotten the idea for computer; and obviously, as computers got using algebraic notation in FORTRAN from cheaper, this situation would get worse. seeing a demonstration of the Laning and Zierler system at MIT. In preparing a pa- This economic factor was one of the prime per [Backus 1976] for the International motivations which led me to propose the FOR- Research Conference on the History of Com- TRAN project in a letter to my boss, Cuth- puting at Los Alamos (June 10-15, 1976), bert Hurd, in late 1953 (the exact date is I reviewed the matter with Irving Ziller not known but other facts suggest December and obtained a copy of a 1954 letter [Backus 1953 as a likely date). I believe that the 1954a] (which Dr. Laning kindly sent to me). economic need for a system like FORTRAN was As a result the facts of the matter have one reason why IBM and my successive bosses, become clear. The letter in question is Hurd, Charles DeCarlo, and John McPherson, one I sent to Dr. Laning asking for a provided for our constantly expanding needs demonstration of his system. It makes clear over the next five years without ever ask- that we had learned of his work at the ing us to project or justify those needs in Office of Naval Research Symposium on Auto- a formal budget. matic Programming for Digital Computers, May 13-14, 1954, and that the demonstration 1.3 Programming systems in 1954. took place on June 2, 1954. The letter also makes clear that the FORTRAN project It is difficult for a programmer of to- was well under way when the letter was sent day to comprehend what "automatic program- (May 21, 1954) and included Harlan Herrick, ming" meant to programmers in 1954. To Robert A. Nelson, and Irving Ziller as well many it then meant simply providing mnemon- as myself. Furthermore, an article in the ic operation codes and symbolic addresses, proceedings of that same ONR Symposium by to others it meant the simple'process of Herrick and myself [Backus and Herrick 1954] obtaining subroutines from a library and shows clearly that we were already consid- inserting the addresses of operands into ering input expressions like "Zaij •b jk " each subroutine. Most "automatic program- and "X÷Y". We went on to raise the ques- ming" systems were either assembly programs, tion "...can a machine translate a suffi- or subroutine-fixing programs, or, most ciently rich mathematical language into a popularly, interpretive systems to provide sufficiently economical program at a suf- floating point and indexing operations. ficiently low cost to make the whole affair My friends and I were aware of a number of feasible?" assembly programs and interpretive systems, some of which have been mentioned above; These and other remarks in our paper besides these there were primarily two presented at the Symposium in May 1954 make other systems of significance: the A-2 it clear that we were already considering compiler [Remington Rand 1953; Moser 1954] algebraic input considerably more sophis- and the Laning and Zierler [1954] algebraic ticated than that of Laning and Zierler's compiler at MIT. As noted above, the A-2 system when we first heard of their pioneer- compiler was at that time largely a sub- ing work.
Recommended publications
  • Historical Perspective and Further Reading 162.E1
    2.21 Historical Perspective and Further Reading 162.e1 2.21 Historical Perspective and Further Reading Th is section surveys the history of in struction set architectures over time, and we give a short history of programming languages and compilers. ISAs include accumulator architectures, general-purpose register architectures, stack architectures, and a brief history of ARMv7 and the x86. We also review the controversial subjects of high-level-language computer architectures and reduced instruction set computer architectures. Th e history of programming languages includes Fortran, Lisp, Algol, C, Cobol, Pascal, Simula, Smalltalk, C+ + , and Java, and the history of compilers includes the key milestones and the pioneers who achieved them. Accumulator Architectures Hardware was precious in the earliest stored-program computers. Consequently, computer pioneers could not aff ord the number of registers found in today’s architectures. In fact, these architectures had a single register for arithmetic instructions. Since all operations would accumulate in one register, it was called the accumulator , and this style of instruction set is given the same name. For example, accumulator Archaic EDSAC in 1949 had a single accumulator. term for register. On-line Th e three-operand format of RISC-V suggests that a single register is at least two use of it as a synonym for registers shy of our needs. Having the accumulator as both a source operand and “register” is a fairly reliable indication that the user the destination of the operation fi lls part of the shortfall, but it still leaves us one has been around quite a operand short. Th at fi nal operand is found in memory.
    [Show full text]
  • John Mccarthy
    JOHN MCCARTHY: the uncommon logician of common sense Excerpt from Out of their Minds: the lives and discoveries of 15 great computer scientists by Dennis Shasha and Cathy Lazere, Copernicus Press August 23, 2004 If you want the computer to have general intelligence, the outer structure has to be common sense knowledge and reasoning. — John McCarthy When a five-year old receives a plastic toy car, she soon pushes it and beeps the horn. She realizes that she shouldn’t roll it on the dining room table or bounce it on the floor or land it on her little brother’s head. When she returns from school, she expects to find her car in more or less the same place she last put it, because she put it outside her baby brother’s reach. The reasoning is so simple that any five-year old child can understand it, yet most computers can’t. Part of the computer’s problem has to do with its lack of knowledge about day-to-day social conventions that the five-year old has learned from her parents, such as don’t scratch the furniture and don’t injure little brothers. Another part of the problem has to do with a computer’s inability to reason as we do daily, a type of reasoning that’s foreign to conventional logic and therefore to the thinking of the average computer programmer. Conventional logic uses a form of reasoning known as deduction. Deduction permits us to conclude from statements such as “All unemployed actors are waiters, ” and “ Sebastian is an unemployed actor,” the new statement that “Sebastian is a waiter.” The main virtue of deduction is that it is “sound” — if the premises hold, then so will the conclusions.
    [Show full text]
  • The Evolution of Ibm Research Looking Back at 50 Years of Scientific Achievements and Innovations
    FEATURES THE EVOLUTION OF IBM RESEARCH LOOKING BACK AT 50 YEARS OF SCIENTIFIC ACHIEVEMENTS AND INNOVATIONS l Chris Sciacca and Christophe Rossel – IBM Research – Zurich, Switzerland – DOI: 10.1051/epn/2014201 By the mid-1950s IBM had established laboratories in New York City and in San Jose, California, with San Jose being the first one apart from headquarters. This provided considerable freedom to the scientists and with its success IBM executives gained the confidence they needed to look beyond the United States for a third lab. The choice wasn’t easy, but Switzerland was eventually selected based on the same blend of talent, skills and academia that IBM uses today — most recently for its decision to open new labs in Ireland, Brazil and Australia. 16 EPN 45/2 Article available at http://www.europhysicsnews.org or http://dx.doi.org/10.1051/epn/2014201 THE evolution OF IBM RESEARCH FEATURES he Computing-Tabulating-Recording Com- sorting and disseminating information was going to pany (C-T-R), the precursor to IBM, was be a big business, requiring investment in research founded on 16 June 1911. It was initially a and development. Tmerger of three manufacturing businesses, He began hiring the country’s top engineers, led which were eventually molded into the $100 billion in- by one of world’s most prolific inventors at the time: novator in technology, science, management and culture James Wares Bryce. Bryce was given the task to in- known as IBM. vent and build the best tabulating, sorting and key- With the success of C-T-R after World War I came punch machines.
    [Show full text]
  • IBM Research AI Residency Program
    IBM Research AI Residency Program The IBM Research™ AI Residency Program is a 13-month program Topics of focus include: that provides opportunity for scientists, engineers, domain experts – Trust in AI (Causal modeling, fairness, explainability, and entrepreneurs to conduct innovative research and development robustness, transparency, AI ethics) on important and emerging topics in Artificial Intelligence (AI). The program aims at creating and investigating novel approaches – Natural Language Processing and Understanding in AI that progress capabilities towards significant technical (Question and answering, reading comprehension, and real-world challenges. AI Residents work closely with IBM language embeddings, dialog, multi-lingual NLP) Research scientists and are expected to fully complete a project – Knowledge and Reasoning (Knowledge/graph embeddings, within the 13-month residency. The results of the project may neuro-symbolic reasoning) include publications in top AI conferences and journals, development of prototypes demonstrating important new AI functionality – AI Automation, Scaling, and Management (Automated data and fielding of working AI systems. science, neural architecture search, AutoML, transfer learning, few-shot/one-shot/meta learning, active learning, AI planning, As part of the selection process, candidates must submit parallel and distributed learning) a 500-word statement of research interest and goals. – AI and Software Engineering (Big code analysis and understanding, software life cycle including modernize, build, debug, test and manage, software synthesis including refactoring and automated programming) – Human-Centered AI (HCI of AI, human-AI collaboration, AI interaction models and modalities, conversational AI, novel AI experiences, visual AI and data visualization) Deadline to apply: January 31, 2021 Earliest start date: June 1, 2021 Duration: 13 months Locations: IBM Thomas J.
    [Show full text]
  • John Mccarthy – Father of Artificial Intelligence
    Asia Pacific Mathematics Newsletter John McCarthy – Father of Artificial Intelligence V Rajaraman Introduction I first met John McCarthy when he visited IIT, Kanpur, in 1968. During his visit he saw that our computer centre, which I was heading, had two batch processing second generation computers — an IBM 7044/1401 and an IBM 1620, both of them were being used for “production jobs”. IBM 1620 was used primarily to teach programming to all students of IIT and IBM 7044/1401 was used by research students and faculty besides a large number of guest users from several neighbouring universities and research laboratories. There was no interactive computer available for computer science and electrical engineering students to do hardware and software research. McCarthy was a great believer in the power of time-sharing computers. John McCarthy In fact one of his first important contributions was a memo he wrote in 1957 urging the Director of the MIT In this article we summarise the contributions of Computer Centre to modify the IBM 704 into a time- John McCarthy to Computer Science. Among his sharing machine [1]. He later persuaded Digital Equip- contributions are: suggesting that the best method ment Corporation (who made the first mini computers of using computers is in an interactive mode, a mode and the PDP series of computers) to design a mini in which computers become partners of users computer with a time-sharing operating system. enabling them to solve problems. This logically led to the idea of time-sharing of large computers by many users and computing becoming a utility — much like a power utility.
    [Show full text]
  • Fpgas As Components in Heterogeneous HPC Systems: Raising the Abstraction Level of Heterogeneous Programming
    FPGAs as Components in Heterogeneous HPC Systems: Raising the Abstraction Level of Heterogeneous Programming Wim Vanderbauwhede School of Computing Science University of Glasgow A trip down memory lane 80 Years ago: The Theory Turing, Alan Mathison. "On computable numbers, with an application to the Entscheidungsproblem." J. of Math 58, no. 345-363 (1936): 5. 1936: Universal machine (Alan Turing) 1936: Lambda calculus (Alonzo Church) 1936: Stored-program concept (Konrad Zuse) 1937: Church-Turing thesis 1945: The Von Neumann architecture Church, Alonzo. "A set of postulates for the foundation of logic." Annals of mathematics (1932): 346-366. 60-40 Years ago: The Foundations The first working integrated circuit, 1958. © Texas Instruments. 1957: Fortran, John Backus, IBM 1958: First IC, Jack Kilby, Texas Instruments 1965: Moore’s law 1971: First microprocessor, Texas Instruments 1972: C, Dennis Ritchie, Bell Labs 1977: Fortran-77 1977: von Neumann bottleneck, John Backus 30 Years ago: HDLs and FPGAs Algotronix CAL1024 FPGA, 1989. © Algotronix 1984: Verilog 1984: First reprogrammable logic device, Altera 1985: First FPGA,Xilinx 1987: VHDL Standard IEEE 1076-1987 1989: Algotronix CAL1024, the first FPGA to offer random access to its control memory 20 Years ago: High-level Synthesis Page, Ian. "Closing the gap between hardware and software: hardware-software cosynthesis at Oxford." (1996): 2-2. 1996: Handel-C, Oxford University 2001: Mitrion-C, Mitrionics 2003: Bluespec, MIT 2003: MaxJ, Maxeler Technologies 2003: Impulse-C, Impulse Accelerated
    [Show full text]
  • The History of Computer Language Selection
    The History of Computer Language Selection Kevin R. Parker College of Business, Idaho State University, Pocatello, Idaho USA [email protected] Bill Davey School of Business Information Technology, RMIT University, Melbourne, Australia [email protected] Abstract: This examines the history of computer language choice for both industry use and university programming courses. The study considers events in two developed countries and reveals themes that may be common in the language selection history of other developed nations. History shows a set of recurring problems for those involved in choosing languages. This study shows that those involved in the selection process can be informed by history when making those decisions. Keywords: selection of programming languages, pragmatic approach to selection, pedagogical approach to selection. 1. Introduction The history of computing is often expressed in terms of significant hardware developments. Both the United States and Australia made early contributions in computing. Many trace the dawn of the history of programmable computers to Eckert and Mauchly’s departure from the ENIAC project to start the Eckert-Mauchly Computer Corporation. In Australia, the history of programmable computers starts with CSIRAC, the fourth programmable computer in the world that ran its first test program in 1949. This computer, manufactured by the government science organization (CSIRO), was used into the 1960s as a working machine at the University of Melbourne and still exists as a complete unit at the Museum of Victoria in Melbourne. Australia’s early entry into computing makes a comparison with the United States interesting. These early computers needed programmers, that is, people with the expertise to convert a problem into a mathematical representation directly executable by the computer.
    [Show full text]
  • Treatment and Differential Diagnosis Insights for the Physician's
    Treatment and differential diagnosis insights for the physician’s consideration in the moments that matter most The role of medical imaging in global health systems is literally fundamental. Like labs, medical images are used at one point or another in almost every high cost, high value episode of care. Echocardiograms, CT scans, mammograms, and x-rays, for example, “atlas” the body and help chart a course forward for a patient’s care team. Imaging precision has improved as a result of technological advancements and breakthroughs in related medical research. Those advancements also bring with them exponential growth in medical imaging data. The capabilities referenced throughout this document are in the research and development phase and are not available for any use, commercial or non-commercial. Any statements and claims related to the capabilities referenced are aspirational only. There were roughly 800 million multi-slice exams performed in the United States in 2015 alone. Those studies generated approximately 60 billion medical images. At those volumes, each of the roughly 31,000 radiologists in the U.S. would have to view an image every two seconds of every working day for an entire year in order to extract potentially life-saving information from a handful of images hidden in a sea of data. 31K 800MM 60B radiologists exams medical images What’s worse, medical images remain largely disconnected from the rest of the relevant data (lab results, patient-similar cases, medical research) inside medical records (and beyond them), making it difficult for physicians to place medical imaging in the context of patient histories that may unlock clues to previously unconsidered treatment pathways.
    [Show full text]
  • The Impetus to Creativity in Technology
    The Impetus to Creativity in Technology Alan G. Konheim Professor Emeritus Department of Computer Science University of California Santa Barbara, California 93106 [email protected] [email protected] Abstract: We describe the technical developments ensuing from two well-known publications in the 20th century containing significant and seminal results, a paper by Claude Shannon in 1948 and a patent by Horst Feistel in 1971. Near the beginning, Shannon’s paper sets the tone with the statement ``the fundamental problem of communication is that of reproducing at one point either exactly or approximately a message selected *sent+ at another point.‛ Shannon’s Coding Theorem established the relationship between the probability of error and rate measuring the transmission efficiency. Shannon proved the existence of codes achieving optimal performance, but it required forty-five years to exhibit an actual code achieving it. These Shannon optimal-efficient codes are responsible for a wide range of communication technology we enjoy today, from GPS, to the NASA rovers Spirit and Opportunity on Mars, and lastly to worldwide communication over the Internet. The US Patent #3798539A filed by the IBM Corporation in1971 described Horst Feistel’s Block Cipher Cryptographic System, a new paradigm for encryption systems. It was largely a departure from the current technology based on shift-register stream encryption for voice and the many of the electro-mechanical cipher machines introduced nearly fifty years before. Horst’s vision directed to its application to secure the privacy of computer files. Invented at a propitious moment in time and implemented by IBM in automated teller machines for the Lloyds Bank Cashpoint System.
    [Show full text]
  • Details Von Neumann/Turing Structure of Von Nuemann Machine
    ENIAC - background 168 420 Computer Architecture ] Electronic Numerical Integrator And Computer ] Eckert and Mauchly Chapter 2 ] University of Pennsylvania Computer Evolution and ] Trajectory tables for weapons Performance ] Started 1943 ] Finished 1946 \ Too late for war effort ] Used until 1955 ENIAC - details ENIAC ] Decimal (not binary) ] 20 accumulators of 10 digits ] Programmed manually by switches ] 18,000 vacuum tubes ] 30 tons ] 15,000 square feet ] 140 kW power consumption ] 5,000 additions per second Structure of von Nuemann von Neumann/Turing machine ] Stored Program concept ] Main memory storing programs and data Arithmetic and Logic Unit ] ALU operating on binary data ] Control unit interpreting instructions from Input Output Main memory and executing Equipment Memory ] Input and output equipment operated by control unit ] Princeton Institute for Advanced Studies Program Control Unit \ IAS ] Completed 1952 1 IAS - details Structure of IAS - detail ] 1000 x 40 bit words Central Processing Unit Arithmetic and Logic Unit \ Binary number Accumulator MQ \ 2 x 20 bit instructions ] Set of registers (storage in CPU) Arithmetic & Logic Circuits \ Memory Buffer Register Input MBR Output Instructions \ Memory Address Register Main Equipment & Data \ Instruction Register Memory \ Instruction Buffer Register IBR PC MAR \ Program Counter IR Control \ Accumulator Circuits Address \ Multiplier Quotient Program Control Unit IAS Commercial Computers ] 1947 - Eckert-Mauchly Computer Corporation ] UNIVAC I (Universal Automatic Computer) ]
    [Show full text]
  • The Largest Project in IT History New England Db2 Users Group Meeting Old Sturbridge Village, 1 Old Sturbridge Village Road, Sturbridge, MA 01566, USA
    The Largest Project in IT History New England Db2 Users Group Meeting Old Sturbridge Village, 1 Old Sturbridge Village Road, Sturbridge, MA 01566, USA Milan Babiak Client Technical Professional + Mainframe Evangelist IBM Canada [email protected] Thursday September 26, AD2019 © 2019 IBM Corporation The Goal 2 © 2019 IBM Corporation AGENDA 1. IBM mainframe history overview 1952-2019 2. The Largest Project in IT History: IBM System/360 3. Key Design Principles 4. Key Operational Principles 5. Dedication 3 © 2019 IBM Corporation The Mainframe Family tree – 1952 to 1964 • Several mainframe families announced, designed for different applications • Every family had a different, incompatible architecture • Within families, moving from one generation to the next was a migration Common compilers made migration easier: FORTRAN (1957) 62 nd anniversary in 2019 COBOL (1959) 60 th anniversary in 2019 4 © 2019 IBM Corporation The Mainframe Family tree – 1952 to 1964 1st generation IBM 701 – 1952 IBM 305 RAMAC – 1956 2nd generation IBM 1401 – 1959 IBM 1440 – 1962 IBM 7094 – 1962 5 © 2019 IBM Corporation The April 1964 Revolution - System 360 3rd generation • IBM decided to implement a wholly new architecture • Specifically designed for data processing • IBM invested $5B to develop System 360 announced on April 7, 1964 IBM Revenue in 1964: $3.23B 6 © 2019 IBM Corporation Data Processing Software in Historical Perspective System 360 1964 IMS 1968 VSAM 1970s ADABAS by Software AG 1971 Oracle database by Oracle Corporation 1977 SMF/RMF system data 1980s
    [Show full text]
  • CSC 272 - Software II: Principles of Programming Languages
    CSC 272 - Software II: Principles of Programming Languages Lecture 1 - An Introduction What is a Programming Language? • A programming language is a notational system for describing computation in machine-readable and human-readable form. • Most of these forms are high-level languages , which is the subject of the course. • Assembly languages and other languages that are designed to more closely resemble the computer’s instruction set than anything that is human- readable are low-level languages . Why Study Programming Languages? • In 1969, Sammet listed 120 programming languages in common use – now there are many more! • Most programmers never use more than a few. – Some limit their career’s to just one or two. • The gain is in learning about their underlying design concepts and how this affects their implementation. The Six Primary Reasons • Increased ability to express ideas • Improved background for choosing appropriate languages • Increased ability to learn new languages • Better understanding of significance of implementation • Better use of languages that are already known • Overall advancement of computing Reason #1 - Increased ability to express ideas • The depth at which people can think is heavily influenced by the expressive power of their language. • It is difficult for people to conceptualize structures that they cannot describe, verbally or in writing. Expressing Ideas as Algorithms • This includes a programmer’s to develop effective algorithms • Many languages provide features that can waste computer time or lead programmers to logic errors if used improperly – E. g., recursion in Pascal, C, etc. – E. g., GoTos in FORTRAN, etc. Reason #2 - Improved background for choosing appropriate languages • Many professional programmers have a limited formal education in computer science, limited to a small number of programming languages.
    [Show full text]