John Backus IBM Research Laboratory San Jose, California I
Total Page:16
File Type:pdf, Size:1020Kb
Load more
Recommended publications
-
Historical Perspective and Further Reading 162.E1
2.21 Historical Perspective and Further Reading 162.e1 2.21 Historical Perspective and Further Reading Th is section surveys the history of in struction set architectures over time, and we give a short history of programming languages and compilers. ISAs include accumulator architectures, general-purpose register architectures, stack architectures, and a brief history of ARMv7 and the x86. We also review the controversial subjects of high-level-language computer architectures and reduced instruction set computer architectures. Th e history of programming languages includes Fortran, Lisp, Algol, C, Cobol, Pascal, Simula, Smalltalk, C+ + , and Java, and the history of compilers includes the key milestones and the pioneers who achieved them. Accumulator Architectures Hardware was precious in the earliest stored-program computers. Consequently, computer pioneers could not aff ord the number of registers found in today’s architectures. In fact, these architectures had a single register for arithmetic instructions. Since all operations would accumulate in one register, it was called the accumulator , and this style of instruction set is given the same name. For example, accumulator Archaic EDSAC in 1949 had a single accumulator. term for register. On-line Th e three-operand format of RISC-V suggests that a single register is at least two use of it as a synonym for registers shy of our needs. Having the accumulator as both a source operand and “register” is a fairly reliable indication that the user the destination of the operation fi lls part of the shortfall, but it still leaves us one has been around quite a operand short. Th at fi nal operand is found in memory. -
John Mccarthy
JOHN MCCARTHY: the uncommon logician of common sense Excerpt from Out of their Minds: the lives and discoveries of 15 great computer scientists by Dennis Shasha and Cathy Lazere, Copernicus Press August 23, 2004 If you want the computer to have general intelligence, the outer structure has to be common sense knowledge and reasoning. — John McCarthy When a five-year old receives a plastic toy car, she soon pushes it and beeps the horn. She realizes that she shouldn’t roll it on the dining room table or bounce it on the floor or land it on her little brother’s head. When she returns from school, she expects to find her car in more or less the same place she last put it, because she put it outside her baby brother’s reach. The reasoning is so simple that any five-year old child can understand it, yet most computers can’t. Part of the computer’s problem has to do with its lack of knowledge about day-to-day social conventions that the five-year old has learned from her parents, such as don’t scratch the furniture and don’t injure little brothers. Another part of the problem has to do with a computer’s inability to reason as we do daily, a type of reasoning that’s foreign to conventional logic and therefore to the thinking of the average computer programmer. Conventional logic uses a form of reasoning known as deduction. Deduction permits us to conclude from statements such as “All unemployed actors are waiters, ” and “ Sebastian is an unemployed actor,” the new statement that “Sebastian is a waiter.” The main virtue of deduction is that it is “sound” — if the premises hold, then so will the conclusions. -
The Evolution of Ibm Research Looking Back at 50 Years of Scientific Achievements and Innovations
FEATURES THE EVOLUTION OF IBM RESEARCH LOOKING BACK AT 50 YEARS OF SCIENTIFIC ACHIEVEMENTS AND INNOVATIONS l Chris Sciacca and Christophe Rossel – IBM Research – Zurich, Switzerland – DOI: 10.1051/epn/2014201 By the mid-1950s IBM had established laboratories in New York City and in San Jose, California, with San Jose being the first one apart from headquarters. This provided considerable freedom to the scientists and with its success IBM executives gained the confidence they needed to look beyond the United States for a third lab. The choice wasn’t easy, but Switzerland was eventually selected based on the same blend of talent, skills and academia that IBM uses today — most recently for its decision to open new labs in Ireland, Brazil and Australia. 16 EPN 45/2 Article available at http://www.europhysicsnews.org or http://dx.doi.org/10.1051/epn/2014201 THE evolution OF IBM RESEARCH FEATURES he Computing-Tabulating-Recording Com- sorting and disseminating information was going to pany (C-T-R), the precursor to IBM, was be a big business, requiring investment in research founded on 16 June 1911. It was initially a and development. Tmerger of three manufacturing businesses, He began hiring the country’s top engineers, led which were eventually molded into the $100 billion in- by one of world’s most prolific inventors at the time: novator in technology, science, management and culture James Wares Bryce. Bryce was given the task to in- known as IBM. vent and build the best tabulating, sorting and key- With the success of C-T-R after World War I came punch machines. -
IBM Research AI Residency Program
IBM Research AI Residency Program The IBM Research™ AI Residency Program is a 13-month program Topics of focus include: that provides opportunity for scientists, engineers, domain experts – Trust in AI (Causal modeling, fairness, explainability, and entrepreneurs to conduct innovative research and development robustness, transparency, AI ethics) on important and emerging topics in Artificial Intelligence (AI). The program aims at creating and investigating novel approaches – Natural Language Processing and Understanding in AI that progress capabilities towards significant technical (Question and answering, reading comprehension, and real-world challenges. AI Residents work closely with IBM language embeddings, dialog, multi-lingual NLP) Research scientists and are expected to fully complete a project – Knowledge and Reasoning (Knowledge/graph embeddings, within the 13-month residency. The results of the project may neuro-symbolic reasoning) include publications in top AI conferences and journals, development of prototypes demonstrating important new AI functionality – AI Automation, Scaling, and Management (Automated data and fielding of working AI systems. science, neural architecture search, AutoML, transfer learning, few-shot/one-shot/meta learning, active learning, AI planning, As part of the selection process, candidates must submit parallel and distributed learning) a 500-word statement of research interest and goals. – AI and Software Engineering (Big code analysis and understanding, software life cycle including modernize, build, debug, test and manage, software synthesis including refactoring and automated programming) – Human-Centered AI (HCI of AI, human-AI collaboration, AI interaction models and modalities, conversational AI, novel AI experiences, visual AI and data visualization) Deadline to apply: January 31, 2021 Earliest start date: June 1, 2021 Duration: 13 months Locations: IBM Thomas J. -
John Mccarthy – Father of Artificial Intelligence
Asia Pacific Mathematics Newsletter John McCarthy – Father of Artificial Intelligence V Rajaraman Introduction I first met John McCarthy when he visited IIT, Kanpur, in 1968. During his visit he saw that our computer centre, which I was heading, had two batch processing second generation computers — an IBM 7044/1401 and an IBM 1620, both of them were being used for “production jobs”. IBM 1620 was used primarily to teach programming to all students of IIT and IBM 7044/1401 was used by research students and faculty besides a large number of guest users from several neighbouring universities and research laboratories. There was no interactive computer available for computer science and electrical engineering students to do hardware and software research. McCarthy was a great believer in the power of time-sharing computers. John McCarthy In fact one of his first important contributions was a memo he wrote in 1957 urging the Director of the MIT In this article we summarise the contributions of Computer Centre to modify the IBM 704 into a time- John McCarthy to Computer Science. Among his sharing machine [1]. He later persuaded Digital Equip- contributions are: suggesting that the best method ment Corporation (who made the first mini computers of using computers is in an interactive mode, a mode and the PDP series of computers) to design a mini in which computers become partners of users computer with a time-sharing operating system. enabling them to solve problems. This logically led to the idea of time-sharing of large computers by many users and computing becoming a utility — much like a power utility. -
Fpgas As Components in Heterogeneous HPC Systems: Raising the Abstraction Level of Heterogeneous Programming
FPGAs as Components in Heterogeneous HPC Systems: Raising the Abstraction Level of Heterogeneous Programming Wim Vanderbauwhede School of Computing Science University of Glasgow A trip down memory lane 80 Years ago: The Theory Turing, Alan Mathison. "On computable numbers, with an application to the Entscheidungsproblem." J. of Math 58, no. 345-363 (1936): 5. 1936: Universal machine (Alan Turing) 1936: Lambda calculus (Alonzo Church) 1936: Stored-program concept (Konrad Zuse) 1937: Church-Turing thesis 1945: The Von Neumann architecture Church, Alonzo. "A set of postulates for the foundation of logic." Annals of mathematics (1932): 346-366. 60-40 Years ago: The Foundations The first working integrated circuit, 1958. © Texas Instruments. 1957: Fortran, John Backus, IBM 1958: First IC, Jack Kilby, Texas Instruments 1965: Moore’s law 1971: First microprocessor, Texas Instruments 1972: C, Dennis Ritchie, Bell Labs 1977: Fortran-77 1977: von Neumann bottleneck, John Backus 30 Years ago: HDLs and FPGAs Algotronix CAL1024 FPGA, 1989. © Algotronix 1984: Verilog 1984: First reprogrammable logic device, Altera 1985: First FPGA,Xilinx 1987: VHDL Standard IEEE 1076-1987 1989: Algotronix CAL1024, the first FPGA to offer random access to its control memory 20 Years ago: High-level Synthesis Page, Ian. "Closing the gap between hardware and software: hardware-software cosynthesis at Oxford." (1996): 2-2. 1996: Handel-C, Oxford University 2001: Mitrion-C, Mitrionics 2003: Bluespec, MIT 2003: MaxJ, Maxeler Technologies 2003: Impulse-C, Impulse Accelerated -
The History of Computer Language Selection
The History of Computer Language Selection Kevin R. Parker College of Business, Idaho State University, Pocatello, Idaho USA [email protected] Bill Davey School of Business Information Technology, RMIT University, Melbourne, Australia [email protected] Abstract: This examines the history of computer language choice for both industry use and university programming courses. The study considers events in two developed countries and reveals themes that may be common in the language selection history of other developed nations. History shows a set of recurring problems for those involved in choosing languages. This study shows that those involved in the selection process can be informed by history when making those decisions. Keywords: selection of programming languages, pragmatic approach to selection, pedagogical approach to selection. 1. Introduction The history of computing is often expressed in terms of significant hardware developments. Both the United States and Australia made early contributions in computing. Many trace the dawn of the history of programmable computers to Eckert and Mauchly’s departure from the ENIAC project to start the Eckert-Mauchly Computer Corporation. In Australia, the history of programmable computers starts with CSIRAC, the fourth programmable computer in the world that ran its first test program in 1949. This computer, manufactured by the government science organization (CSIRO), was used into the 1960s as a working machine at the University of Melbourne and still exists as a complete unit at the Museum of Victoria in Melbourne. Australia’s early entry into computing makes a comparison with the United States interesting. These early computers needed programmers, that is, people with the expertise to convert a problem into a mathematical representation directly executable by the computer. -
Treatment and Differential Diagnosis Insights for the Physician's
Treatment and differential diagnosis insights for the physician’s consideration in the moments that matter most The role of medical imaging in global health systems is literally fundamental. Like labs, medical images are used at one point or another in almost every high cost, high value episode of care. Echocardiograms, CT scans, mammograms, and x-rays, for example, “atlas” the body and help chart a course forward for a patient’s care team. Imaging precision has improved as a result of technological advancements and breakthroughs in related medical research. Those advancements also bring with them exponential growth in medical imaging data. The capabilities referenced throughout this document are in the research and development phase and are not available for any use, commercial or non-commercial. Any statements and claims related to the capabilities referenced are aspirational only. There were roughly 800 million multi-slice exams performed in the United States in 2015 alone. Those studies generated approximately 60 billion medical images. At those volumes, each of the roughly 31,000 radiologists in the U.S. would have to view an image every two seconds of every working day for an entire year in order to extract potentially life-saving information from a handful of images hidden in a sea of data. 31K 800MM 60B radiologists exams medical images What’s worse, medical images remain largely disconnected from the rest of the relevant data (lab results, patient-similar cases, medical research) inside medical records (and beyond them), making it difficult for physicians to place medical imaging in the context of patient histories that may unlock clues to previously unconsidered treatment pathways. -
The Impetus to Creativity in Technology
The Impetus to Creativity in Technology Alan G. Konheim Professor Emeritus Department of Computer Science University of California Santa Barbara, California 93106 [email protected] [email protected] Abstract: We describe the technical developments ensuing from two well-known publications in the 20th century containing significant and seminal results, a paper by Claude Shannon in 1948 and a patent by Horst Feistel in 1971. Near the beginning, Shannon’s paper sets the tone with the statement ``the fundamental problem of communication is that of reproducing at one point either exactly or approximately a message selected *sent+ at another point.‛ Shannon’s Coding Theorem established the relationship between the probability of error and rate measuring the transmission efficiency. Shannon proved the existence of codes achieving optimal performance, but it required forty-five years to exhibit an actual code achieving it. These Shannon optimal-efficient codes are responsible for a wide range of communication technology we enjoy today, from GPS, to the NASA rovers Spirit and Opportunity on Mars, and lastly to worldwide communication over the Internet. The US Patent #3798539A filed by the IBM Corporation in1971 described Horst Feistel’s Block Cipher Cryptographic System, a new paradigm for encryption systems. It was largely a departure from the current technology based on shift-register stream encryption for voice and the many of the electro-mechanical cipher machines introduced nearly fifty years before. Horst’s vision directed to its application to secure the privacy of computer files. Invented at a propitious moment in time and implemented by IBM in automated teller machines for the Lloyds Bank Cashpoint System. -
Details Von Neumann/Turing Structure of Von Nuemann Machine
ENIAC - background 168 420 Computer Architecture ] Electronic Numerical Integrator And Computer ] Eckert and Mauchly Chapter 2 ] University of Pennsylvania Computer Evolution and ] Trajectory tables for weapons Performance ] Started 1943 ] Finished 1946 \ Too late for war effort ] Used until 1955 ENIAC - details ENIAC ] Decimal (not binary) ] 20 accumulators of 10 digits ] Programmed manually by switches ] 18,000 vacuum tubes ] 30 tons ] 15,000 square feet ] 140 kW power consumption ] 5,000 additions per second Structure of von Nuemann von Neumann/Turing machine ] Stored Program concept ] Main memory storing programs and data Arithmetic and Logic Unit ] ALU operating on binary data ] Control unit interpreting instructions from Input Output Main memory and executing Equipment Memory ] Input and output equipment operated by control unit ] Princeton Institute for Advanced Studies Program Control Unit \ IAS ] Completed 1952 1 IAS - details Structure of IAS - detail ] 1000 x 40 bit words Central Processing Unit Arithmetic and Logic Unit \ Binary number Accumulator MQ \ 2 x 20 bit instructions ] Set of registers (storage in CPU) Arithmetic & Logic Circuits \ Memory Buffer Register Input MBR Output Instructions \ Memory Address Register Main Equipment & Data \ Instruction Register Memory \ Instruction Buffer Register IBR PC MAR \ Program Counter IR Control \ Accumulator Circuits Address \ Multiplier Quotient Program Control Unit IAS Commercial Computers ] 1947 - Eckert-Mauchly Computer Corporation ] UNIVAC I (Universal Automatic Computer) ] -
The Largest Project in IT History New England Db2 Users Group Meeting Old Sturbridge Village, 1 Old Sturbridge Village Road, Sturbridge, MA 01566, USA
The Largest Project in IT History New England Db2 Users Group Meeting Old Sturbridge Village, 1 Old Sturbridge Village Road, Sturbridge, MA 01566, USA Milan Babiak Client Technical Professional + Mainframe Evangelist IBM Canada [email protected] Thursday September 26, AD2019 © 2019 IBM Corporation The Goal 2 © 2019 IBM Corporation AGENDA 1. IBM mainframe history overview 1952-2019 2. The Largest Project in IT History: IBM System/360 3. Key Design Principles 4. Key Operational Principles 5. Dedication 3 © 2019 IBM Corporation The Mainframe Family tree – 1952 to 1964 • Several mainframe families announced, designed for different applications • Every family had a different, incompatible architecture • Within families, moving from one generation to the next was a migration Common compilers made migration easier: FORTRAN (1957) 62 nd anniversary in 2019 COBOL (1959) 60 th anniversary in 2019 4 © 2019 IBM Corporation The Mainframe Family tree – 1952 to 1964 1st generation IBM 701 – 1952 IBM 305 RAMAC – 1956 2nd generation IBM 1401 – 1959 IBM 1440 – 1962 IBM 7094 – 1962 5 © 2019 IBM Corporation The April 1964 Revolution - System 360 3rd generation • IBM decided to implement a wholly new architecture • Specifically designed for data processing • IBM invested $5B to develop System 360 announced on April 7, 1964 IBM Revenue in 1964: $3.23B 6 © 2019 IBM Corporation Data Processing Software in Historical Perspective System 360 1964 IMS 1968 VSAM 1970s ADABAS by Software AG 1971 Oracle database by Oracle Corporation 1977 SMF/RMF system data 1980s -
CSC 272 - Software II: Principles of Programming Languages
CSC 272 - Software II: Principles of Programming Languages Lecture 1 - An Introduction What is a Programming Language? • A programming language is a notational system for describing computation in machine-readable and human-readable form. • Most of these forms are high-level languages , which is the subject of the course. • Assembly languages and other languages that are designed to more closely resemble the computer’s instruction set than anything that is human- readable are low-level languages . Why Study Programming Languages? • In 1969, Sammet listed 120 programming languages in common use – now there are many more! • Most programmers never use more than a few. – Some limit their career’s to just one or two. • The gain is in learning about their underlying design concepts and how this affects their implementation. The Six Primary Reasons • Increased ability to express ideas • Improved background for choosing appropriate languages • Increased ability to learn new languages • Better understanding of significance of implementation • Better use of languages that are already known • Overall advancement of computing Reason #1 - Increased ability to express ideas • The depth at which people can think is heavily influenced by the expressive power of their language. • It is difficult for people to conceptualize structures that they cannot describe, verbally or in writing. Expressing Ideas as Algorithms • This includes a programmer’s to develop effective algorithms • Many languages provide features that can waste computer time or lead programmers to logic errors if used improperly – E. g., recursion in Pascal, C, etc. – E. g., GoTos in FORTRAN, etc. Reason #2 - Improved background for choosing appropriate languages • Many professional programmers have a limited formal education in computer science, limited to a small number of programming languages.