A Large Routine
Total Page:16
File Type:pdf, Size:1020Kb
Load more
Recommended publications
-
May 21St at the Time, the EDSAC Used IBM 726 Three Small Cathode Ray Tube Börje Langefors Screens to Display the State of Its Announced Memory
it played tic-tac-toe (known as Noughts and Crosses in the UK). May 21st At the time, the EDSAC used IBM 726 three small cathode ray tube Börje Langefors screens to display the state of its Announced memory. Each one could draw a May 21, 1952 Born: May 21, 1915; grid of 35 x 16 dots. Douglas re- purposed one of them for his Ystad, Sweden The IBM 726 was the company’s game, and obtained input (i.e. Died: Dec. 13, 2009 first magnetic tape unit, where to place a nought or intended for use with the Langefors developed the cross) via EDSAC’s rotary recently announced IBM 701 ‘infological equation’ in 1980, controller. [April 7], the company’s first which describes the difference electronic computer. between information and data in terms of additional semantic The 726 utilized half-inch tapes background and a with seven tracks. Six were for communication time interval. the data and the seventh was employed as a parity track. Langefors joined SAAB, the Some tapes were 1,200 feet long, Swedish aerospace and defense could store 2.3 MB of data, and company, in 1949 where he IBM claimed that just one could utilized analog devices for replace 12,500 punch cards. calculating wing stresses. The need for more powerful tools The drive could write 100 became evident, and the only characters per inch on a tape Swedish computer of the time, EDSAC CRT Tubes. Computer and read 75 inches per second. the BESK [April 1], was Lab, Univ. of Cambridge. CC BY To withstand the system’s fast insufficient for the task. -
An Early Program Proof by Alan Turing F
An Early Program Proof by Alan Turing F. L. MORRIS AND C. B. JONES The paper reproduces, with typographical corrections and comments, a 7 949 paper by Alan Turing that foreshadows much subsequent work in program proving. Categories and Subject Descriptors: 0.2.4 [Software Engineeringj- correctness proofs; F.3.1 [Logics and Meanings of Programs]-assertions; K.2 [History of Computing]-software General Terms: Verification Additional Key Words and Phrases: A. M. Turing Introduction The standard references for work on program proofs b) have been omitted in the commentary, and ten attribute the early statement of direction to John other identifiers are written incorrectly. It would ap- McCarthy (e.g., McCarthy 1963); the first workable pear to be worth correcting these errors and com- methods to Peter Naur (1966) and Robert Floyd menting on the proof from the viewpoint of subse- (1967); and the provision of more formal systems to quent work on program proofs. C. A. R. Hoare (1969) and Edsger Dijkstra (1976). The Turing delivered this paper in June 1949, at the early papers of some of the computing pioneers, how- inaugural conference of the EDSAC, the computer at ever, show an awareness of the need for proofs of Cambridge University built under the direction of program correctness and even present workable meth- Maurice V. Wilkes. Turing had been writing programs ods (e.g., Goldstine and von Neumann 1947; Turing for an electronic computer since the end of 1945-at 1949). first for the proposed ACE, the computer project at the The 1949 paper by Alan M. -
The 1-Bit Instrument: the Fundamentals of 1-Bit Synthesis
BLAKE TROISE The 1-Bit Instrument The Fundamentals of 1-Bit Synthesis, Their Implementational Implications, and Instrumental Possibilities ABSTRACT The 1-bit sonic environment (perhaps most famously musically employed on the ZX Spectrum) is defined by extreme limitation. Yet, belying these restrictions, there is a surprisingly expressive instrumental versatility. This article explores the theory behind the primary, idiosyncratically 1-bit techniques available to the composer-programmer, those that are essential when designing “instruments” in 1-bit environments. These techniques include pulse width modulation for timbral manipulation and means of generating virtual polyph- ony in software, such as the pin pulse and pulse interleaving techniques. These methodologies are considered in respect to their compositional implications and instrumental applications. KEYWORDS chiptune, 1-bit, one-bit, ZX Spectrum, pulse pin method, pulse interleaving, timbre, polyphony, history 2020 18 May on guest by http://online.ucpress.edu/jsmg/article-pdf/1/1/44/378624/jsmg_1_1_44.pdf from Downloaded INTRODUCTION As unquestionably evident from the chipmusic scene, it is an understatement to say that there is a lot one can do with simple square waves. One-bit music, generally considered a subdivision of chipmusic,1 takes this one step further: it is the music of a single square wave. The only operation possible in a -bit environment is the variation of amplitude over time, where amplitude is quantized to two states: high or low, on or off. As such, it may seem in- tuitively impossible to achieve traditionally simple musical operations such as polyphony and dynamic control within a -bit environment. Despite these restrictions, the unique tech- niques and auditory tricks of contemporary -bit practice exploit the limits of human per- ception. -
The History of Computer Language Selection
The History of Computer Language Selection Kevin R. Parker College of Business, Idaho State University, Pocatello, Idaho USA [email protected] Bill Davey School of Business Information Technology, RMIT University, Melbourne, Australia [email protected] Abstract: This examines the history of computer language choice for both industry use and university programming courses. The study considers events in two developed countries and reveals themes that may be common in the language selection history of other developed nations. History shows a set of recurring problems for those involved in choosing languages. This study shows that those involved in the selection process can be informed by history when making those decisions. Keywords: selection of programming languages, pragmatic approach to selection, pedagogical approach to selection. 1. Introduction The history of computing is often expressed in terms of significant hardware developments. Both the United States and Australia made early contributions in computing. Many trace the dawn of the history of programmable computers to Eckert and Mauchly’s departure from the ENIAC project to start the Eckert-Mauchly Computer Corporation. In Australia, the history of programmable computers starts with CSIRAC, the fourth programmable computer in the world that ran its first test program in 1949. This computer, manufactured by the government science organization (CSIRO), was used into the 1960s as a working machine at the University of Melbourne and still exists as a complete unit at the Museum of Victoria in Melbourne. Australia’s early entry into computing makes a comparison with the United States interesting. These early computers needed programmers, that is, people with the expertise to convert a problem into a mathematical representation directly executable by the computer. -
Part I Background
Cambridge University Press 978-0-521-19746-5 - Transitions and Trees: An Introduction to Structural Operational Semantics Hans Huttel Excerpt More information PART I BACKGROUND © in this web service Cambridge University Press www.cambridge.org Cambridge University Press 978-0-521-19746-5 - Transitions and Trees: An Introduction to Structural Operational Semantics Hans Huttel Excerpt More information 1 A question of semantics The goal of this chapter is to give the reader a glimpse of the applications and problem areas that have motivated and to this day continue to inspire research in the important area of computer science known as programming language semantics. 1.1 Semantics is the study of meaning Programming language semantics is the study of mathematical models of and methods for describing and reasoning about the behaviour of programs. The word semantics has Greek roots1 and was first used in linguistics. Here, one distinguishes among syntax, the study of the structure of lan- guages, semantics, the study of meaning, and pragmatics, the study of the use of language. In computer science we make a similar distinction between syntax and se- mantics. The languages that we are interested in are programming languages in a very general sense. The ‘meaning’ of a program is its behaviour, and for this reason programming language semantics is the part of programming language theory devoted to the study of program behaviour. Programming language semantics is concerned only with purely internal aspects of program behaviour, namely what happens within a running pro- gram. Program semantics does not claim to be able to address other aspects of program behaviour – e.g. -
The Standard Model for Programming Languages: the Birth of A
The Standard Model for Programming Languages: The Birth of a Mathematical Theory of Computation Simone Martini Department of Computer Science and Engineering, University of Bologna, Italy INRIA, Sophia-Antipolis, Valbonne, France http://www.cs.unibo.it/~martini [email protected] Abstract Despite the insight of some of the pioneers (Turing, von Neumann, Curry, Böhm), programming the early computers was a matter of fiddling with small architecture-dependent details. Only in the sixties some form of “mathematical program development” will be in the agenda of some of the most influential players of that time. A “Mathematical Theory of Computation” is the name chosen by John McCarthy for his approach, which uses a class of recursively computable functions as an (extensional) model of a class of programs. It is the beginning of that grand endeavour to present programming as a mathematical activity, and reasoning on programs as a form of mathematical logic. An important part of this process is the standard model of programming languages – the informal assumption that the meaning of programs should be understood on an abstract machine with unbounded resources, and with true arithmetic. We present some crucial moments of this story, concluding with the emergence, in the seventies, of the need of more “intensional” semantics, like the sequential algorithms on concrete data structures. The paper is a small step of a larger project – reflecting and tracing the interaction between mathematical logic and programming (languages), identifying some -
The London University Atlas
The London University Atlas: A talk given at the Atlas 50th Anniversary Symposium on 5th December 2012. Dik Leatherdale. London >> Manchester London University was, indeed still is, a HUGE organisation even by the standards of this place. It’s a collegiate university the individual colleges spread across the capital from Queen Mary College in the east end to the baroque splendour of Royal Holloway in leafy Egham built, you may be interested to learn, upon the proceeds of Victorian patent medicines hardly any of which had any beneficial effect whatsoever. In the “build it yourself” era, two of the colleges took the plunge. At Imperial College, Keith Tocher, Sid Michaelson and Tony Brooker constructed the “Imperial College Computing Engine” using relays. With the benefit of hindsight, this looks like a dead end in development. But Imperial weren’t the only team to put their faith in relay technology. Neither were they the last. Yet amazingly, the legacy of ICCE lived on because of a chance lunchtime conversation years later between Tom Kilburn and Tony Brooker, the Atlas adder logical design was derived from that of ICCE. Over in the unlikely surroundings of Birkbeck College, Andrew Booth was building a series of low cost, low performance machines. His ambition was to build computers cheap enough for every college to be able to afford one. You might say he invented the minicomputer a decade before anybody noticed. His designs were taken up by the British Tabulating Machine company and became the ICT 1200 series; for a while the most popular computer in the UK. -
Who Did What? Engineers Vs Mathematicians Title Page America Vs Europe Ideas Vs Implementations Babbage Lovelace Turing
Who did what? Engineers vs Mathematicians Title Page America vs Europe Ideas vs Implementations Babbage Lovelace Turing Flowers Atanasoff Berry Zuse Chandler Broadhurst Mauchley & Von Neumann Wilkes Eckert Williams Kilburn SlideSlide 1 1 SlideSlide 2 2 A decade of real progress What remained to be done? A number of different pioneers had during the 1930s and 40s Memory! begun to make significant progress in developing automatic calculators (computers). Thanks (almost entirely) to A.M. Turing there was a complete theoretical understanding of the elements necessary to produce a Atanasoff had pioneered the use of electronics in calculation stored program computer – or Universal Turing Machine. Zuse had mad a number of theoretical breakthroughs Not for the first time, ideas had outstripped technology. Eckert and Mauchley (with JvN) had produced ENIAC and No-one had, by the end of the war, developed a practical store or produced the EDVAC report – spreading the word memory. Newman and Flowers had developed the Colossus It should be noted that not too many people were even working on the problem. SlideSlide 3 3 SlideSlide 4 4 The players Controversy USA: Eckert, Mauchley & von Neumann – EDVAC Fall out over the EDVAC report UK: Maurice Wilkes – EDSAC Takeover of the SSEM project by Williams (Kilburn) UK: Newman – SSEM (what was the role of Colossus?) Discord on the ACE project and the departure of Turing UK: Turing - ACE EDSAC alone was harmonious We will concentrate for a little while on the SSEM – the winner! SlideSlide 5 5 SlideSlide 6 6 1 Newman & technology transfer: Secret Technology Transfer? the claim Donald Michie : Cryptanalyst An enormous amount was transferred in an extremely seminal way to the post war developments of computers in Britain. -
Current Issue of FACS FACTS
Issue 2021-2 July 2021 FACS A C T S The Newsletter of the Formal Aspects of Computing Science (FACS) Specialist Group ISSN 0950-1231 FACS FACTS Issue 2021-2 July 2021 About FACS FACTS FACS FACTS (ISSN: 0950-1231) is the newsletter of the BCS Specialist Group on Formal Aspects of Computing Science (FACS). FACS FACTS is distributed in electronic form to all FACS members. Submissions to FACS FACTS are always welcome. Please visit the newsletter area of the BCS FACS website for further details at: https://www.bcs.org/membership/member-communities/facs-formal-aspects- of-computing-science-group/newsletters/ Back issues of FACS FACTS are available for download from: https://www.bcs.org/membership/member-communities/facs-formal-aspects- of-computing-science-group/newsletters/back-issues-of-facs-facts/ The FACS FACTS Team Newsletter Editors Tim Denvir [email protected] Brian Monahan [email protected] Editorial Team: Jonathan Bowen, John Cooke, Tim Denvir, Brian Monahan, Margaret West. Contributors to this issue: Jonathan Bowen, Andrew Johnstone, Keith Lines, Brian Monahan, John Tucker, Glynn Winskel BCS-FACS websites BCS: http://www.bcs-facs.org LinkedIn: https://www.linkedin.com/groups/2427579/ Facebook: http://www.facebook.com/pages/BCS-FACS/120243984688255 Wikipedia: http://en.wikipedia.org/wiki/BCS-FACS If you have any questions about BCS-FACS, please send these to Jonathan Bowen at [email protected]. 2 FACS FACTS Issue 2021-2 July 2021 Editorial Dear readers, Welcome to the 2021-2 issue of the FACS FACTS Newsletter. A theme for this issue is suggested by the thought that it is just over 50 years since the birth of Domain Theory1. -
Hereby the Screen Stands in For, and Thereby Occludes, the Deeper Workings of the Computer Itself
John Warnock and an IDI graphical display unit, University of Utah, 1968. Courtesy Salt Lake City Deseret News . 24 doi:10.1162/GREY_a_00233 Downloaded from http://www.mitpressjournals.org/doi/pdf/10.1162/GREY_a_00233 by guest on 27 September 2021 The Random-Access Image: Memory and the History of the Computer Screen JACOB GABOURY A memory is a means for displacing in time various events which depend upon the same information. —J. Presper Eckert Jr. 1 When we speak of graphics, we think of images. Be it the windowed interface of a personal computer, the tactile swipe of icons across a mobile device, or the surreal effects of computer-enhanced film and video games—all are graphics. Understandably, then, computer graphics are most often understood as the images displayed on a computer screen. This pairing of the image and the screen is so natural that we rarely theorize the screen as a medium itself, one with a heterogeneous history that develops in parallel with other visual and computa - tional forms. 2 What then, of the screen? To be sure, the computer screen follows in the tradition of the visual frame that delimits, contains, and produces the image. 3 It is also the skin of the interface that allows us to engage with, augment, and relate to technical things. 4 But the computer screen was also a cathode ray tube (CRT) phosphorescing in response to an electron beam, modified by a grid of randomly accessible memory that stores, maps, and transforms thousands of bits in real time. The screen is not simply an enduring technique or evocative metaphor; it is a hardware object whose transformations have shaped the ma - terial conditions of our visual culture. -
Alan Turing's Automatic Computing Engine
5 Turing and the computer B. Jack Copeland and Diane Proudfoot The Turing machine In his first major publication, ‘On computable numbers, with an application to the Entscheidungsproblem’ (1936), Turing introduced his abstract Turing machines.1 (Turing referred to these simply as ‘computing machines’— the American logician Alonzo Church dubbed them ‘Turing machines’.2) ‘On Computable Numbers’ pioneered the idea essential to the modern computer—the concept of controlling a computing machine’s operations by means of a program of coded instructions stored in the machine’s memory. This work had a profound influence on the development in the 1940s of the electronic stored-program digital computer—an influence often neglected or denied by historians of the computer. A Turing machine is an abstract conceptual model. It consists of a scanner and a limitless memory-tape. The tape is divided into squares, each of which may be blank or may bear a single symbol (‘0’or‘1’, for example, or some other symbol taken from a finite alphabet). The scanner moves back and forth through the memory, examining one square at a time (the ‘scanned square’). It reads the symbols on the tape and writes further symbols. The tape is both the memory and the vehicle for input and output. The tape may also contain a program of instructions. (Although the tape itself is limitless—Turing’s aim was to show that there are tasks that Turing machines cannot perform, even given unlimited working memory and unlimited time—any input inscribed on the tape must consist of a finite number of symbols.) A Turing machine has a small repertoire of basic operations: move left one square, move right one square, print, and change state. -
Social Construction and the British Computer Industry in the Post-World War II Period
The rhetoric of Americanisation: Social construction and the British computer industry in the Post-World War II period By Robert James Kirkwood Reid Submitted to the University of Glasgow for the degree of Doctor of Philosophy in Economic History Department of Economic and Social History September 2007 1 Abstract This research seeks to understand the process of technological development in the UK and the specific role of a ‘rhetoric of Americanisation’ in that process. The concept of a ‘rhetoric of Americanisation’ will be developed throughout the thesis through a study into the computer industry in the UK in the post-war period . Specifically, the thesis discusses the threat of America, or how actors in the network of innovation within the British computer industry perceived it as a threat and the effect that this perception had on actors operating in the networks of construction in the British computer industry. However, the reaction to this threat was not a simple one. Rather this story is marked by sectional interests and technopolitical machination attempting to capture this rhetoric of ‘threat’ and ‘falling behind’. In this thesis the concept of ‘threat’ and ‘falling behind’, or more simply the ‘rhetoric of Americanisation’, will be explored in detail and the effect this had on the development of the British computer industry. What form did the process of capture and modification by sectional interests within government and industry take and what impact did this have on the British computer industry? In answering these questions, the thesis will first develop a concept of a British culture of computing which acts as the surface of emergence for various ideologies of innovation within the social networks that made up the computer industry in the UK.