The Making of the MCM/70 Microcomputer
Total Page:16
File Type:pdf, Size:1020Kb
Load more
Recommended publications
-
Wang Laboratories 2200 Series
C21-908-101 Programmable Terminals Wang Laboratories 2200 Series MANAGEMENT SUMMARY A series of five minicomputer systems that Wang Laboratories' 2200 Series product line currently offer both batch and interactive communica consists of the 2200VP, 2200MVP, 2200LVP and tions capabilities. 2200SVP processors, all of which are multi-user or multi user upgradeable systems. The 2200 Series product line Emulation packages are provided for standard also includes the 2200 PCS-I1I, a packaged system which Teletype communications; IBM 2741 emula is single-user oriented. tion; IBM 2780, 3780, and 3741 batch terminal communications; and IBM 3270 The CPU and floppy disk drive are packaged together interactive communications. with appropriate cables and a cabinet to produce an integrated system. The 2200 PCS-I1I includes the CPU, A basic 2200PCS-1II with 32K bytes of user CRT, keyboard and single-sided double density memory, a CRT display, and a minidiskette minidiskette drive in a standard CRT-size enclosure as drive is priced at $6,500. Prices range well as an optional disk multiplexer controller. The upward to $21,000 for a 2200MVP proc number of peripherals that can be attached to a given essor with 256K bytes of user memory, system depends on the number of I/O slots that are excluding peripherals. Leases and rental plans available, with most peripherals requiring one slot. In all are also available. 2200 Series CPUs, highly efficient use of memory is achieved through the use of separate control memory to store the BASIC-2 interpreter and operating system. CHARACTERISTICS Wang began marketing the original 2200 Series in the VENDOR: Wang Laboratories,lnc., One Industrial Avenue, Spring of 1973. -
An Array-Oriented Language with Static Rank Polymorphism
An array-oriented language with static rank polymorphism Justin Slepak, Olin Shivers, and Panagiotis Manolios Northeastern University fjrslepak,shivers,[email protected] Abstract. The array-computational model pioneered by Iverson's lan- guages APL and J offers a simple and expressive solution to the \von Neumann bottleneck." It includes a form of rank, or dimensional, poly- morphism, which renders much of a program's control structure im- plicit by lifting base operators to higher-dimensional array structures. We present the first formal semantics for this model, along with the first static type system that captures the full power of the core language. The formal dynamic semantics of our core language, Remora, illuminates several of the murkier corners of the model. This allows us to resolve some of the model's ad hoc elements in more general, regular ways. Among these, we can generalise the model from SIMD to MIMD computations, by extending the semantics to permit functions to be lifted to higher- dimensional arrays in the same way as their arguments. Our static semantics, a dependent type system of carefully restricted power, is capable of describing array computations whose dimensions cannot be determined statically. The type-checking problem is decidable and the type system is accompanied by the usual soundness theorems. Our type system's principal contribution is that it serves to extract the implicit control structure that provides so much of the language's expres- sive power, making this structure explicitly apparent at compile time. 1 The Promise of Rank Polymorphism Behind every interesting programming language is an interesting model of com- putation. -
Compendium of Technical White Papers
COMPENDIUM OF TECHNICAL WHITE PAPERS Compendium of Technical White Papers from Kx Technical Whitepaper Contents Machine Learning 1. Machine Learning in kdb+: kNN classification and pattern recognition with q ................................ 2 2. An Introduction to Neural Networks with kdb+ .......................................................................... 16 Development Insight 3. Compression in kdb+ ................................................................................................................. 36 4. Kdb+ and Websockets ............................................................................................................... 52 5. C API for kdb+ ............................................................................................................................ 76 6. Efficient Use of Adverbs ........................................................................................................... 112 Optimization Techniques 7. Multi-threading in kdb+: Performance Optimizations and Use Cases ......................................... 134 8. Kdb+ tick Profiling for Throughput Optimization ....................................................................... 152 9. Columnar Database and Query Optimization ............................................................................ 166 Solutions 10. Multi-Partitioned kdb+ Databases: An Equity Options Case Study ............................................. 192 11. Surveillance Technologies to Effectively Monitor Algo and High Frequency Trading .................. -
Microprocessors in the 1970'S
Part II 1970's -- The Altair/Apple Era. 3/1 3/2 Part II 1970’s -- The Altair/Apple era Figure 3.1: A graphical history of personal computers in the 1970’s, the MITS Altair and Apple Computer era. Microprocessors in the 1970’s 3/3 Figure 3.2: Andrew S. Grove, Robert N. Noyce and Gordon E. Moore. Figure 3.3: Marcian E. “Ted” Hoff. Photographs are courtesy of Intel Corporation. 3/4 Part II 1970’s -- The Altair/Apple era Figure 3.4: The Intel MCS-4 (Micro Computer System 4) basic system. Figure 3.5: A photomicrograph of the Intel 4004 microprocessor. Photographs are courtesy of Intel Corporation. Chapter 3 Microprocessors in the 1970's The creation of the transistor in 1947 and the development of the integrated circuit in 1958/59, is the technology that formed the basis for the microprocessor. Initially the technology only enabled a restricted number of components on a single chip. However this changed significantly in the following years. The technology evolved from Small Scale Integration (SSI) in the early 1960's to Medium Scale Integration (MSI) with a few hundred components in the mid 1960's. By the late 1960's LSI (Large Scale Integration) chips with thousands of components had occurred. This rapid increase in the number of components in an integrated circuit led to what became known as Moore’s Law. The concept of this law was described by Gordon Moore in an article entitled “Cramming More Components Onto Integrated Circuits” in the April 1965 issue of Electronics magazine [338]. -
Application and Interpretation
Programming Languages: Application and Interpretation Shriram Krishnamurthi Brown University Copyright c 2003, Shriram Krishnamurthi This work is licensed under the Creative Commons Attribution-NonCommercial-ShareAlike 3.0 United States License. If you create a derivative work, please include the version information below in your attribution. This book is available free-of-cost from the author’s Web site. This version was generated on 2007-04-26. ii Preface The book is the textbook for the programming languages course at Brown University, which is taken pri- marily by third and fourth year undergraduates and beginning graduate (both MS and PhD) students. It seems very accessible to smart second year students too, and indeed those are some of my most successful students. The book has been used at over a dozen other universities as a primary or secondary text. The book’s material is worth one undergraduate course worth of credit. This book is the fruit of a vision for teaching programming languages by integrating the “two cultures” that have evolved in its pedagogy. One culture is based on interpreters, while the other emphasizes a survey of languages. Each approach has significant advantages but also huge drawbacks. The interpreter method writes programs to learn concepts, and has its heart the fundamental belief that by teaching the computer to execute a concept we more thoroughly learn it ourselves. While this reasoning is internally consistent, it fails to recognize that understanding definitions does not imply we understand consequences of those definitions. For instance, the difference between strict and lazy evaluation, or between static and dynamic scope, is only a few lines of interpreter code, but the consequences of these choices is enormous. -
The Origins of Word Processing and Office Automation
Remembering the Office of the Future: The Origins of Word Processing and Office Automation Thomas Haigh University of Wisconsin Word processing entered the American office in 1970 as an idea about reorganizing typists, but its meaning soon shifted to describe computerized text editing. The designers of word processing systems combined existing technologies to exploit the falling costs of interactive computing, creating a new business quite separate from the emerging world of the personal computer. Most people first experienced word processing using a word processor, we think of a software as an application of the personal computer. package, such as Microsoft Word. However, in During the 1980s, word processing rivaled and the early 1970s, when the idea of word process- eventually overtook spreadsheet creation as the ing first gained prominence, it referred to a new most widespread business application for per- way of organizing work: an ideal of centralizing sonal computers.1 By the end of that decade, the typing and transcription in the hands of spe- typewriter had been banished to the corner of cialists equipped with technologies such as auto- most offices, used only to fill out forms and matic typewriters. The word processing concept address envelopes. By the early 1990s, high-qual- was promoted by IBM to present its typewriter ity printers and powerful personal computers and dictating machine division as a comple- were a fixture in middle-class American house- ment to its “data processing” business. Within holds. Email, which emerged as another key the word processing center, automatic typewriters application for personal computers with the and dictating machines were rechristened word spread of the Internet in the mid-1990s, essen- processing machines, to be operated by word tially extended word processing technology to processing operators rather than secretaries or electronic message transmission. -
Timeline of Computer History
Timeline of Computer History By Year By Category Search AI & Robotics (55) Computers (145)(145) Graphics & Games (48) Memory & Storage (61) Networking & The Popular Culture (50) Software & Languages (60) Bell Laboratories scientist 1937 George Stibitz uses relays for a Hewlett-Packard is founded demonstration adder 1939 Hewlett and Packard in their garage workshop “Model K” Adder David Packard and Bill Hewlett found their company in a Alto, California garage. Their first product, the HP 200A A Called the “Model K” Adder because he built it on his Oscillator, rapidly became a popular piece of test equipm “Kitchen” table, this simple demonstration circuit provides for engineers. Walt Disney Pictures ordered eight of the 2 proof of concept for applying Boolean logic to the design of model to test recording equipment and speaker systems computers, resulting in construction of the relay-based Model the 12 specially equipped theatres that showed the movie I Complex Calculator in 1939. That same year in Germany, “Fantasia” in 1940. engineer Konrad Zuse built his Z2 computer, also using telephone company relays. The Complex Number Calculat 1940 Konrad Zuse finishes the Z3 (CNC) is completed Computer 1941 The Zuse Z3 Computer The Z3, an early computer built by German engineer Konrad Zuse working in complete isolation from developments elsewhere, uses 2,300 relays, performs floating point binary arithmetic, and has a 22-bit word length. The Z3 was used for aerodynamic calculations but was destroyed in a bombing raid on Berlin in late 1943. Zuse later supervised a reconstruction of the Z3 in the 1960s, which is currently on Operator at Complex Number Calculator (CNC) display at the Deutsches Museum in Munich. -
Museum Monthly Reports
.J LI j' .. ... ' .J t / . oJ , EXHIBITS AND AR~HJVES D::::PhRTIV1Et\'Y' -- OCTOBER '83 REPORT STAFFING: "'1eredith Stelling, Cooro i na tor Gregory Welch, Operations Manager/Research Bill Wisheart , Registr~r/Photo and Video Archives Beth Par kh urst, Re search RECENT ARTIFACT AC0UISITIONS (since October 1, 1983): X239. 83 Monr oe High Speed Adding Calculator, gift of Lee Swanson. X240.83 Vari-typer, gift of Lee Swanson. X241.83 HP-65 Programmable Calculator, gift of Stephen and Barbara Gross. X241.83 BIAX memory cores, gift of G.B. Westrom. X243.83 - X259.83 The University of Illinios Department of Computer Science Collection of Drawing Instruments, Slide Rules, Calculators and Circuit Boards. X243.83 Smith's Im proved Protactor. 7 X246.83 ILLIAC III Ci rcuit Boards. /o X2~7. 83 ILLIAC II Ci r cuit Board. /0 X250.e3 Keuffel & Esser Cylind rical Slide Rule. ? X260.83 - X274.83 The SAGE AN/SFQ-7 computer. Gi ft of The National 1'1useum of Science and Technology, Ontario. X2r,r . 83 1/2 naste r console ~ C5l5U X2f,} . [;3 "· ,o.onet j c Dr U':l Uni t. 5. (f(5D ~ I X2',2 . P3 IRM 7J8 printer. /C1t7 X2 G ~ . 83.1':>, - E 5 RAda r Operato r's Consoles. ~~ 107.J7.J X7r.t. £'3.Z>. - E 5 Auxiliary Consoles. -------6?:!O/02J7..) X2C,S . 83?l, - E 5 Operator's Chairs. 50 I X7 :- F. f' 3 I RIv! 2 G Car d Pu n c h . / CJ7) X767 . S3 IB"'1 723 Ca rd Recorne r. -
Related Links History of the Radio Shack Computers
Home Page Links Search About Buy/Sell! Timeline: Show Images Radio Shack TRS-80 Model II 1970 Datapoint 2200 Catalog: 26-4002 1971 Kenbak-1 Announced: May 1979 1972 HP-9830A Released: October 1979 Micral Price: $3450 (32K RAM) 1973 Scelbi-8H $3899 (64K RAM) 1974 Mark-8 CPU: Zilog Z-80A, 4 MHz MITS Altair 8800 RAM: 32K, 64K SwTPC 6800 Ports: Two serial ports 1975 Sphere One parallel port IMSAI 8080 IBM 5100 Display: Built-in 12" monochrome monitor MOS KIM-1 40 X 24 or 80 X 24 text. Sol-20 Storage: One 500K 8-inch built-in floppy drive. Hewlett-Packard 9825 External Expansion w/ 3 floppy bays. PolyMorphic OS: TRS-DOS, BASIC. 1976 Cromemco Z-1 Apple I The Digital Group Rockwell AIM 65 Compucolor 8001 ELF, SuperELF Wameco QM-1A Vector Graphic Vector-1 RCA COSMAC VIP Apple II 1977 Commodore PET Radio Shack TRS-80 Atari VCS (2600) NorthStar Horizon Heathkit H8 Intel MCS-85 Heathkit H11 Bally Home Library Computer Netronics ELF II IBM 5110 VideoBrain Family Computer The TRS-80 Model II microcomputer system, designed and manufactured by Radio Shack in Fort Worth, TX, was not intended to replace or obsolete Compucolor II the Model I, it was designed to take up where the Model I left off - a machine with increased capacity and speed in every respect, targeted directly at the Exidy Sorcerer small-business application market. Ohio Scientific 1978 Superboard II Synertek SYM-1 The Model II contains a single-sided full-height Shugart 8-inch floppy drive, which holds 500K bytes of data, compared to only 87K bytes on the 5-1/4 Interact Model One inch drives of the Model I. -
The Restoration of the Honeywell Questar M Written by Sergio Gervasini for ESOCOP – the European Society for Computer Preservation
European SOciety for COmputer Preservation The restoration of the Honeywell Questar M written by Sergio Gervasini for ESOCOP – The European Society for Computer Preservation http://www.esocop.org The restoration of the Honeywell Questar M In 2018 by the European Society for Computer Preservation http://www.esocop.org 1 European SOciety for COmputer Preservation Table of Contents License..................................................................................................................................................3 References............................................................................................................................................3 History..................................................................................................................................................4 My history............................................................................................................................................5 Visual Check.........................................................................................................................................6 Power supply and motherboard............................................................................................................7 Monitor.................................................................................................................................................9 Keyboard.............................................................................................................................................11 -
Chapter 1 Basic Principles of Programming Languages
Chapter 1 Basic Principles of Programming Languages Although there exist many programming languages, the differences among them are insignificant compared to the differences among natural languages. In this chapter, we discuss the common aspects shared among different programming languages. These aspects include: programming paradigms that define how computation is expressed; the main features of programming languages and their impact on the performance of programs written in the languages; a brief review of the history and development of programming languages; the lexical, syntactic, and semantic structures of programming languages, data and data types, program processing and preprocessing, and the life cycles of program development. At the end of the chapter, you should have learned: what programming paradigms are; an overview of different programming languages and the background knowledge of these languages; the structures of programming languages and how programming languages are defined at the syntactic level; data types, strong versus weak checking; the relationship between language features and their performances; the processing and preprocessing of programming languages, compilation versus interpretation, and different execution models of macros, procedures, and inline procedures; the steps used for program development: requirement, specification, design, implementation, testing, and the correctness proof of programs. The chapter is organized as follows. Section 1.1 introduces the programming paradigms, performance, features, and the development of programming languages. Section 1.2 outlines the structures and design issues of programming languages. Section 1.3 discusses the typing systems, including types of variables, type equivalence, type conversion, and type checking during the compilation. Section 1.4 presents the preprocessing and processing of programming languages, including macro processing, interpretation, and compilation. -
A Modern Reversible Programming Language April 10, 2015
Arrow: A Modern Reversible Programming Language Author: Advisor: Eli Rose Bob Geitz Abstract Reversible programming languages are those whose programs can be run backwards as well as forwards. This condition impacts even the most basic constructs, such as =, if and while. I discuss Janus, the first im- perative reversible programming language, and its limitations. I then introduce Arrow, a reversible language with modern features, including functions. Example programs are provided. April 10, 2015 Introduction: Many processes in the world have the property of reversibility. To start washing your hands, you turn the knob to the right, and the water starts to flow; the process can be undone, and the water turned off, by turning the knob to the left. To turn your computer on, you press the power button; this too can be undone, by again pressing the power button. In each situation, we had a process (turning the knob, pressing the power button) and a rule that told us how to \undo" that process (turning the knob the other way, and pressing the power button again). Call the second two the inverses of the first two. By a reversible process, I mean a process that has an inverse. Consider two billiard balls, with certain positions and velocities such that they are about to collide. The collision is produced by moving the balls accord- ing to the laws of physics for a few seconds. Take that as our process. It turns out that we can find an inverse for this process { a set of rules to follow which will undo the collision and restore the balls to their original states1.