Computer Architecture
Total Page:16
File Type:pdf, Size:1020Kb
А.М. Sergiyenko Computer Architecture Part 1. Central Processing Unit and Peripherals Kyiv-2016 0 УДК 004.383 ББК 32.973-018 Рецензенти: А. М. Романкевіч, доктор технічних наук, професор (кафедра спеціализованих обчислювальних систем Національного технічного університету України "КПИ"); В. П. Сімоненко, доктор технічних наук, професор (кафедра обчислювальної техніки Національного технічного університету України "КПИ") Сергієнко А. М. Архітектура компʼютерів. Частина 1. Центральний процесорний елемент та периферійні пристрої: Конспект лекцій. Англійською мовою – К.: НТУУ«КПІ», 2016. – 182 с. Розглянуті теоретичні відомості про архітектури сучасних компʼютерів. Для студентів, аспірантів, викладачів вузів в галузі електроніки, вимірю- вальної і обчислювальної техніки, звʼязку та захисту інформації. УДК 004.383 ББК 32.973-018 А. М. Сергієнко, 2016 1 CONTENTS FOREWORD 3 1. CENTRAL PROCESSING UNIT ARCHITECTURE 5 1.1 Introduction. Basic definitions and principles of computers 5 1.2 Element basis of computers 33 1.3. Von Neumann Computer Architecture 44 1.4. Instruction set of the computer 51 1.5. Data formats and operations 56 1.6. Addressing modes 69 1.7 Intel 8051 architecture 79 1.8 RISC and CISC processors 94 1.9. Interrupt system 103 2. PROCESSOR ENVIRONMENT ARCHITECTURE 120 2.1. Interface basics 120 2.2 Memory in computers 142 2.3. External storage devices 156 2.4 Computer console 169 2.5. Structure of a single processor computer 177 BIBLIOGRAPHY 180 ANNEX 1 182 2 FOREWORD The computer architecture is the basic information about a computer, which is necessary both for its design and programming. The development of a new program is the creative process of implementation of a given algorithm on a computer with the specific architecture. The programmer creating a new program must clearly understand what processes and in what order are executed in the computer, which implements that program. Otherwise, the calculation process, which is planned by the programmer, may differ from the computational process, which is really running in a specific computer. As a result, the purpose of the program development may be unachievable. Every year, more complicated computer architectures are in operation, increasing the degree of inherent parallelism, and the level of information security. At the same time, many programmers understand and percept the computer architecture as unchanged one for decades. On the one hand, this means that programmers can not achieve the most effective implementation of its programs in modern computers without the knowledge of new architectures. On the other hand, the particular organization of computing processes in these computers can lead to inefficient or even incorrect implementation of the programs, which are written for legacy architectures. Also without architecture knowledge, it is impossible to develop, adjust, and analyse many complex and mission-critical applications that have special requirements for speed, reliability, information protection, energy consumption. In the "Computer Architecture" discipline the students learn the theoretical information about the architectural elements of modern computers and communication between them, as well as gain the skills for 3 their use in the selection and programming of microprocessors. Therefore, this discipline is central to the education of specialists in the field of "Information Control Systems and Technologies", "Computer systems and networks". 4 1. CENTRAL PROCESSING UNIT ARCHITECTURE 1.1 Introduction. Basic definitions and principles of computers To understand what is architecture, and why it is studied and investigated, we should recall the history of computers. 1.1.1 The history of computers and their architectures The most significant impetus for the electronic computing machine invention has been a need for improvement of weapons during the Second World War. The problem of calculating the trajectory of shells and bombs was important for the preparation of firing and bombing tables. It was desirable to calculate such a data in real time. To speed up the solution of this problem in 1943 the University of Pennsylvania began to develop the first electronic computer ENIAC ( Electronic Numerical Integrator And Computer) by the US Ministry of Defense order. It started its operation in 1946 and worked until 1955. The Eniac computer had 18,000 electronic lamps and performed multiplication of decimal 10-bit numbers for 2.8 ms. It consisted of several integrators-accumulators, registers, constant and input-output data tables. It performed iterative algorithms for solving differential equations. The programming of such a computer consisted in connection of that registers, accumulators, etc. to the network via data lines in accordance with the algorithm graph. At the first period, this network was done manually using the wires with connectors, and the switching field, which resembled the telephone exchange. But the algorithm changing process required several days. To 5 accelerate this process, the special patch panel with the network configura- tion was invented. Then the algorithm exchange was performed manually by replacing the "programmed" patch panel. So, Eniac was the programmable computer, but both the algorithm set, and data types were very limited. In 1943 in England, ten copies of the Colossus computer have been developed and manufactured under the leadership of N. Wiener. They were used to decipher German radio messages. According to its principle of operation this computer was the symbolic correlator, which consisted of the 1,500 lamps. However, the structure of it also reflected the structure of the algorithm, and this algorithm was unchanged. Even earlier — from 1939 to 1942 — D. Atanasof at the State College of Iowa developed the ABC electronic computer. It performed a single algorithm of the linear equation system solution. It has a parallel structure of thirty sequential Arithmetic Logic Units (ALU) with a parallel data storage based on the capacitor drum. Thus, the first electronic computers were able to perform only one or more similar algorithms that have been encoded in their control devices or interconnect structure. Their ideology was borrowed from the mechanical or electro-mechanical calculators, which were distributed to the middle of the twentieth century. They were very costly. And the computer must be the versatile installation to be much more popular, and effective. In 1936, Alan Turing has proposed the computational model, in which the arbitrary algorithm is coded as a table in the control unit, and as an initial state of the endless storage tape. This tape was the model of the computer memory. Thus the idea of a computer, that can perform any hypothetical algorithm, was born. It is believed that the idea of implementing a universal programmable computer first appeared in the US in the Moore Electrical School of the 6 Pennsylvania University. There was a team of scientists led by von Neumann, which in 1946 drew up a prominent report. In this report, the programable computer design principles were proposed. Among them were the ideas to store both data and program in a single computer memory, to perform the data processing using their binary representation and the apparatus of the Boolean algebra, to fetch the instructions using the instruction counter. Thus, the memory of a computer as the basis of its functionality and versatility was considered. In those days, the computer structure was significantly dependent on the memory design. The register memory based on the vacuum tubes was fast but too expensive and unreliable. The memory based on the mercury acoustic delay lines was very popular. This memory was of the sequential type, and therefore, it determined the sequential, bit-wise processing of the data. The Williams cathode ray tube (CRT) recorded the data bits in the form of pixels on the tube screen and provided the random access to the data. But the volume of such a memory not exceeded thousands of bits. T. Kilburn, and F. Williams were the first who in June 1948 have calculated the first program in the computer SSEM (Small-Scale Experimental Machine) at Manchester University. It was the world's first stored-program computer, which had successfully demonstrated the practicality of the stored- program approach and the Williams tube memory effectiveness. SSEM was redesigned into the Mark1 computer, which has runned the first program in April 1949. The Cambridge professor M. Wilkes took over the idea of the Neumann’s group and has implemented his first program in the computer EDSAC (Electronic Delay Storage Automatic Computer) in 1949. A year later, the computer EDVAC (Electronic Discrete Variable Automatic Computer) was in the operation in the Moore School. In fact, the Neumann group report was 7 compiled to develop this computer. Both these computers were built on the delay lines. Therefore, they operate synchronously and sequentially, that is, the words are processed bit by bit. At the same period of time in Kiev, S. A. Lebedev has developed the computer MESM (Malaja Electronno-Stchetnaja Mashina), in which many new ideas have been incorporated. One of them is the parallel ALU data processing (such as was first used in the Whirlwind computer in 1951 in the USA). Second of them is the microprogramming principle (which was published in 1951 and implemented in 1957 by M. Wilks). MESM was running for the first time in 1950 and was put into operation in late 1951. Its successful use for complex calculations in the defense projects has contributed to the emergence, and expanding of the computer industry in the USSR. The Univac-1 machine, which was produced from 1951 by the Univak affiliate of the Remington company, gave the big push for the commercial distribution of computers. It contained 5000 vacuum tubes and hundreds of delay lines for a thousand 12-digit decimal numbers. The program and data input-output was performed by switches, electric typewriter and a magnetic tape device. In 1953, this company was an early adopter of the random access memory (RAM), based on the magnetic rings. This memory invention improved the computer performance by 50 times.