Introduction to Computers
Total Page:16
File Type:pdf, Size:1020Kb
Introduction to Computers 1. The History of Computers If I asked when the history of computers began, I am relatively certain that most answers would be between 1940 and 1980. Few people would expect the that the first mechanical computer was built in 1823. So, our story begins with Charles Babbage. 1823 – Charles Babbage Charles Babbage was a mathematician, philosopher, inventor and a mechanical engineer. He is widely regarded as the “father of the computer” as he invented the first mechanical computer. He designed an Analytical Engine first described in 1937, that could perform arithmetic calculations, true and false statements as well as loops and conditional branching. It even contained integrated memory. All this, while Queen Victoria was on the throne. His Difference Engine which was the forerunner to the Analytical Engine. It was not programmable but could do arithmetic calculations. In 1823 the British government gave Babbage £1 700 (± £211 000 or R3 900 000 Charles Babbage today) to start work on the project. The actual implementation proved to be (26 December 1791 - 18 October 1871) extremely expensive as the metalworking techniques of the era could not produce the precision and quality required. The government abandoned the project in 1842 after Babbage had received and spent £17 000 or R39 000 000 His Difference Engine was finally constructed by the British Science Museum between 1989 – 1991 to celebrate the bi-centennial of Babbage’s birth. They used his original plans and the success of the finished engine proved that Babbage’s machine would have worked. The Difference Engine can be seen today in the Science Museum in London. The Difference Engine on display in the Science Museum © www.bsharptraining.org Page 1 1890 – Herman Hollerith Herman Hollerith designed a punch card system to calculate the 1890 census and saved the government $5 million (± $138 million or R2 billion today). Anyone who has done a multiple-choice exam on special cards will appreciate how a punched card system works. A hole or a black mark can be read by a punch card reader and stored on a computer. Hollerith established a company called Computing-Tabulating-Recording Company (CTR), that would go on to become the International Business Machines Corporation or IBM. IBM is the sixth largest IT company in the world. Apple, Samsung, Microsoft, Google and Intel occupy the top five spots. 1936 – Alan Turing Alan Turing (23 June 1912 – 7 June 1954) was a mathematician, computer scientist, logician, cryptanalyst, philosopher and theoretical biologist. It is estimated that his work in cryptography during the Second World War shortened the war by two years saving approximately fourteen million lives. He was awarded the Order of the British Empire (OBE) in secret in 1946 as much of his work was covered under the Official Secrets Act. He was convicted for indecency in 1952 as he was homosexual and was chemically castrated. He committed suicide two years later, and we lost on of the greatest minds of the twentieth century. There are many that dispute the finding of suicide, but as the apple that supposedly contained the cyanide was never tested, we may never know. Queen Elizabeth II posthumously pardoned Alan Turing in 2013. Nevertheless, he is regarded as the founder of theoretical computer science as well as artificial intelligence. He designed the Turing Test for Artificial Intelligence, which to date has not been passed. In 1936 at the age of 24 he presented a notion of a universal machine which would be capable of computing anything that is computable. This was the first idea of a general-purpose computer. The machine was later called the Turing Machine and is still studied in Computer Science degrees today. FIRST GENERATION – Vacuum Tubes The first generation lasted from approximately 1942 to 1954 and made use of vacuum tubes. It used batch processing, punched cards and magnetic tapes. The simplest vacuum tube, the diode was invented in 1904. Vacuum tubes were a key component of electrical circuits for the first half of the twentieth century. They were found in radios, television, radar, recording devices, telephone networks and second- generation computers. They contain a heated electron-emitting cathode and an anode. Current could only flow in one direction through the device, so it operated like a switch. Computers store and transmit data using binary © www.bsharptraining.org Page 2 which is a series of ones and zeros. If the switch was on, then it represented a one, and if off a zero. This simple vacuum tube was therefore critical to the processing of a computer. Vacuum tubes generated a lot of heat and like light bulbs had to be replaced often. They would last anywhere from forty-one days to a little over two years depending on the manufacturer. They were very expensive, especially the long-life vacuum tubes, and meant that computers could only be afforded by very large companies. The use of vacuum tubes resulted in extremely bulky computers. Computers using vacuum tubes could take up entire floors of buildings 1943 – Electronic Numerical Integrator and Calculator (ENIAC) Two University of Pennsylvania professors, John Mauchly and J Presper Eckert built the Electronic Numerical Integrator and Calculator (ENIAC). It filled a 6m x 12m room and had 18 000 vacuum tubes. The ENIAC experienced a vacuum tube failure on average once every two days. It would take about fifteen minutes to locate the faulty tube. It is considered the grandfather of modern-day digital computers. Forty-Six ENIAC’s were made and cost of $750 000 each ( ± $11 150 000 or R161 000 000 today) The ENIAC was primarily designed to calculate artillery firing tables for the United States Army. It could calculate a trajectory in 30 seconds that took a human 20 hours. It was in continuous operation for the US Army until 2 October 1955. It would take the ENIAC almost two years to perform the number of operations the iPad does in one second. 1946 – Universal Automatic Computer (UNIVAC) Mauchly and Presper left the University of Pennsylvania to build the UNIVAC which was the first commercial computer for business and government. In 1951 the UNIVAC predicted the outcome of the 1952 presidential elections. It predicted an Eisenhower landslide when the traditional pollsters expected Adlai Stevenson to win. © www.bsharptraining.org Page 3 1947 – The Transistor Bell Laboratories invented the transistor which was an electric switch made from a semiconductor. This eliminated the need for vacuum tubes which radically reduced the size of computers and made them far more reliable. This invention led to the second generation of computers SECOND GENERATION – Transistors The second generation lasted from approximately 1952 to 1964 and made use of transistors. It made use of high-level programming languages like FORTRAN and COBOL. 1958 – Integrated Circuit Jack Kilby and Robert Noyce develop the Integrated Circuit, known as the computer chip. This single chip contained multiple transistors on a semi-conductor material which was usually silicone. Integrated circuits used less power, were smaller and far more reliable that building individual transistors. Kilby was awarded the Nobel Prize for Physics in 2000 for his work. Unfortunately, Robert Noyce had died of a heart attack in 1992 at the age of 62, and the Nobel Prize is not awarded posthumously. Integrated Circuits form the basis of modern computers and led to the third generation of computers. THIRD GENERATION – Integrated Circuits The third generation lasted from approximately 1964 to 1972 and made use of integrated circuits. It made use of high-level programming languages like FORTRAN II, COBOL, PASCAL and BASIC. 1965 – Moore’s Law Gordon E Moore was the co-founder of Intel and he made an observation, now called Moore's law. It stated that the number of transistors in a dense integrated circuit doubles approximately every two years. Which in turn doubles the speed of the processor. The first processors in 1978 had 29 000 transistors with the current i9 CPU’s forty years later, having an estimated 6.5 billion transistors. Current Central Processing Units have transistors that are as small as ten atoms thick. Moore’s Law was proven correct up to about the year 2000, with CPU speeds doubling every two years. Since then the speeds have slowed down. The Pentium 4 chip in 2000 ran at 3.4 GHz and the latest Pentium 4 chip in 2019 is only running at 5 GHz, which is slightly less than a 50% increase in nineteen years. This is largely because we are reaching the limit of the current technology. © www.bsharptraining.org Page 4 Very Large-Scale Integrated Circuits (VLSI) Very large-scale integration was made possible with the wide adoption of the MOS transistor (metal–oxide–silicon transistor). MOS integrated circuit technology allowed the integration of more than 10,000 transistors in a single chip. Later hundreds of thousands, then millions and now billions of transistors could be included on a single microchip. FOURTH GENERATION – Very Large-Scale Integrated Circuits The fourth generation lasted from approximately 1972 to 1990 and made use of Very Large-Scale Integrated Circuits (VLSI). These circuits contained about 5000 transistors and other circuit elements in a single chip It made use of high-level programming languages like C, C++ and DBASE. Computers became cheaper, smaller and portable They ran cooler and didn’t require air-conditioning The concept of the Internet was introduced, and networks were developed 1973 – Xerox Alto The Xerox Alto was a milestone in computing. It was the first computer to support an operating system and a graphical user interface (GUI) a decade before mass market GUI system became available.