Introduction to Computers

1. The History of Computers If I asked when the history of computers began, I am relatively certain that most answers would be between 1940 and 1980. Few people would expect the that the first mechanical computer was built in 1823. So, our story begins with Charles Babbage.

1823 – Charles Babbage Charles Babbage was a mathematician, philosopher, inventor and a mechanical engineer. He is widely regarded as the “father of the computer” as he invented the first mechanical computer.

He designed an Analytical Engine first described in 1937, that could perform arithmetic calculations, true and false statements as well as loops and conditional branching. It even contained integrated memory. All this, while Queen Victoria was on the throne.

His Difference Engine which was the forerunner to the Analytical Engine. It was not programmable but could do arithmetic calculations.

In 1823 the British government gave Babbage £1 700 (± £211 000 or R3 900 000 Charles Babbage today) to start work on the project. The actual implementation proved to be (26 December 1791 - 18 October 1871) extremely expensive as the metalworking techniques of the era could not produce the precision and quality required.

The government abandoned the project in 1842 after Babbage had received and spent £17 000 or R39 000 000

His Difference Engine was finally constructed by the British Science Museum between 1989 – 1991 to celebrate the bi-centennial of Babbage’s birth. They used his original plans and the success of the finished engine proved that Babbage’s machine would have worked. The Difference Engine can be seen today in the Science Museum in London.

The Difference Engine on display in the Science Museum

© www.bsharptraining.org Page 1

1890 – Herman Hollerith Herman Hollerith designed a punch card system to calculate the 1890 census and saved the government $5 million (± $138 million or R2 billion today). Anyone who has done a multiple-choice exam on special cards will appreciate how a punched card system works. A hole or a black mark can be read by a punch card reader and stored on a computer.

Hollerith established a company called Computing-Tabulating-Recording Company (CTR), that would go on to become the International Business Machines Corporation or IBM. IBM is the sixth largest IT company in the world. Apple, Samsung, Microsoft, Google and Intel occupy the top five spots.

1936 – Alan Turing Alan Turing (23 June 1912 – 7 June 1954) was a mathematician, computer scientist, logician, cryptanalyst, philosopher and theoretical biologist.

It is estimated that his work in cryptography during the Second World War shortened the war by two years saving approximately fourteen million lives. He was awarded the Order of the British Empire (OBE) in secret in 1946 as much of his work was covered under the Official Secrets Act.

He was convicted for indecency in 1952 as he was homosexual and was chemically castrated. He committed suicide two years later, and we lost on of the greatest minds of the twentieth century. There are many that dispute the finding of suicide, but as the apple that supposedly contained the cyanide was never tested, we may never know. Queen Elizabeth II posthumously pardoned Alan Turing in 2013.

Nevertheless, he is regarded as the founder of theoretical computer science as well as artificial intelligence. He designed the Turing Test for Artificial Intelligence, which to date has not been passed.

In 1936 at the age of 24 he presented a notion of a universal machine which would be capable of computing anything that is computable. This was the first idea of a general-purpose computer. The machine was later called the Turing Machine and is still studied in Computer Science degrees today.

FIRST GENERATION – Vacuum Tubes

The first generation lasted from approximately 1942 to 1954 and made use of vacuum tubes.

It used batch processing, punched cards and magnetic tapes.

The simplest vacuum tube, the diode was invented in 1904. Vacuum tubes were a key component of electrical circuits for the first half of the twentieth century. They were found in radios, television, radar, recording devices, telephone networks and second- generation computers.

They contain a heated electron-emitting cathode and an anode. Current could only flow in one direction through the device, so it operated like a switch. Computers store and transmit data using binary

© www.bsharptraining.org Page 2 which is a series of ones and zeros. If the switch was on, then it represented a one, and if off a zero. This simple vacuum tube was therefore critical to the processing of a computer.

Vacuum tubes generated a lot of heat and like light bulbs had to be replaced often. They would last anywhere from forty-one days to a little over two years depending on the manufacturer. They were very expensive, especially the long-life vacuum tubes, and meant that computers could only be afforded by very large companies.

The use of vacuum tubes resulted in extremely bulky computers. Computers using vacuum tubes could take up entire floors of buildings

1943 – Electronic Numerical Integrator and Calculator (ENIAC) Two University of Pennsylvania professors, John Mauchly and J Presper Eckert built the Electronic Numerical Integrator and Calculator (ENIAC). It filled a 6m x 12m room and had 18 000 vacuum tubes. The ENIAC experienced a vacuum tube failure on average once every two days. It would take about fifteen minutes to locate the faulty tube. It is considered the grandfather of modern-day digital computers.

Forty-Six ENIAC’s were made and cost of $750 000 each ( ± $11 150 000 or R161 000 000 today)

The ENIAC was primarily designed to calculate artillery firing tables for the United States Army. It could calculate a trajectory in 30 seconds that took a human 20 hours.

It was in continuous operation for the US Army until 2 October 1955.

It would take the ENIAC almost two years to perform the number of operations the iPad does in one second.

1946 – Universal Automatic Computer (UNIVAC) Mauchly and Presper left the University of Pennsylvania to build the UNIVAC which was the first commercial computer for business and government.

In 1951 the UNIVAC predicted the outcome of the 1952 presidential elections. It predicted an Eisenhower landslide when the traditional pollsters expected Adlai Stevenson to win.

© www.bsharptraining.org Page 3

1947 – The Transistor Bell Laboratories invented the transistor which was an electric switch made from a semiconductor. This eliminated the need for vacuum tubes which radically reduced the size of computers and made them far more reliable.

This invention led to the second generation of computers

SECOND GENERATION – Transistors

The second generation lasted from approximately 1952 to 1964 and made use of transistors.

It made use of high-level programming languages like FORTRAN and COBOL.

1958 – Integrated Circuit Jack Kilby and Robert Noyce develop the Integrated Circuit, known as the computer chip. This single chip contained multiple transistors on a semi-conductor material which was usually silicone. Integrated circuits used less power, were smaller and far more reliable that building individual transistors.

Kilby was awarded the Nobel Prize for Physics in 2000 for his work. Unfortunately, Robert Noyce had died of a heart attack in 1992 at the age of 62, and the Nobel Prize is not awarded posthumously.

Integrated Circuits form the basis of modern computers and led to the third generation of computers.

THIRD GENERATION – Integrated Circuits

The third generation lasted from approximately 1964 to 1972 and made use of integrated circuits.

It made use of high-level programming languages like FORTRAN II, COBOL, PASCAL and BASIC.

1965 – Moore’s Law Gordon E Moore was the co-founder of Intel and he made an observation, now called Moore's law. It stated that the number of transistors in a dense integrated circuit doubles approximately every two years. Which in turn doubles the speed of the processor.

The first processors in 1978 had 29 000 transistors with the current i9 CPU’s forty years later, having an estimated 6.5 billion transistors. Current Central Processing Units have transistors that are as small as ten atoms thick.

Moore’s Law was proven correct up to about the year 2000, with CPU speeds doubling every two years. Since then the speeds have slowed down. The Pentium 4 chip in 2000 ran at 3.4 GHz and the latest Pentium 4 chip in 2019 is only running at 5 GHz, which is slightly less than a 50% increase in nineteen years.

This is largely because we are reaching the limit of the current technology.

© www.bsharptraining.org Page 4

Very Large-Scale Integrated Circuits (VLSI) Very large-scale integration was made possible with the wide adoption of the MOS transistor (metal–oxide–silicon transistor). MOS integrated circuit technology allowed the integration of more than 10,000 transistors in a single chip. Later hundreds of thousands, then millions and now billions of transistors could be included on a single microchip.

FOURTH GENERATION – Very Large-Scale Integrated Circuits

The fourth generation lasted from approximately 1972 to 1990 and made use of Very Large-Scale Integrated Circuits (VLSI).

These circuits contained about 5000 transistors and other circuit elements in a single chip

It made use of high-level programming languages like C, C++ and DBASE.

Computers became cheaper, smaller and portable

They ran cooler and didn’t require air-conditioning

The concept of the Internet was introduced, and networks were developed

1973 – The Xerox Alto was a milestone in computing. It was the first computer to support an and a (GUI) a decade before mass market GUI system became available.

Up to this point every command had to be manually typed into the computer. This required a great deal of technical skill.

It was also the first time a mouse was used.

The computer screen was set out in a portrait orientation.

It had a CPU speed of 5.88 MHz or 5.88 million instructions per second, which was incredibly fast for its time. The first IBM PC launched in 1981 couldn’t match its speed.

The Alto cost $32 000 in 1979 which was the equivalent of ± $110 500 or R1 600 000 today

In December 1979, Apple engineers led by , attended a demonstration of the Alto. They used concepts they learnt there to develop the Lisa and the . Xerox was paid in Apple shares for the demonstration. Xerox only produced around two thousand units and possibly didn’t realise the scope of what they had created.

It took a visionary like Steve Jobs to see the potential.

When leaving the demonstration, Steve Jobs apparently instructed one of his engineers to create a mouse, which was used for the .

1976 – Apple Inc Steve Jobs and Steve Wozniak founded Apple Inc in 1976 and released the , which was the first computer with a single circuit board.

© www.bsharptraining.org Page 5

The users were required to provide their own keyboards and televisions. It was aimed at a technical and hobbyist market.

It had a CPU speed of 1 MHz or 1 million instructions per second

The Apple 1 was hand-built by Wozniak and cost $666.66 (± $2 950 or R42 500 in today’s terms) because he liked repeating digits and it represented a 33% mark up on the $500 cost price.

Two hundred units were produced and one hundred and seventy- five were sold within ten months. It was officially discontinued in October 1987.

If a customer bought an Apple II they could return the Apple I for a credit. The Apple I’s were stripped for parts, and as a result, a working model of an Apple 1 is extremely rare today and could sell for between $400 000 to $650 000. (5.8 to 9.4 million Rand)

1977 – Apple II The launch of the Apple II marked the first time Apple aimed its computer at the consumer market. It was branded towards households rather than businessmen or computer hobbyists.

It had a CPU speed of 3 MHz or 3 million instructions per second. It was therefore three times faster than the Apple I.

It cost $1 298 (± $5 400 or R77 600 in today’s terms)

One of the defining features of the Apple II was its ability to display colour graphics. It was able to display a mere 16 colours, compared with 16 million that modern computers display. This was the reason the Apple logo was redesigned to have a spectrum of colours.

A computer game called “Mystery House” was developed for the Apple II and was the first home computer game to feature graphics. Released in 1980, is sold over 10 000 units at a cost of $24.95 a copy. (approximately R1 200 today)

© www.bsharptraining.org Page 6

1981 – The First IBM released their first personal computer code names ‘Acorn’. It used Microsoft’s MS-DOS 1.0 operating system and had an Intel 8080 chip running at 4.77 MHz with 16 KB of memory. It also had two floppy disks and an optional colour screen.

This was the first time a Microsoft operating system was used. Microsoft would go onto to release new versions of MS-DOS on average once a year, requiring customers to upgrade and resulting in Bill Gates becoming the richest man in the world at that time.

Also, the PC only had 16 KB of memory. That would barely enough memory to display the picture above. There are one million kilobytes in a single gigabyte. A typical mobile phone today has a million times more memory than the PC.

It was the first-time computers were sold by outside distributors. The prices started at $1 565 or the equivalent of approximately $4 300 or R62 000 today.

This computer popularized the term PC.

1981 – The Osborne 1 was the first mass produced . It weighed 24 pounds or 11kg and cost $1 795. (± $5 000 or R71 500 today)

It contained bundled word processors, spreadsheets and other software that was worth $1 500. Its price was its main marketing feature with Osborne describing the performance as “merely adequate”.

Its design was largely based upon the Xerox NoteTaker developed in 1978. The NoteTaker never entered production, and only ten prototypes were made. Once again Xerox had developed something amazing and possibly not seen the potential in it.

The company sold 11,000 units in the first eight months of sales, and sales at their peak reached 10,000 units per month.

Unfortunately, the company prematurely announced the launch of the Osborne executive which was a far better machine. Consumers therefore cancelled their orders preferring to wait for the better model and the company went bankrupt. This phenomenon was later called the Osborne effect.

1982 – Commodore 64 The Commodore 64 was better known as the C64 and sold for $595 (± $1 600 or R22 500 today). It had impressive graphics and by 1993 had sold over 22 million units. The 2006 Guinness Book of World Records recognises the C64 as the greatest selling single computer of all time.

The C64 had 64 KB of memory for where it got its name. It could display 16 colours and a 320 x 200 resolution. Its processor ran at 1MHz (a million instructions per second.)

The C64 dominated the low end of the computer market for much of the 1980’s, outselling IBM, Apple and Atari.

It could be used as a computer or a video game console.

© www.bsharptraining.org Page 7

1983 – Apple’s Lisa Apple released the Lisa which was a pet project of Steve Jobs and was named after his first daughter. Although officially it stands for "Locally Integrated Software Architecture", I’ll let you be the judge.

Development began in 1978, but many changes were made after the Apple team visited Xerox and saw a demonstration of the Xerox Alto.

The Lisa was the first personal computer with a GUI. It featured drop-down menus and icons, together with the mouse they had seen at Xerox.

Steve Jobs was forced off the project in 1982 and went on to develop the Macintosh which was launched in 1984.

The Lisa cost a staggering $9 999 (equivalent to approximately $25 250 or R365 000 today)

Although the Lisa was technically brilliant for its time, it was a complete flop, largely due to the price tag and the fact that the Macintosh was cheaper and more usable. The Lisa was later renamed as the Macintosh XL.

1986 – Intel 80386 Microprocessor Compaq beat IBM to market when it released the Deskpro 386 which was the first computer to use Intel’s 80386 chip. This chip was a 32-bit microprocessor with 275 000 transistors, which was an improvement on the 16-bit architecture of previous processors. It could do 3 million instructions per second. The 80386 gave PCs as much speed and power as older mainframes and minicomputers.

This meant that a small to medium sized company that previously could not afford the cost of a mainframe or minicomputer could enter the computer age. Computer sales responded accordingly.

1989 – Intel 80486 Microprocessor Intel released the 80486 microprocessor that contained more than a million transistors, with a 32-bit arithmetic and logic unit as part of the CPU. It was capable of 33 million instructions per second.

Ultra-Large-Scale Integrated Circuits (ULSI) Once technology allowed for millions of transistors to be included in a single chip, the fifth generation of computers was launched.

FIFTH GENERATION – Ultra Large-Scale Integrated Circuits

The fifth generation is from approximately 1990 to dates and makes use of Ultra Large-Scale Integrated Circuits (ULSI).

These circuits contain tens of millions of transistors and other circuit elements on a single chip

It made use of high-level programming languages like C, C++, JAVA and .NET.

Concepts like, robotics, neural networks, gaming, expert systems and natural language processing were introduced

It also saw the introduction of laptops and mobile computing as well as the introduction of the Internet in 1994

© www.bsharptraining.org Page 8

1993 – Intel Pentium The Pentium chip allowed programs to run faster and introduced parallel processing. This meant that the processor could execute several instructions at the same time. The Pentium CPU also added support for graphics and music. It contained 3.3 million transistors and ran at speeds of up to 200 MHz or 200 million instructions per second.

1997 – Intel Pentium MMX The MMX chip added multimedia instructions and increased the transistors to 4.5 million. They could run at 300 MHz or 200 million instructions per second.

1997 – Intel Pentium II The Pentium II had 7.5 million transistors and could run at 450 MHz or 450 million instructions per second.

1998 – Intel Celeron The Intel Celeron was produced by Intel in an effort to compete with the cheaper clone chips that were on the market. They contained less features and started with speeds of 266 MHz or 266 million instructions per second.

1999 – Pentium III The Pentium III added seventy additional instructions to the Pentium II and had up to twenty-eight million transistors and could run at a maximum 1.13 GHz or 1.13 billion instructions per second.

2000 – Pentium 4 The Pentium 4 had forty-two million transistors and could run at a maximum 3.4 GHz or 3.4 billion instructions per second. Current Pentium 4 processors are now developed as dual core and quad core which essentially means that two or four processor are included in a single chip. The latest i9 processor can run at 5 GHz or 5 billion instructions per second. It would take a stopwatch 158.5 years to reach five billion seconds

2014 – IBM Summit Summit was built for the US Department of Energy by IBM at a cost of $325 million. As of November 2018, it is the fastest supercomputer in the world with a peak speed of 200 petaflops – two hundred quadrillion floating-point calculations per second.

A floating-point calculation is similar to calculations involving exponents. It is complicated and requires a large amount of processing power, which is why it is used as a measure of speed in super computers.

Summit is capable of doing 200 000 000 000 000 000 FLOPS.

It has approximately 1 400 GB of memory and can store 250 Petabytes of storage (250 million gigabytes)

It contains 9 216 central processing units (CPU’s) and 27 648 graphics processing units (GPU’s)

Summit will be used for cosmology, medical and climate research.

© www.bsharptraining.org Page 9

Theoretical Computers As we into the Fourth Industrial Revolution and the age of robotics and artificial intelligence, the need for faster computers is growing.

As demonstrated through Moore’s Law, our current technology is reaching the limit of its capacity. We require a new type of computer if we are to move forward into an exciting new age of computing.

Thankfully research is being done into many different types of computers, including:

 Quantum Computers make use of the energy of atoms and subatomic particles. The interaction of these atoms and particles is predictable and can be used to do calculations.  Chemical Computers is also called a reaction diffusion computer. It makes use of a semi-solid chemical “soup”. Data is represented by varying concentration of the chemicals within this soup.  DNA Computers use DNA enzymes to do calculations. DNA forms the basis for all life on earth and as such can process data. When combinations of DNA strands are unfurled calculations can be performed.  Optical Computers use photons produced by lasers and diodes for computation. Photons promise to allow a higher bandwidth than the electrons used in conventional computers.  Spintronics-Based Computers use the spin and the associated magnetic moment of electrons. So, in addition to the charge state, the electron spin acts as an addition degree of freedom. This makes data storage and transfer far more efficient.

In October 2019 Google researches claimed to have achieved Quantum Supremacy when a quantum computer took 200 seconds to perform a calculation that the researchers estimated would take a supercomputer 10 000 years to compute. IBM, who owns said supercomputer, claimed that it would take just two and a half days to perform the calculation. Regardless who is correct, at least it demonstrates that progress is being made.

© www.bsharptraining.org Page 10