Chapter Two Rise of Modern Computers Singularity Or Skynet
Total Page:16
File Type:pdf, Size:1020Kb
Chapter Two Rise of Modern Computers Singularity or Skynet “There is no reason for any individual to have a computer in his home.” Ken Olsen (1926 - 2011), President, Digital Equipment Introduction How do you feel about the Ken Olsen quote found above? Why do you feel that way? What forces demanded the development of computers? Read the quote above and carefully consider the statement made by Ken Olsen. Do you agree or disagree? Why? Hold onto your thoughts and reasons as we go through this chapter. About half way into the chapter we’ll return to this quotation. Last chapter saw us examining developments leading toward modern computers. We looked at the Roman numerals and their replacement with the Hindu-Arabic number system. We also saw how devices such as the abacus and slide rule came to be replaced with computers. The push behind computer development was the demands of science, business, and war. The first real computer was Charles Babbage’s Analytical Engine and for it Ada Lovelace wrote what is considered to be the world’s first computer program. Computer development accelerated during the twentieth century. The usual suspects were involved in this acceleration, science, business, and war. Herman Hollerith, who solved the Census Office problems with a machine using punched cards, started the company that would become IBM (International Business Machines). The world wars of the twentieth century demanded help for humanity in breaking the codes and ciphers of the Axis. They were also used in predicting the weather and calculating gunnery tables. Developments during the war, and advances in both electronics and manufacturing, placed us in the position to make great advances. We left Chapter One with the development of COLOSSUS in England and Eniac in the United States. This chapter will pick up the story. Important Words in this Chapter Adventure Altair 8800 Apple Computer Bill Gates Gordon E. Moore Grace Hopper IBM PC Ken Olsen Mainframe Micro Computer Mini Computer Moore’s Law PONG Steve Jobs Steve Wozniak VisiCalc WordStar Mainframes How did the IBM (International Business Machines) company come to be formed? The first commercial computers were mainframes. What were some of the tasks for which they were used? Did users buy or lease mainframe computers? Why? What is Moore’s Law? What did Grace Hopper do during WW II? In which branch of the military did she serve? What modern programming language was created by Grace Hopper? For what use was the language created? What is the story behind the creation of the term, “computer bug”? If you think back to our earlier lesson, Herman Hollerith developed tabulating machines using punched cards to help with the census. It was so successful that he formed a company to lease these machines to other companies. This company mutated over time and through several mergers to become IBM Figure 1 IBM 360 Mainframe (International Business Machines). IBM was one of the main manufacturers of mainframes. (In fact, the group of mainframe manufacturers were sometimes jokingly called IBM and the Seven Dwarfs. The Seven Dwarfs generally were Burroughs, UNIVAC, NCR, Control Data, Honeywell, General Electric and RCA. This list did change as some manufacturers ceased manufacturing mainframes or merged with other companies.) The first commercial computers were mainframes. They ruled the roost until the early 1970s and were used by large organizations for applications like the census, industry and consumer statistics, allocation of resources in an enterprise, and transaction processing. If you see an old science fiction movie, the computers shown in the movies are mainframes. They took up entire rooms and were usually leased to the company or government using them. This leasing followed the model created by Hollerith's first company. Leasing computer services, instead of selling a computer to the customer and letting them do it all, is coming back today with cloud services. Companies give up some control over their data to save costs. They don't have to buy the computer doing the processing and storage, nor do they have to maintain it. They pay a fee for someone else to take care of it. The regular stream of money coming in works well for the provider too. It is not like selling a product where you might sell one this month and not sell another one until next year. We don't have space or time to discuss all the important people, but two notable people to remember with respect to mainframes are Gordon E. Moore and Grace Hopper. Gordon Moore was the Director of Research & Development for Fairchild Semiconductor. As with any good company, they were looking ahead predicting where their industry was going and deciding what they needed to do to be part of that future. Gordon Moore was one of the persons doing this at Fairchild. He examined the data for 1959-to-1964, looking at the number of components on each integrated circuit. The number had increased each year. He extended this trend out ten years, predicting that the number of components on an integrated chip would double every 12 months. An edited version of his research was published in the April 19, 1965, issue of “Electronics” magazine as, “Cramming More Components onto Integrated Circuits”. His conclusions became known as Moore's Law. Figure 2 Gordon E. Moore w/ Moore's Law Graphic Working for Intel in 1975, he revisited the data, concluding that the rate of growth would slow to a doubling of capacity every two years. (In effect, the number of electronic components that could be placed on an integrated chip would double every two years.) The massive growth of computers to the present is the result of this doubling of capacity. This became a self-fulfilling prophecy as companies strove to make Figure 3 DDRAM Memory Cards Crammed w/ technological breakthroughs to meet Integrated Chips the prediction of Moore's Law. There is some speculation that we are currently reaching the limit of the number of components that can be crammed on an integrated chip. If that is the case, Moore’s Law will cease to be true. This would have grave implications for both computer manufacturers and users. Grace Brewster Murray Hopper was an American computer scientist and a United States Navy Rear Admiral. If we have any NJROTC persons in this class, the U.S. Navy guided-missile destroyer USS Hopper is named for her. She was extremely curious as a child, earned a Ph.D. in mathematics Figure 4 Grace Hopper from Yale, and taught at Vassar College. During WW II (World War II), she obtained a leave of absence from her teaching position and joined the U.S. Navy Reserve. Following her graduation from the Naval Reserve Midshipmen’s School, she was assigned to the Bureau of Ships Computation Project at Harvard University. There she served on the Mark 1 computer programming staff. Her request to transfer into the regular Navy following the end of WW II was denied. She remained in the Navy Reserve; working at the Harvard Computation Lab, declining the offer of a full professorship from Vassar. Grace Hopper developed COBOL (Common Business-Oriented Language), a computer language designed for business use. This and the computer languages that followed were designed as an interface between machine language, a computer’s native language, and the language of humans. You can think of a computer language as a high-level language we understand. A compiler or interpreter translates that high-level language into a low- Figure 5 Ironic Comment on "Bug" being found in Computer level language that the computer can understand. Grace Hopper is also often given credit for the creation of the term “bug” to refer to a computer programming error or problem that is difficult to find. Engineers had previously referred to gremlins as bugs. When the computer experienced problems September 9, 1945, a search was made through the electronics making up the computer. Between the points in Relay 70, the technical team found a moth. They removed it, taped it into the log book, and someone with a sense of humor labelled it, “First actual case of a bug being found.” As result of this incident she is often credited with the creation of the term “bug”. It is a good story, but the term predates this incident. This is also where we find the roots of the term “debug”, to find and remove the faults from a computer program or electronic device. Mini Computers Why were Mini Computers developed? When were Mini Computers developed? What were common uses or applications for Mini Computers? At some point it was realized that technology developments made smaller computers feasible. Moreover, there were markets available that had applications for a smaller class of computers. Mini computers were developed in the mid- 1960s, using a teleprinter as an input/output device and running programs in a high-level language. They were used for control, instrumentation, human interaction, and communication. Applications include power plant control/instrumentation, telephone switching, the control of laboratory equipment, and CAD (Computer Aided Design) when this field began in the early 1970s. Manufacturers included DEC, SPARC, Oracle, Data General, Apollo Computer, Prime Computer, IBM, and Hewlett-Packard. An interesting read about mini computers is “The Soul of a New Machine”, by Tracy Kidder. It is the story of a group of Data General designers rushing to create a next-generation mini computer to compete with the products of the DEC company. Micro Computers What led to the development of Micro Computers? What was the first Micro Computer sold to the public? Was it sold as a kit or preassembled? What was the first highly successful, mass produced Micro Computer? When did it go on sale? What were the software programs made the Apple computers popular in business settings? What simulation of pioneers “going west” made Apple computers popular in education settings? What did Ken Olsen mean with his statement about individuals and computers? What happened to IBM’s share of the total computer market? How, and when, did IBM develop a Micro Computer? Technology marched on.