<<

Social and Professional Issues in IT Pioneers of Computing

Pioneers of Computing

Many scientist and engineers have contributed in the development of computing science. This chapter covers all the major scientists and their works, who have played a significant role in the development of computing. Some of the major Pioneers of Computing and their contributions are described here

Aristotle (384 BC, Stagira, Greece, 322 BC, Athens, Greece)

He was one of the greatest ancient Greek philosophers of his time. Many of his thoughts have become the back bone of computing and artificial Intelligence. His work in the natural and social sciences greatly influenced virtually every area of modern thinking.

Biography Aristotle was born in 384 BC in Stagira, on the northwest coast of the Aegean Sea. His father was a friend and the physician of the king of Macedonia, and the lad spent most of his boyhood at the court. At 17, he went to Athens to study. He enrolled at the famous Academy directed by the philosopher Plato. Aristotle threw himself wholeheartedly into Plato's pursuit of truth and goodness. Plato was soon calling him the "mind of the school." Aristotle stayed at the Academy for 20 years, leaving only when his beloved master died in 347 BC. In later years he renounced some of Plato's theories and went far beyond him in breadth of knowledge. Aristotle became a teacher in a school on the coast of Asia Minor. He spent two years studying marine biology on Lesbos. In 342 BC, Philip II invited Aristotle to return to the Macedonian court and teach his 13-year-old son Alexander. This was the boy who was to become conqueror of the world ( see Alexander the Great ). No one knows how much influence the philosopher had on the headstrong youth. After Alexander became king, at 20, he gave his teacher a large sum of money to set up a school in Athens.

The Peripatetic School In Athens Aristotle taught brilliantly at his school in the Lyceum. He collected the first great library and established a museum. In the mornings he strolled in the Lyceum gardens, discussing problems with his advanced students. Because, he walked about while teaching, Athenians called his school the Peripatetic (which means "to walk about") school. He led his pupils in research in every existing field of knowledge. They dissected animals and studied the habits of insects. The science of observation was new to the Greeks. Hampered by lack of instruments, they were not always correct in their conclusions. One of Aristotle's most important contributions was defining and classifying the various branches of knowledge. He sorted them into , , psychology, rhetoric, poetics, and , and thus laid the foundation of most of the sciences of today.

Aristotle's Works After his death, Aristotle's writings were scattered or lost. In the early Middle Ages the only works of his known in Western Europe were parts of his writings on logic. They became the basis of one of the three subjects of the medieval trivium--logic, grammar, and rhetoric. Early in the 13th century

Compiled Deepanjal Shrestha 1 of 33 Social and Professional Issues in IT Pioneers of Computing

other books reached the West. Some came from Constantinople; others were brought by the Arabs to Spain. Medieval scholars translated them into Latin.

Muhammad ibn Musa al-Khwarizmi (800 - 847, Baghdad, Iraq)

Muhammad ibn Musa al-Khwarizmi was born sometime before 800 A.D. in an area not far from Baghdad and lived at least until 847. He wrote his Al-jabr wa'l muqabala (from which our modern word "algebra" comes) while working as a scholar at the House of Wisdom in Baghdad. In addition to this treatise, al- Khwarizmi wrote works on astronomy, on the Jewish calendar, and on the Hindu numeration system. The English word "" derives from the Latin form of al-Khwarizmi's name.

Henry Briggs (Feb 1561 Worley Wood, Yorkshire)

Briggs is especially known for his publication of tables of to the base 10, first Logarithmorum chilias prima, 1617, and later Arithmetica logarithmetica, 1624. He also composed a work on trigonometry (basically tables, both of the functions and of the logs of sines and tangents) that was left unfinished at his death; Gellibrand completed and published it. And he left quite a few mathematical manuscripts that remained unpublished.

Charles Babbage (Dec 26, 1791 Teignmouth, Devonshire, UK, Oct 18, 1871, London, UK)

Charles Babbage “Father of ” was a great scientist who lived in the 18th century. He was the designer of and the .

Biography Charles Babbage was born in London on December 26, 1792 (3), the son of Benjamin Babbage, a London banker. As a youth Babbage was his own instructor in algebra, of which he was passionately fond, and was well-read in the continental of his day. Upon entering Trinity College, , in 1811, he found himself far in advance of his tutors in mathematics. With Herschel, Peacock, and others, Babbage founded the for promoting continental mathematics and, reforming the mathematics of Newton, then taught at the university. In his twenties Babbage worked as a mathematician, principally in the calculus of functions. He was elected a Fellow of the Royal Society, in 1816, and played a prominent part in the foundation of the Astronomical Society (later Royal Astronomical Society) in 1820. It was about this time that Babbage first acquired the interest in calculating machinery that became his consuming passion for the remainder of his life. Throughout his life Babbage worked in many intellectual fields typical of his day, and made contributions that would have assured his fame irrespective of the Difference and Analytical Engines.

Prominent among his published works are:

· A Comparative View of the Various Institutions for the Assurance of Lives (1826); an actuarial paper, Compiled Deepanjal Shrestha 2 of 33 Social and Professional Issues in IT Pioneers of Computing

· Table of Logarithms of the Natural Numbers from 1 to 108, 000 (1827), · Reflections on the Decline of Science in (1830), · On the Economy of Machinery and Manufactures (1832), · Ninth Bridgewater Treatise (1837), · and the autobiographical Passages from the Life of a Philosopher (1864).

Babbage occupied the Lucasian chair of mathematics at Cambridge from 1828 to 1839. He played an important role in the establishment of the Association for the Advancement of Science and the Statistical Society (later Royal Statistical Society). Despite his many achievements, the failure to construct his calculating , and in particular the failure of the government to support his work, left Babbage in his declining years a disappointed and embittered man. He died at his home in Dorset Street, London, on October 18, 1871.

Honors and Awards · Elected Fellow of the Royal Society - 1816 · First gold medal of the Astronomical Society of London

George Boole (Nov 2, 1815 - Dec 8, 1864, Lincoln, England)

George Boole laid the groundwork for what we know today as Information Theory through the publication of his masterpiece, An Investigation of the laws of Thought, on which are founded the Mathematical Theories of Logic and Probabilities. In this work, published when the author was 39, Boole reduced logic to an extremely simple type of algebra, in which 'reasoning' is carried out through manipulating formulas simpler than those used in second-year traditional algebra. His theory of logic, which recognizes three basic operations - AND, OR and NOT - was to become germane to the development of telephone circuit switching and the design of electronic .

Ada Byron (Born: London, England, Dec 10, 1815, Died: London, England, Nov 27, 1852)

Ada Byron was the daughter of a brief marriage between the Romantic poet and Anne Isabelle Milbanke, who separated from Byron just a month after Ada was born. Four months later, Byron left England forever. Ada never met her father (who died in Greece in 1823) and was raised by her mother, . Her life was an apotheosis of struggle between emotion and reason, subjectivism and objectivism, poetics and mathematics, ill health and bursts of energy. Lady Byron wished her daughter to be unlike her poetical father, and she saw to it that Ada received tutoring in mathematics and music, as disciplines to counter dangerous poetic tendencies. But Ada's complex inheritance became apparent as early as 1828, when she produced the design for a flying Analyst, Metaphysician, . It was mathematics that gave her life its wings. and Founder of Scientific Computing Lady Byron and Ada moved in an elite London society, one in which gentlemen not members of the clergy or occupied with politics or the affairs of a regiment were quite likely to spend their time and fortunes pursuing botany, geology, or astronomy. In the early nineteenth century there were no "professional" scientists (indeed, the word "scientist" was only coined by in 1836)--but the participation of noblewomen in intellectual pursuits was not widely encouraged.

Compiled Deepanjal Shrestha 3 of 33 Social and Professional Issues in IT Pioneers of Computing

One of the gentlemanly scientists of the era was to become Ada's lifelong friend. Charles Babbage, Lucasian professor of mathematics at Cambridge, was known as the inventor of , an elaborate calculating machine that operated by the method of finite differences. Ada met Babbage in 1833, when she was just 17, and they began a voluminous correspondence on the topics of mathematics, logic, and ultimately all subjects. In 1835, Ada married William King, ten years her senior, and when King inherited a noble title in 1838, they became the Earl and Countess of Lovelace. Ada had three children. The family and its fortunes were very much directed by Lady Byron, whose domineering was rarely opposed by King. Babbage had made plans in 1834 for a new kind of calculating machine (although the Difference Engine was not finished), an Analytical Engine. His Parliamentary sponsors refused to support a second machine with the first unfinished, but Babbage found sympathy for his new project abroad. In 1842, an Italian mathematician, Louis Menebrea, published a memoir in French on the subject of the Analytical Engine. Babbage enlisted Ada as translator for the memoir, and during a nine-month period in 1842-43, she worked feverishly on the article and a set of Notes she appended to it. These are the source of her enduring fame. Ada called herself "an Analyst (& Metaphysician)," and the combination was put to use in the Notes. She understood the plans for the device as well as Babbage but was better at articulating its promise. She rightly saw it as what we would call a general-purpose computer. It was suited for "developping [sic] and tabulating any function whatever. . . the engine [is] the material expression of any indefinite function of any degree of generality and complexity." Her Notes anticipate future developments, including computer-generated music. Ada died of cancer in 1852, at the age of 37, and was buried beside the father she never knew. Her contributions to science were resurrected only recently, but many new biographies* attest to the fascination of Babbage's "Enchantress of Numbers."

Albert Einstein (1879 – 1955)

Biography Albert Einstein is one of the most recognized and well-known scientists of the century. His theories solved centuries-old problems in physics and rocked even non-physicists' view of the world. Einstein's early years did not mark him as a genius. His parents worried because he was so slow to learn to speak. Although his family was Jewish, he attended a Catholic elementary school, where he did not excel. Because of failed business ventures, the family moved several times during Einstein's childhood, finally to when he was 15. He was supposed to remain in Germany and finish school. He left, however (historians debate whether he was expelled or arranged to be excused for illness), and joined his family in Italy. He also renounced his Germany citizenship then, which freed him from military service. He belonged to no country until he became a Swiss citizen in 1921. From Italy he went to Switzerland to finish high school and attend the Swiss Federal Institute of Technology. He didn't care for such organized education; he hated having to attend classes regularly and take exams. He graduated with a teaching degree, but couldn't find a job. Finally he got a post at the Swiss patent office in Bern, in 1902. He worked there for seven years, which turned out to be the most productive period of his life. In 1903 he married a former classmate, Maria Maric, though his parents disapproved. They'd had a daughter Liserl in 1902, but she was given up for adoption. They later had two sons. 1905 was a huge year for Einstein. He published five papers in the German Yearbook of Physics, three or them groundbreaking. The first was on the motion of particles suspended in liquid. He

Compiled Deepanjal Shrestha 4 of 33 Social and Professional Issues in IT Pioneers of Computing

developed a mathematical formula to explain that the visible motion of the particles was due to the invisible motion of the molecules of the liquid. His second paper was on the photoelectric effect, or the release of electrons from metal when light shines on it. Einstein used the very recent ideas of Max Planck to explain the phenomenon. That is, he explained it in terms of quanta, or packets of energy. This was the first use of the theory outside of Planck's own work. Einstein received the Nobel Prize in physics for this paper. Last and perhaps most famous, Einstein published his special theory of relativity. This resulted in the shocking conclusion that time is not constant. Neither is weight or mass. When moving at high speeds, all of these things get compressed; only the speed of light remains the same. That happens because, said Einstein, energy is equal to mass times the speed of light squared, or E = mc2. In the following years, Einstein held positions at universities in Zurich, Prague, and Berlin. In 1914, Einstein was in Berlin. War broke out, and his wife and two sons returned to Switzerland. The couple's relationship had grown increasingly distant, and after the war the two were never reunited. They officially divorced in 1919. Some historians now believe that Maria Maric was instrumental in Einstein's early work, especially the mathematical calculations. In his letters to her he mentioned "our papers," and in one even wrote, "How happy and proud I will be when both of us together will have brought our work on relative motion to a successful end." As he gained greater prestige and scientific positions, she gained greater household responsibilities and their collaboration ended. When he received the Nobel Prize, however, Einstein gave the cash award to Maria Maric. Soon after their divorce, Einstein married his cousin Elsa. Meanwhile, he kept grappling with the ideas of physics. There were problems with his special theory, and he knew it. The problems of gravity bothered him most. Whenever physicists worked out a , gravity seemed to confuse it. In 1915, he wrote the general theory of relativity. It was extremely radical. To account for gravity, time and space must be curved around massive objects. The math was very complex and the whole idea so strange that most people didn't accept it. But Einstein suggested three ways it could be proven. One was to make observations of starlight during a solar eclipse. Conveniently, a solar eclipse occurred in 1919 and astronomers made the observations that proved the general theory of relativity. Einstein became a celebrity. Much of the world had just caught its breath after a long and horrifying war, and perhaps in relief, latched on to this amazing human achievement. Einstein himself had always opposed war. He spoke against it during the First World War, and throughout the 1920s and 1930s. Hitler was rising to power in Germany, and though Einstein had renewed his German citizenship, he was considered suspect as both a Jew and a pacifist. It may be, too, that the absolutist Nazi party found that his relativity theories conflicted with what they considered pure physics. He was in California when Hitler took power in 1933, and he never returned to Germany. He took a position at the Institute for Advanced Studies in Princeton, where he remained for the rest of his life. By the 1920s, Einstein's major contributions to physics were behind him. He debated quantum mechanics and the uncertainty principle with Niels Bohr, which helped Bohr clarify the concept, but it was a theory that Einstein never quite accepted. He spent his latter years in search of a unified field theory, or one basic equation to explain all of the forces of nature. He wrote on many topics, especially peace, but rising fascism in the years before World War II made him sign a 1939 letter to President Roosevelt, warning him that the Germans could create an atomic weapon. This led FDR to set up the Manhattan Project, an effort to secretly develop an atomic bomb. Though Einstein's formula E = mc2 was key to the project, Einstein was considered a security risk and was not involved. In 1940 Einstein renounced his German citizenship for a second time and became a U.S. citizen. He became a supporter of disarmament and of a Jewish state. In 1952 the young nation of Israel offered

Compiled Deepanjal Shrestha 5 of 33 Social and Professional Issues in IT Pioneers of Computing

Einstein the presidency, but he declined. The ninety-ninth element in the periodic table was discovered shortly after Einstein's death in 1955, and it was named "einsteinium."

Alexander Graham Bell (March 3, 1847 - August 2, 1922, Edinburgh, Scotland)

Alexander Graham Bell is most well known for inventing the telephone. He came to the U.S as a teacher of the deaf, and conceived the idea of "electronic speech" while visiting his hearing-impaired mother in Canada. This led him to invent the microphone and later the "electrical speech machine" -- his name for the first telephone. "Watson, come here! I need you!" Alexander Graham Bell has worked on technology basically all of his life. From when he helped his father on creating sign language, to his invention of the telephone, he was always fascinated by technology. While a professor at Boston University, he was also working on an invention that would allow a few sound waves, over a single wire at the same time. After he had completed his work of simple sound waves over the wire, he was determined to have vocal messages go over a wire, to another destination. When he hired his assistant, Thomas Watson, they made this dream of vocal sound over a single wire, become a reality. When Bell and Watson had completed the telephone, they still were not satisfied. They were perfectionists. They then had to fix any minor or major defects of the telephone. When they both died, the changes were left for other inventors and discoverers. Without the new inventors and discovers, the phone would not reach long distance, and the basic connection of the telephone would not be the same. We still would not, although, be this far without Alexander Graham Bell, and Thomas A. Watson.

Biography Bell was born in Edinburgh, Scotland on March 3, 1847. He enrolled in the University of London to study anatomy and physiology, but his college time was cut short when his family moved to Canada in 1870. His parents had lost two children to tuberculosis, and they insisted that the best way to save their last child was to leave England. When he was eleven, Bell invented a machine that could clean wheat. He later said that if he had understood at all, he would have been too discouraged to invent the telephone. Everyone else "knew" it was impossible to send voice signals over a wire. While trying to perfect a method for carrying multiple messages on a single wire, he heard the sound of a plucked spring along 60 feet of wire in a Boston electrical shop. Thomas A. Watson, one of Bell's assistants, was trying to reactivate a telegraph transmitter. Hearing the sound, Bell believed that he could solve the problem of sending a human voice over a wire. He figured out how to transmit a simple current first, and received a patent for that invention on March 7, 1876. Five days later, he transmitted actual speech. Sitting in one room, he spoke into the phone to his assistant in another room, saying the now famous words: "Mr. Watson, come here. I need you." The telephone patent is one of the most valuable patents ever issued. Bell had other inventions as well -- his own home had a precursor to modern day air conditioning, he contributed to aviation technology, and his last patent, at the age of 75, was for the fastest hydrofoil yet invented. Bell was committed to the advancement of science and technology. As such he took over the presidency of a small, almost unheard-of, scientific society in 1898: the National Geographic Society. Bell and his son-in-law, Gilbert Grosvenor, took the society's dry journal and added beautiful photographs and interesting writing -- turning National Geographic into one of the world's best-known magazines. He also is one of the founders of Science C. Gordon Bell

Compiled Deepanjal Shrestha 6 of 33 Social and Professional Issues in IT Pioneers of Computing

magazine. Bell died on August 2, 1922. On the day of his burial, all telephone service in the US was stopped for one minute in his honor.

Sir Ambrose J. Fleming (Nov 29, 1848, Lancaster, UK, Apr 18, 1945, Sidmouth, Devon, UK)

English engineer who made numerous contributions to electronics, photometry, electric measurements, and telegraphy. He is best remembered as the inventor of the two-electrode radio rectifier, which he called the thermionic valve; it is also known as the vacuum , kenotron, thermionic tube, and Fleming valve. It was patented in 1904.

Biography University College, Cambridge University English engineer who made numerous contributions to electronics, photometry, electric measurements, and wireless telegraphy. He is best remembered as the inventor of the two-electrode radio rectifier, which he called the thermionic valve; it is also known as the vacuum diode, kenotron, thermionic tube, and Fleming valve. It was patented in 1904. Powered by the amplifier grid invented in 1906 by Lee De Forest, became a consultant to the Edison Electric Light Company. A popular teacher at University College, knighted in 1929.

Howard Hathaway Aiken (March 8, 1900- March 14, 1973)

Achievement (Invented the Mark I) Howard Hathaway Aiken was born March 8, 1900 in Hoboken, New Jersey. However he grew up in Indianapolis, Indiana where he attended the Arsenal Technical High School. After high school he studied at the University of Wisconsin where he received a bachelor's degree in . During college Aiken worked for the Madison Gas Company; after graduation he was promoted to chief Hardware Mark I engineer there. In 1935 Aiken decided to return to school. In 1939 he received a Ph.D. from . It was while working on his doctoral thesis in physics that Aiken began to think about constructing a machine to help with the more tedious of calculations. Aiken began to talk about his idea and to do some research into what could be done. With some help from colleagues at the university, Aiken succeeded in convincing IBM to fund his project. The idea was to build an electro mechanical machine that could perform mathematical operations quickly and efficiently and allow a person to spend more time thinking instead of laboring over tedious calculations. IBM was to build the machine with Aiken acting as head of the construction team and donate it to Harvard with the requirement that IBM would get the credit for building it. The constructing team was to use machine components that IBM already had in existence.

Compiled Deepanjal Shrestha 7 of 33 Social and Professional Issues in IT Pioneers of Computing

It took seven years and a lot of money to finally get the machine operational. Part of the delay was due to the intervention of World War II. Officially the computer was called the IBM Automatic Sequence Controlled Calculator but most everyone called it the Mark I. After completing the Mark I, Aiken went on to produce three more computers, two of which were electric rather than electromechanical. More important than the actual computer (whose major purpose was to create tables), was the fact that it proved to the world that such a machine was more than just fancy, it was a practical purpose machine. Perhaps more important than the invention of Mark I was Aiken's contribution to academia. He started the first academic program in the world. Aiken retired from teaching at Harvard in 1961 and moved to Ft. Lauderdale, Florida. He died March 14, 1973 in St. Louis, Missouri.

Charles Ranlett Flint (1850, Maine, USA, 1934, USA)

Merging companies, a venture capitalist. Flint merged several companies together in what will become known as IBM. One of the largest computer manufacturers on the world formed in the 20th century.

Paul Allen (January 21, 1953, , USA)

He founded Microsoft together with Bill Gates

Biography At Lakeside School, (14 years old) and friend Bill Gates (12 years old) became early computer enthusiasts. Allen went on to attend Washington State University, though he dropped out after two years to pursue his and Gates's dream of writing commercially for the new "personal computers". Allen and Gates and a small group of other Lakeside students begin programming in BASIC, using a teletype terminal. Helps teach computer course to junior high students at Lakeside. Graduates from Lakeside; enrolls at Washington State University. Allen and Gates buy an Intel 8008 chip for $360 and build a computer to measure traffic.They launch their first company, Traf-O-Data. He was hired as a by in Boston. Allen and Gates write the first microcomputer BASIC for the Altair, a computer kit based on Intel's new 8080 chip. They move to Albuquerque, N.M., where Altair's producer MITS makes Allen its associate director of software. Allen divides his time between MITS and a new company he and Gates have started to develop and market microcomputer languages: Micro Soft.

Compiled Deepanjal Shrestha 8 of 33 Social and Professional Issues in IT Pioneers of Computing

Apple commissions Microsoft to supply a version of its BASIC for the hot-selling Apple II. Radio Shack buys a Microsoft BASIC for its TRS- 80. Microsoft moves from Albuquerque to Bellevue, Wash. Microsoft agrees to develop and license DOS and BASIC to IBM for its new personal computer. Gates and Allen discuss graphical user interfaces, planting the seeds that will become Windows. Allen and Gates found Microsoft (initially "Micro Soft") in Albuquerque, New Mexico, and begin selling a BASIC interpreter. Allen spearheads a deal for Microsoft to buy an operating system called QDOS for $50,000. Microsoft wins a contract to supply it to for use as the operating system of IBM's new PC. This becomes the foundation for Microsoft's remarkable growth. Allen was forced to resign from Microsoft in 1983 after being diagnosed with Hodgkin's disease which was successfully treated by several months of radiation therapy. In 1984 he founded Asymetrix, a software development company based in Belleuve, Washington, to make application development tools that nonprogrammers can use. Asymetrix later went on to become Click2learn.com and yet later merged with Docent to become Sum Total System (2004). In the 1990's the company began to specialize in software for developing and delivering computer-based learning. 1992 Allen started Starwave, a producer of online content sites. Starwave did such great work for ESPN SportsZone and ABCNews.com that Disney (NYSE: DIS) bought it for a total of $350 million last year. 1998 In April Allen buys Marcus Cable, the nation's 10th largest cable company, for $2.8 billion--his biggest investment to date. Also this year Allen grabbed a stake of the video-sales market with his purchase of Hollywood Entertainment. And he took another software group public. This time it's Asymetrix Learning Systems, maker of products for online classes. On September 28, 2000 - Microsoft Corp. announced that Paul Allen is assuming a new role as senior strategy adviser to top Microsoft executives. The company also announced that Allen and Richard Hackborn have decided not to seek re-election to Microsoft's board of directors at the company's November shareholder meeting. In December 2003 he announced that he was the sponsor behind the SpaceShipOne private rocket plane venture from , as part of the competition. In June 2004, SpaceShipOne became the first sucessful commercial spacecraft when it passed the 100 kilometer threshold of space. In September 2003, Allen founded the Allen Institute of Brain Science pledging $100 million in seed money to the Seattle-based organization. Its inaugural project is the Allen Brain Atlas, a map of the human brain which will be made publicly accessible. The Brain Atlas is a component of the loosely formed Human Cognome Project. Starting in 2003, Vulcan Ventures began funding Project Halo, an attempt to apply Artificial Intelligence techniques to the problem of producing a digital Aristotle that might serve as a mentor, providing comprehensive access to the world's knowledge. Allen is a major contributor to the SETI, or Search for Extra-Terrestrial Intelligence project. He is also the founder of the Experience Music Project, originally inspired by his interest in a museum to house his considerable collection of Jimi Hendrix memorabilia. Allen runs a venture capital firm, Vulcan Ventures, and has created the Experience Music Project, a museum of music history, in Seattle, Washington. He owns (through Rose City Radio Corporation) some Portland radio stations. When he heard Seattle's was about to shut down, he bought, restored, and updated it into a showplace for movies of all formats. He is also one of the principal financiers behind the SETI project, having stepping in to rescue the project when NASA stopped funding it in the 1990s.

Compiled Deepanjal Shrestha 9 of 33 Social and Professional Issues in IT Pioneers of Computing

Currently Allen is the owner of the (an NBA basketball team) and the of the National Football League. He also owns Rose Garden Arena, the home court of the NBA Blazers team. Due to declining attendance in 2002 and 2003, as well as difficulties renegotiating the terms of a 1993 loan, the Rose Garden corporation filed for bankruptcy on February 27, 2004. In June of 2004, Allen opened the Science Fiction Museum and Hall of Fame, located at the Experience Music Project.

Gene Amdahl (1922, South Dakota, USA)

Achievement One of the original architects of the business mainframe computer, including IBM’s System/360 computer line, Amdahl started the IBM-compatible market when he left IBM to found Amdahl Corporation. Amdahl’s work has been called brilliant and genius by his peers. of London named him one of the "1,000 Makers of the 20th Century" in 1991, and mainframe magazine Computerworld considered Amdahl one of the 25 people "who changed the world." He is the founder of four companies, Amdahl Corporation, Trilogy Systems (now part of Elxsi Corporation), Andor Systems, and Commercial Data Servers (CDS).

Biography Gene Myron Amdahl was born in South Dakota in 1922. After serving two years in the U.S. Navy during World War II, where he learned electronics, and taking a course in , he received a bachelor’s degree in engineering physics at South Dakota State University in 1948. In 1952 he completed his doctorate in theoretical physics at the University of Wisconsin, where he designed his first computer, the Wisconsin Integrally Synchronized Computer (WISC). He began his career with IBM in 1952, and became the chief design engineer of the IBM 704. In 1955, Amdahl worked with others to design the Datatron, which led to a computer called the Stretch, and eventually became the IBM 7030, a computer that used the new technology. In 1956, after just four short years with IBM, Amdahl became unhappy with the company and quit. After five years of working for other computer companies, he returned to IBM in 1960. During the 1960s, Amdahl gained recognition as the principle architect of IBM’s impressive System 360 series of mainframe computers. The IBM System 360 was based on the Stretch, which Amdahl had worked on in 1955. The 360 series was one of the greatest success stories in the computer industry and became the main ingredient to IBM’s enormous profitability in the late 1960s. Amdahl became an IBM Fellow and was able to pursue his own research projects. In 1969, he was director of IBM’s Advanced Computing Systems Laboratory in Menlo Park, California. He recommended that the laboratory be shut down, which IBM did, and presented his ideas about the internal barriers that prevented IBM from shooting for the high end of computer development. Although his ideas were accepted, IBM executives refused to change policies and Amdahl left IBM again. In 1970, Amdahl formed his own company, Amdahl Corporation, in Sunnyvale, California. His plan was to compete head-to-head with IBM in the mainframe market. Most industry analysts considered this to be career suicide and gave his start-up company very little chance of surviving. But survive it did, and actually prospered. Instead of creating a rival system to IBM, Amdahl created discounted computers that could be substituted for name brand models and run the same software. Basically, he designed the first computer clones, known then as "plug-to-plug compatibles." Amdahl became the most celebrated entrepreneur in the computer industry for awhile. The only major criticism that was raised about Amdahl Corporation at the time was that Amdahl took start-up money from Fujitsu Ltd. of Japan in exchange for American mainframe technology.

Compiled Deepanjal Shrestha 10 of 33 Social and Professional Issues in IT Pioneers of Computing

In 1975, Amdahl Corporation shipped its first computer, the Amdahl 470 V/6. Over the next few years, Amdahl and IBM leap-frogged each other with faster, smaller, and cheaper computers. In 1979, Gene Amdahl began moving away from Amdahl Corporation when he resigned his post as chairman. He became chairman emeritus for less than a year, leaving Amdahl Corporation in 1980 to found Trilogy Systems Corporation. With the success of Amdahl Corporation, Amdahl had no trouble interesting investors in this new company and easily raised $230 million in start-up money. Again, his plan was to compete with IBM, and also Amdahl Corporation, in the high-end mainframe computer market. In addition, Amdahl planned to completely redesign the chips that powered the computers. His dream was to combine the functions of 100 separate chips onto one superchip that would work faster and more efficiently than the multiple chips. Unfortunately, Trilogy was hounded by disasters. Torrential rains delayed construction of the chip plant, then invaded the air conditioning, destroying the clean room atmosphere and all the chips currently being created. At this point, Amdahl had spent one-third of the start-up money with nothing to show for it. To save Trilogy, Amdahl spent the remainder of the money to acquire Elxsi Corporation, a computer manufacturer, in 1985. The new company continued to flounder and never achieved great success. In 1989, Amdahl stepped down as chairman of Elxsi to devote more time to his next venture. In 1987, Amdahl founded his third company, this one called Andor Systems after the "and" and "or" logic gates of computer circuitry. This time his aim was to build computers that would compete with IBM’s smaller mainframes. Industry analysts uniformly gave the company very little chance of success. But Amdahl felt he had an edge -- he could make small mainframe computers more cheaply than IBM. He could use new technology that allowed him to pack the computer’s central processor onto one board, rather than the several used by IBM, and he redesigned the compiler to work more quickly and efficiently. These innovations allowed Andor’s computers to take up less space and generate less heat, a distinct advantage to customers who no longer would need giant air-conditioned rooms in which to place their computers. But Andor was plagued by bad chips, causing a delay of almost two years before the first computers hit the market. Meanwhile, IBM came out with its own midsize computer using some of the same technology employed by Andor. To survive, Andor had to come up with other peripheral products that it could quickly get on the market. But Andor never achieved the success it was after with the small mainframes, and in 1991 it had scaled back products to include only a data backup system. By 1994, the company had yet to turn a profit. Eventually, the company declared bankruptcy. In 1996, at the age of 74, he started his fourth company, this one called Commercial Data Servers (CDS). Through CDS, Amdahl intends to distribute IBM-compatible, PC-based mainframes that use cryogenically-cooled CMOS processors and a new processor design that he created. CDS is targeting its products at companies that need the capabilities and selling price of a smaller mainframe, a market that CDS believes IBM and other manufacturers aren’t serving adequately. Gene Amdahl continues his quest to merge mainframe technologies with the more popular PC technology. Though many find these two areas incompatible (mainframe means centralized, controlled computing; PCs are for individual computing), Amdahl won’t give in to those who believe mainframes are dinosaurs that have outlived their usefulness. And, apparently he doesn’t intend to ever give up.

Compiled Deepanjal Shrestha 11 of 33 Social and Professional Issues in IT Pioneers of Computing

John Vincent Atanasoff (October 4, 1903, Hamilton, New York, USA, June 15, 1995, Monrovia, USA)

Achievement was born on 4 October 1903 in Hamilton, New York. He is the inventor of the electronic digital computer. He is, along with being an Inventor, a Mathematical Physicist and a Businessman.

Biography In any science field, there needs to be a person with the vision to define the future. John Vincent Atanasoff was a genius with such a vision. He developed the first electronic digital computer that has dramatically changed our lives. John Vincent Atanasoff gave birth to the field of electronic computing. In doing so, he also gave birth to a new era, an era of computers. Today, the computer is an essential part of every person as well as every business. We cannot imagine our lives without a computer being involved. Turning on the TV, making a telephone call, and typing up a report all involves the use of a computer. The invention of the computer meant that technology could improve at a faster rate and our lives became more convenient and more safe. Take for instance the use of computers in our cars. Anti-lock brakes, air bags, and fuel injections are all controlled by a computer. These advancements make the car safer and more reliable. Computers can also be found in banks, schools, airplanes, businesses, space shuttles, satellites, and numerous other things. In today's society, almost everything involves the use of a computer. The electronic age is the direct result of the invention of the computer. Never before in the history of humanity has there been an invention that grown so quickly as the computer has. Within the last twenty years, the speed and power of the computer has grown at an exponential rate. When John Vincent Atanasoff invented the computer, he probably did not know how much of an impact it would have on people's lives. Computers will be involved in every aspect of technology, and it will continue to be a part of technologies to come. The capabilities of computers are advancing every day. Soon, a computer will become more like the human brain than an electronic machine. Computers will take us to Mars, and get us back safely. Computers will always be on the edge of technology and anyone that learns to harness its power will be an important part of the future. Every aspect of our lives has changed because on the computer and its inventor, John Vincent Atanasoff. The first electronic computer with vacuum tubes is constructed by John Atanasoffen Clifford Berry of the State College. The Atanasoff-Berry computer was the first digital computer, built during 1937-1942, and introduced the concepts of binary arithmetic, regenerative memory, and logic circuits. This machine will never reach the production stage and remain a prototype. John Vincent Atanasoff is the first son of John Atanasoff and Iva Lucena Purdy. At a very early age, John Vincent Atanasoff had a great interest in mathematics. When John Vincent was about ten years old, he was curious in a Dietzgen that his father had bought. John Vincent read the instructions on how to use the slide rule, and he became more interested in the mathematical principles of the slide rule. With the help of his mother, John Vincent began to study a college algebra book that belonged to his father. In the years that followed, John Vincent's family moved to Old Chicora, Florida. John Vincent studied at Mulberry High School and graduated in two years. He received A's is all of his science and math courses. John Vincent did not enter into college right away because he wanted to work and save money. In 1921 John Vincent entered the as an undergraduate. John Vincent graduated from the University of Florida in 1925 with a Bachelor of Science degree in electrical engineering. He received straight A's as an undergraduate.

Compiled Deepanjal Shrestha 12 of 33 Social and Professional Issues in IT Pioneers of Computing

John Vincent Atanasoff then went to Iowa State College to pursue his master's degree. At Iowa State, John Vincent met his future wife, Lura Meeks. At the time, John Vincent did not know that she was three years his senior. John Vincent received his master's degree in mathematics from Iowa State College in 1926. Within a few days of receiving his degree, John Vincent and Lura Meeks were married. After receiving his master's degree, John Vincent went to the University of Wisconsin for his doctorate in theoretic physics. In the same year that John Vincent was accepted as a doctoral candidate, his wife gave birth to their eldest daughter, Elsie. In 1930, John Vincent Atanasoff received his Ph.D. as a theoretic physicist from the University of Wisconsin. Dr. Atanasoff then returned to Iowa State College as an assistant professor in mathematics and physics in 1936. Dr. Atanasoff had always been interested in finding new ways to perform mathematical computations faster. Dr. Atanasoff examined many of the computational devices that existed at that time. These included the Monroe calculator and the International Business Machines (IBM) tabulator. Dr. Atanasoff concluded that these devices were slow and inaccurate. After being promoted to associate professor of mathematics and physics, Dr. Atanasoff began to envision a computational device that was "digital." He believed that analog devices were too restrictive and could not get the type of accuracy he wanted. The idea of building an electronic digital computer came to him while he was sitting in a tavern. Dr. Atanasoff came up with four principles for his electronic digital computer.

· He would use electricity and electronics as the medium for the computer. · In spite of custom, he would use base-two numbers (the binary system for his computer. · He would use condensers for memory and would use a regenerative or "jogging" process to avoid lapses that might be caused by leakage of power. · He would compute by direct logical action and not by enumeration as used in analog calculating devices. (Mollenhoff, 34)

As Dr. Atanasoff worked on his computer project, he asked a colleague to recommend a graduate student to assist him with his project. The graduate student that was introduced to him was Clifford Berry. Berry was gifted electrical engineer and had very similar background as Dr. Atanasoff did. They both got along almost immediately. In December 1939, the first prototype of the Atanasoff Berry Computer (ABC) was ready. The ABC showed some of the potentials of a computer and it amazed the University. So in 1939, Dr. Atanasoff and his assistant Clifford Berry built the world's first electronic digital computer. With the first prototype working well, Dr. Atanasoff wanted to improve on prototype as well as get patents for the Atanasoff Berry Computer. Obtaining the patents were a slow process that ultimately caused Dr. Atanasoff the recognition that he deserved. In 1940 Dr. Atanasoff attended a lecture given by Dr. John W. Mauchly. They talked for some time and Dr. Mauchly was very intrigued with Dr. Atanasoff's electronic digital computer. Dr. Mauchly wanted to see the ABC for himself and Dr. Atanasoff agreed. This decision by Dr. Atanasoff would be a mistake since Dr. Mauchly later used many of Dr. Atanasoff's ideas in the design of the ENIAC. The ENIAC is falsely considered by most people as the world's first electronic digital computer designed by Dr. Mauchly and Dr. Eckert. Charges of piracy were later brought against Dr. Mauchly, co-inventor of the ENIAC. A long trial followed and it was not until 1972 that Dr. Atanasoff was given the recognition he so deserved. U.S. District Judge Earl R. Larson ruled that the ENIAC was "derived" from the ideas of Dr. Atanasoff. Although Judge Larson did not explicitly say that Dr.

Compiled Deepanjal Shrestha 13 of 33 Social and Professional Issues in IT Pioneers of Computing

Mauchly "stole" Dr. Atanasoff's ideas, Judge Larson did say that Dr. Mauchly had used many of Dr. Atanasoff's ideas on the ABC to design the ENIAC. When the trial finally ended, Dr. Atanasoff was given credit as the inventor of the electronic digital computer. Clark Mollenhoff in his book, Atanasoff, Forgotten Father of the Computer, details the design and construction of the Atanasoff-Berry Computer with emphasis on the relationships of the individuals. Alice and Arthur Burks in their book, The First Electronic Computer: The Atanasoff Story, describe the design and construction of the ABC and provide a more technical perspective. Numerous articles provide additional information. In recognition of his achivement, Atanasoff was awarded the National Medal of Technology by President George Bush at the White house on November 13, 1990. Dr. John Vincent Atanasoff died 15 June 1995 of a stroke at his home in Monrovia, Md. He was 91 years old. Although Dr. Atanasoff was not able to get a patent for the ABC, he held 32 patents for his other inventions

John Logie Baird (1888, Helensburgh, Scotland, June, 1946, city, country)

Achievement John Logie Baird is remembered as the inventor of mechanical television, radar and fiber optics. Successfully tested in a laboratory in late 1925 and unveiled with much fanfare in London in early 1926, mechanical television technology was quickly usurped by electronic television, the basis of modern video technology. Nonetheless, Baird's achievements, including making the first trans-Atlantic television transmission, were singular and critical scientific accomplishments. Lonely, driven, tireless and often poor, the native Scot defined the pioneering spirit of scientific inquiry. During his long career, John Baird created a host of television technologies. Among them, phonovision, a forerunner of the video recorder (which largely still relies on mechanical scanning); noctovision, an infra-red spotting system for "seeing" in the dark; open-air television, a theater- projection system; stereoscopic color TV; and the first high definition color TV. According to present-day TV historians, Baird only pursued mechanical scanning to get a television system working as quickly as possible. He changed to electronic scanning in the early 1930s and refined the system to a high degree. Before he died in 1946, Baird was drafting plans for a television with 1,000 lines of resolution and he had earlier patents for television with up to 1,700 lines of resolution using interlacing technology. The world would not catch up with him until 1990 when the Japanese introduced a TV with 1125 lines of resolution per frame.

Baird and the BBC For some time Baird's exploits had captivated the popular imagination. The press hailed him as a visionary and criticized the BBC, still a fledgling radio broadcaster, as inept and behind the times. One journalist went so far as to suggest that the BBC be dismantled and replaced by Baird Television Limited. Considering him very much an outside competitor, the BBC turned down Baird's requests for a transmitting license. Baird rebutted by first threatening to make pirate television broadcasts in 1928, then actually making them from Berlin in 1929. The BBC soon relented and granted a license. The press criticism and Baird's guerrilla tactics gnawed at J.C.W. Reith, the BBC's general manager, and tainted his perception of Baird. Still, when Baird offered to demonstrate his invention for the BBC in 1929, Reith grudgingly accepted. Baird's system, he acquiesced, had potential. It was the beginning of an uneasy relationship that lasted until 1935.

Compiled Deepanjal Shrestha 14 of 33 Social and Professional Issues in IT Pioneers of Computing

In September of 1929, Baird, in association with the BBC, began a series of experimental television transmissions. Working from his cramped studio, the project was plagued with technical difficulties. The worst setback was the lack of synchronized sound. Because they had access to only one transmitter, pictures and sound were broadcast alternately. The pictures themselves were minutely small; no larger than a saucer, even when magnified. Anxious to create a commercially viable system, the BBC pressured Baird to perfect the Televisor. They wanted a simple product that could be manufactured cheaply and widely distributed. But Baird's mind leap-frogged to ever more fantastic ideas: color television (demonstrated in 1928); big screen TV; and open air projections for large audiences. BBC management grew uneasy. Baird saw the Televisor as a prototype, not a finished product. It was replete with bugs and problems. Although BBC engineers had solved the sound synchronization glitch in 1930, the device was still crude; its picture flickering and tiny. In its current state, the Televisor could be no more than a novelty for a handful of amateur radio enthusiasts. Reluctantly, Baird prepared to mass produce the Televisor. Short of capital, he sought financing from Gaumont British, a formidable conglomerate holding company that owned a large chain of movie theaters and was very interested in showing large screen television. After that, the future of Baird Television passed forever beyond his control. Produced in kit-form, some 20,000 Baird Televisors sold across England and the Continent. It seemed that the mechanical system might have a foothold in the coveted European market. But the BBC was already studying a rival system based on the work of Vladimir Zworykin.

Lost in a vacuum: The Iconoscope comes of age In the 1920s, a number of American companies began developing electronic image scanners based on the cathode ray tube. In 1933 Zworykin, working for RCA, invented a revolutionary device called the Iconoscope. Delivering superior resolution with almost no irritating flicker, the Iconoscope was a formidable challenger to the humble Nipkow disc. In 1933 Baird was told that the BBC would end its relationship with him the following year. Mechanical television, they said, was no match for an all-electronic scanning system. In an arrangement with the newly incorporated EMI, the BBC developed their own version of the new technology. In 1935, EMI unveiled the Emitron camera tube, a device that was uncannily similar to Zworykin's Iconoscope. This was no accident. RCA and EMI had a cross-licensing arrangement, so it is likely that they shared technology. Their goal was to dominate the global market with a single television system. Upstarts like Baird would simply disappear in their wake.

A tale of two tubes: Farnsworth's Image Dissector Baird was determined that mechanical television could work. He was, of course, aware of the advances made with cathode ray tubes, but had neither the inclination or financing to pursue it. But Gaumont British had other plans. Anxious about the potential of the Emitron tube, they urged Baird to seek a licensing agreement from Philo T. Farnsworth, a young American who created a device called the Image Dissector. Farnsworth conceived his television system in 1923, while still in high school. Utilizing a cathode ray tube, his design predated Zworykin's Iconoscope by a decade. By 1927 the boy wonder had transmitted straight line images from his first Image Dissector. In 1934, the year he met Baird, he was deeply entangled in patent litigation suits with RCA. By licensing the Image Dissector in Great Britain, he hoped to sidestep RCA and claim a piece of the European market. Baird made a point to be present in London for Farnsworth's demonstration of the Image Dissector and was stunned by what he saw.

Compiled Deepanjal Shrestha 15 of 33 Social and Professional Issues in IT Pioneers of Computing

The best resolution Baird had achieved was 180 lines per frame. Farnsworth's Image Dissector displayed an astounding 300 lines per frame. Gaumont British's executives were duly impressed. They signed an agreement with Farnsworth and gave Baird the task of putting the Image Dissector at the core of a new television system.

Down in flames: the end of Baird TV Despite the compelling display, Baird was not an easy convert to electronic television. He was convinced that his mechanical system could be synthesized with the Image Dissector to create a superior hybrid. In 1935 he succeeded in demonstrating a 700 line picture for the press. By 1936 Baird Television was in serious trouble because of BBC's growing preference for EMI. Though Baird's electro-mechanical TV now produced 240 lines per frame versus EMI's 405 line capability, witnesses said they could not see a difference between the two transmissions, possibly because the 405 system was on the wrong carrier wave. Nevertheless, the BBC was very close to penning an official contract with EMI. Housed in a studio in the Crystal Palace, Baird and his technicians were increasingly isolated. In 1937-38, Baird began to drift from the day-to-day operations of Baird Television. He preferred to work in his home studio where he could indulge his imagination, unconstrained by the politics of business. In a desperate move, Gaumont British brought Farnsworth to London. They hoped he could help Baird back on track before the BBC committed to EMI. Not long after Farnsworth departed, the Crystal Palace burned to the ground in a fire that was likely caused by faulty wiring. Within days, EMI's television system was officially adopted by the BBC. Baird of England operated profitably for several years, manufacturing television receivers. Operating from a claustrophobic studio in a surviving section of the Crystal Palace, the inventor enjoyed the occasional triumph. In 1938 at the Dominion Theater, an audience of 3,000 watched color television images on a 12 X 9 foot screen. These were the first color pictures ever shown publicly. Yet even these highly public achievements could not change the reality of the market place. In 1940, Baird's company changed its name to Cinema Television. At the outbreak of war with Germany, the British government placed severe restrictions on TV signals, fearing that German bombers would use TV signals to home in on London. Eventually, the British used TV signals as a means of jamming enemy bomber guidance systems.

Phoenix from the ashes: Don Quixote rides again Left with scarce resources and no hope of procuring benevolent corporate backers, Baird was on his own. Financing research from his savings, he enjoyed a curious sense of freedom. Like the old London garret days, his work was fueled only by passion and insatiable curiosity. As Hitler raged across Europe, Great Britain poured its resources into the war effort. Electronic components became scarce and Baird had to forage for parts. As ever, his ambitions ran high. By proclaiming he would build the first commercially viable color television, he put himself in direct competition with American monoliths like RCA. Furthermore, he claimed the system would have 600-line pictures, nearly 200 more than EMI's 405-line standard. Baird at last abandoned mechanical systems in favor of electronics. But even here he left his own indelible mark. Images were created by scanning subjects with an intense beam of light from a cathode ray tube. The light passed through spinning colored filters before being relayed to photo- electric cells. A variation of the "flying-spot" scanning method he'd developed and patented in the '20s, it was a brilliant success. By interlacing several 200-line scans he achieved a 600-line picture. In December of 1940 he demonstrated the television in his home before an influential group of journalists. Encouraged by their enthusiastic praise, Baird set to work on a stereoscopic color TV. Despite the significance of his accomplishments, neither this early form of high definition television

Compiled Deepanjal Shrestha 16 of 33 Social and Professional Issues in IT Pioneers of Computing

or stereoscope were commercially produced. The systems championed by EMI and RCA would set the standard for decades to come. In 1943 Baird appeared before the Hankey Committee, a government task force examining the future of television. He encouraged them to consider high definition systems of a 1000 lines or more for post-war commercial development. He also urged them to pursue stereoscopic TV. In failing health, he no longer had the stamina to finish these projects, though he had succeeded in demonstrating all of them. The Hankey Committee's report mostly concurred and recommended the overturning of lesser standards put forth by the Selsdon Committee in 1937. Baird died in June of 1946. The work of John Logie Baird comprised a crucial break-through in television technology. Today, 95% of modern TV is pre-recorded, an approach recommended by Baird. A large amount of contemporary TV utilizes the film scanning system of Rank-Cintel, which absorbed Baird's Cinema Television. Baird's single electronic gun CRT development work in 1945 was eventually followed in the design of the Sony Trinitron tube. In a manner that today seems commonplace, his initial mechanical solution was quickly supplanted by newer technology, but his inventive work continued and his legacy continues. Baird succeeded in perfecting visual transmission systems others had long abandoned. His single-minded tenacity proves that most obstacles are no greater than the limits of the imagination.

Gordon Bell (August 19, 1934, Kirksville, Missouri)

Achievement While working for Digital Equipment Corporation (DEC) in the 1960s, Gordon Bell helped build the PDP series of minicomputers, the first minicomputers introduced to the commercial data processing market. He also oversaw the development of one of the industry’s most successful computer lines, DEC’s VAX series. He is currently one of several experts employed by Microsoft to direct that company’s future path.

Biography Bell was born on August 19, 1934 in Kirksville, Missouri. He received a B.S. and M.S. in electrical engineering from the Massachusetts Institute of Technology (MIT) in 1956 and 1957, respectively. After receiving his master’s degree, he worked at MIT’s Engineering Speech Communications Laboratory. During his time at MIT, Bell knew two other MIT computer engineers named Ken Olsen and Harlan Anderson and decided to start their own company, Digital Equipment Corporation. By 1959 Olsen and Anderson were starting to build a new, smaller computer that was the precursor to the minicomputer. Bell was asked to join DEC as an engineer in 1960, where he began work on the early Program Data Processor (PDP) minicomputers. His work on the PDPs contributed to Bell becoming an industry-recognized expert on minicomputers and interactive computing. In 1966, he left DEC to teach computer science at Carnegie-Mellon University. While there, he also worked on minicomputer-related interactive computing. In 1972, he was convinced to return to DEC as vice-president of engineering to oversee the production of the VAX (Virtual Address Extension) line of minicomputers and move the company into the world of semiconductor (chip) technology. The 32-bit VAX minicomputers went on to become some of the most successful computers ever created and completely changed the face of the minicomputer industry. Bell remained with DEC until his retirement in 1983, which was prompted by a heart attack. In July 1983, he started a new company with Kenneth Fisher and Henry Burkhardt called Encore Computer Corporation. Their intention was to build a new generation of small computers and to help new start- up companies. He then spent the late 1980s leading the National Science Foundation’s Information Superhighway initiative and co-writing a book on venture capital called High Tech Ventures: The

Compiled Deepanjal Shrestha 17 of 33 Social and Professional Issues in IT Pioneers of Computing

Guide for Entrepreneurial Success. The early 1990s found Bell helping several small start-up companies where he served in various positions, including chief scientist at Stardent Computer in Newton, Massachusetts in 1990 and vice-president of Research and Development at Ardent Corporation in Sunnyvale, California in 1991. He was also a director of the Bell-Mason Group, which programmed computers to analyze the strengths and weaknesses of new businesses. From 1991 to 1995, Bell acted as an advisor to Microsoft for future development efforts and helped set up its first research laboratory. Microsoft was trying to become the research power that would dictate the next decade’s computer technology. To this point, it tempted some of the best minds in the industry to join in its long-term research and development. In August 1995, Bell officially joined this brain trust. Through his association with Microsoft, Bell plans to explore the use of video and high-speed networks to expand and facilitate human-human interactions and to reduce physical travel, fields known as "telepresence" or "telecomputing." He is also exploring what he calls "scalable network and platform computing (SNAP)," a new architecture that would give a single desktop operator the same operating power as a mainframe, something he believes is feasible shortly after the turn of the century. In addition to his work in the computer field, Bell published several books and helped establish the Computer Museum in Boston. He also received several awards for his contributions to the computer industry, including the National Medal of Technology in 1991 and the Eckert-Mauchly award in 1982, named after the two developers of ENIAC and UNIVAC. In 1986, the National Science Foundation asked him to direct funding for U.S. computer science efforts. As the first assistant director for computing at the National Science Foundation, he led the National Research Network panel that became the National Information Infrastructure/Global Information Infrastructure (NII/GII) and helped write the High-Performance Computer and Communications Initiative. While many of the pioneers in the computer industry have fallen by the wayside, either of their own volition or because the industry moved away from them, Gordon Bell has remained always at the forefront. He isn’t sitting back resting on his laurels or pining for the good old days. One reason he agreed to join Microsoft is to "continue to be a part of the next paradigm shift

Walter Brattain (10 Feb 1902, Amoy, China-13 October 1987, Seattle Wash, USA)

Achievement "Walter was a very good experimental physicist. He could put things together out of sealing wax and paper clips, if you wish, and make things work." -- John Pierce co-worker.

Biography Walter Brattain had the hands. Give him the direction and he could build anything. He was a solid physicist who had a good understanding of theory, but his strength was in physically constructing experiments. Working with the ideas of William Shockley and , Brattain's hands built the first transistor. His father, Ross, and mother, Ottilie, married just after they'd graduated from Whitman College in Walla Walla, Washington. Ross got a job teaching science and math in China, and was born on February 10, 1902 in Amoy. They didn't stay abroad long: by 1903, the Brattains were back in Washington. Walter spent most of his youth on a large cattle ranch just south of the Canadian border. When he wasn't doing school work, Walter had little time for anything besides helping out on the ranch. He was a cowboy. In the fall of 1920, Brattain entered Whitman. He claimed he majored in physics and math because they were the only subjects he was good at.

Compiled Deepanjal Shrestha 18 of 33 Social and Professional Issues in IT Pioneers of Computing

Brattain attended college at a turning point in American science, when physics was being transformed. Older students would have been expected to travel to Europe for a first-class physics education, but Brattain was in the first wave of those who could do just as well in the US. Encouraged by his professor Benjamin Brown to continue his studies, Brattain went on to the University of Oregon for his Masters and to the for a Ph.D.. Brattain's first job out of graduate school was at the National Bureau of Standards as a radio engineer, but after a year there he wanted to get back to physics. At an American Physical Society meeting, he was about to ask his thesis advisor, John Tate, for help. But before he said anything, Tate introduced him to Joseph Becker of Bell Labs. "By the way, Becker is looking for a man," he said, and Brattain quickly responded, "I'm interested!" Becker asked for only one qualification: he wanted to make sure that Brattain was the kind of guy who'd stand up to his superiors when necessary. Brattain, raised on a working ranch with a rifle in his saddle bags to shoot rattlesnakes, laughed. On August 1, 1929, Brattain moved to Becker's lab in New York City. Working with Becker, Brattain spent most of his time studying copper-oxide rectifiers. The pair thought they might be able to make an amplifier by putting a tiny metal grid in the middle of the device, similar to the design of vacuum tubes. A few years later, William Shockley came to him with a similar idea. Neither contraption actually worked. Working with crystals eventually paid off. On March 6, 1940, Brattain and Becker were called into the office of Bell's President, Mervin Kelly. There they saw Russell Ohl 's mysterious crystal that increased voltage whenever light was flashed on it. It turned out to be a very crude P-N junction, but no one knew that at the time. Brattain, who at first thought it was a practical joke, gave an off-the- cuff explanation that electrical current was being generated at a barrier inside. That theory turned out to be true. Kelly was suitably impressed.

Fred Brooks (April 19, 1931, Durham NC, USA)

Achievement Brooks Jr., Fred; Managed the development of the 360 operating system software for IBM (1960s); wrote The Mythical Man-Month about software project management

Edith Clark (1883 - 1959, USA)

Achievement She translated what many engineers found to be esoteric mathematical methods into graphs or simpler forms during a time when power systems were becoming more complex and when the initial efforts were being made to develop electromechanical aids to problem solving.

John Cocke (30 May 1925, Charlotte, N.C., USA, 16 July 2002, Valhalla, N.Y. USA)

Achievement Chief designer for the RISC chip architecture at IBM

Biography Born on May 30, 1925, Mr. Cocke was raised in Charlotte, N.C. His father, Norman Cocke, was the president of the Duke Power Company and a trustee of . 's curiosity, which would prove so valuable later in his life, was evident early. As an adult, Mr. Cocke once recalled that when he was given his first bicycle at the age

Compiled Deepanjal Shrestha 19 of 33 Social and Professional Issues in IT Pioneers of Computing

of 6, he dismantled it within a few hours, much to the chagrin of his mother, Mary. Mr. Cocke joined I.B.M.'s research labs in 1956 after he received a doctorate in mathematics from Duke, and he remained with the company until he retired. Mr. Cocke (rhymes with "sock") was the principal designer of the type of microprocessor that serves as the engine of most of today's large, powerful computers and the Apple Macintosh personal computers. Machines using his chip design — a simplification of the hardware, which opened the door to faster computation — are reduced instruction-set computers, or RISC.Advertisement Throughout his long career as a researcher for I.B.M., Mr. Cocke was also responsible for a host of other innovations. He was a leader in the arcane but vital field of designing more efficient software compilers — the software that translates instructions written in a programming language understood by human into the vernacular of all computers, the 1's and 0's of digital code. Mr. Cocke also came up with ideas that helped advance fields as diverse as speech-recognition technology and data storage. In computer science circles, Mr. Cocke was renowned for the breadth of his intellect, his energy, his insights and his unconventional working methods. A former colleague, Paul M. Horn, who had joined I.B.M.'s research labs after a career as a physics professor at the University of Chicago, recalled that when he worked on weekends, Mr. Cocke was invariably in the labs. The senior researcher, Mr. Horn recalled, would drop by and engage the newcomer in long discussions of the finer points of unification theory in physics. "John Cocke knew as much about high-energy physics as I did, and it wasn't even his field," said Mr. Horn, who is the director of I.B.M.'s research division. Even after he retired in 1992, Mr. Cocke always displayed "a wonderful childlike curiosity — he was interested in everything," recalled R. Andrew Heller, who collaborated with Mr. Cocke on the RISC technology, beginning in the late 1960's. The RISC chip design, experts say, was a striking example of Mr. Cocke's defining attribute. His deep understanding of both the computer hardware and software, and their interaction, often enabled Mr. Cocke to pierce through the complexity of computer problems with fresh insights. Among his many achievements, he was named IBM Fellow, the company's highest technical honor, in 1972, he also won National Medal of Technology, the and was awarded the National Medal of Science in 1991 by President Bush. Over his lifetime as a scientist, he made unique and creative contributions to through his innovative developments in high performance system design.

His expertise in achieving high performance for a broad range of scientific Cocke with applications led to remarkable advances in compiler design and machine prototype server pic architecture that culminated in his invention of the Reduced Instruction Set IBM, the "Father" Computer (RISC) --- which profoundly affected the broad field of information of RISC architecture technology. RISC is the basis for a Unix systems market that last year was $22.3 billion, according to industry analyst group, IDC. "His tenure at IBM spanned an amazing time, 1956 to 1992," said Peter Capek, John's colleague at IBM and close friend. "His career was unusual in its breadth. He was known for his work in , but he was interested in everything -- circuits, storage, compilers -- any technology that could advance the state of the art." John is a founder and key innovator of the technology of compiler optimization, now used systematically throughout the computer industry to enable computers programmed in higher level languages such as FORTRAN, C, PASCAL and others to reach levels of efficiency comparable to --

Compiled Deepanjal Shrestha 20 of 33 Social and Professional Issues in IT Pioneers of Computing

and in some cases exceeding -- the levels of efficiency reachable by much more expensive and time- consuming programming techniques closer to the machine's instruction set.

Edgar F. Codd

Citation For his fundamental and continuing contributions to the theory and practice of database management systems. He originated the relational approach to database management in a series of research papers published commencing in 1970. His paper "A Relational Model of Data for Large Shared Data Banks" was a seminal paper, in a continuing and carefully developed series of papers. Dr. Codd built upon this space and in doing so has provided the impetus for widespread research into numerous related areas, including database languages, query subsystems, database semantics, locking and recovery, and inferential subsystems.

Seymour Cray (1925, Chippewa Falls (Wi), USA-1996, Colorado Springs (CO), USA)

Achievement Creator of the first super computer Cray I In 1972, founded Cray Research to design and build the world's highest performance general-purpose supercomputers. His Cray-1 computer established a new standard in supercomputing upon its introduction in 1976, and his Cray-2 computer system, introduced in 1985, moved supercomputing forward yet again. In the years following the company's founding, Mr. Cray relinquished the company's management reins to devote more time to computer development. From 1972 to 1977 he served as director, chief executive officer, and president of the company. In October 1977, he left the presidency, but remained chief executive officer and became chairman of the board. In 1980, Mr. Cray resigned as chief executive officer, and in 1981, he stepped aside as chairman of the board to devote himself full time to the Cray-2 project as an independent contractor for Cray Research. For some time, he remained a director and a member of the Policy Committee. Later, Mr. Cray worked to develop a successor to the Cray-2 system and explored gallium arsenide technology. Mr. Cray spent his entire career designing large-scale computer equipment. He was one of the founders of Control Data Corporation in 1957 and was responsible for the design of that company's most successful large-scale computers, the CDC 1604, 6600, and 7600 systems. He served as a director for CDC from 1957 to 1965 and was senior vice president at the time of his departure in 1972. From 1950 to 1957, Mr. Cray held several positions with Engineering Research Associates (ERA) of St. Paul, Minnesota. At ERA, he worked on the development of the ERA 1101 scientific computer for the U.S. government. Later, he had design responsibility for a major portion of the ERA 1103, the first commercially successful scientific computer. While with ERA, he worked with the gamut of computer technologies ranging from vacuum tubes and magnetic amplifiers to . Mr. Cray is the inventor of a number of technologies that have been patented by the companies for which he has worked. Among the more significant are the Cray-1 vector register technology, the cooling technologies for the Cray-2 computer, the CDC 6600 freon-cooling system, and a magnetic amplifier for ERA. He also contributed to the Cray-1 cooling technology design. Cray supercomputers

Compiled Deepanjal Shrestha 21 of 33 Social and Professional Issues in IT Pioneers of Computing

Mr. Cray earned a bachelor of science degree in electrical engineering in 1950 from the University of Minnesota. In 1951 he earned a master of science degree in from the same institution. In 1968, he was awarded the W.W. McDowell Award by the American Foundation of Information Processing Societies for his work in the computer field. In 1972 he was presented with the Harry H. Good Memorial Award for his contributions to large-scale computer design and the development of multiprocessing systems. Mr. Cray was born in 1925 in Chippewa Falls, Wisconsin; he died in 1996 in Colorado Springs.

Edsger Wybe Dijkstra (11 May 1930, Rotterdam, Netherlands-6 Aug 2002, Nuenen, Netherlands)

Achievement In 1968 Edsger Dijkstra(7) laid the foundation stone in the march towards creating structure in the domain of programming by writing, not a scholarly paper on the subject, but instead a letter to the editor entitled "GO TO Statement Considered Harmful". (Comm. ACM, August 1968) The movement to develop reliable software was underway.

Biography Edsger Wybe Dijkstra was born in Rotterdam, Netherlands in 1930. Both of his parents were intellectual people and had received good educations. His father was a chemist, and his mother was a mathematician. In 1942, when Dijkstra was 12 years old he entered the Gymnasium Erasminium, a high school for extremely bright students, and he was educated in a number of different subjects including: Greek, Latin, French, German, English, biology, mathematics, and chemistry. In 1945, Dijkstra thought that he might study law and possibly serve as a representative for the Netherlands at the United Nations. However, due to the fact that he had scored so well in chemistry, mathematics, and physics, he entered the University of Leiden, where he decided to study theoretical physics. He went to summer school on the subject of programming at Cambridge University, during the summer of 1951. He began part-time work at the Mathematical Centre in Amsterdam in March 1952, which further helped fuel his growing interest in programming. He finished the requirements for his theoretical physics degree as quickly as possible and began to pursue his interests in progamming. One of the problems that he ran into, however was that programming still was not officially recognized as a profession. In fact, when he applied for a marriage license in 1957, he had to put down "theoretical physicist" as his profession. Dijkstra continued to work at the Mathematical Centre until he accepted a job as a research fellow for Burroughs Corporation, in the United States, in the early 1970s. He was awarded the ACM Turing Award in 1972. He was given the AFIPS Harry Goode Memorial Award in 1974. Dijkstra moved to Austin, Texas in the early 1980s. In 1984 he was appointed to a chair in Computer Science at the University of Texas, Austin, where he has been ever since.

Contributions to Computer Science: In 1956, Dijkstra came up with the "shortest-path algorithm", after he had been assigned the task of showing the powers of ARMAC, the computer that the Mathematical Centre had in it's possession; an algorithm which aids in finding the best way to travel between two points. He also used this to solve the problem of finding a way to "convey electricity to all essential circuits, while using as little expensive copper wire as possible" that the engineers that had designed the ARMAC ran into. He

Compiled Deepanjal Shrestha 22 of 33 Social and Professional Issues in IT Pioneers of Computing

called it the "shortest subspanning tree algorithm." In the early 1960s, Dijkstra applied the idea of mutual exclusion to communications between a computer and its keyboard. He used the letters P and V to represent the two operations that go on in the mutual exclusion problem. This idea has become a part of pretty much all, modern processors and memory board since 1964, when IBM first used it in its 360 architecture. The next problem that computer engineers must deal with that Dijkstra recognized was the "dining philosophers problem." In this problem, five philosophers are sitting at a table with a bowl of rice and a chopstick on either side of the bowl. The problem that arises is how the philosophers will be able to eat without coming to a "deadlock", ending up in a "starvation" situation, or a situation with "lack of fairness." He helped make the computer software industry a lot more disciplined by using one phrase: "GO TO considered harmful. This means that the more GO TO statements there are in a program the harder it is to follow the program's source code.

John Presper Eckert (April 9, 1919 Philadelphia-1995 Bryn Mawr, USA )

Achievement

Co-inventor of the first fully electronical digital computer

Biography At the university of Pennsylvenia Eckert and the late John. W. Mauchly created ENIAC (Electronic Numerical Integrator and Computer). A 30 ton leviathan that contained less computational power than a single one of today's tiny chips. Designed to figure trajectories for world war II atillery, ENIAC was 1000 times as fast as contemporary calculators , and became an invaluable tool for scientist working to build the first atom bomb. After selling their business in 1950 to Sperry Rand Corp, now Unisys, the partners continued to advance in their field. Eckert was awarded a National Science Medal in 1968.

Chronology J. Presper Eckert was a graduate of the University of Pennsylvania. He was a researcher at the Moore School of the University of Pennsylvania when he began working with John W Mauchly on ENIAC (Electronic Numerical Integrator and Computer) in 1943. When they completed ENIAC, in 1946, it had 500 000 hand soldered connections and used punched cards to store data. ENIAC was used by the U.S. Army for military calculations. In 1946 Eckert and Mauchly started a business partnership that become the Eckert-Mauchly Computer Corporation. In 1949 Eckert and Mauchly launched BINAC (Binary Automatic Computer) which used magnetic tape to store data.programming, theoretical In 1950 the Remington Rand Corporation acquired the Eckert-Mauchly company and changed the name to the Univac Division of Remington Rand. Later research resulted in UNIVAC1 (Universal Automatic Computer), a computer containing many of the elements of today's machines. In 1955 Remington Rand merged with the to form Sperry Rand. Eckert remained with the company and became an executive. He continued with the company as it merged with the Burroughs Corporation to become Unisys. In 1989 Eckert retired from Unisys but continued to act as a consultant for the company. He died of leukemia in Bryn Mawr, Pennsylvenia, USA

Edward A. Feigenbaum (20 january 1936 , Weehawken NJ, USA)

"{Knowledge} is power, and computers that amplify that knowledge will amplify every dimension of power"

Compiled Deepanjal Shrestha 23 of 33 Social and Professional Issues in IT Pioneers of Computing

"Humans are superb problem solvers; superb learners; superb at coordinating functions of sensing and locomotion and problem solving into an integrated unit. However, computer programs can claim intellectual niches that evolution did not provide for us marvelous creatures."

Achievement

The Expert System (4) It is Feigenbaum's development of the expert system that has contributed the most to computer science, particularly the field of artificial intelligence. He began hypothesizing about the creation of an expert system as early as 1962, when first he co-edited Computers and Thought along with Julien Feldman. Wanting to use computers to explore induction -- "an informed guess, one that may have to be changed in light of new evidence" (Sasha, Lazare, 215), Feigenbaum considered many possibilities for his first expert system. Eventually, Feigenbaum along with (chairman of the genetics department) and Carl Djarassi (another colleague, who supplied the needed knowledge of chemistry) created the first expert system -- DENDRAL. DENDRAL was important not only because it helped solve the problem its creators set out to solve (the odds of life on other planets), but also because it became a framework for expert systems to follow. Feigenbaum tested his framework in other areas, proving successful each time. These days, expert systems are used in everything from airlines to computer manufacturers, the military to hospitals. Feigenbaum's framework of the expert system and theories about it continue to influence the leading computer scientists working in artificial intelligence today. Douglas Lenat's Cyc project, for example, builds upon Feigenbaum's knowledge principle. The expert system is hardly common place today, knowledge and acceptance is growing.

Biography From the earliest moments of his life, Feigenbaum had an interest in the sciences -- an interest sparked by his stepfather. With the prodding of his parents, however, Feigenbaum entered the Carnegie Institute of Technology in 1952, in order to earn a degree in electrical engineering. Earning his BS in 1956, Feigenbaum rediscovered science at Carnegie. A professor of his, James March, introduced Feigenbaum to 's ideas of game theory; Edward was intrigued. After earning his PhD at Carnegie Institute of Technology, Feigenbaum went on to become a member of the faculty at Berkeley's School of Business Administration. The school's lack of a computer science program, however, prompted Feigenbaum to leave the world of psychology and human behavior to the world of artificial intelligence. With John McCarthy (who had first worked with at MIT) recently hired at Stanford to head the artificial intelligence laboratory, Feigenbaum could not help but move on. At Stanford, Feigenbaum made most of his contributions to the world of AI and computer science in general. Feigenbaum works now as a professor at Stanford, having held numerous positions in the past, including Chief Scientist of the United States Air Force (from 1994-1997). He has written many books, collaborating with various different experts, including his wife H. Penny Nii, who he married in 1975, and has had four children with.

Compiled Deepanjal Shrestha 24 of 33 Social and Professional Issues in IT Pioneers of Computing

Alan Turing

Alan Turing was born at Paddington, London. His father, Julius Mathison Turing, was a British member of the Indian Civil Service and he was often abroad. Alan's mother, Ethel Sara Stoney, was the daughter of the chief engineer of the Madras railways and Alan's parents had met and married in India. When Alan was about one year old his mother rejoined her husband in India, leaving Alan in England with friends of the family. Alan was sent to school but did not seem to be obtaining any benefit so he was removed from the school after a few months. Next he was sent to Hazlehurst Preparatory School where he seemed to be an 'average to good' pupil in most subjects but was greatly taken up with following his own ideas. He became interested in chess while at this school and he also joined the debating society. He completed his Common Entrance Examination in 1926 and then went to Sherborne School. Now 1926 was the year of the general strike and when the strike was in progress Turing cycled 60 miles to the school from his home, not too demanding a task for Turing who later was to become a fine athlete of almost Olympic standard. He found it very difficult to fit into what was expected at this public school, yet his mother had been so determined that he should have a public school education. Many of the most original thinkers have found conventional schooling an almost incomprehensible process and this seems to have been the case for Turing. His genius drove him in his own directions rather than those required by his teachers. He was criticised for his handwriting, struggled at English, and even in mathematics he was too interested with his own ideas to produce solutions to problems using the methods taught by his teachers. Despite producing unconventional answers, Turing did win almost every possible mathematics prize while at Sherborne. In chemistry, a subject which had interested him from a very early age, he carried out experiments following his own agenda which did not please his teacher. Turing's headmaster wrote :- If he is to stay at Public School, he must aim at becoming educated. If he is to be solely a Scientific Specialist, he is wasting his time at a Public School. This says far more about the school system that Turing was being subjected to than it does about Turing himself. However, Turing learnt deep mathematics while at school, although his teachers were probably not aware of the studies he was making on his own. He read Einstein's papers on relativity and he also read about quantum mechanics in Eddington's The nature of the physical world. An event which was to greatly affect Turing throughout his life took place in 1928. He formed a close friendship with Christopher Morcom, a pupil in the year above him at school, and the two worked together on scientific ideas. Perhaps for the first time Turing was able to find someone with whom he could share his thoughts and ideas. However Morcom died in February 1930 and the experience was a shattering one to Turing. He had a premonition of Morcom's death at the very instant that he was taken ill and felt that this was something beyond what science could explain. Despite the difficult school years, Turing entered King's College, Cambridge, in 1931 to study mathematics. This was not achieved without difficulty. Turing sat the scholarship examinations in 1929 and won an exhibition, but not a scholarship. Not satisfied with this performance, he took the examinations again in the following year, this time winning a scholarship. In many ways Cambridge was a much easier place for unconventional people like Turing than school had been. He was now much more able to explore his own ideas and he read Russell's Introduction to mathematical philosophy in 1933. At about the same time he read von Neumann's 1932 text on quantum mechanics, a subject he returned to a number of times throughout his life. The year 1933 saw the beginnings of Turing's interest in mathematical logic. He read a paper to the Moral Science Club at Cambridge in December of that year of which the following minute was recorded:-

Compiled Deepanjal Shrestha 25 of 33 Social and Professional Issues in IT Pioneers of Computing

A M Turing read a paper on "Mathematics and logic". He suggested that a purely logistic view of mathematics was inadequate; and that mathematical propositions possessed a variety of interpretations of which the logistic was merely one. Of course 1933 was also the year of Hitler's rise in Germany and of an anti-war movement in Britain. Turing joined the anti-war movement but he did not drift towards Marxism, nor pacifism, as happened to many. Turing graduated in 1934 then, in the spring of 1935, he attended Max Newman's advanced course on the foundations of mathematics. This course studied Gödel's incompleteness results and Hilbert's question on decidability. In one sense 'decidability' was a simple question, namely given a mathematical proposition could one find an algorithm which would decide if the proposition was true of false. For many propositions it was easy to find such an algorithm. The real difficulty arose in proving that for certain propositions no such algorithm existed. When given an algorithm to solve a problem it was clear that it was indeed an algorithm, yet there was no definition of an algorithm which was rigorous enough to allow one to prove that none existed. Turing began to work on these ideas. Turing was elected a fellow of King's College, Cambridge, in 1935 for a dissertation On the Gaussian error function which proved fundamental results on probability theory, namely the central limit theorem. Although the central limit theorem had recently been discovered, Turing was not aware of this and discovered it independently. In 1936 Turing was a Smith's Prizeman. Turing's achievements at Cambridge had been on account of his work in probability theory. However, he had been working on the decidability questions since attending Newman's course. In 1936 he published On Computable Numbers, with an application to the Entscheidungsproblem. It is in this paper that Turing introduced an abstract machine, now called a "", which moved from one state to another using a precise finite set of rules (given by a finite table) and depending on a single symbol it read from a tape. The Turing machine could write a symbol on the tape, or delete a symbol from the tape. Turing wrote :- Some of the symbols written down will form the sequences of figures which is the decimal of the real number which is being computed. The others are just rough notes to "assist the memory". It will only be these rough notes which will be liable to erasure. He defined a computable number as real number whose decimal expansion could be produced by a Turing machine starting with a blank tape. He showed that was computable, but since only countably many real numbers are computable, most real numbers are not computable. He then described a number which is not computable and remarks that this seems to be a paradox since he appears to have described in finite terms, a number which cannot be described in finite terms. However, Turing understood the source of the apparent paradox. It is impossible to decide (using another Turing machine) whether a Turing machine with a given table of instructions will output an infinite sequence of numbers. Although this paper contains ideas which have proved of fundamental importance to mathematics and to computer science ever since it appeared, publishing it in the Proceedings of the London Mathematical Society did not prove easy. The reason was that Alonzo Church published An unsolvable problem in elementary number theory in the American Journal of Mathematics in 1936 which also proves that there is no decision procedure for arithmetic. Turing's approach is very different from that of Church but Newman had to argue the case for publication of Turing's paper before the London Mathematical Society would publish it. Turing's revised paper contains a reference to Church's results and the paper, first completed in April 1936, was revised in this way in August 1936 and it appeared in print in 1937. A good feature of the resulting discussions with Church was that Turing became a graduate student at Princeton University in 1936. At Princeton, Turing undertook research under Church's supervision Compiled Deepanjal Shrestha 26 of 33 Social and Professional Issues in IT Pioneers of Computing

and he returned to England in 1938, having been back in England for the summer vacation in 1937 when he first met Wittgenstein. The major publication which came out of his work at Princeton was Systems of Logic Based on Ordinals which was published in 1939. Newman writes:- This paper is full of interesting suggestions and ideas. ... [It] throws much light on Turing's views on the place of intuition in mathematical proof. Before this paper appeared, Turing published two other papers on rather more conventional mathematical topics. One of these papers discussed methods of approximating Lie groups by finite groups. The other paper proves results on extensions of groups, which were first proved by Reinhold Baer, giving a simpler and more unified approach. Perhaps the most remarkable feature of Turing's work on Turing machines was that he was describing a modern computer before technology had reached the point where construction was a realistic proposition. He had proved in his 1936 paper that a universal Turing machine existed:- ... which can be made to do the work of any special-purpose machine, that is to say to carry out any piece of computing, if a tape bearing suitable "instructions" is inserted into it. Although to Turing a "computer" was a person who carried out a computation, we must see in his description of a universal Turing machine what we today think of as a computer with the tape as the program. While at Princeton Turing had played with the idea of constructing a computer. Once back at Cambridge in 1938 he starting to build an analogue mechanical device to investigate the Riemann hypothesis, which many consider today the biggest unsolved problem in mathematics. However, his work would soon take on a new aspect for he was contacted, soon after his return, by the Government Code and Cypher School who asked him to help them in their work on breaking the German Enigma codes. When war was declared in 1939 Turing immediately moved to work full-time at the Government Code and Cypher School at Bletchley Park. Although the work carried out at Bletchley Park was covered by the Official Secrets Act, much has recently become public knowledge. Turing's brilliant ideas in solving codes, and developing computers to assist break them, may have saved more lives of military personnel in the course of the war than any other. It was also a happy time for him [13]:- ... perhaps the happiest of his life, with full scope for his inventiveness, a mild routine to shape the day, and a congenial set of fellow-workers. Together with another mathematician W G Welchman, Turing developed the Bombe, a machine based on earlier work by Polish mathematicians, which from late 1940 was decoding all messages sent by the Enigma machines of the Luftwaffe. The Enigma machines of the German navy were much harder to break but this was the type of challenge which Turing enjoyed. By the middle of 1941 Turing's statistical approach, together with captured information, had led to the German navy signals being decoded at Bletchley. From November 1942 until March 1943 Turing was in the United States liaising over decoding issues and also on a speech secrecy system. Changes in the way the Germans encoded their messages had meant that Bletchley lost the ability to decode the messages. Turing was not directly involved with the successful breaking of these more complex codes, but his ideas proved of the greatest importance in this work. Turing was awarded the O.B.E. in 1945 for his vital contribution to the war effort. At the end of the war Turing was invited by the National Physical Laboratory in London to design a computer. His report proposing the Automatic Computing Engine (ACE) was submitted in March 1946. Turing's design was at that point an original detailed design and prospectus for a computer in the modern sense. The size of storage he planned for the ACE was regarded by most who considered the report as hopelessly over-ambitious and there were delays in the project being approved.

Compiled Deepanjal Shrestha 27 of 33 Social and Professional Issues in IT Pioneers of Computing

Turing returned to Cambridge for the academic year 1947-48 where his interests ranged over many topics far removed from computers or mathematics; in particular he studied neurology and physiology. He did not forget about computers during this period, however, and he wrote code for programming computers. He had interests outside the academic world too, having taken up athletics seriously after the end of the war. He was a member of Walton Athletic Club winning their 3 mile and 10 mile championship in record time. He ran in the A.A.A. Marathon in 1947 and was placed fifth. By 1948 Newman was the professor of mathematics at the University of Manchester and he offered Turing a readership there. Turing resigned from the National Physical Laboratory to take up the post in Manchester. Newman writes that in Manchester:- ... work was beginning on the construction of a computing machine by F C Williams and T Kilburn. The expectation was that Turing would lead the mathematical side of the work, and for a few years he continued to work, first on the design of the subroutines out of which the larger programs for such a machine are built, and then, as this kind of work became standardised, on more general problems of numerical analysis. In 1950 Turing published Computing machinery and intelligence in Mind. It is another remarkable work from his brilliantly inventive mind which seemed to foresee the questions which would arise as computers developed. He studied problems which today lie at the heart of artificial intelligence. It was in this 1950 paper that he proposed the Turing Test which is still today the test people apply in attempting to answer whether a computer can be intelligent :- ... he became involved in discussions on the contrasts and similarities between machines and brains. Turing's view, expressed with great force and wit, was that it was for those who saw an unbridgeable gap between the two to say just where the difference lay. Turing did not forget about questions of decidability which had been the starting point for his brilliant mathematical publications. One of the main problems in the theory of group presentations was the question: given any word in a finitely presented groups is there an algorithm to decide if the word is equal to the identity. Post had proved that for semigroups no such algorithm exist. Turing thought at first that he had proved the same result for groups but, just before giving a seminar on his proof, he discovered an error. He was able to rescue from his faulty proof the fact that there was a cancellative semigroup with insoluble word problem and he published this result in 1950. Boone used the ideas from this paper by Turing to prove the existence of a group with insoluble word problem in 1957. Turing was elected a Fellow of the Royal Society of London in 1951, mainly for his work on Turing machines in 1936. By 1951 he was working on the application of mathematical theory to biological forms. In 1952 he published the first part of his theoretical study of morphogenesis, the development of pattern and form in living organisms. Turing was arrested for violation of British homosexuality statutes in 1952 when he reported to the police details of a homosexual affair. He had gone to the police because he had been threatened with blackmail. He was tried as a homosexual on 31 March 1952, offering no defence other than that he saw nothing wrong in his actions. Found guilty he was given the alternatives of prison or oestrogen injections for a year. He accepted the latter and returned to a wide range of academic pursuits. Not only did he press forward with further study of morphogenesis, but he also worked on new ideas in quantum theory, on the representation of elementary particles by spinors, and on relativity theory. Although he was completely open about his sexuality, he had a further unhappiness which he was forbidden to talk about due to the Official Secrets Act. The decoding operation at Bletchley Park became the basis for the new decoding and intelligence work at GCHQ. With the cold war this became an important operation and Turing continued to work for GCHQ, although his Manchester colleagues were totally unaware of this. After his conviction, his security clearance was withdrawn. Worse than that, security officers were now extremely worried that someone with complete knowledge of the work going on at GCHQ was now labelled a security Compiled Deepanjal Shrestha 28 of 33 Social and Professional Issues in IT Pioneers of Computing

risk. He had many foreign colleagues, as any academic would, but the police began to investigate his foreign visitors. A holiday which Turing took in Greece in 1953 caused consternation among the security officers. Turing died of potassium cyanide poisoning while conducting electrolysis experiments. The cyanide was found on a half eaten apple beside him. An inquest concluded that it was self-administered but his mother always maintained that it was an accident.

Ray Holt born, city, USA

Achievement invented central processor in 1971 Built what is called a slice computer where he used several elements such as the 74818 and others to make a computer of expandable word width... This is not at MICRO-COMPUTER-BY-ANY MEANS! Once you go outside of one chip for the main control of the organization; all Registers, Program Counter, Arithmetic on one piece of silicon, you are outside the realm of the Microcomputer.

John von Neumann

John von Neumann was born János von Neumann. He was called Jancsi as a child, a diminutive form of János, then later he was called Johnny in the United States. His father, Max Neumann, was a top banker and he was brought up in a extended family, living in Budapest where as a child he learnt languages from the German and French governesses that were employed. Although the family were Jewish, Max Neumann did not observe the strict practices of that religion and the household seemed to mix Jewish and Christian traditions.

It is also worth explaining how Max Neumann's son acquired the "von" to become János von Neumann. In 1913 Max Neumann purchased a title but did not change his name. His son, however, used the German form von Neumann where the "von" indicated the title. As a child von Neumann showed he had an incredible memory. Poundstone, writes:- At the age of six, he was able to exchange jokes with his father in classical Greek. The Neumann family sometimes entertained guests with demonstrations of Johnny's ability to memorise phone books. A guest would select a page and column of the phone book at random. Young Johnny read the column over a few times, then handed the book back to the guest. He could answer any question put to him (who has number such and such?) or recite names, addresses, and numbers in order. In 1911 von Neumann entered the Lutheran Gymnasium. The school had a strong academic tradition which seemed to count for more than the religious affiliation both in the Neumann's eyes and in those of the school. His mathematics teacher quickly recognised von Neumann's genius and special tuition was put on for him. The school had another outstanding mathematician one year ahead of von Neumann, namely . World War I had relatively little effect on von Neumann's education but, after the war ended, Béla Kun controlled Hungary for five months in 1919 with a Communist government. The Neumann family fled to Austria as the affluent came under attack. However, after a month, they returned to face the problems of Budapest. When Kun's government failed, the fact that it had been largely

Compiled Deepanjal Shrestha 29 of 33 Social and Professional Issues in IT Pioneers of Computing

composed of Jews meant that Jewish people were blamed. Such situations are devoid of logic and the fact that the Neumann's were opposed to Kun's government did not save them from persecution. In 1921 von Neumann completed his education at the Lutheran Gymnasium. His first mathematics paper, written jointly with Fekete the assistant at the University of Budapest who had been tutoring him, was published in 1922. However Max Neumann did not want his son to take up a subject that would not bring him wealth. Max Neumann asked Theodore von Kármán to speak to his son and persuade him to follow a career in business. Perhaps von Kármán was the wrong person to ask to undertake such a task but in the end all agreed on the compromise subject of chemistry for von Neumann's university studies. Hungary was not an easy country for those of Jewish descent for many reasons and there was a strict limit on the number of Jewish students who could enter the University of Budapest. Of course, even with a strict quota, von Neumann's record easily won him a place to study mathematics in 1921 but he did not attend lectures. Instead he also entered the University of Berlin in 1921 to study chemistry. Von Neumann studied chemistry at the University of Berlin until 1923 when he went to Zurich. He achieved outstanding results in the mathematics examinations at the University of Budapest despite not attending any courses. Von Neumann received his diploma in chemical engineering from the Technische Hochschule in Zürich in 1926. While in Zurich he continued his interest in mathematics, despite studying chemistry, and interacted with Weyl and Pólya who were both at Zurich. He even took over one of Weyl's courses when he was absent from Zurich for a time. Pólya said:- Johnny was the only student I was ever afraid of. If in the course of a lecture I stated an unsolved problem, the chances were he'd come to me as soon as the lecture was over, with the complete solution in a few scribbles on a slip of paper. Von Neumann received his doctorate in mathematics from the University of Budapest, also in 1926, with a thesis on set theory. He published a definition of ordinal numbers when he was 20, the definition is the one used today. Von Neumann lectured at Berlin from 1926 to 1929 and at Hamburg from 1929 to 1930. However he also held a Rockefeller Fellowship to enable him to undertake postdoctoral studies at the University of Göttingen. He studied under Hilbert at Göttingen during 1926-27. By this time von Neumann had achieved celebrity status :- By his mid-twenties, von Neumann's fame had spread worldwide in the mathematical community. At academic conferences, he would find himself pointed out as a young genius. Veblen invited von Neumann to Princeton to lecture on quantum theory in 1929. Replying to Veblen that he would come after attending to some personal matters, von Neumann went to Budapest where he married his fiancée Marietta Kovesi before setting out for the United States. In 1930 von Neumann became a visiting lecturer at Princeton University, being appointed professor there in 1931. Between 1930 and 1933 von Neumann taught at Princeton but this was not one of his strong points :- His fluid line of thought was difficult for those less gifted to follow. He was notorious for dashing out equations on a small portion of the available blackboard and erasing expressions before students could copy them. In contrast, however, he had an ability to explain complicated ideas in physics :- For a man to whom complicated mathematics presented no difficulty, he could explain his conclusions to the uninitiated with amazing lucidity. After a talk with him one always came away with a feeling that the problem was really simple and transparent.

Compiled Deepanjal Shrestha 30 of 33 Social and Professional Issues in IT Pioneers of Computing

He became one of the original six mathematics professors (J W Alexander, A Einstein, M Morse, O Veblen, J von Neumann and H Weyl) in 1933 at the newly founded Institute for Advanced Study in Princeton, a position he kept for the remainder of his life. During the first years that he was in the United States, von Neumann continued to return to Europe during the summers. Until 1933 he still held academic posts in Germany but resigned these when the Nazis came to power. Unlike many others, von Neumann was not a political refugee but rather he went to the United States mainly because he thought that the prospect of academic positions there was better than in Germany. In 1933 von Neumann became co-editor of the Annals of Mathematics and, two years later, he became co-editor of Compositio Mathematica. He held both these editorships until his death. Von Neumann and Marietta had a daughter Marina in 1936 but their marriage ended in divorce in 1937. The following year he married Klára Dán, also from Budapest, whom he met on one of his European visits. After marrying, they sailed to the United States and made their home in Princeton. There von Neumann lived a rather unusual lifestyle for a top mathematician. He had always enjoyed parties :- Parties and nightlife held a special appeal for von Neumann. While teaching in Germany, von Neumann had been a denizen of the Cabaret-era Berlin nightlife circuit. Now married to Klára the parties continued :- The parties at the von Neumann's house were frequent, and famous, and long. Ulam summarises von Neumann's work. He writes:- In his youthful work, he was concerned not only with mathematical logic and the axiomatics of set theory, but, simultaneously, with the substance of set theory itself, obtaining interesting results in measure theory and the theory of real variables. It was in this period also that he began his classical work on quantum theory, the mathematical foundation of the theory of measurement in quantum theory and the new statistical mechanics. His text Mathematische Grundlagen der Quantenmechanik (1932) built a solid framework for the new quantum mechanics. Van Hove writes:- Quantum mechanics was very fortunate indeed to attract, in the very first years after its discovery in 1925, the interest of a mathematical genius of von Neumann's stature. As a result, the mathematical framework of the theory was developed and the formal aspects of its entirely novel rules of interpretation were analysed by one single man in two years (1927-1929). Self-adjoint algebras of bounded linear operators on a Hilbert space, closed in the weak operator topology, were introduced in 1929 by von Neumann in a paper in Mathematische Annalen . Kadison explains:- His interest in ergodic theory, group representations and quantum mechanics contributed significantly to von Neumann's realisation that a theory of operator algebras was the next important stage in the development of this area of mathematics. Such operator algebras were called "rings of operators" by von Neumann and later they were called W*-algebras by some other mathematicians. J Dixmier, in 1957, called them "von Neumann algebras" in his monograph Algebras of operators in Hilbert space (von Neumann algebras). In the second half of the 1930's and the early von Neumann, working with his collaborator F J Murray, laid the foundations for the study of von Neumann algebras in a fundamental series of papers. However von Neumann is know for the wide variety of different scientific studies. Ulam explains how he was led towards game theory :-

Compiled Deepanjal Shrestha 31 of 33 Social and Professional Issues in IT Pioneers of Computing

Von Neumann's awareness of results obtained by other mathematicians and the inherent possibilities which they offer is astonishing. Early in his work, a paper by Borel on the minimax property led him to develop ... ideas which culminated later in one of his most original creations, the theory of games. In game theory von Neumann proved the minimax theorem. He gradually expanded his work in game theory, and with co-author Oskar Morgenstern, he wrote the classic text Theory of Games and Economic Behaviour (1944). Ulam continues:- An idea of Koopman on the possibilities of treating problems of classical mechanics by means of operators on a function space stimulated him to give the first mathematically rigorous proof of an ergodic theorem. Haar's construction of measure in groups provided the inspiration for his wonderful partial solution of Hilbert's fifth problem, in which he proved the possibility of introducing analytical parameters in compact groups. In 1938 the American Mathematical Society awarded the Bôcher Prize to John von Neumann for his memoir Almost periodic functions and groups. This was published in two parts in the Transactions of the American Mathematical Society, the first part in 1934 and the second part in the following year. Around this time von Neumann turned to applied mathematics :- In the middle 30's, Johnny was fascinated by the problem of hydrodynamical turbulence. It was then that he became aware of the mysteries underlying the subject of non-linear partial differential equations. His work, from the beginnings of the Second World War, concerns a study of the equations of hydrodynamics and the theory of shocks. The phenomena described by these non-linear equations are baffling analytically and defy even qualitative insight by present methods. Numerical work seemed to him the most promising way to obtain a feeling for the behaviour of such systems. This impelled him to study new possibilities of computation on electronic machines ... Von Neumann was one of the pioneers of computer science making significant contributions to the development of logical design. Shannon writes :- Von Neumann spent a considerable part of the last few years of his life working in [automata theory]. It represented for him a synthesis of his early interest in logic and proof theory and his later work, during World War II and after, on large scale electronic computers. Involving a mixture of pure and applied mathematics as well as other sciences, automata theory was an ideal field for von Neumann's wide-ranging intellect. He brought to it many new insights and opened up at least two new directions of research. He advanced the theory of cellular automata, advocated the adoption of the bit as a measurement of computer memory, and solved problems in obtaining reliable answers from unreliable computer components. During and after World War II, von Neumann served as a consultant to the armed forces. His valuable contributions included a proposal of the implosion method for bringing nuclear fuel to explosion and his participation in the development of the hydrogen bomb. From 1940 he was a member of the Scientific Advisory Committee at the Ballistic Research Laboratories at the Aberdeen Proving Ground in Maryland. He was a member of the Navy Bureau of Ordnance from 1941 to 1955, and a consultant to the Los Alamos Scientific Laboratory from 1943 to 1955. From 1950 to 1955 he was a member of the Armed Forces Special Weapons Project in Washington, D.C. In 1955 President Eisenhower appointed him to the Atomic Energy Commission, and in 1956 he received its Enrico Fermi Award, knowing that he was incurably ill with cancer. Eugene Wigner wrote of von Neumann's death :- When von Neumann realised he was incurably ill, his logic forced him to realise that he would cease to exist, and hence cease to have thoughts ... It was heartbreaking to watch the frustration of his mind, when all hope was gone, in its struggle with the fate which appeared to him unavoidable but unacceptable.

Compiled Deepanjal Shrestha 32 of 33 Social and Professional Issues in IT Pioneers of Computing

von Neumann's death is described in these terms:- ... his mind, the amulet on which he had always been able to rely, was becoming less dependable. Then came complete psychological breakdown; panic, screams of uncontrollable terror every night. His friend said, "I think that von Neumann suffered more when his mind would no longer function, than I have ever seen any human being suffer." Von Neumann's sense of invulnerability, or simply the desire to live, was struggling with unalterable facts. He seemed to have a great fear of death until the last... No achievements and no amount of influence could save him now, as they always had in the past. Johnny von Neumann, who knew how to live so fully, did not know how to die. It would be almost impossible to give even an idea of the range of honours which were given to von Neumann. He was Colloquium Lecturer of the American Mathematical Society in 1937 and received the its Bôcher Prize as mentioned above. He held the Gibbs Lectureship of the American Mathematical Society in 1947 and was President of the Society in 1951-53. He was elected to many academies including the Academia Nacional de Ciencias Exactas (Lima, Peru), Academia Nazionale dei Lincei (, Italy), American Academy of Arts and Sciences (USA), American Philosophical Society (USA), Instituto Lombardo di Scienze e Lettere (Milan, Italy), National Academy of Sciences (USA) and Royal Netherlands Academy of Sciences and Letters (Amsterdam, The Netherlands). Von Neumann received two Presidential Awards, the Medal for Merit in 1947 and the Medal for Freedom in 1956. Also in 1956 he received the Albert Einstein Commemorative Award and the Enrico Fermi Award mentioned above. Peierls writes :- He was the antithesis of the "long-haired" mathematics don. Always well groomed, he had as lively views on international politics and practical affairs as on mathematical problems.

Konrad Zuse (June 22, 1910 in Berlin)

Inventor of the first freely programmable computer 1935 Degree of Civil Engineering at the Polytechnical Institute of Berlin-Charlottenburg (predecessor of the Technische Universität Berlin). First job at the German aircraft industry. Zuse left this job early to set up an "inventor´s workshop" in the living room of his parents' apartment in Berlin. First idea to construct a "mechanical brain".

Compiled Deepanjal Shrestha 33 of 33