1956 and the Origins of Artificial Intelligence Computing
Total Page:16
File Type:pdf, Size:1020Kb
Load more
Recommended publications
-
Artificial Consciousness and the Consciousness-Attention Dissociation
Consciousness and Cognition 45 (2016) 210–225 Contents lists available at ScienceDirect Consciousness and Cognition journal homepage: www.elsevier.com/locate/concog Review article Artificial consciousness and the consciousness-attention dissociation ⇑ Harry Haroutioun Haladjian a, , Carlos Montemayor b a Laboratoire Psychologie de la Perception, CNRS (UMR 8242), Université Paris Descartes, Centre Biomédical des Saints-Pères, 45 rue des Saints-Pères, 75006 Paris, France b San Francisco State University, Philosophy Department, 1600 Holloway Avenue, San Francisco, CA 94132 USA article info abstract Article history: Artificial Intelligence is at a turning point, with a substantial increase in projects aiming to Received 6 July 2016 implement sophisticated forms of human intelligence in machines. This research attempts Accepted 12 August 2016 to model specific forms of intelligence through brute-force search heuristics and also reproduce features of human perception and cognition, including emotions. Such goals have implications for artificial consciousness, with some arguing that it will be achievable Keywords: once we overcome short-term engineering challenges. We believe, however, that phenom- Artificial intelligence enal consciousness cannot be implemented in machines. This becomes clear when consid- Artificial consciousness ering emotions and examining the dissociation between consciousness and attention in Consciousness Visual attention humans. While we may be able to program ethical behavior based on rules and machine Phenomenology learning, we will never be able to reproduce emotions or empathy by programming such Emotions control systems—these will be merely simulations. Arguments in favor of this claim include Empathy considerations about evolution, the neuropsychological aspects of emotions, and the disso- ciation between attention and consciousness found in humans. -
Biographies of Computer Scientists
1 Charles Babbage 26 December 1791 (London, UK) – 18 October 1871 (London, UK) Life and Times Charles Babbage was born into a wealthy family, and started his mathematics education very early. By . 1811, when he went to Trinity College, Cambridge, he found that he knew more mathematics then his professors. He moved to Peterhouse, Cambridge from where he graduated in 1814. However, rather than come second to his friend Herschel in the final examinations, Babbage decided not to compete for an honors degree. In 1815 he co-founded the Analytical Society dedicated to studying continental reforms of Newton's formulation of “The Calculus”. He was one of the founders of the Astronomical Society in 1820. In 1821 Babbage started work on his Difference Engine designed to accurately compile tables. Babbage received government funding to construct an actual machine, but they stopped the funding in 1832 when it became clear that its construction was running well over-budget George Schuetz completed a machine based on the design of the Difference Engine in 1854. On completing the design of the Difference Engine, Babbage started work on the Analytical Engine capable of more general symbolic manipulations. The design of the Analytical Engine was complete in 1856, but a complete machine would not be constructed for over a century. Babbage's interests were wide. It is claimed that he invented cow-catchers for railway engines, the uniform postal rate, a means of recognizing lighthouses. He was also interested in locks and ciphers. He was politically active and wrote many treatises. One of the more famous proposed the banning of street musicians. -
Computability and Complexity
Computability and Complexity Lecture Notes Herbert Jaeger, Jacobs University Bremen Version history Jan 30, 2018: created as copy of CC lecture notes of Spring 2017 Feb 16, 2018: cleaned up font conversion hickups up to Section 5 Feb 23, 2018: cleaned up font conversion hickups up to Section 6 Mar 15, 2018: cleaned up font conversion hickups up to Section 7 Apr 5, 2018: cleaned up font conversion hickups up to the end of these lecture notes 1 1 Introduction 1.1 Motivation This lecture will introduce you to the theory of computation and the theory of computational complexity. The theory of computation offers full answers to the questions, • what problems can in principle be solved by computer programs? • what functions can in principle be computed by computer programs? • what formal languages can in principle be decided by computer programs? Full answers to these questions have been found in the last 70 years or so, and we will learn about them. (And it turns out that these questions are all the same question). The theory of computation is well-established, transparent, and basically simple (you might disagree at first). The theory of complexity offers many insights to questions like • for a given problem / function / language that has to be solved / computed / decided by a computer program, how long does the fastest program actually run? • how much memory space has to be used at least? • can you speed up computations by using different computer architectures or different programming approaches? The theory of complexity is historically younger than the theory of computation – the first surge of results came in the 60ties of last century. -
Women in Computing
History of Computing CSE P590A (UW) PP190/290-3 (UCB) CSE 290 291 (D00) Women in Computing Katherine Deibel University of Washington [email protected] 1 An Amazing Photo Philadelphia Inquirer, "Your Neighbors" article, 8/13/1957 2 Diversity Crisis in Computer Science Percentage of CS/IS Bachelor Degrees Awarded to Women National Center for Education Statistics, 2001 3 Goals of this talk ! Highlight the many accomplishments made by women in the computing field ! Learn their stories, both good and bad 4 Augusta Ada King, Countess of Lovelace ! Translated and extended Menabrea’s article on Babbage’s Analytical Engine ! Predicted computers could be used for music and graphics ! Wrote the first algorithm— how to compute Bernoulli numbers ! Developed notions of looping and subroutines 5 Garbage In, Garbage Out The Analytical Engine has no pretensions whatever to originate anything. It can do whatever we know how to order it to perform. It can follow analysis; but it has no power of anticipating any analytical relations or truths. — Ada Lovelace, Note G 6 On her genius and insight If you are as fastidious about the acts of your friendship as you are about those of your pen, I much fear I shall equally lose your friendship and your Notes. I am very reluctant to return your admirable & philosophic 'Note A.' Pray do not alter it… All this was impossible for you to know by intuition and the more I read your notes the more surprised I am at them and regret not having earlier explored so rich a vein of the noblest metal. -
Inventing Computational Rhetoric
INVENTING COMPUTATIONAL RHETORIC By Michael W. Wojcik A THESIS Submitted to Michigan State University in partial fulfillment of the requirements for the degree of Digital Rhetoric and Professional Writing — Master of Arts 2013 ABSTRACT INVENTING COMPUTATIONAL RHETORIC by Michael W. Wojcik Many disciplines in the humanities are developing “computational” branches which make use of information technology to process large amounts of data algorithmically. The field of computational rhetoric is in its infancy, but we are already seeing interesting results from applying the ideas and goals of rhetoric to text processing and related areas. After considering what computational rhetoric might be, three approaches to inventing computational rhetorics are presented: a structural schema, a review of extant work, and a theoretical exploration. Copyright by MICHAEL W. WOJCIK 2013 For Malea iv ACKNOWLEDGEMENTS Above all else I must thank my beloved wife, Malea Powell, without whose prompting this thesis would have remained forever incomplete. I am also grateful for the presence in my life of my terrific stepdaughter, Audrey Swartz, and wonderful granddaughter Lucille. My thesis committee, Dean Rehberger, Bill Hart-Davidson, and John Monberg, pro- vided me with generous guidance and inspiration. Other faculty members at Michigan State who helped me explore relevant ideas include Rochelle Harris, Mike McLeod, Joyce Chai, Danielle Devoss, and Bump Halbritter. My previous academic program at Miami University did not result in a degree, but faculty there also contributed greatly to my the- oretical understanding, particularly Susan Morgan, Mary-Jean Corbett, Brit Harwood, J. Edgar Tidwell, Lori Merish, Vicki Smith, Alice Adams, Fran Dolan, and Keith Tuma. -
Ada Lovelace the first Computer Programmer 1815 - 1852
Ada Lovelace The first computer programmer 1815 - 1852 Biography Ada Lovelace Day I Born on December 10th, 1815 in London as Augusta Ada Byron Each second Tuesday in October is Ada Lovelace Day. A day to raise the I Parents separated when she was a baby profile of women in science, technology, engineering, and maths to create new role models for girls and women in these fields. During this day the I Father Lord Byron was a poet and died when she was 8 years old accomplishments of those women are celebrated. I Mother Lady Wentworth was a social reformer I Descended from a wealthy family I Early interest in mathematics and science, encouraged by her mother Portrait I Obtained private classes and got in touch with intellectuals, e.g. Mary Sommerville who tutored her and later introduced Lovelace to Charles Babbage at the age of 17 I Married in 1835 William King at the age of 19, shortly after becoming the Countess of Lovelace I By 1839, she had given birth to 3 children I Continued studying maths, supported among others by Augustus De Morgan, a math professor in London who taught her via correspondence I In 1843, she published a translation of an Italian academic paper about Babbage's Analytical Engine and added her famous note section (see Contributions) I Died on November 27th, 1852 at the age of 36 Contributions I First computer programmer, roughly a century before the electronic computer I A two decade lasting correspondence with Babbage about his idea of an Analytical Engine I Developed an algorithm that would enable the Analytical Engine to calculate a sequence of Bernoulli numbers, unfortunately, the machine was never built I First person to realize the power of computer programs: Not only used for calculations with numbers I Combined arts and logic, calling it poetical science Figure 3:Ada Lovelace I First reflections about artificial intelligence, but she rejected the idea Bernoulli Numbers Quotes I Play an important role in several domains of mathematics, e.g. -
Computability Theory
CSC 438F/2404F Notes (S. Cook and T. Pitassi) Fall, 2019 Computability Theory This section is partly inspired by the material in \A Course in Mathematical Logic" by Bell and Machover, Chap 6, sections 1-10. Other references: \Introduction to the theory of computation" by Michael Sipser, and \Com- putability, Complexity, and Languages" by M. Davis and E. Weyuker. Our first goal is to give a formal definition for what it means for a function on N to be com- putable by an algorithm. Historically the first convincing such definition was given by Alan Turing in 1936, in his paper which introduced what we now call Turing machines. Slightly before Turing, Alonzo Church gave a definition based on his lambda calculus. About the same time G¨odel,Herbrand, and Kleene developed definitions based on recursion schemes. Fortunately all of these definitions are equivalent, and each of many other definitions pro- posed later are also equivalent to Turing's definition. This has lead to the general belief that these definitions have got it right, and this assertion is roughly what we now call \Church's Thesis". A natural definition of computable function f on N allows for the possibility that f(x) may not be defined for all x 2 N, because algorithms do not always halt. Thus we will use the symbol 1 to mean “undefined". Definition: A partial function is a function n f :(N [ f1g) ! N [ f1g; n ≥ 0 such that f(c1; :::; cn) = 1 if some ci = 1. In the context of computability theory, whenever we refer to a function on N, we mean a partial function in the above sense. -
A Framework for Representing Knowledge Marvin Minsky MIT-AI Laboratory Memo 306, June, 1974. Reprinted in the Psychology of Comp
A Framework for Representing Knowledge Marvin Minsky MIT-AI Laboratory Memo 306, June, 1974. Reprinted in The Psychology of Computer Vision, P. Winston (Ed.), McGraw-Hill, 1975. Shorter versions in J. Haugeland, Ed., Mind Design, MIT Press, 1981, and in Cognitive Science, Collins, Allan and Edward E. Smith (eds.) Morgan-Kaufmann, 1992 ISBN 55860-013-2] FRAMES It seems to me that the ingredients of most theories both in Artificial Intelligence and in Psychology have been on the whole too minute, local, and unstructured to account–either practically or phenomenologically–for the effectiveness of common-sense thought. The "chunks" of reasoning, language, memory, and "perception" ought to be larger and more structured; their factual and procedural contents must be more intimately connected in order to explain the apparent power and speed of mental activities. Similar feelings seem to be emerging in several centers working on theories of intelligence. They take one form in the proposal of Papert and myself (1972) to sub-structure knowledge into "micro-worlds"; another form in the "Problem-spaces" of Newell and Simon (1972); and yet another in new, large structures that theorists like Schank (1974), Abelson (1974), and Norman (1972) assign to linguistic objects. I see all these as moving away from the traditional attempts both by behavioristic psychologists and by logic-oriented students of Artificial Intelligence in trying to represent knowledge as collections of separate, simple fragments. I try here to bring together several of these issues by pretending to have a unified, coherent theory. The paper raises more questions than it answers, and I have tried to note the theory's deficiencies. -
Lovelace & Babbage and the Creation of the 1843 'Notes'
Lovelace & Babbage and the Creation of the 1843 ‘Notes’ John Fuegi and Jo Francis Flare/MITH Augusta Ada Lovelace worked with Charles Babbage to create a description of Babbage’s unbuilt invention, the Analytical Engine, a highly advanced mechanical calculator often considered a forerunner of the electronic calculating computers of the 20th century. Ada Lovelace’s “Notes,” describing the Analytical Engine, published in Taylor’s Scientific Memoirs in 1843, contained a ground-breaking description of the possibilities of programming the machine to go beyond number-crunching to “computing” in the wider sense in which we understand the term today. This article expands on research first presented by the authors in their documentary film, To Dream Tomorrow. What shall we do to get rid of Mr. Babbage and known to have crossed the intellectual thresh- his calculating Machine? Surely if completed it old between conceptualizing computing as would be worthless as far as science is con- only for calculation on the one hand, and on cerned? the other hand, computing as we know it —British Prime Minister Sir Robert Peel, 18421 today: with wider applications made possible by symbolic substitution. The Analytical Engine does not occupy common In an early background interview at the ground with mere ‘calculating machines.’ … In Science Museum (London) for the historical enabling mechanism to combine together gen- documentary film about collaboration between eral symbols, in successions of unlimited variety Lovelace and Babbage, To Dream Tomorrow,3 and extent, a uniting link is established between Babbage authority Doron Swade mentioned the operations of matter and the abstract mental that he thought Babbage and Lovelace had processes of the most abstract branch of mathe- “very different qualities of mind.” Swade’s matical science. -
A Brief History of Computers
History of Computers http://www.cs.uah.edu/~rcoleman/Common/History/History.html A Brief History of Computers Where did these beasties come from? Ancient Times Early Man relied on counting on his fingers and toes (which by the way, is the basis for our base 10 numbering system). He also used sticks and stones as markers. Later notched sticks and knotted cords were used for counting. Finally came symbols written on hides, parchment, and later paper. Man invents the concept of number, then invents devices to help keep up with the numbers of his possessions. Roman Empire The ancient Romans developed an Abacus, the first "machine" for calculating. While it predates the Chinese abacus we do not know if it was the ancestor of that Abacus. Counters in the lower groove are 1 x 10 n, those in the upper groove are 5 x 10 n Industrial Age - 1600 John Napier, a Scottish nobleman and politician devoted much of his leisure time to the study of mathematics. He was especially interested in devising ways to aid computations. His greatest contribution was the invention of logarithms. He inscribed logarithmic measurements on a set of 10 wooden rods and thus was able to do multiplication and division by matching up numbers on the rods. These became known as Napier’s Bones. 1621 - The Sliderule Napier invented logarithms, Edmund Gunter invented the logarithmic scales (lines etched on metal or wood), but it was William Oughtred, in England who invented the sliderule. Using the concept of Napier’s bones, he inscribed logarithms on strips of wood and invented the calculating "machine" which was used up until the mid-1970s when the first hand-held calculators and microcomputers appeared. -
Introduction to the Theory of Computation Computability, Complexity, and the Lambda Calculus Some Notes for CIS262
Introduction to the Theory of Computation Computability, Complexity, And the Lambda Calculus Some Notes for CIS262 Jean Gallier and Jocelyn Quaintance Department of Computer and Information Science University of Pennsylvania Philadelphia, PA 19104, USA e-mail: [email protected] c Jean Gallier Please, do not reproduce without permission of the author April 28, 2020 2 Contents Contents 3 1 RAM Programs, Turing Machines 7 1.1 Partial Functions and RAM Programs . 10 1.2 Definition of a Turing Machine . 15 1.3 Computations of Turing Machines . 17 1.4 Equivalence of RAM programs And Turing Machines . 20 1.5 Listable Languages and Computable Languages . 21 1.6 A Simple Function Not Known to be Computable . 22 1.7 The Primitive Recursive Functions . 25 1.8 Primitive Recursive Predicates . 33 1.9 The Partial Computable Functions . 35 2 Universal RAM Programs and the Halting Problem 41 2.1 Pairing Functions . 41 2.2 Equivalence of Alphabets . 48 2.3 Coding of RAM Programs; The Halting Problem . 50 2.4 Universal RAM Programs . 54 2.5 Indexing of RAM Programs . 59 2.6 Kleene's T -Predicate . 60 2.7 A Non-Computable Function; Busy Beavers . 62 3 Elementary Recursive Function Theory 67 3.1 Acceptable Indexings . 67 3.2 Undecidable Problems . 70 3.3 Reducibility and Rice's Theorem . 73 3.4 Listable (Recursively Enumerable) Sets . 76 3.5 Reducibility and Complete Sets . 82 4 The Lambda-Calculus 87 4.1 Syntax of the Lambda-Calculus . 89 4.2 β-Reduction and β-Conversion; the Church{Rosser Theorem . 94 4.3 Some Useful Combinators . -
Mccarthy As Scientist and Engineer, with Personal Recollections
Articles McCarthy as Scientist and Engineer, with Personal Recollections Edward Feigenbaum n John McCarthy, professor emeritus of com - n the late 1950s and early 1960s, there were very few people puter science at Stanford University, died on actually doing AI research — mostly the handful of founders October 24, 2011. McCarthy, a past president (John McCarthy, Marvin Minsky, and Oliver Selfridge in of AAAI and an AAAI Fellow, helped design the I Boston, Allen Newell and Herbert Simon in Pittsburgh) plus foundation of today’s internet-based computing their students, and that included me. Everyone knew everyone and is widely credited with coining the term, artificial intelligence. This remembrance by else, and saw them at the few conference panels that were held. Edward Feigenbaum, also a past president of At one of those conferences, I met John. We renewed contact AAAI and a professor emeritus of computer sci - upon his rearrival at Stanford, and that was to have major con - ence at Stanford University, was delivered at the sequences for my professional life. I was a faculty member at the celebration of John McCarthy’s accomplish - University of California, Berkeley, teaching the first AI courses ments, held at Stanford on 25 March 2012. at that university, and John was doing the same at Stanford. As – AI Magazine Stanford moved toward a computer science department under the leadership of George Forsythe, John suggested to George, and then supported, the idea of hiring me into the founding fac - ulty of the department. Since we were both Advanced Research Project Agency (ARPA) contract awardees, we quickly formed a close bond concerning ARPA-sponsored AI research and gradu - ate student teaching.