The Turing Test Turing's Own Objections

The Turing Test Turing's Own Objections

One of major divisions in AI (and you can see it in those There is another group separate from the Cognitive definitions above) is between: Scientists and Engineers we just distinguished: it is those who are interested in attributing mental capacities to For Dennett machines and people are in roughly the * those who think AI is the only serious way of finding machines--and this group could overlap with either of the same position: we have a language for talking about out how WE work (since opening heads doesnt yet first two. how they work and why, which he calls FOLK tell you much) Their interest is the mentality of machines, not the machine-likeness of PSYCHOLOGY---i.e. the propositional attitudes humans. Here is Dennett, the major US philosopher concerned with and AI: BELIEVE, INTEND etc. those who want computers to do very smart things, In a recent conversation with the designer of a chess- independently of how WE work. playing program I heard the following criticism of a rival Cognitive scientists vs. Engineers. program: It thinks it should get its queen out early. This ascribes a propositional attitude to the program in a very Think about a reading computer that read English useful and predictive way, for the designer went on to say But he says that in neither case should we (very well) from Right to Left! one can usually count on chasing that queen around a assume those correspond to anything What follows, if anything, from its success? board. But for all the many levels of explicit representation to be found in that program, nowhere is real inside, in the brain or the program. there anything roughly synonymous with ‘I should get my queen out early.’ explicitly tokened. Strong vs. Weak AI Contrast Dennett, who doesnt really think people or machines have mental states--they are the same position An important distinction we shall need later, due to the with respect to ‘as if’ explanation---it behaves AS IF it The Turing Test philosopher John Searle. wants to get its queen out early For him, WEAK AI is like Cognitive Science above Q Turing in 1950 published a (I.e. about people): it uses the machine philosophical paper designed to stop representations and hypotheses to mimic human mental function, but never ascribes those properties people arguing about whether or not to the machine. machines could think. For Searle, STRONG AI is the claim that machines programmed with the appropriate behaviour, are Q He proposed that the question be having the same mental states as people would who replaced with a test, which was not had the same behaviour--i.e. that machines can have MENTAL STATES. quite what is now called the Turing . |Test. Turing’s own objections: Turing’s test was about whether or not an Q If, after some agreed time, the interrogator interrogator could tell a man from a woman! cannot distinguish situations where a machine has been substituted for the Q Turing considered, and dismissed, possible Q An interrogator in another room asks objections to the idea that computers can think. questions of a subject by teletype(!), man/woman, we should just agree to say Some of these objections might still be raised today. the machine can think (says Turing). Some objections are easier to refute than others. trying to determine their sex. Objections considered by Turing: Q NOTICE: the question of whether it is a 1. The theological objection Q The subject is sometimes a man and 2. The ‘heads in the sand’ objection sometimes a woman. machine never comes up in the questions. 3. The mathematical objection Q Nowadays, the ‘Turing Test’ is precisely 4. The argument from consciousness Q 5. Arguments from various disabilities about whether the other is a machine or 6. Lady Lovelace’s objection not. 7. Argument from continuity in the nervous system (8.) The argument from informality of behaviour (9.) The argument from extra-sensory perception 1 The theological objection Heads in the sand objection The mathematical objection Q Results of mathematical logic which can be Q ‘…Thinking is a function of man’s immortal Q i.e. The consequence of machines used to show that there are limitations to the soul. God has given an immortal soul to thinking would be too dreadful. Let us powers of discrete-state machines. eg halting problem: will the execution of a every man and woman, but not to any other hope and believe that they cannot do so. program P eventually halt or will it run for animal or to machines. Hence no animal or - related to theological argument; idea ever? Turing (1936) proved that for any machine can think…’ that Humans are superior to the rest of algorithm H that purports to solve halting creation, and must stay so……... problems there will always be a program Pi Q Why not believe that God could give a soul to such that H will not be able to answer the a machine if He wished? Q ‘.. Those who believe in ..(this and the halting problem correctly. previous objection).. would probably not i.e. Certain questions cannot be answered be interested in any criteria..’ correctly by any formal system. Q But, similar limitations may also apply to the human intellect. Argument from consciousness Lady Lovelace’s objection: Consciousness Q ‘…This argument is very well expressed in Professor Q Thought and consciousness do not always go together. Q (memoir from Lady Lovelace about Babbage’s Jefferson’s Lister Oration for 1949, from which I quote. Freud and unconscious thought. Analytical Engine) “Not until a machine can write a sonnet or compose a Thought we cannot introspect about. (eg searching for Babbage (1792-1871) and Analytical Engine: concerto because of thoughts and emotions felt, and not forgotton name) general purpose calculator. Entirely mechanical. by the chance fall of symbols, could we agree that Blindsight (Weiskrantz) – removal of visual cortex, blind in Entire contraption never built – engineering not up machine equals brain – that is not only write it but know certain areas, but can still locate spot without to it and no electricity! that it had written it. No mechanism could feel (and not consciousness of it. merely artificially signal, an easy contrivance) pleasure ‘..The Analytical Engine has no pretensions to Arguments from various disabilities ie ‘I grant that you can originate anything. It can do whatever we know at its successes, grief when its valves fuse, be warmed make machines to all the things you have mentioned but by flattery, be made miserable by its mistakes, be you will never be able to make one do X’. how to order it to perform..’ charmed by sex, be angry or depressed when it cannot eg be kind, resourceful, beautiful, friendly, have initiative, A computer cannot be creative, it cannot originate get what it wants”..’ have a sense of humour, tell right from wrong, make anything, only carry out what was given to it by the Only way one could be sure that a machine thinks is to mistakes, fall in love, enjoy strawberries and cream, make programmer. be that machine and feel oneself thinking. someone fall in love with it, learn from experience, use But computers can surprise their programmers. – - similarly, only way to be sure someone else thinks, is words properly, be the subject of its own though, have as ie by producing answers that were not expected. to be that person. much diversity of behaviour as a man, do something really Original data may have been given to computer, but How do we know that anyone is conscious? solipsism. new. may then be able to work out its consequences and Instead, we assume that others can think and are These criticisms often disguised forms of argument from implications (cf. level of chess programs and their conscious----it is a polite convention. Similarly, could consciousness. assume that machine which passes Turing test is so programmers). too Argument from continuity in the nervous Q Other objections 2. Too easy: Sense organs objection system Q Copeland (1993) [see ‘Artificial Intelligence: a philosophical introduction’] discusses 4 further Q Turing Test focuses only on verbal responses: Q Nervous system is continuous: the digital objections to Turing Test. The first three of these the computer is able to use words without really computer is discrete state machine. he dismisses, and the fourth he incorporates into a knowing their meanings. I.e. in the nervous system a small error in modified version of the Turing Test. (like taking a driving test that consists only of the information about the size of a nervous 1. Too conservative: Chimpanzee objection Chimpanzees, dolphins, dogs, and pre-linguistic answering questions). impulse impinging on a neuron may make infants all can think (?) but could not pass Turing Equip it with sense organs, and you can then a large difference to the size of the Test. test knowledge of world (remember turning the outgoing impulse. But this only means that Turing Test cannot be a turtle over in ‘Blade Runner’ when spotting Discrete state machines: move by sudden litmus test (red = acid, not red = non acidic). robots. jumps and clicks from one state to - nothing definite follows if computer/animal/baby fails the test. Q But there many words which can be investigated another. For example, consider the without computer having sense organs. Eg ‘convenient fiction’ that switches are either Ie negative outcome does not mean computer cannot think. ‘teasing’. definitely on, or definitely off. (In philosophical terms: TT gives a sufficient not a Q Think about getting to know people through However, discrete state machine can still necessary condition of thought) give answers that are indistinguishable webchat--can you? from a continuous machine.

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    11 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us