1Ttif\\Irji~ Lljjl \/' II International
Total Page:16
File Type:pdf, Size:1020Kb
COMPUTER THOUGHT: PROPOSITIONAL ATTITUDES AND META-KNOWLEDGE (ARTIFICIAL INTELLIGENCE, SEMANTICS, PSYCHOLOGY, ALGORITHMS). Item Type text; Dissertation-Reproduction (electronic) Authors DIETRICH, ERIC STANLEY. Publisher The University of Arizona. Rights Copyright © is held by the author. Digital access to this material is made possible by the University Libraries, University of Arizona. Further transmission, reproduction or presentation (such as public display or performance) of protected items is prohibited except with permission of the author. Download date 07/10/2021 12:28:25 Link to Item http://hdl.handle.net/10150/188116 INFORMATION TO USERS This reproduction was made from a copy of a manuscript sent to us for publication and microfilming. While the most advanced technology has been used to pho tograph and reproduce this manuscript. the quality of the reproduction is heavily dependent upon the quality of the material submitted. Pages in any manuscript may have indistinct print. In all cases the best available copy has been filmed. The following explanation of techniques is provided to help clarify notations which may appear on this reproduction. 1. Manuscripts may not always be complete. When it is not possible to obtain missing pages. a note appears to indicate this. 2. When copyrighted materials are removed from the manuscript. a note ap pears to indicate this. 3. Oversize materials (maps. drawings. and charts) are photographed by sec tioning the original. beginning at the upper left hand comer and continu ing from left to right in equal sections with small overlaps. Each oversize page is also filmed as one exposure and is available. for an additional charge. as a standard 35mm slide or in black and white paper format. * 4. Most photographs reproduce acceptably on positive microfilm or micro fiche but lack clarity on xerographic copies made from the microfilm. For an additional charge. all photographs are available in black and white standard 35mm slide format. * *For more information about black and white slides or enlarged paper reproductions. please contact the Dissertations Customer Services Department. 1TTIf\\IrJI~ llJJL \/' II International 8603337 Dietrich, Eric Stanley COMPUTER THOUGHT: PROPOSITIONAL ATTITUDES AND META KNOWLEDGE The University of Arizona PH.D. 1985 University Microfilms International 300 N. Zeeb Road, Ann Arbor, MI48106 COMPUTER THOUGHT: PROPOSITIONAL ATTITUDES AND META-KNOWLEDGE by Eric Stanley Dietrich A Dissertation Submitted to the Faculty of the DEPARTMENT OF PHILOSOPHY In Partial Fulfillment of the Requirements For the Degree of DOCTOR OF PHILOSOPHY In the Graduate College THE UNIVERSITY OF ARIZONA I 985 THE UNIVERSITY OF ARIZONA GRADUATE COLLEGE As members of the Final Examination Committee, we certify that we have read the dissertation prepared by -----------------------------------------------Eric Dietrich entitled "Computer Thought: Propositional Attitudes and Meta-Knowledge" and recommend that it be accepted as fulfilling the dissertation requirement ~~D' of Philosoehy gj~£/ Date d ~ ~fS Daf.~~ Date Date Final approval and acceptance of this dissertation is contingent upon the candidate's submission of the final copy of the dissertation to the Graduate College_ I hereby certify that I have read this dissertation prepared under my direction and recommend that it be accepted as fulfilling the dissertation ~ait~it~ Da~ STATEMENT BY AUTHOR This dissertation has been submitted in partial fulfill ment of requirements for an advanced degree at The University of Arizona and is deposited in the University Library to be made available to borrowers under rules of the Library. Brief quotations from this dissertation are allowable without special permission, provided that accurate acknowledgment of source is made. Requests for permission for extended quotation from or reproduction of this manuscript in whole or in part may be granted by the head of the major department or the Dean of the Graduate College when in his or her judgment the proposed use of the material is in the interests of scholarship. In all other instances, however, permission must be obtained from the author. ACKNOWLEDGMENTS I wish to thank Kent Bach, Myles Brand, Rob Cummins, Chris Fields, Ron Sauers, and Kathy Yagel for help in under standing the matters discussed herein. I also wish to thank the Clipr staff and Lyle Bourne for computers and funding. Funding was also provided by the Dietrich Family Foundation. iii TABLE OF CONTENTS Page ABSTRACT ... vii 1. INTRODUCTION. 1 2. FOUR PROBLEMS FOR A COMPUTATIONAL THEORY OF PROPOSITIONAL ATTITUDES 10 2.1 Introduction. 10 2.2 Functionalism 10 2.3 A Taxonomy of Theories of the Mind .. 16 2.3.1 An Introduction to Dennett's Intentional-System Theory .. 17 2.3.2 Functional Theories that Deny the Existence of Propositional Attitudes 21 2.3.3 Causal Functionalism and Propositional Attitudes. 28 2.4 The Four Problemb . 33 3. SEMANTICS AND MENTAL REPRESENTATION 39 3.1 Introduction. 39 3.2 Representation in Computers 41 3.2.1 Data Structures ... 42 3.2.2 Knowledge Representations: Definitions and Examples . 46 3.2.3 The Functional Definition of *Knowledge Representations . 58 3.3 The Constraints on Intelligent Computer Thought •............... 61 iv v Page 3.3.1 Definitions of Computational Efficiency and Expressive Power. 62 3.3.2 On the Search for Universal and Scientia1 Schemes. ..• 71 3.4 Two Theories of Mental Representation 80 3.4.1 Non-mental Representation. 83 3.4.2 Mental Representation. 87 3.5 Conclusion. • 103 4. A THEORY OF PROPOSITIONAL ATTITUDES .• 106 4.1 Introduction. 106 4.2 Computational Functionalism 109 4.2.1 Propositional Attitudes and Computational Relations .. 110 4.2.2 Programs, Algorithms, and Functions. 113 4.2.3 What Propositional Attitudes Are. 116 4.3 Propositional Attitudes and Stephen Stich. 130 4.3.1 The Autonomy Principle ... 131 4.3.2 An Argument for the Autonomy Principle. • . • 136 4.3.3 The Tension between the Autonomy Principle and Propositional Attitudes 141 4.4 Conclusion. ·154 5. MACHINES WITH MIND AND CONSCIOUSNESS. 156 5.1 Introduction. 156 5.2 Implicit Information ••• 158 vi Page 5.2.1 Control-Implicit Information. 160 5.2.2 Domain-Implicit Information •• 161 5.3 Meta-Level Processing ••.•• 163 5.3.1 Strategic Meta-Knowledge. 166 5.3.2 Descriptive Meta-Knowledge. 167 5.3.3 Systemic Meta-Knowledge. 168 5.4 Ip-Intelligence 169 5.4.1 Abstract Data Types and Implicit Information. • . • . • . 173 5.4.2 Content and Implicit Information 178 5.5 Ip-Inte11igence. Propositional Attitudes, Consciousness, and Mind 194 BIBLIOGRAPHY 206 ABSTRACT Though artificial intelligence scientists frequently use words such as "belief" and "desire" when describing the computa tional capacities of their programs and computers, they have completely ignored the philosophical and psychological theories of belief and desire. Hence, their explanations of computational capacities which use these terms are frequently little better than folk-psychological explanations. Conversely, though philosophers and psychologists attempt to couch their theories of belief and desire in computational terms, they have consistently misunder stood the notions of computation and computational semantics. Hence, their theories of such attitudes are frequently inadequate. A computational theory of propositional attitudes (belief and desire) is presented here. It is argued that the theory of propositional attitudes put forth by philosophers and psycholo gists entails that propositional attitudes are a kind of abstract data type. This refined computational view of propositional attitudes bridges the gap between artificial intelligence, philosophy and psychology. Lastly, it is argued that this theory of propositional attitudes has consequences for meta-precessing and consciousness in computers. vii CHAPTER I INTRODUCTION I have an acquaintance whose vocation is artificial intelligence. His research involves a sort of robot--a computer equipped with a camera and one manipulator arm--and some colored blocks. The robot arranges the blocks in any manner one wishes: being told to pick up a red block and put it on a green one, his robot scans its work area for a red block, extends its arm and grasps the block; then finding a green block, it puts the former on the latter. Recently, this acquaintance of mine discovered a mouse in his home kitchen and, characteristically, built an electronic mousetrap which could sense a mouse, exterminate it, and then signal that it had a dead mouse in its chamber. He placed this mousetrap out on the floor in a corner of his kitchen. A few mornings later, awakened by the signal, he found the mouse caught in the trap, dead. When I heard of this incident a few days later I remarked that there seemed to be little difference between his mousetrap and the robot in his laboratory. He disagreed claiming that, in fact, his robot reasoned but the mousetrap did not. The trap was merely an electro-mechanical device obeying the laws of physics. On the other hand, he claimed that his robot had 1 2 thoughts, in particular beliefs, because it could perform a cluster of processes and actions associated with manipulating an environment consisting of itself and various blocks. The mouse trap, on the other hand, had no beliefs or thoughts of any kind. As an example, he said, his robot could believe that a red cube was located at such and such a position relative to it, that a block it had picked up was red, that it had been requested to pick up a red block, that there were such things as blocks, and so forth. I asked him what would be required to turn the mousetrap into a thinking mousetrap as opposed to a merely well-functioning physical device. His answer was quite typical of researchers in artificial intelligence. The essential ingredient, he claimed, was a capacity to represent. A thinking mousetrap would need representations and processes defined over them. According to him, computers can have thoughts because they have internal structures the processing of which can count as adding 2 and 2, parsing "please pick up a red block," etc.