Can the Mind Change the World?

Can the Mind Change the World?

I Can the mind change the world? NED BLOCK Hilary Putnam origi nated the idea that mental states are computational states. At first (Putnam, 1960), his view was that although mental states are not identical wit h computational states (or " logical states." as he then called them), there are useful analogies between them. Later (Putnam, 1967). he argued in favor of the identity on the grounds that it was more plausible to suppose mental states are function al states (as he then called them) than that they are behavioral or physical states. This doctrine - funct ionalism - has dominated the philosophy of mind for over twenty years. Shortly after proposing functionalism. Putnam re­ jected it again (J 973), and he has maintained this position ever since (Putnam. 1988). Putnam was my teacher during both my undergraduate and gradu·ate days, and I fear I have absorbed his ambivalence toward functionalism. My teacher has had a habit of changing his mind, but never has he done so within a single essay, and so in this chapter I have surpassed him. My chapter starts out as an argument for functionalism, but it ends up suggesting an argument against it. The issue is whether we ca n avoid epiphenomenalism, which I here understand as the doctrine that what we think or want has no causal relevance to what we do. I propose functionalism as a way of warding off arguments for epiphcnomenalism, but then I argue that functionalism may bring epiphenomenalism in its wake. The orientation of the chapter is toward the sciences of the mind, and their relation to intentional content, that is, what is shared hy the I am graleful for support IO the America n Counci l or Learned Sociclies and lo the Nalional Science Foundation. granl numhcr DTR88 12559. I am also grateful to Jerry Fodor, Paul Horwich, Frank Jackson, Gabriel Segal. and Marianne Talbot for !heir commenls on an earlier draft; to Thomas Nagel, who was the commentaror on the paper on which lhis chapler is based at a conference al Columbia University in December 1988; to Simon Blacktiurn, who was !he commentalor on the paper at Oxford in February 1989; to Martin Davies. Dorothy Edgington. and Oarry Smith for discussion or 1he issues while I was at Birkbeck College. University of London; and to the audiences al a number of philosophy colloquia. .. · · :; 138 NFD ULOCK hclief that grass grows and the desire that grass grows, the that grass model. T grow.~ that buth sraces are directed toward. The question at hand is qucstioni whether the scie nces of the mind prc..:cludc int entional content from useful pt. causal relevance to behavior. One argument that the intentional con­ J think ti tents of our beliefs, thoughts, and the like have no effects on our premise behavior could he put this way: The processors in the head are not The r. sensitive to content, so how could content have any effect on the Turing r outputs or changes of state of the system of processors? And if content such as , can't affect the operation of this system of processors, how could it play sensitive any role in producing behavior? This argument seems formidable Uu1 if 1h whether one thinks of the processors as neural devices reacting lO a gate. h neural inputs or, instead, from the cognitive science point of view, as gates? S computational devices processing representations. ' Jn this chapter, I se ntat io1 confin e myself to the prohlcm as it arises in the cognitive science influcnc• I approach that is dominated by the computer model of the mind. I soning a assume a very specific picture of cognitive science and its relation to the computl commonsense conception of intentional content, namely, the view ac­ mind (c cording to which there is an internal system of representation from assumrn I whose meanings our intentional contents derive (Fodor, 1975; Pylys hyn, My r '. 1984 ). One of my reasons for couching the discussion in terms of this view is that although those who adopt this view are motil'ated by the I. t· aim of showing how our commonsense beliefs about cont ent (including I I. l our belief in content's causal efficacy) are vi ndicated by the computer II model of the mind, the problem of thl' epiphenomenalism of content 111. s arises within this view in an extremely simple and straightforward (and (.' poignant) way. The viewpoint assumed throughout the chapter is that of r a supporter of the computer model in cognit ive science who also would IV . I like to believe that the contents of our thoughts arc indeed ca usally g relevant to what we do. A suht I The problem I have in mind might be put in terms of The Paradox of relcvarn the Causal Efficacy of Content, namely, th<Jt the fo llowing claims all of a lav. seem to be true, yet incompatible: a count I. The intentional content of a thought (or other intentional state) is causally relevant to its behavioral (and other) effects. 11. Intentional conteAt reduces to meanings of internal representa­ tions. The fir: II J. Internal processors arc sensitive to the "syntactic forms" of propL·r1 ·internal representations. not thl'ir meanings. SOllll' a and ge The first claim is meant to he part nf the communscnse view of the name : mind. The third is plausibly taken to be a basic claim of the computer Bureau model of the mind, and the second i~ a useful and plausible way of United thinking how Commonsense rsycholO!-,')' meshes with lhe comruter contc11 1 f i Ct111 rh1· 111i11d change the ll'orld? IJl) c: thar grass ml>Je l. This s<.:eon c.J daim is by for the most conc roversial, hul I won't be at hand is questioning it here. My reasons are that I think it is true. that I see no intent from usdul purpose to dividing meaning and content in this cn11text. and that Jtion al con­ I think the best bets fo r resolving the paradox are to que:-. ti on the third :cts on our premise amt whether the reasoning that leads to the paradox is right. !ad arc not T he reasoning behind the paradox goes so mething li ke this: Any feet on the;: Turing machine can be constructed fro m simple primit ive p rnces~l) r s cl if content such ;.is 1111<1 ga tes. or gates, and the like. (See Minsky, llJC17 .) CJates are :ould it play sen!'> itive to the sy ntactic fo rms of representations, not tl1eir meanings. fo rmiJable But if the meaning 1>f a rcprc:!se nt ation cannot in liuence tl1c behavior of reacting to a ga te. how coulJ iL influence th t.: hehuvior of a computer - a system of of view. as gates? Since intentional content reduces lO meanings of internal rep re­ ; chapter, I sentations, and si nce meanings of internal repre!>t! nt ati ons ca nnot live scie nce inllut.:nce behavior, content cannot influence hehavior t.:it her. The rea­ the minJ. I soning assumes that at least as far as our thinking is con('erneJ, we are lation to the computers. This idea - which is simply the computer model of the he view ac- mind (o ur cognit ive mind, that is) - may he wrong, but I will be ·tation from assum ing it m explore wh ere it leads. '5; Pylyshyn, My plan fo r the chapter involves erms of this •ared by the I. Expl<J ining each premise tt (including II. Examining and rejecting a putative solution based nn a nomolog­ le computer ical conception of causal re le vance 1 of content 111. Suggesti ng a solution based on a functionalist conception of orwan.l (and cont i.; nt and rn caning and a countcrfactual thl my 1>f causal tcr is that of relevance ) also would IV. Discussing a probkm with the proposed solution. one lhat sug- eed causally gests that functionalism actually hreec.Js epiphenomenalism A suhtheme of rh e chapter is that a nomological theory of causal e Paradox of relevance (a theory that explai ns causal relevance in terms of th e not ion g claims all of a law of nature) has more of a problem with epiphenomenal ism than a counterfacwal approach. 1tional state) cts. l . The premises I represent a- The firs! prcmi"c uses Lhe notion of a causally relevanr f"Opl.'rty. Some c forms" of propl'rties of a cause arl' releva nt to the production of an c lfect, and Sllmt: are not. I lurrica111.: EliLa hrokc my window. Elii:1's wi nd speed and geographical path are ca usa lly relevant to the hr ~ aking, bur its view of the name amJ the location of its records in the United ~.ta l es Wea1her he computer Bureau un.: not. 1\ ccord i11 g to th ~ fi rst premise, if my hel ie f that the .siblc way of Unit cu States is a tfangcrous place causes me to leave the country. the le comrn1ter content of the hclid is causally relevant to the behavior; ;i property that 140 NED £31.0CK is not causally relevant to the behavior is the last letter of the name of the city in which the belief was formed. Note that the point is not that beliefs, thoughts, desires, and the like (mental states or events that have content) are causes, for example, of behavior. (I assume that they arc.) Rather, the point is that when mental events have effects, they typically have those effects (rather than different effects) because the mental events have the contents that they have, rather than some other contents.

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    34 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us