<<

AMERICAN PHILOSOPHICAL QUARTERLY Volume 41, Number 3, July 2004

ON BEING SIMPLE MINDED

Peter Carruthers

The question ‘Do fishes think?’ does not exist for belief, or a reason for action, can count among our applications of language, it is not as possessing beliefs or engaging in inten- raised.—Wittgenstein tional action. And Searle (1992) says: you need consciousness. Only creatures that have 1. ON HAVING A MIND conscious beliefs and conscious desires can count as having beliefs or desires at all. How simple minded can you be? Many Such views seem to the present author philosophers would answer: no more to be ill-motivated. Granted, humans speak simple than a language-using human be- and interpret the speech of others. And ing. Many other philosophers, and most granted, humans weigh up and evaluate cognitive scientists, would allow that mam- reasons for belief and for action. And mals, and perhaps birds, possess minds. granted, too, humans engage in forms of But few have gone to the extreme of be- thinking that have all of the hallmarks of lieving that very simple organisms, such consciousness. But there are no good rea- as insects, can be genuinely minded.1 This sons for insisting that these features of the is the ground that the present paper pro- human mind are necessary conditions of poses to occupy and defend. It will argue mindedness as such. Or so, at least, this pa- that ants and bees, in particular, possess per will briefly argue now, and then take for minds. So it will be claiming that minds granted as an assumption in what follows. can be very simple indeed. Common sense has little difficulty with What does it take to be a minded organ- the idea that there can be beliefs and de- ism? Davidson (1975) says: you need to sires that fail to meet these demanding con- be an interpreter of the speech and behav- ditions. This suggests, at least, that those ior of another minded organism. Only crea- conditions are not conceptually necessary tures that speak, and that both interpret and ones. Most people feel pretty comfortable are subject to interpretation, count as genu- in ascribing simple beliefs and desires to inely thinking anything at all. McDowell non-language-using creatures. They will (1994) says: you need to exist in a space say, for example, that a particular ape acts of reasons. Only creatures capable of ap- as she does because she believes that the preciating the normative force of a reason mound contains termites and wants to eat them. And our willingness to entertain such

205 206 / AMERICAN PHILOSOPHICAL QUARTERLY thoughts seems unaffected by the extent to These considerations give rise to an ar- which we think that the animal in question gument against the very possibility of non- can appreciate the normative implications linguistic thought, which was initially of its own states of belief and desire. More- presented by Davidson (1975). The argu- over, most of us are now (post-Freud and ment claims first, that beliefs and desires the cognitive turn in cognitive science) are content-bearing states whose contents entirely willing to countenance the exist- must be expressible in a sentential comple- ence of beliefs and desires that are not con- ment (a that-clause). Then second, the ar- scious ones. gument points out that it must always be It isn’t only ordinary folk who think that inappropriate to use sentential comple- beliefs and desires can exist in the absence ments that embed our concepts when de- of the stringent requirements laid down by scribing the thoughts of an animal (given some philosophers. Many cognitive scien- the absence of linguistic behavior of the tists and comparative psychologists would appropriate sorts). In which case (putting these agree. (Although sometimes, admittedly, two premises together) it follows that animals the language of ‘belief’ and ‘desire’ gets cannot be said to have thoughts at all. omitted in deference to the sensibilities of The error in this argument lies in its as- some philosophical audiences.) There is sumption that thought contents must be speci- now a rich and extensive body of litera- fiable by means of that-clauses, however. For ture on the cognitive states and processes this amounts to the imposition of a co-think- of non-human animals (e.g., Walker, 1983; ing constraint on genuine thoughthood. In Gallistel, 1990; Gould and Gould, 1994). order for another creature (whether human And this literature is replete with talk of in- or animal) to be thinking a particular thought, formation-bearing conceptualized states that it would have to be the case that someone guide planning and action-selection (beliefs), else should also be capable of entertaining as well as states that set the ends planned for that very thought, in such a way that it can and that motivate action (desires). be formulated into a that-clause. But why True enough, it can often be difficult to should we believe this? For we know that say quite what an animal believes or de- there are many thoughts—e.g., some of sires. And many of us can, on reflection, Einstein’s thoughts, or some of the thoughts rightly be made to feel uncomfortable of Chomsky—that we may be incapable of when using a that-clause constructed out entertaining. And why should we neverthe- of our human concepts to describe the less think that the real existence of those thoughts of an animal. If we say of an ape thoughts is contingent upon the capacity of that she believes that the mound contains someone else to co-think them? Perhaps termites, for example, then we can easily Einstein had some thoughts so sophisticated be made to feel awkward about so doing. that there is no one else who is capable of For how likely is it that the ape will have entertaining their content. the concept termite? Does the ape distin- The common-sense position is that (in guish between termites and ants, for in- addition to being formulated and co-thought stance, while also believing that both kinds from the inside, through a that-clause) belong to the same super-ordinate category thoughts can equally well be characterized (insects)? Does the ape really believe that from the outside, by means of an indirect termites are living things which excrete and description. In the case of the ape dipping reproduce? And so on. for termites, for example, most of us would, ON BEING SIMPLE MINDED / 207 on reflection, say something like this: we Action Percept schemata Motor do not know how much the ape knows about Body termites, nor how exactly she conceptual- states izes them, but we do know that she believes of the termites in that mound that they are Figure 2: An Unminded Behavioral Arcitecture there, and we know that she wants to eat them. And on this matter common sense and The crucial components of this account cognitive science agree. Through careful ex- are the beliefs and desires. For it is unlikely perimentation scientists can map the bound- that possession of perceptual states alone, aries of a creature’s concepts, and can where those states are used to guide a suite explore the extent of its knowledge of the of innate behavioral programs or fixed ac- things with which it deals (Bermúdez, tion schemata, could be sufficient for a 2003). These discoveries can then be used creature to count as possessing a mind. to provide an external characterization of Consider the architecture represented in the creature’s beliefs and goals, even if the Figure 2. The innate behaviors of many concepts in question are so alien to us that mammals, birds, reptiles, and insects can we couldn’t co-think them with the crea- be explained in such terms. The releasing ture in the content of a that-clause. factors for an innate behavior will often What does it take to be a minded organ- include both bodily states (e.g., pregnancy) ism, then? We should say instead: you need and perceptual information, with the per- to possess a certain core cognitive architec- ceptual states serving to guide the detailed ture. Having a mind means being a subject performance of the behavior in question, of perceptual states, where those states are too.2 (Consider a butterfly landing neatly used to inform a set of belief states which on a leaf to lay her eggs.) But engaging in guide behavior, and where the belief states a suite of innately coded action patterns in turn interact with a set of desire states in isn’t enough to count as having a mind, ways that depend upon their contents, to even if the detailed performance of those select from amongst an array of action sche- patterns is guided by perceptual informa- mata so as to determine the form of the be- tion. And nor, surely, is the situation any havior. This sort of belief/desire architec- different if the action patterns are not in- ture is one that humans share. It is repre- nate ones, but are, rather, acquired habits, sented diagrammatically in Figure 1. learned through some form of conditioning. This paper will assume that it isn’t enough for an organism to count as having a mind, either, that the animal in question Practical Belief reason should be merely interpretable as possess- ing beliefs and desires. For as Dennett Desire (1987) has taught us, we can adopt what Action Percept Motor he calls ‘the intentional stance’ in respect schemata of even the most rigidly pre-programmed Body of behaviors. No, the architecture repre- states sented in figure 1 needs to be construed realistically. There needs to be a real dis- Figure 1: The Core Architecture of a Mind tinction between the belief-states and the desire-states, in virtue of which they pos- sess their distinctive causal roles (guiding 208 / AMERICAN PHILOSOPHICAL QUARTERLY and motivating action, respectively). And causal contingencies, and by goals that are these states must, in addition, be both dis- only linked to basic forms of motivation crete, and structured in a way that reflects via learning. (We, too, have to monitor our their semantic contents. And their detailed own reactions to learn what we want; causal roles, too (the ways in which par- Damasio, 1994.) These arguments are ticular belief states and particular desire convincing. But at the same time Dickinson states interact) must be sensitive to those advances a particular conception of what it structural features. takes to be a truly minded creature. The lat- Dennett (1991) thinks that only language ter will need to be resisted if the position can provide these properties. It is only with that insects have minds is to be defensible. the first appearance of the Joycean ma- Here are some of the data (Dickinson and chine—the stream of inner verbalization Balleine, 2000). Rats do not press a lever that occupies so much of our waking for food any more when hungry than when lives—that humans come to have discrete nearly satiated, unless they have had ex- structured semantically-evaluable states, perience of eating that food when hungry. where the interactions of those states are While the reward-value of the food is sensitive to their structures. So although something that they know, the increased Dennett might say that many animals pos- value that attaches to food when hungry is sess , all he really means is something that they have to learn. Simi- that their behavior is rich enough to make larly, rats caused to have an aversion to one it worth our while to adopt the intentional food rather than another (via the injection stance towards them. But he denies that of an emetic shortly after eating the one non-human animals possess minds realis- food but not the other) will thereafter stop tically construed, in the way that the performing a trained action causally paired present paper proposes to construe them. with just that food, in the absence of feed- To be minded means to be a thinker, then. back resulting from its actions (i.e., with- And that means (it will hereafter be as- out receiving any rewards). However, they sumed) having distinct belief states and will do so only if they have been re-pre- desire states that are discrete, structured, sented with the relevant food in the inter- and causally efficacious in virtue of their val. They have to learn that they now have structural properties. These are demanding an aversion to the food in question. But conditions on mindedness. The question is: once learned, they make appropriate deci- how simple can an organism be while still sions about which of two previously having states with these features? learned actions to perform—they know which action will make them sick, and 2. MEANS/ENDS REASONING IN RATS choose accordingly. Dickinson and colleagues have con- Dickinson thinks that the data warrant ducted an elegant series of experiments, ascribing to rats a two-tier motivational reported in a number of influential papers, system much like our own. There is a ba- providing evidence for the view that rats, sic level of biological drives, which fixes at least, are genuine means/ends reasoners the reward value of experiences received. (Dickinson and Balleine, 1994; Dickinson And there is an intentional level of repre- and Shanks, 1995; Dickinson and Balleine, sented goals (e.g., to eat this stuff rather 2000). They argue that rats engage in goal- than that stuff), which has to be linked up directed behavior, guided by beliefs about to the basic level via learning to achieve ON BEING SIMPLE MINDED / 209 its motivating status. Others have made and Balleine, 2000). Their reasoning is that similar proposals to explain aspects of the otherwise the animal’s behavior will be human motivational system (Damasio, explicable in terms of mere innate motor 1994; Rolls, 1999). programs or learned habits created through In other experiments Dickinson and col- some form of associative conditioning. leagues have shown that rats are sensitive These further claims are unwarranted, to the degree of causal efficacy of their however, as we shall see. actions (Dickinson and Charnock, 1985; Dickinson and Shanks, 1995). In particu- 3. NON-ASSOCIATIVE LEARNING AND lar, the rat’s rate of action drops as the NON-CAUSAL INSTRUMENTAL REASONING causal connection between act and out- Dickinson assumes that if behavior isn’t come falls. It even turns out that rats dis- caused by means/ends reasoning in the play exactly the same illusions of causality above sense, then it must either be innate as do humans. The set-up in these experi- or the product of associative conditioning. ments is that the probability of an event But this assumption is false. The animal occurring (e.g., a figure appearing on a TV world is rife with non-associative forms of monitor, for the humans) or a reward be- learning (Gallistel, 1990). Many animals ing delivered (for the rats) is actually made (including the Tunisian desert ant) can independent of the action to be performed navigate by dead reckoning, for example. (pressing the space-bar, pressing a lever), This requires the animal to compute the while sometimes occurring in a way that value of a variable each time it turns—in- happens to be temporally paired with that tegrating the direction in which it has just action. If (but only if) the unpaired out- been traveling (as calculated from the po- comes are signaled in some way (by a co- larization of the sun’s light in the sky; incident sound, say), then both rats and Wehner, 1994), with an estimate of the dis- humans continue to believe (and to behave) tance traveled in that direction, to produce as if the connection between act and out- a representation of current position in re- come were a causal one. lation a point of origin (home base, say). The conclusion drawn from these and This plainly isn’t conditioned behavior of similar studies, then, is that rats are genu- any sort; and nor can it be explained in terms ine means/ends reasoners. They possess of associative mechanisms, unless those learned representations of the goals they mechanisms are organized into an architec- are trying to achieve (e.g., to receive a par- ture that is then tantamount to algorithmic ticular type of food). And they have ac- symbol processing (Marcus, 2001). quired representations of the relative causal Similarly, many kinds of animal will efficacy of the actions open to them. Taken construct mental maps of their environ- together, these representations will lead ment which they use when navigating; and them (normally, when not fooled by cun- they update the properties of the map ning experimenters) to act appropriately. through observation without conditioning However, Dickinson and colleagues claim (Gould and Gould, 1994).3 Many animals that only creatures with these abilities can can adopt the shortest route to a target (e.g., count as being genuinely minded, or as a source of food), guided by landmarks and possessing a belief/desire cognitive archi- covering ground never before traveled. tecture of the sort depicted in figure 1 This warrants ascribing to the animals a (Heyes and Dickinson, 1990; Dickinson mental map of their environment. But they 210 / AMERICAN PHILOSOPHICAL QUARTERLY will also update the properties of the map Gallistel and Gibbon, 2001). One simple on an on-going basis. point they make is that animals on a delayed Food-caching birds, for example, can reinforcement schedule in which the re- recall the positions of many hundreds or wards only become available once the con- thousands of hidden seeds after some ditioned stimulus (e.g., an illuminated months; but they generally won’t return to panel) has been present for a certain amount a cache location that they have previously of time, will only respond on each occasion emptied. Similarly, rats can be allowed to after a fixed proportion of the interval has explore a maze on one day, finding a food elapsed. This is hard to explain if the ani- reward in both a small dark room and a mals are merely building an association large white one. Next day they are given between the illuminated panel and the re- food in a (distinct) large white room, and ward. It seems to require, in fact, that they shocked in a small dark one. When re- should construct a representation of the re- placed back in the maze a day later they inforcement intervals, and act accordingly. go straight to the white room and avoid the Moreover, there are many well-estab- dark one (Gould and Gould, 1994). Having lished facts about conditioning behaviors learned that dark rooms might deliver a shock, that are hard to explain on associationist they have updated their representation of the models, but that are readily explicable properties of the maze accordingly.4 within a computational framework. For Many animals make swift calculations of example, delay of reinforcement has no relative reward abundance, too, in a way that effect on rate of acquisition so long as the isn’t explicable via conditioning. For ex- intervals between trials are increased by ample, a flock of ducks will distribute 1:1 the same proportions. And the number of or 1:2 in front of two feeders throwing food reinforcements required for acquisition of at rates of 1:1 or 1:2 within one minute of a new behavior isn’t affected by interspers- the onset of feeding, during which time ing a significant number of unreinforced many ducks get no food at all, and very few trials. This is hard to explain if the ani- of them experience rewards from both mals are supposed to be building associa- sources (Harper, 1982). Similarly, both pi- tions, since the unreinforced trials should geons and rats on a variable reward sched- surely weaken those associations. But it ule from two different alcoves will match can be predicted if what the animals are their behavior to the changing rates of re- doing is estimating relative rates of return. ward. They respond very rapidly, closely For the rate of reinforcement per stimulus tracking random variations in the immedi- presentation relative to the rate of rein- ately preceding rates (Dreyfus, 1991; Mark forcement in background conditions re- and Gallistel, 1994). They certainly are not mains the same, whether or not significant averaging over previous reinforcements, as numbers of stimulus presentations remain associationist models would predict. unreinforced. Gallistel and colleagues have argued, in- There are many forms of learning in ani- deed, that even classical conditioning—the mals that are not simply learned associa- very heartland of associationist general- tions, then. And consequently there are purpose learning models—is better ex- many animal behaviors that are not mere plained in terms of the computational habits, but that do not involve representa- operations of a specialized rate-estimating tions of causality, either. A bird who navi- foraging system (Gallistel, 1990, 2000; gates to a previously established food ON BEING SIMPLE MINDED / 211

cache, using landmarks and a mental map And so there can, surely, be minds that are on which the location of the cache is rep- simpler than the mind of a rat. resented, certainly isn’t acting out of habit. But then nor does the action involve any 4. INSECTS (1): explicit representation of the causality of INFLEXIBLE FIXED ACTION PATTERNS the bird’s own behavior. The bird doesn’t How simple minded can you be? Do in- have to think, ‘Flying in that direction will sects, in particular, have beliefs and de- cause me to be in that place.’ It just has to sires? Few seem inclined to answer, ‘Yes.’ integrate its perception of the relevant In part this derives from a tradition of landmarks with the representations on its thought (traceable back at least to Des- mental map, then keying into action the cartes) of doubting whether even higher flying-in-that-direction action schema. The mammals such as apes and monkeys have causation can be (and surely is) left im- minds. And in part it derives from the mani- plicit in the bird’s action schemata and fest rigidity of much insect behavior. The behaviors, not explicitly represented in the tradition will here be ignored. (Some as- bird’s reasoning.5 pects of it have been discussed briefly in But for all that, why should such animals section 1 above.) But the rigidity requires not count as exemplifying the belief/desire some comment. architecture depicted in Figure 1? If the We are all familiar with examples of the animal can put together a variety of goals behavioral rigidity of insects. Consider the with the representations on a mental map, tick, which sits immobile on its perch un- say, and act accordingly, then why should til it detects butyric acid vapor, whereupon we not say that the animal behaves as it it releases its hold (often enough falling does because it wants something and be- onto the bodies of mammals passing be- lieves that the desired thing can be found low, whose skins emit such a vapor); and at a certain represented location on the then when it detects warmth, it burrows. map? There seems to be no good reason Or there are the caterpillars who follow the why we should not. (And nor is there any light to climb trees to find food, in whom good reason to insist that an animal only the mechanism that enables them to do this has genuine desires if its goals are acquired is an extremely simple one: when more through learning.) Dickinson is surely mis- light enters one eye than the other, the legs led in thinking that means/ends reasoning on that side of its body move slower, caus- has to involve representations of causal, ing the animal to turn towards the source rather than merely spatial, ‘means.’6 of the light. When artificial lighting is pro- The difference between rats and many vided at the bottom of the trees, the cater- other animals is just that rats (like humans) pillars climb downwards and subsequently are generalist foragers and problem solv- starve to death. And when blinded in one ers. For this reason they have to learn the eye, these animals will move constantly in value of different foods, and they are es- circles. How dumb can you be! Right? pecially good at learning what acts will Even apparently sophisticated and intel- cause rewards to be delivered. But un- ligent sequences of behavior can turn out, learned values can still steer intentional on closer investigation, to be surprisingly behavior, guided by maps and other learned rigid. There is the well-known example of representations of location and direction. the Sphex wasp, that leaves a paralyzed cricket in a burrow with its eggs, so that 212 / AMERICAN PHILOSOPHICAL QUARTERLY its offspring will have something to feed is finished by a simple stopping-rule. But on when they hatch. When it captures a once any given stage is completed, there cricket, it drags it to the entrance of the is no going back to make corrections or burrow, then leaves it outside for a moment repairs. The wasp appears to have no con- while it enters, seemingly to check for in- ception of the overall goal of the sequence, truders. However, if an interfering experi- nor any beliefs about the respective con- menter moves the cricket back a few inches tributions made by the different elements. while the wasp is inside, she repeats the If this were the full extent of the flexibil- sequence: dragging the insect to the ity of insect behaviors, then there would burrow’s entrance, then entering briefly be no warrant for believing that insects once more alone. And this sequence can be have minds at all. made to ‘loop’ indefinitely many times over. It turns out that even flexibility of be- Or consider the Australian digger wasp, havioral strategy isn’t really sufficient for which builds an elaborate tower-and-bell a creature to count as having a mind, in- structure over the entrance of the burrow deed. For innate behavioral programs can in which she lays her eggs (Gould and have a conditional format. It used to be Gould, 1994). (The purpose of the struc- thought, for example, that all male crick- ture is to prevent a smaller species of para- ets sing to attract mates. But this isn’t so; sitic wasp from laying her eggs in the same and for good reason. For singing exposes burrow. The bell is of such a size and hung crickets to predation, and also makes them at such an angle, and worked so smooth targets for parasitic flies who drop their on the inside, that the smaller wasp cannot eggs on them. Many male crickets adopt either reach far enough in, or gain enough the alternative strategy of waiting silently purchase, to enter.) She builds the tower as a satellite of a singing male, and inter- three of her own body-lengths high. If the cepting and attempting to mate with any tower is progressively buried while she females attracted by the song. But the two builds, she will keep on building. But once different strategies are not fixed. A previ- she has finished the tower and started on ously silent male may begin to sing if one the bell, the tower can be buried without or more of the singing males is removed her noticing—with disastrous results, since (Gould and Gould, 1994). the bell will then be half on the ground, Admittedly, such examples suggest that and consequently quite useless. Similarly, something like a decision process must be if a small hole is drilled in the neck of the built into the structure of the behavioral tower, she seems to lack the resources to program. There must be some mechanism cope with a minor repair. Instead she builds that takes information about, for example, another tower and bell structure, con- the cricket’s own size and condition, the structed on top of the hole. ratio of singing to non-singing males in the In order to explain such behaviors, we vicinity, and the loudness and vigor of their do not need to advance beyond the archi- songs, and then triggers into action one tecture of Figure 2. The digger wasp would behavioral strategy or the other. But com- seem to have an innately represented se- putational complexity of this sort, in the ries of nested behavioral sub-routines, with mechanism that triggers an innate behav- the whole sequence being triggered by its ior, isn’t the same as saying that the insect own bodily state (pregnancy). Each sub- acts from its beliefs and desires. And the routine is guided by perceptual input, and ON BEING SIMPLE MINDED / 213

latter is what mindedness requires, we are belief/desire cognitive architectures are of assuming. very ancient ancestry indeed. Like many other insects, bees use a vari- 5. INSECTS (2): ety of navigation systems. One is dead A SIMPLE BELIEF/DESIRE PSYCHOLOGY reckoning (integrating a sequence of direc- From the fact that many insect behaviors tions of motion with the velocity traveled result from triggering of innately repre- in each direction, to produce a representa- sented sequences of perceptually guided tion of one’s current location in relation to activity, however, it doesn’t follow that all the point of origin). This in turn requires do. And it is surely no requirement on that bees can learn the expected position mindedness that every behavior of a of the sun in the sky at any given time of minded creature should result from inter- day, as measured by an internal clock of actions of belief states with desire states. some sort. Another mechanism permits Indeed, some of our own behaviors are trig- bees can recognize and navigate from land- gered fixed action sequences—think of marks, either distant or local (Collett and sneezing or coughing, for example, or of Collett, 2002). And some researchers have the universal human disgust reaction, claimed that bees will, in addition, con- which involves fixed movements of the struct crude mental maps of their environ- mouth and tongue seemingly designed to ment from which they can navigate. (The expel noxious substances from the oral maps have to be crude because of the poor cavity. So it remains a possibility that in- resolution of bee eyesight. But they may sects might have simple minds as well as still contain the relative locations of salient a set of triggerable innately represented landmarks, such as a large free-standing action sequences. tree, a forest edge, or a lake shore.) In effect, it remains a possibility that in- Gould (1986) reports, for example, that sects might exemplify the cognitive archi- when trained to a particular food source, tecture depicted in figure 1, only with an and then carried from the hive in a dark arrow added between ‘body states’ and ‘ac- box to new release point, the bees will fly tion schemata’ to subserve a dual causal directly to the food, but only if there is a route to behavior (see Figure 3).7 In what significant landmark in their vicinity when follows it will be argued that this is indeed they are released. (Otherwise they fly off the case, focusing on the minds of honey on the compass bearing that would previ- bees in particular. While this paper won’t ously have led from the hive to the food.) make any attempt to demonstrate this, it While other scientists have been unable to seems likely that the conclusion we reach replicate these experiments directly, in the case of honey bees will generalize Menzel et al. (2000) found that bees that to all navigating insects. In which case had never foraged more than a few meters from the nest, but who were released at Belief Practical reason random points much further from it, were able to return home swiftly. They argue that Desire this either indicates the existence of a map- Action like structure, built during the bees’ initial Percept Motor schemata orientation flights before they had begun Body foraging, or else the learned association of states vectors-to-home with local landmarks. But Figure 3: A Mind with Dual Routes to Action 214 / AMERICAN PHILOSOPHICAL QUARTERLY either way, they claim, the spatial repre- Menzel et al., 2000.) And they have yet sentations in question are allocentric rather another mechanism again for decoding the than egocentric in character. dances of other bees, extracting a repre- As is well known, honey bees dance to sentation of the distance and direction of a communicate information of various sorts target (generally nectar, but also pollen, to other bees. The main elements of the water, tree sap, or the location of a poten- code have now been uncovered through tial nest site; Seeley, 1995). patient investigation (Gould and Gould, Although basic bee motivations are, no 1988). They generally dance in a figure- doubt, innately fixed, the goals they adopt of-eight pattern on a vertical surface in the on particular occasions (e.g., whether or dark inside the hive. The angle of move- not to move from one foraging patch to an- ment through the center of the figure of other, whether to finish foraging and re- eight, as measured from the vertical, cor- turn to the hive, and whether or not to responds to the angle from the expected dance on reaching it) would appear to be direction of the sun for the time of day. influenced by a number of factors (Seeley, (E.g., a dance angled at 30o to the right of 1995). Bees are less likely to dance for di- vertical at midday would represent 30o west lute sources of food, for example; they are of south, in the northern hemisphere.) And less likely to dance for the more distant of the number of ‘waggles’ made through the two sites of fixed value; and they are less center of the figure of eight provides a likely to dance in the evening or when there measure of distance. (Different bee species is an approaching storm, when there is a use different innately fixed measures of significant chance that other bees might not waggles-to-distance.) be capable of completing a return trip. And Honey bees have a number of innately careful experimentation has shown that structured learning mechanisms, in fact. bees scouting for a new nest site will weigh They have one such mechanism for learn- up a number of factors, including cavity ing the position of the sun in the sky for volume, shape, size and direction of en- the time of day. (This mechanism—like the trance, height above ground, dampness, human language faculty—appears to have draftiness, and distance away. Moreover, an innate ‘universal grammar.’ All bees in dancing scouts will sometimes take time the northern hemisphere are born know- out to observe the dances of others and ing that the sun is in the east in the morn- check out their discoveries, making a com- ing, and in the west in the afternoon; Dyer parative assessment and then dancing ac- and Dickinson, 1994.) And they have an- cordingly (Gould and Gould, 1988). other such mechanism for learning, by Bees do not just accept and act on any dead reckoning, where things are in rela- information they are offered, either. On the tion to the hive. (Here it is ‘visual flow’ contrary, they evaluate it along a number that seems to be used as the measure of of dimensions. They check the nature and distance traveled; Srinivasan et al., 2000.) quality of the goal being offered (normally They may have yet another mechanism for by sampling it, in the case of food). And constructing a mental map from a combi- they factor in the distance to the indicated nation of landmark and directional infor- site before deciding whether or not to fly mation. (At the very least they have the out to it. Most strikingly, indeed, it has capacity for learning to associate land- been claimed that bees will also integrate marks with vectors pointing to the hive; communicated information with the rep- ON BEING SIMPLE MINDED / 215

resentations on their mental map, reject- the information states to create a poten- ing even rich sources of food that are be- tially unique behavior, never before seen ing indicated to exist in the middle of a in the life of that particular bee. It appears, lake, for example.8 indeed, that bees exemplify the architec- How should these bee capacities be ex- ture depicted in figure 3. In which case, plained? Plainly the processes in question there can be minds that are capable of just cannot be associative ones, and these forms a few dozen types of desire, and that are of bee learning are not conditioned re- capable of just a few thousand types of sponses to stimuli. Might the bee behav- belief.10 How simple minded can you be? iors be explained through the existence of Pretty simple. some sort of ‘subsumption architecture’ (Brooks, 1986)? That is, instead of having 6. STRUCTURE-DEPENDENT INFERENCE a central belief/desire system of the sort Recall, however, that the conditions on depicted in figure 3, might bees have a genuine mindedness that we laid down in suite of input-to-output modular systems, section 1 included not just a distinction one for each different type of behavior? between information states and goal states, This suggestion is wildly implausible. For but also that these states should interact (depending on how one counts behaviors) with one another to determine behavior in there would have to be at least five of these ways that are sensitive to their composi- input-to-output modules (perhaps dozens, tional structures. Now, on the face of it this if each different ‘goal’ amounts to a dif- condition is satisfied. For if one and the ferent behavior), each of which would have same item of directional information can to duplicate many of the costly computa- be drawn on both to guide a bee in search tional processes undertaken by the others. of nectar and to guide the same bee return- There would have to be a scouting-from- ing to the hive, then it would seem that the the-hive module, a returning-to-the-hive bee must be capable of something resem- module, a deciding-to-dance-and-dancing bling the following pair of practical infer- module, a returning-to-food-source mod- ences (using BEL to represent belief, DES ule, and a perception-of-dance-and-flying- to represent desire, MOVE to represent to-food-source module. Within each of action—normally flight, but also walking these systems essentially the same com- for short distances—and square brackets putations of direction and distance infor- to represent contents). mation would have to be undertaken. (1) BEL [nectar is 200 meters north of hive] The only remotely plausible interpreta- BEL [here is at hive] tion of the data is that honey bees have a DES [nectar] suite of information-generating systems MOVE [200 meters north] that construct representations of the rela- tive directions and distances between a (2) BEL [nectar is 200 meters north of hive] variety of substances and properties and the BEL [here is at nectar] hive,9 as well as a number of goal-gener- DES [hive] MOVE [200 meters south] ating systems taking as inputs body states and a variety of kinds of contextual infor- These are inferences in which the con- mation, and generating a current goal as clusions depend upon structural relations output. Any one of these goal states can amongst the premises.11 then in principle interact with any one of 216 / AMERICAN PHILOSOPHICAL QUARTERLY

It might be suggested that we have information before a decision can be moved too swiftly, however. For perhaps reached. So we can conclude that not only there needn’t be a representation of the do bees have distinct information states goal substance built explicitly into the and goal states, but that such states inter- structure of the directional information- act with one other in ways that are sensi- state. To see why this might be so, notice tive to their contents in determining that bees do not represent what it is that behavior. In which case bees really do ex- lies in the direction indicated as part of the emplify the belief/desire architecture de- content of their dance; and nor do observ- picted in Figure 3, construed realistically. ers acquire that information from the dance One final worry remains, however. Do itself. Rather, dancing bees display the the belief and desire states in question sat- value on offer by carrying it; and observ- isfy what Evans (1983) calls ‘the General- ing bees know what is on offer by sam- ity Constraint’? This is a very plausible pling some of what the dancing bee is constraint on genuine (i.e., composition- carrying. ally structured) concept possession. It tells It might be claimed, then, that what re- us that any concept possessed by a thinker ally happens is this. An observing bee must be capable of combining appropri- samples some of the dancing bee’s load, ately with any other concept possessed by and discovers that it is nectar, say. This keys the same thinker. If you can think that a is the observer into its fly-in-the-direction- F and you can think that b is G, then you indicated sub-routine. The bee computes must also be capable of thinking that a is the necessary information from the details G and that b is F. For these latter thoughts of the dance, and flies off towards the in- are built out of the very same components dicated spot. If it is lucky, it then discov- as the former ones, only combined together ers nectar-bearing flowers when it gets with one another differently. there and begins to forage. But at no point Now, bees can represent the spatial rela- do the contents of goal-states and the con- tionships between nectar and hive, and tents of the information-states need to in- between pollen and hive; but are they ca- teract with one another. pable of representing the spatial relation- This idea won’t wash, however. Although ships between nectar and pollen? Are bees the presence of nectar isn’t explicitly rep- capable of thoughts of the following form? resented in the content of the dance, it does BEL [nectar is 200 meters north of pollen] need to be represented in the content of both the dancer’s and the observer’s belief- If not, it may be said, then bees cannot be states. For recall that bees do not dance counted as genuine concept-users; and so even for a rich source of nectar that is too they cannot count as genuine believer/ far away (Gould and Gould, 1988). The desirers, either. distance-information therefore needs to be It is possible that this particular example integrated with the substance-information isn’t a problem. Foragers returning to a in determining the decision to dance. nectar site to find it almost depleted might Equally, observers ignore dances indicat- fly directly to any previously discovered ing even rich sources of nectar if the indi- foraging site that is nearby; including one cated distances are too great. So again, the containing pollen rather than nectar, if pol- distance information derived from the len is in sufficient demand back at the hive. dance needs to be integrated with the value But there will, almost certainly, be other relationships that are never explicitly rep- ON BEING SIMPLE MINDED / 217

resented. It is doubtful, for example, that Suppose, on the other hand, that bees do any scout will ever explicitly represent the not construct mental maps; rather they relations between one potential nest site learn a variety of kinds of vector informa- and another (as opposed to some such be- tion linking food sources (or nest sites) and lief being implicit in the information con- the hive, and linking landmarks and the tained in a mental map). So it is doubtful hive. Then the reason why no bee will ever whether any bee will ever form an explicit come to believe that one nest-cavity stands thought of the form: in a certain relation to another will have BEL [cavity A is 200 meters north of cavity B] nothing to do with the alleged absence of genuine compositional structure from the Rather, the bees will form beliefs about the bees’ belief states, but will rather result relations between each site and the colony. from the mechanisms that give rise to new Such examples are not really a problem bee beliefs, combined with the bees’ lim- for the Generality Constraint, however. ited inferential abilities. From the fact that bees never form beliefs For here once again, part of the explana- of a certain kind, it doesn’t follow that they tion is just that bees are only interested in a cannot. (Or at least, this doesn’t follow in small sub-set of the spatial relations avail- such a way as to undermine the claim that able to them. They only ever compute and their beliefs are compositionally struc- encode the spatial relationships between tured.) Suppose, first, that bees do con- desired substances and the hive, or between struct genuine mental maps of their landmarks and the hive, not amongst the environment. Then it might just be that locations of those substances or those land- bees are only ever interested in the rela- marks themselves. And nor do they have the tionships amongst potential new nest sites inferential abilities needed to work out the and the existing colony, and not between vector and distance relations amongst land- the nest sites themselves. But the same sort marks from their existing spatial beliefs. But of thing is equally true of human beings. these facts give us no reason to claim that Just as there are some spatial relationships bees do not really employ compositionally that might be implicit in a bee’s mental structured belief states, which they integrate map, but never explicitly believed; so there with a variety of kinds of desire state in such are some things implicit in our beliefs a way as to select an appropriate behavior. about the world, but never explicitly en- And in particular, the fact that the bees lack tertained, either, because they are of no the ability to draw inferences freely and pro- interest. Our beliefs, for example, collec- miscuously amongst their belief states tively entail that mountains are less easy should not exclude them from having any to eat than rocks. (We could at least pound belief states at all. Which is to say: these a rock up into powder, which we might facts about the bees’ limitations give us no have some chance of swallowing.) But un- reason for denying that bees possess simple til finding oneself in need of a philosophi- minds. cal example, this isn’t something one How simple minded can you be, then? would ever have bothered to think. Like- Pretty simple; and a good deal more simple wise with the bees. The difference is just than most philosophers seem prepared to that bees do not do philosophy. allow.12 University of Maryland 218 / AMERICAN PHILOSOPHICAL QUARTERLY

NOTES

1. One notable philosophical exception is Tye (1997). While the present paper shares some of Tye’s conclusions (specifically, that honey bees have beliefs and desires) it offers different argu- ments. And the main focus of Tye’s paper is on the question whether insects have phenomenally conscious experiences. This is quite another question from ours. Whether bees are belief/desire reasoners is one thing; whether they are phenomenally conscious is quite another. For a negative verdict on the latter issue, see Carruthers, 2000. It should also be stressed that the main question before us in this paper is quite different from the ones that formed the focus of Bennett’s famous discussion of honey bee behavior (Bennett, 1964). Bennett argued that bee signaling systems are not a genuine language, and that honey bees are not genuinely rational in the fully-fledged sense of ‘rationality’ that is distinctive of human beings. (His purpose was to build towards an analysis of the latter notion.) Our concern, rather, is just with the question whether bees have beliefs and desires. (On this question, Bennett expressed no clear opinion.) 2. Do informational states that do not interact with beliefs and desires deserve to be counted as genuine perceptions? Common sense suggests that they do. Ask someone whether they think that the fish sees the net sweeping towards it through the water and they will answer, ‘Of course, because it swims deftly in such a way at to avoid it.’ But ask whether the fish has thoughts about the net or its own impending capture, and many will express skepticism. Moreover, the ‘dual systems’ theory of vision (Milner and Goodale, 1995; Clark, 2002) suggests that the perceptual states that guide our own movements on-line are not available for belief or desire formation, either. Yet we wouldn’t deny, even so, that our detailed movements occur as they do because we see the shapes and orientations of the objects surrounding us. 3. Other animals navigate by the stars, or by using the Earth’s magnetic field. Night-migrating birds study the sky at night when they are chicks in the nest, thereby extracting a representation of the center of rotation of the stars in the night sky. When they later leave the nest, they use this information to guide them when flying south (in fall in the northern hemisphere) and again when flying north (in spring in the northern hemisphere). The representations in question are learned, not innate, as can be demonstrated by rearing chicks in a planetarium where they observe an artificially generated center of night-sky rotation. 4. Similarly, western scrub jays will update the representations on their map of cached foods, and behave accordingly, once they learn independently of the different decay rates of different types of food. Thereafter they access the faster-decaying caches first. See Clayton et al., 2003. 5. The same is surely true of humans. When one wants a beer and recalls that there is a beer in the fridge, and then sets out for the kitchen, one doesn’t explicitly represent one’s walking as the cause of one getting the beer. Rather, once one knows the location, one just starts to walk. 6. Bermúdez (2003), too, claims that there can be no genuine decision-making (and so no real belief/desire psychology) in the absence of representations of instrumental causality. And this seems to be because he, too, assumes that there are no forms of learning between classical kinds of conditioning and genuine causal belief. But given the reality of a variety of forms of spatial learning (reviewed briefly above), it seems unmotivated to insist that sophisticated navigation behaviors are not really guided by decision-making, merely on the grounds that there are no causal beliefs involved. 7. Note that if the ‘dual visual systems’ hypothesis of Milner and Goodale (1995) generalizes to other perceptual modalities and to other species, then there should really be two distinct ‘percept boxes’ in figure 3, one feeding into conceptual thought and decision making, and another feeding into the action schemata so as to provide fine-grained on-line guidance of movement. ON BEING SIMPLE MINDED / 219

8. In these experiments two groups of bees were trained to fly to weak sugar solutions equidis- tant from the hive, one on a boat in the middle of a lake, and one on the lake shore. When both sugar solutions were increased dramatically, both sets of bees danced on returning to the hive. None of the receiving bees flew out across the lake. But this wasn’t just a reluctance to fly over water. In experiments where the boat was moved progressively closer and closer to the far lake shore, more and more receiving bees were prepared to fly to it. See Gould and Gould, 1988. 9. Note that this satisfies the two criteria laid down by Bennett (1964, §4) for a languageless creature to possess beliefs. One is that the creature should be capable of learning. And the other is that the belief-states should be sensitive to a variety of different kinds of evidence. 10. The bee’s capacity for representing spatial relations is by no means unlimited. There is prob- ably an upper limit on distances that can be represented. And discriminations of direction are relatively crude (at least, by comparison with the almost pin-point accuracy of the Tunisian desert ant; see Wehner and Srinivasan, 1981). However, bees are also capable of forming a limited range of other sorts of belief, too. They come to believe that certain odors and colors signal nectar or pollen, for example (Gould and Gould, 1988). 11. Is there some way of specifying in general terms the practical inference rule that is at work here? Indeed there is. The rule might be something like the following: BEL [here is at x, F is m meters and no from x], DES [F] ® MOVE [m meters at no]. This would require the insertion of an extra premise into argument (2) above, transforming the first premise into the form, BEL [hive is 200 meters south of nectar]. 12. The author is grateful to his students from a recent graduate seminar on the architecture of the mind for their skeptical discussion of this material, and to Ken Cheng, Anthony Dickinson, Keith Frankish, Robert Kirk, and Mike Tetzlaff for comments on an earlier draft.

REFERENCES

Bennett, J. 1964. Rationality: An Essay Towards an Analysis. : Routledge. Bermúdez, J. 2003. Thinking without Words. Oxford: Oxford University Press. Brooks, R. 1986. “A Robust Layered Control System for a Mobile Robot.” IEEE Journal of Robotics and Automation, RA-2: 14–23. Carruthers, P. 2000. Phenomenal Consciousness: A Naturalistic Theory. Cambridge: Cambridge University Press. Clark, A. 2002. “Visual Experience and Motor Action: Are the Bonds Too Tight?” Philosophical Review, vol. 110, pp. 495–520. Clayton, N., N. Emory, and A. Dickinson. 2003. “The Rationality of Animal Memory: The Cog- nition of Caching.” In Animal Rationality, ed. S. Hurley. Oxford: Oxford University Press. Collett, T., and M. Collett. 2002. “Memory Use in Insect Visual Navigation.” Nature Reviews: Neuroscience, vol. 3, pp. 542–552. Damasio, A. 1994. Descartes’ Error. Picador Press. Davidson, D. 1975. “Thought and Talk.” In Mind and Language, ed. S. Guttenplan. Oxford: Oxford University Press. Dennett, D. 1987. The Intentional Stance. Cambridge, Mass.: MIT Press. . 1991. Consciousness Explained. Allen Lane. Dickinson, A., and B. Balleine. 1994. “Motivational Control of Goal-Directed Action.” Animal Learning and Behavior, vol. 22, pp. 1–18. 220 / AMERICAN PHILOSOPHICAL QUARTERLY

. 2000. “Causal Cognition and Goal-Directed Action. In The Evolution of Cognition, ed. C. Heyes and L. Huber. Cambridge, Mass.: MIT Press. Dickinson, A., and D. Charnock. 1985. “Contingency Effects with Maintained Instrumental Re- inforcement.” Quarterly Journal of Experimental Psychology, vol. 37B, pp. : 397–416. Dickinson, A., and D. Shanks. 1995. “Instrumental Action and Causal Representation.” In Causal Cognition, ed. D. Sperber, D. Premack, and A. Premack. Oxford: Oxford University Press. Dreyfus, L. 1991. “Local Shifts in Relative Reinforcement Rate and Time Allocation on Concur- rent Schedules.” Journal of Experimental Psychology, vol. 17, pp. 486–502. Dyer, F., and J. Dickinson. 1994. “Development of Sun Compensation by Honeybees.” Proceed- ings of the National Academy of Science, vol. 91, pp. 4471–4474. Evans, G. 1982. The Varieties of Reference. Oxford: Oxford University Press. Gallistel, R. 1990. The Organization of Learning. Cambridge, Mass.: MIT Press. . 2000. “The Replacement of General-Purpose Learning Models with Adaptively Spe- cialized Learning Modules.” In The New Cognitive Neurosciences (second edition), ed. M.Gazzaniga. Cambridge, Mass.: MIT Press. Gallistel, R., and J. Gibson. 2001. “Time, Rate and Conditioning.” Psychological Review, vol. 108, pp. 289–344. Gould, J. 1986. “The Locale Map of Bees: Do Insects Have Cognitive Maps?” Science, vol. 232, pp. 861–863. Gould, J., and C. Gould. 1988. The Honey Bee. Scientific American Library. . 1994. The Animal Mind. Scientific American Library. Harper, D. 1982. “Competitive Foraging in Mallards.” Animal Behavior, vol. 30, pp. 575–584. Heys, C., and A. Dickinson. 1990. “The Intentionality of Animal Action.” Mind and Language, vol. 5, pp. 87–104. Marcus, G. 2001. The Algebraic Mind. Cambridge, Mass.: MIT Press. Mark, T., and R. Gallistel. 1994. “The Kinetics of Matching.” Journal of Experimental Psychol- ogy, vol. 20, pp. 1–17. McDowell, J. 1994. Mind and World. Cambridge, Mass.: MIT Press. Menzel, R., R. Brandt, A. Gumbert, B. Komischke, and J. Kunze. 2000. “Two Spatial Memories for Honeybee Navigation.” Proceedings of the Royal Society: London B, vol. 267, pp. 961–966. Milner, D., and M. Goodale. 1995. The Visual Brain in Action. Oxford: Oxford University Press. Rolls, E. 1999. Emotion and the Brain. Oxford: Oxford University Press. Searle, J. 1992. The Rediscovery of the Mind. Cambridge, Mass.: MIT Press. Seeley, T. 1995. The Wisdom of the Hive: The Social Physiology of Honey Bee Colonies. Cam- bridge, Mass.: Harvard University Press. Tye, M. 1997. “The Problem of Simple Minds.” Philosophical Studies, vol. 88, pp. 289–317. Walker, S. 1983. Animal Thought. London: Routledge. Wehner, R. 1994. “The Polarization-Vision Project.” In The Neural Basis of Behavioral Adapta- tions, ed. K. Schildberger and N. Elsner. Gustav Fischer. Wehner, R., and M. Srinivasan. 1981. “Searching Behavior of Desert Ants.” Journal of Compara- tive Physiology, vol. 142, pp. 315–338.