The behavioral decision-making architecture

Markus Domeier* & Pierre Sachse*

* Institute of Psychology, Leopold-Franzens-University Innsbruck

ABSTRACT Decision-makers in real life often have to deal with different situational influences while making a decision. They don’t know the odds of the outcome of different options and thus make their decisions under uncertainty. Moreover, most real- life situations are fast changing and dynamic, and the decision-maker doesn’t always know the exact cause of a given cir- cumstance. This intransparency and interdependency of the decision’s different elements can lead to a high complexity of the situation (Schroda, 2000) and thus to a difficult decision. Potential consequences are, besides errors, cognitive in the decision-making process, which can lead to erroneous decisions. But why do these systematic unconscious effects occur so frequently and what makes them so robust? This paper investigates the mechanisms and processes which lead to biased decisions. Therefore, a Behavioral Decision-Making Architecture model is presented. It takes a closer look onto the interaction between the characteristics of complex situations (Schroda, 2000), the computational architecture of psycho- logical processes (PSI theory, Dörner & Güss, 2013), and the occurrence of cognitive biases (Carter, Kaufmann & Michel, 2007) as well as their behavioral consequences in the decision-making process. The model depicts these processes and provides an approach to explain the unconscious upside (positive influence on motivational needs) of cognitive biases.

Keywords Behavioral Decision-Making Architecture – PSI theory – Cognitive Biases – Erroneous Decisions – Real-life Decisions

1 Introduction and quickly became an entrepreneurial success story. With the introduction of the „Kodak Brownie“ in 1900, In everyday life, we come across a lot of decisions, every photographer could afford a camera for the price many small ones and some bigger ones. Usually, one of $ 1. The related sale of films secured the company would assume that when you do something on a daily a highly profitable business for a long time. Thus, for basis, you become better at it over time. Consequently, the following decades, Kodak was omnipresent in the over the years, we should master the art of decision- US and European markets. In 1988, the photo empire making, which allows us to make good decisions, es- had over 145,000 employees and only three years later pecially when they are important. A group of people their sales rose to nearly $ 20 billion (The Economist, who should be very experienced in decision-making 2012). are managers. A manager’s job is, in fact, to take de- However, in 2012, the former world market leader cisions on a frequent basis, so that this amount of was no longer able to pay its suppliers, employees, and practice should enable them to make good decisions partners. What had happened? How could a company, without any problems. But reality looks different as the that dominated the photo market with its numerous in- following example shows. ventions for over a century, go bankrupt? January 19, 2012 marked the temporary end of a 120- In the case of Kodak, a decisive point was the ad- year success story. With the filing for insolvency (in vent of digital photography, which took away a large accordance with Chapter 11 of the Bankruptcy Code), part of the market share for analog photography. Ignor- Eastman Kodak applied for creditor protection. Until ing this new technology was afterwards called Kodak’s the 1990s, Kodak was one of the five most valuable „cardinal error of the management board“ (Lehky, brands in the world. The company, which was founded 2012, p.1). Ironically, Kodak engineer Steven Sasson in 1892 by George Eastman, had a fabulous rise behind invented the first digital camera in-house in the 1970s.

2015 – innsbruck university press, Innsbruck Journal Psychologie des Alltagshandelns / Psychology of Everyday Activity, Vol. 8 / No. 2, ISSN 1998-9970 36 M. Domeier & P. Sachse

When Sasson presented his first functional pro- better way. These cases have striking patterns which totype of a digital camera to the management, they show up again and again. replied: „That’s cute – but do not tell anyone about Based on the example of the Eastman Kodak Com- it“ (Deutsch, 2008, p.1) – as he later reported to the pany, the question arises how such decisions can be New York Times. Despite the fact that with a weight explained. Let us assume that the managers were of 4 kg and a low resolution, the camera was still far well educated, intelligent and wanted the best for the away from mass production, its potential was already company, how can such decisions happen? Why did emerging. The managers of the Eastman Kodak Com- they only react when it was already far too late? An pany, however, completely underestimated this new explanation along the lines that the managers acted business field (dpa, 2012). For this reason, very little in an irrational way, is too simple and not satisfying. had been invested in the digital solution. With a price The question is: what are the underlying psychologi- of $ 20,000, Kodak’s first digital cameras were far too cal mechanisms behind this seemingly irrational be- expensive for the consumer. Regarding this, a former havior? Kodak manager told the WirtschaftsWoche (Eckl-Dor- na, 2010, p.2), „This was further developed, but not further promoted“. 2 Theoretical Background Although the company had been facing the digi- talization back then, Kodak entered the digital market The reason for such devastating decisions is often only in 2004. However, at this time, digital cameras based on automatic cognitive processes of the indi- had replaced traditional analog cameras to a large ex- vidual decision-maker (Dörner & Güss, 2013; Carter, tend and digital cameras themselves were already be- Kaufmann & Michel, 2007). These unconscious sys- ing threatened by the increasing popularity of mobile tematically recurring processes are called cognitive bi- phone cameras. In addition, the competitors among ases. This explains why, as Campbell, Whitehead and them Sony, Fuji, Canon, and Olympus already had a Finkelstein (2009) notice, experience, intelligence, huge share in the market of digital photography. For and good intentions may not necessarily avoid such Kodak, this competitive struggle ended with the filing biases and the associated sub-optimal decisions. Fiske of the bankruptcy petition (Gödecke, 2011). and Taylor (2013) provide a key reason for the occur- Christensen (2013) has raised the question ad- rence of these biases. They claim that people are cog- dressing the underlying reasons of such phenomena: nitive misers. Many decisions, especially in everyday Why do market leaders loose their market share in life, happen intuitively (Kahneman, 2003) or through such a significant way? Have they simply been mis- heuristics (Gigerenzer & Gaissmaier, 2011). These managed for many years? Christensen himself con- time- and effort-reducing decision-making strategies tradicts this assumption: „an alternative explanation, are mostly easy and quick solutions, but can lead to however, is that these failed firms were as well-run as fatal errors if, for example, the surrounding situation one could expect a firm managed by mortals to be – has changed. We want to take a closer look at this – as but that there is something about the way decisions are Dörner (2003) calls it – logic behind failures. There- made in successful organizations that sows the seed of fore, in the following, the important aspects of deci- eventual failure“(p. 6). Also, Campbell, Whitehead and sion-making in real-life are highlighted. Finkelstein (2009), according to the analysis of man- agement decisions, state „the daunting reality is that 2.1 Erroneous Decisions enormously important decisions made by intelligent, responsible people with the best information and in- There is a broad interest in the research question why tentions are sometimes hopelessly flawed“ (p.2). decisions fail. Nutt (2002) states: „Debacles highlight How is it possible that experienced and well- blunders. They offer insights into how a decision can trained managers and motivated founders make de- go wrong, why it went wrong, and what changes in cisions for which the outcome is sometimes far away decision-making practices could improve the chance from the actual needs? The decline of the Eastman Ko- of success“ (p.8). Thus, failed decisions offer the pos- dak Company is certainly an extreme case, but it serves sibility to deduce the mechanisms of decision-making. as a good example for the impact of wrong decisions. Not every failed decision has the same impact. Inevitably, companies all over the world repeatedly Therefore, Spychiger’s (2008) classification of errors make decisions that lead to suboptimal results. The can be applied to the results of failed decisions. On two problem is that unforeseeable factors come into play, dimensions, consequence and reversibility, the deci- which influence the results afterwards and couldn’t sions can be described. The four kinds of errors are have been prevented (Edwards, 1984). However, in clustered as big and bad errors (high consequences the analysis of wrong decisions there are also cases in and low reversibility), smaller errors, annoyances (low which people could have decided in another, perhaps consequences and low reversibility), smaller errors, The behavioral decision-making architecture 37

bagatelles (low consequences and high reversibility), (2001). At first a decision-maker has to be aware that and noticeable but not tragic errors (high consequenc- there is a problem and that a decision has to be made. es and high reversibility). Second, the situation is analyzed with the aim to gain When can we say that a decision has failed? Yates, further information. In the third phase, a decision is Veinott and Patalano (2003) defined 5 criteria for good taken on the basis of the found and weighted informa- decision-making. Due to their theory, an erroneous tion and the result is implemented. Several decision decision doesn’t fulfill one or more of these criteria. process models follow roughly this structure (e.g., Ba- The 5 different factors are the aim criterion (the deci- zerman & Moore, 2013). The PrOACT (acronym for the sion is congruent with the former objective of the deci- core-steps) approach of Hammond, Keeney and Raiffa sion), the need criterion (the decision fulfills the needs (1998) has eight elements. The core-steps, suggested of the decision-maker), the aggregated outcomes cri- by the authors, encompass: problem, objectives, alter- terion (the chosen option is the best one in the set of natives, consequences, and tradeoffs. Additionally, the all possible options) and the process cost criterion (the last three steps address the role of uncertainty, risk tol- decision needed in a minimal amount of resources). erance, and the consideration of linked decisions. Real-life decisions are affected by chance or en- Dörner (2012) describes five steps of a decision- vironmental influences and mostly made under un- making process (finding objectives, situation analysis, certainty. So when looking at erroneous decisions, a forecast, planning and deciding, action, and self-reflec- precondition is, that the circumstances, which lead to tion). Especially forecast is an interesting addition to the the error, could have been influenced, avoided, or pre- aforementioned sequence. It means the anticipation of dicted by the decision-maker (Zapf, Frese & Brodbeck, upcoming events or developments which might be im- 1999). Having enough degrees of freedom to make portant for the selection of the right alternative. In their the decision in another or even better way is another seven steps decision process model, Grünig and Kühn important precondition to classify a decision as erro- (2013) also include the step „Determining environ- neous. mental scenarios“, which has the same purpose. Moreover, the needed information must have been potentially available for the decision-maker. 2.3 Cognitive Biases Hacker and Sachse (2014) classify erroneous actions due to lack of information, which may be caused by As introductory mentioned, the reasons for failed deci- objectively missing information or deficiencies in the sions might be errors or biases. The difference is that usage of available information. An example for defi- an error occurs randomly, but a in a systematic ciencies may be the failure to use available information way and therefore can be prevented. There is a huge (overlooking, forgetting, skipping, subjective informa- amount of literature about cognitive biases (e.g., Kahn- tion reduction and mental capacity limits) or the incor- eman, 2011; Ariely, 2008; Ariely & Jones 2010). De- rect usage (at the point of orientation, the design of pending on which collection is referred to, there are programs, the fit in actions and the decision-making). over 100 types of this „systematic departure from the But only looking at the results can be distorting. normative standard of judgment“ (Kerr, MacCoun & Good decisions may have a bad outcome and bad de- Kramer, 1996, p.688). Many researchers tried to find cisions may turn out to have good results. Focusing logic categories to deal with the different biases. A on the outcome doesn’t seem to be a mutually exclu- clear taxonomy divides the biases into categories sive way to seperate good from bad decision-making which put the similar ones together and is mutual (Brown, Kahr & Peterson, 1974; Edwards, 1984). As the exclusive and exhaustive. These groups should differ outcome doesn’t perfectly indicate whether a decision from the others and should show a higher consistency. is good or bad, the focus lies on the decision process The struggle is that there is a lack of agreement on the when it comes to assessing decisions. A good decision definitions of biases and their groupings. Most of the process increases the chance of getting good results. taxonomies in the literature sorted the biases along a Error or biases in the process may result in suboptimal literature review. The categories represent the origin output. of their biases or the aspects they have in common. To name a few, with this method, Bhandari and Hassanein 2.2 Decision Process (2012) found three categories (Cognitive biases, Affec- tive biases and Conative biases), Stanovich, Toplak, Decision-making can be seen as a process with se- and West (2008) presented four categories (cognitive quential steps. There are different approaches about miserliness, override failure, mindware gap and con- how the exact sequence is shaped and which parts taminated mindware), Arnott (1998) developed six are included. Most of the decision process models categories (Memory biases, Statistical biases, Confi- show the three main phases „Situational Awareness“, dence biases, Adjustment biases, Presentation biases, „Situation Analysis“ and „Action“ as described in Jost and Situation biases) as did McFadden (1999; context, 38 M. Domeier & P. Sachse

reference point, availability, superstition, process and a decision? Dörner (2003) describes four different as- projection). pects which characterize a complex situation: com- Carter, Kaufmann and Michel (2007) identified 76 plexity, connectedness, intransparency, and dynam- biases in their literature review. They created a quali- ics. Moreover, people often lack of knowledge on how tative cluster analysis and grouped the biases with the to cope with the situation. In this tradition, Schroda Q-sort methodology. Finally, the authors assigned the (2000) analyzed construction tasks for engineers and 76 biases into nine categories (Availability cognition identified six aspects which describe the complexity of biases, Base rate biases, Commitment biases, Confir- a task, namely conflicting goals, complexity, transpar- matory biases, Control illusion biases, Output evalu- ency, freedom degrees, dynamics, and required knowl- ation biases, Persistence biases, Presentation biases, edge. Depending on these aspects, a problem can be and Reference point biases). The following table 1 is described as more or less complex. based on their meta-analysis. Higher levels in several aspects leads to a higher complexity of the task. Conflicting goals means that 2.4 Situation Characteristics a task has several goals, that can be contradicting. Complexity describes if a problem has dependent sub What characterizes real life decisions? In which situ- functions and their number and strength of connection ation do we have problems or struggle with making among themselves. The objective availability of infor-

Table 1: Description of Categories after Carter, Kaufmann and Michel (2007).

Category of Description Associated Biases Cognitive Bias

Availability Information is judged as more probable due Availability; Country of origin; Cultural cognition bias to their cognitive availability. Familiarity; Home; Imaginability; Recall

Base rate bias The decision-maker ignores the base rates Base rate; Recency effect; Subset and gives less relevant data more weigh.

Commitment bias Due to a commitment in the past (time, Aversion to regret; Concorde fallacy; Endow- money, effort, resources etc.), the decision- ment; Escalating commitment; Escalation; Loss maker sticks with the option he invested in, aversion; Sunk costs fallacy even when it is not the best possible option.

Confirmatory bias To confirm ones own hypotheses or opinion Aversion to ambiguity; ; Be- about specific facts, only the information lief; Confirmation; Confirmation evidence; Con- are searched which get along with the prior firmatory; Desire; Fact-value confusion; Halo opinion, others are ignored. Moreover, oth- effect; Prior hypothesis bias; Selectivity; Self- er information is interpreted in the same fulfilling prophecy; Wishful thinking manner.

Control illusion bias Random events are weighted in favor of the Attenuation; Chance; Completeness; Complexi- accuracy or certainty of the own judgment. ty; Conjunction; Control; Correlation; Disjunc- Unrealistic assumptions may be the conse- tion; False consensus; Gambler’s fallacy; Hot quence. hand fallacy; Impact; Law of small Numbers; Magical thinking; Overconfidence; Planning fal- lacy; Sample; Similarity; Test

Output The extend to which past events could have Egocentric; Hindsight; ; Self evaluation bias been predicted is overestimated or reasons serving; Success; Testimony are misattributed in retrospective.

Persistence bias An option is chosen with a higher probabili- Habit; Persistence; Status quo ty when it has been chosen before.

Presentation bias The presentation of the stimulus material Contrast; Framing; Frequency/redundancy influences the perception of subsequent in- Issue valence; Mere exposure effect; Mode; Or- formation in a systematic way. der; Primacy effect; Scale; Search; Series positi- on effect; Von Restorff Effect

Reference point bias Judgment and evaluations are influenced Anchoring and adjustment; Conservatism; First by initial positions and thus are not adju- impression; Non-linear extrapolation; Refe- sted in a sufficient way. rence; Regression The behavioral decision-making architecture 39

mation leads to transparency and vice versa to intrans- (p.306). A more detailed explanation of the quad struc- parency if the information is not available or account- ture is described in Dörner et al. (2002). able. Degrees of freedom are defined as the amount PSI refers to different needs. They can be divided of different attempts of solutions available to the deci- in three categories. Besides physiological needs (need sion-maker. A dynamic problem is subject to temporal for food, sleep, and sexuality), there are informa- changes of a decision’s future impact. However, this tional needs (need for competence and certainty) and impact is predictable only to a limited extent. The re- a social need (need for affiliation). Dörner and Güss quired knowledge is divided into knowledge about the (2013) postulate that every human motivation can be conditions and methods of processing a problem (Sch- explained through the aforementioned five needs. roda, 2000). In comparison with the findings of Reiss (2004), who Most of these aspects can also be found in the identified 16 human motives from a survey with more features of Naturalistic Decision Making (Klein, 2008). than 20,000 participants, Dörner and Güss (2013) con- Von der Weth (2001) takes individual differences into clude that these 16 motives can be summarized within account. He proposes a description of complexity on their model of five basic needs, namely “existential three levels. On the job level, objective sources of com- needs (eating, physical activity, tranquility), sexuality plexity are analyzed. The requirement level defines (romance), affiliation (acceptance, honor, social con- complexity in relation to performance prerequisites. tact, family), certainty (order, savings, curiosity, ideal- In the experience level, the individual representation ism), and competence (power, status, independence, of complexity is considered. vengeance)“ (p.302). In PSI, these needs are symbolized as tanks. These 2.5 PSI Theory tanks have an in- and an outflow. Related positive sig- nals lead to an inflow and thus to a higher level in the How is the human mind working when solving prob- specific tank, related negative information in turn can lems in real-life situations? How does it process in- lead to an outflow. Moreover, there is consumption formation and make decisions? With the PSI theory over time – a positive inflow does not have an ever- (named after the 23rd letter in the greek alphabet lasting effect – so that there is always a continuous which is associated with psychology), Dörner presents drain. On the one hand, a high level in a tank leads an architecture of the human mind which attempts to to a pleasure signal. On the other hand, a low level explain human behavior with the interaction of cogni- or the anticipation of a further loss leads to a displea- tion, motivation, and emotion (Dörner & Güss, 2013; sure signal. Low levels in the tanks indicate that one or Dörner, 1999). He compares the model with other more needs should be fulfilled. According to the mo- cognitive architectures like SOAR (States, Operators tive strength of every specific need, the one with the and Results; Newell, 1987) and ACT-R (Adaptive Con- highest product is selected and becomes the dominant trol of Thought – Rational; Anderson, 1990). The PSI motive (for the detailed calculation see Dörner & Güss, theory differs from the before mentioned models as it 2013). When talking about needs, we focus on the in- attempts to integrate cognitive, motivational, and emo- formational and social needs, which are explained in tional processes as well as their interaction into one more detail hereinafter. architecture. For a detailed explanation of the model, The need for competence is described as the ex- see Dörner and Güss (2013). tent to which a person feels capable of solving prob- Cognition in PSI is described with quads. Quads lems in his or her environment (Dörner, 2012). A high are defined as one central neuron that is surrounded level in the tank for competence means active control by and connected to four other neurons. These four of the situation and the ability to cope with it (Dörner, neurons are pointing forward, backward, downward, Gerdes & Hagg, 2008). The need for competence can and upward, respectively. When a need triggers a goal also be described as the need for power, control, or that could fulfill the need, a search process is initi- autonomy and is related to self-confidence and self- ated. The upward neuron looks for contexts that may worth (Dörner & Güss, 2011). contain a certain goal (e.g., searching for a restaurant The competence tank is filled by the fulfillment of when craving a burger). When the context is found, other needs, when the person experiences success, es- the backward neuron is activated, looking for success- pecially in difficult situations, and in general through ful behavior tendencies from the past to reach the goal. the perception of efficacy signals. The tank is emptied If such a behavior tendency is stored in memory, it can by failed attempts to fulfill needs and by failure signals be executed, if not, a new behavior sequence has to in general. be created. This is what Dörner and Güss (2013) de- The need for certainty is described as the need for scribe as planning. „One adjusts to the demand of the passive control of the environment. That means know- situation before the situation actually has happened“ ing or being able to predict what is going to happen in 40 M. Domeier & P. Sachse

the future and the meaning for the own needs (Brügge- 2.6 Synthesis mann, Strohschneider & Rek, 2006). A confirmation of ones own knowledge of the future and the elimination The decision cases, to which the following model is of uncertainty is targeted. Also, the need for efficacy of applied, are erroneous decisions. They are defined as one’s actions is described here. The tank is filled when decisions which hurt one or more criteria for a good predicted events occur or when an action has expected decision from Yates, Veinott, and Patalano (2003), could consequences. It is emptied by events which are unex- have been influenced or prevented (Zapf, Frese & pected or new or where the probability of occurrence Brodbeck, 1999), and where the needed information is unknown (Dörner & Güss, 2013). had been potentially available to the decision-maker The need for affiliation is understood as a need for (Hacker & Sachse, 2014). Cases which have high con- legitimacy signals, which means accordance with the sequences and aren’t reversible are the most interest- norms of a social group. The need gets higher through ing ones (Spychiger, 2008). the laps of time. The tank is filled with group conform We follow the decision process steps of Dörner behavior or the feeling of acceptance in the group and (2012) in a slightly adapted way. To uncover the mech- from positive signals as a consequence of supportive anism behind decisions which went wrong, we take a and/or helping behavior as in general by signals of le- closer look at the systematic occurrence of cognitive gitimacy. The tank is emptied when people experience biases. Therefore, the taxonomy of Carter, Kaufmann rejection, exclusion or a lack of recognition. and Michel (2007) provides a good basis. To detect the Emotion plays a special role within the PSI theory. complexity of situations in real-life and its influence It is not seen as a process besides motivation and cog- on the decision-making process, the characteristics of nition, but as a psychological macro. This means an Schroda (2000) are used. overarching model of organization. Dörner and Güss With their holistic approach, which integrates cogni- (2013) sum up: „As we will see, motivations determine tive, motivational, and emotional aspects as well as what must be done, emotions determine how it is to be their interaction in one theory, the PSI theory (Dörner, done“ (p.307). 1999; Dörner & Güss, 2013) provides the fundamental There is a need to explain behavior with emo- computational mechanisms. tions. Without them, no model can provide any good As there is no model, the authors are aware of, prediction. By looking for a satisfactory definition of which covers all of the above-mentioned aspects and emotions, Dörner and Güss take a closer look at the processes of erroneous real-life decisions, the Behav- work of Ekman (1992, 1999) who proposed basic emo- ioral Decision-Making Architecture was designed. The tions, and the early work of Wundt (1896), who char- goal is to provide a holistic model that describes and acterized emotions as states that are organized along explains erroneous real-life decisions. The model in- three different dimensions (pleasure-displeasure, tegrates important aspects like the influence of the tension-relaxation, and excitement-inhibition/tran- situation on cognitive, motivational, and emotional quilization). Dörner and Güss (2013) conclude that the processes, depicts the dynamic of the decision process approach of simply dividing emotions into categories as well as the effects of the cognitive biases. In the fol- (Basic emotions) doesn’t provide a sufficient explana- lowing, the Behavioral Decision-Making Architecture tion, as this approach does not clarify what emotions is described in more detail. really are. In PSI, emotion is seen as a modulation of cognitive-motivational processes in combination with specific behavior tendencies. To explain human 3 The Behavioral Decision-Making Architecture behavior, these three entities are inextricably linked. This view of emotion is similar to the one’s of Scherer The present model (Figure 1) attempts to combine the (2009), who defines emotion as “an emergent, dynam- PSI theory and the research on cognitive biases in or- ic process based on an individual’s subjective apprais- der to explain the occurrence of erroneous decisions al or significant events” (p.1307). with the interaction of situation, cognition, motivation, The PSI theory was validated by the prediction and emotion. The goal is to provide an architecture of human behavior, by the construction of artificial that explains real-life decisions with a negative out- agents (Dörner, Gerdes & Hagg, 2008), by analyzing come due to failures – conscious or unconscious – in historical events (Dörner & Güss, 2013), and by experi- the decision-making process. mental comparisons with human behavior (Dörner et These errors occur due to two reasons (Dörner, al., 2002). 2003; Schaub, 1996). First, there are cognitive rea- sons. Humans have a bounded processing and a lim- ited memory capacity. This leads them to think eco- nomically. The consequence is that complex systems are simplified and steps in the decision process are The behavioral decision-making architecture 41

Figure 1: The Behavioral Decision-Making Architecture.

skipped. Second, there are motivational reasons. De- in more detail. Plus signs mean positive influence, mi- pending on the kind of information flowing in, the mo- nus signs negative influence (+/-). tivational processes are influenced. The most impor- tant ones, such as the need for competence, the need 1. Situation-Characteristics. The characteristics of for certainty, and the need for affiliation are building the situation influence the cognitive needs (+/-), different motives. These motives can influence our which are represented by the three tanks (Affilia- cognition, and therefore our emotions. In the adapted tion, Competence, and Certainty). This influence PSI model (Dörner & Güss, 2013), emotions are seen can lead to an inflow (positive information) or to as a modulation of cognitive-motivational processes an outflow (negative information). The situation equipped with specific behavior tendencies. The influ- is perceived as more demanding when it contains ence of needs on the cognition may increase the occur- increasing conflicting goals, connected subfunc- rence of cognitive biases (Carter, Kaufmann & Michel, tions (complexity), intransparency, dynamics, a 2007; Dörner, 2003) – confirmatory biases, control illu- lot of degrees of freedom and demands of knowl- sion biases, commitment biases etc. – in the decision- edge (Schroda, 2000). According to the levels por- making process (Dörner, 2012) with the aim to cope posed by von der Weth (2001), the characteristics with the uncertainty and complexity of the situation. of the situation are an objective source of com- As a consequence, instead of satisfying the decision’s plexity. demands, only the fulfillment of personal needs may 2. Need-Indicator. The context of the situational in- be addressed by the decision. Biased cognition can be formation gets more important and moreover the a personal short-term advantage (optimism, protection meaning for the decision-maker. Due to its com- of competence, keeping up a positive self-perception plexity, is the situation a threat to the need for etc.) – but in the long run, it might lead to suboptimal competence, as the person is not feeling capable decisions. to cope with the situation? In addition to that, is the decision-maker loosing passive control and 3.1 Description of the Process the ability to predict future events? When other people are involved, the complex decision could The Behavioral Decision-Making Architecture propos- also be a threat to the need for affiliation as the es that the process is divided into eight steps (see Fig- status in the group might be in danger. The deci- ure 1). The following paragraphs explain those steps sion-maker’s interpretation of the situation (see 42 M. Domeier & P. Sachse

von der Weth, 2001, requirement and experience ing the perception of the situation. When its high, level) leads to an in- or outflow and thus to a devi- the situation is analyzed and compared very ation of the level in the different tanks. If the level precisely, when its low, the situation is detected falls under a specific set point (negative informa- more roughly. Inhibition is the counterpart of the tion or consumption over time), a need is indicat- resolution level. A high inhibition has several af- ed. The kind of need depends on the information fects on planning, like a narrow memory search, flowing in. An unsatisfied need also means a dis- a decrease in the number of alternatives, and no pleasure signal for the decision-maker. adequate attention to long-term or side effects 3. Need-Selector. To satisfy the needs and to fill up (Dörner & Güss, 2013). Possible consequences the tanks, an actual need is selected out of the are that important aspects are forgotten, over- several tanks and their combinations. This need looked, missed, or skipped for other reasons, and or an amalgam of needs become an action lead- moreover that myopic decisions occur (Hacker ing motive. & Sachse, 2014). The selection threshold is posi- 4. Decision Process. The different steps of the deci- tively influenced which means that the current sion-making process are used to reduce the de- motive stays action leading. Thereby, the system viation in the tanks from its targeted set point. At is stabilized, as the different motives are not over- first the problem of the situation has to be rec- lapping each other; however, it gets less sensitive ognized in order to set goals which should be to opportunities and threats. achieved. Followed by the analysis of the situa- 7. Increasing Measures/Behavior Tendencies. The tion. The future development of events is fore- chosen needs trigger certain behavior tenden- casted and depending on this, alternatives are cies. To reduce the difference in the motivational generated and weighted. The steps can be taken tanks or prevent the tanks from further drain, sequentially, but as the whole cycle has several these tendencies work unconsciously and can recurring phases, some steps might also be left lead to cognitive biases through overly optimistic out or skipped. Taking steps back is also possible. or pessimistic evaluations, erroneous evaluations 5. Cognitive filter. The steps aim at increasing an in- of outcomes, and probabilities and the disregard sufficient present tank level to an aspired higher of information or alternatives (Carter, Kaufmann target level. For example, with more information‚ & Michel, 2007; Dörner & Güss, 2011). uncertainty can be reduced. However, if there is The subsequent biases take part in the different too much information, which the decision-mak- steps of the decision-making process and can er can’t cope with, uncertainty increases. The lead to a quick positive inflow (dashed line) in the cognitive filter can reflect the information from tanks. the decision-making process, depending on its 8. Choice. When the need-indicator is satisfied, a resolution level (see step 6). This process of self- choice is made. This point is similar to the ef- reflection can reduce competence and certainty fect which Thompson, Turner, and Pennycook in the short run. However, in the long run, it can (2011) call the „Feeling of Rightness“ (FOR). It is increase the level in the tanks as past failures are described as a metacognitive experience which analyzed and potentially avoided (Dörner & Güss, signals if additional analysis is needed (p.107). 2011). Thus, the process itself can lead to in- or During a constant monitoring of the process, the outflow in the tanks. motivational tanks can be filled up or emptied. 6. Internal Modulation. The assessment of informa- The search continues until a set point is reached tion flowing in and the selected need (and thus and the Feeling of Rightness sets in. Moreover, the indication of a demand) can lead to a modifi- Ackerman (2014) found out that this set point cation of the decision-maker’s arousal (Dörner & lowers itself over time, (Diminishing Criterion Güss, 2013). This internal modulation depends on Model) when no satisfying answer could be found the competence (assessment of ones own abilities yet. When the satisfaction comes from a close to cope with the situation), the importance, and research and a good process, in which every im- the urgency of the current motive, the menace portant aspect has been analyzed and weighted, for the own existence and the relative importance the needs meet the decision-demands, and conse- of the current motive (Dörner, Gerdes & Hagg, quently, a „good“ choice – in relation to the deci- 2008). Low level in the tanks increase the acti- sion process – is made. If there is satisfaction, but vation, as the uncomfortable situation should be it is based on biased information the needs don’t solved as soon as possible. Thus, a higher arousal match the decision-demands (Dörner & Güss, increases the readiness to act, but also leads to 2013). In this scenario, an erroneous choice can a higher inhibition and a lower resolution level. be the consequence. The resolution level describes the accuracy dur- The behavioral decision-making architecture 43

The model describes a big circle with recurring In many cases, the confrontation with the situation is phases. During a decision, this circle can be run avoided and no model is created. But even when there through for several times on its different paths un- is a model but it doesn’t fit the situation, it is often ar- til the tanks have reached their specific set point. gued that that one’s own model is good but the circum- The motivational tanks constantly monitor every stances aren’t. This behavior is described as immuniz- action. These tanks are influenced externally by ing marginal conditioning. When building a model, the the characteristics of the situation and internally selection of information that is equitable with one’s by the decision process. The steps of the decision own information can lead to an incomplete represen- process are mostly conscious ones, while the in- tation of the situation. An incomplete representation ternal modulation and the increasing measures of the situation leads to a behavior in which symptoms are unconscious at a large scale. The Behavioral instead of causes are processed. Decision-Making Architecture describes the ori- Besides a solid representation the development of gins of erroneous decisions within the decision the situation has to be forecasted in order to adjust the process. decision (forecasting). Complex decisions are mostly dynamic and fast changing. A good forecast attempts 3.2 Behavioral Consequences in the Decision to take these aspects into account. A purely linear con- Process tinuation of the current situation can lead to errors as temporal change and the development of important Because of the described reasons for errors when variables are ignored. Another source of errors – when faced with complex situations – bounded capacity of faced with complexity – is the orientation towards a processing and memory, motivational influences – known variable, while other (unknown) ones are ig- there are behavioral tendencies that can be observed nored. At last, inappropriate optimism may result in in the decision process (the following examples refer bad decisions. This behavior, resulting from wishful to Dörner, 2003; Detje, 1996; Dörner, 2014). thinking, ignores given probabilities and draws an Frequent errors in the stage of setting goals are overoptimistic course of events. the overevaluation of the current motive, encapsu- The phase of planning describes the mental an- lation, repair-service behavior, or the inaccurate or ticipation of action steps to reach a certain goal. Within missing setting of objectives. In the first case, a current a cross-linked system an action has one or more con- motive leads to the fulfillment of an unimportant goal, sequences. These consequences are decisive when it the important one is ignored even when it is relevant. comes to good decisions. A central error in planning When faced with complex problems, only manageable is not considering long-term and side-effects or fric- problems are solved while no attention is paid to the tions (occurring obstacles in planning). Thus, no alter- bigger picture. This leads to an encapsulation due to a natives are generated if an option is omitted. Holding high level of uncertainty. Another consequence is the on to known planning strategies can also lead to bad repair-service behavior. Only the current problems are decisions. This methodism takes place when familiar solved – more important ones, which aren’t urgent yet, strategies are used unreflected to a new situation with are not paid any attention to. An inaccurate or missing different characteristics. If the decision-maker avoids setting of objectives and priorities is the reason behind the confrontation with the complex situation, this can this effect. lead to what is called a horizontal or a vertical escape. A main error is not to build a model of the situ- The horizontal escape describes a behavior that ation (situation analysis). Thus, the situation and its is characterized by a retreat into a well-known field of developments are hard to predict and long-term and planning. Thus, unknown and uncertain aspects are side effects are ignored. Simplified models are also ignored. The vertical escape is planning within ones problematic when every component of a complex sys- own worldview. Thereby, the planning seems to fit tem is reduced to one causing variable. This central perfectly to the own and biased picture one wants to reduction leads to an incomplete representation of the draw from reality. Another misleading strategy is the problem. intuition actionism. Solely relying on ones own feeling Another main error is the incorrect, simplified can be a good strategy in an environment that is pre- or missing representations of the situation. When dictable and has learning potential (Klein, 2008), but in the complexity is high, a model helps to understand complex situations which don’t show these character- interdependencies between the different parts of the istics, this behavior carries the risk of an insufficient decision and to recognize long-term and side-effects. decomposition of the problem. However, in order to deal with the complex situa- As described in the model, these behavioral con- tion, models are often simplified and reduced to one sequences and their origins also have positive effects causing variable. This central reduction leads to an on the self-concept, the feeling of competence and cer- incomplete and wrong representation of the problem. tainty, affiliation to others, the self-protection etc., and 44 M. Domeier & P. Sachse

thus serve a reason. This is what makes these effects Decision-Making Architecture, a deeper insight is pro- so robust in decision-making. The problem is when vided. Kodak’s strategy to stick with the analog photog- the decision-maker is not aware that these mostly un- raphy for too long can be categorized as an erroneous conscious positive effects take place at the expense of decision with high consequences and without revers- the decision’s quality. ibility. According to statements in the Forbes Magazine by the former Kodak manager Vince Barabba, the in- formation on the development of digital photography 4 Limitations had been present at that time (Mui, 2012). Back then in 1981 – when there was a chance to change the course Real-life decisions themselves are complex and have – Barabba was the Head of Market Intelligence. After a lot of different influences. Therefore it can be dif- Sony launched the first digital camera on the mar- ficult to identify the mechanisms and processes which ket, one of Kodak’s largest retailers asked if they did lead to erroneous decisions. The model does not claim not worry about the digital photography. Thereupon to be a complete representation of the environment Barabba conducted a profound analysis and compared and the cognitive architecture. However, it attempts to the development of the analog with the digital photog- contain the most important processes in order to de- raphy. pict real-life decisions and the reasons for failure. A retrospective analysis of decisions can be problematic „The results of the study produced both ‘bad’ as decision-makers might forget things or reinterpret and ‘good’ news. The ‘bad’ news was that the situation (Weingardt, 2008). Nonetheless, only af- digital photography had the potential ca- terwards, a behavior can be labeled as erroneous and pability to replace Kodak’s established film short- and long-term advantages can be separated. A based business. The ‘good’ news was that it process of self-reflection can support the insight into would take some time for that to occur and the processes leading to a wrong decision (Spychiger, that Kodak had roughly ten years to prepare 2008). It will be the focus of another study to reveal the for the transition.“ (Mui, 2012, p.2) exact relations on how the decision’s situation influ- ences the motivational needs. Moreover a deeper look As it turned out, the management did not use this time will be taken on the effect of the deviation in the tanks properly. Almost eight years later, the CEO Kay R. With- and the occurrence of biases or errors. more announced that his company, which is known for its films in the yellow boxes, will continue to focus on its original core business: „We defined our focus too 5 Conclusion broadly (...). Now we are focusing on the sectors we want to be in. We are defining what we mean when we As described, cognitive biases can influence the mo- say imaging, chemicals and health“ (Holusha, 1989, tivational tanks and lead to an increased tank level. p.2). When he retired, Withmore’s successor George The question whether cognitive biases are irrational MC Fisher still relied on Kodak’s established business depends on the model they are compared to. In rela- strategy: „Maybe my real failure is that I haven’t com- tion to the goal of the decision, cognitive biases in the municated how powerful our digitization strategy re- decision-making process may have the consequence ally is“ (Deutsch, 1999, p.2). that the goal is not met, even if the decision-maker The retrospective analysis of this case shows that had the best intentions in mind and wanted to achieve the information had been available and the decision it. According to a rational view of the decision-maker, could have been made in a better way. But when look- this seems quite irrational. The cognitive biases, how- ing at the statements of the CEOs, the logic behind ever, fulfill another function. With filling up the tanks their behavior emerges. The advent of digital photo- of certainty, competence, or affiliation, they keep us in graphy meant a huge change for the company (threat the game in complex and threatening situations. The for the certainty because they couldn’t predict what is unconscious usage of biases gives a short cut to the going to happen). Kodak was world leading in the field thinking process, is less effortful and fulfills motiva- of photography (threat for the competence because the tional functions associated with the avoidance of un- analog photography would be replaced), employed a certainty, keeping up a positive self-image, making a lot of chemists and developers which were specialized competent impression and thus strengthening the af- in the analog field (threat for affiliation if they are not filiation to others (Dörner, 2003; Dörner & Güss, 2011; needed anymore) and had huge chemical installations Dörner, 2012). for the development of the films. What did this situa- Introductory, we asked for the logic behind the tion provoke? failure of Eastman Kodak’s downfall. If we look at When looking for an explanation, three cognitive this case again from the viewpoint of the Behavioral biases offer the most elucidation: Confirmation Bias, The behavioral decision-making architecture 45

Sunk Cost Effect, and Overconfidence Effect. Kodak’s Campbell, A., Whitehead, J. & Finkelstein, S. (2009). leadership ignored the information about the threat Why good leaders make bad decisions. Harvard and highlighted the advantages of the analog photo- business review, 87 (2), 60-66. graphy (Confirmation Bias). As the leading company Carter, C. R., Kaufmann, L. & Michel, A. (2007). Behav- in the analog photography, Kodak had invested a lot of ioral supply management: a taxonomy of judg- resources in this field and therefore ignored the con- ment and decision-making biases. International flicting digital solution for too long (Sunk Cost Effect). Journal of Physical Distribution & Logistics Man- Moreover, there was an overestimation of Kodak’s agement, 37 (8), 631-669. strategy regarding the future challenges (Overconfi- Christensen, C. (2013). The innovator’s dilemma: when dence Effect). What was the behavioral consequence? new technologies cause great firms to fail. Water- Important aspects had been ignored, there was no suf- town: Harvard Business Review Press. ficient consideration of long-term effects and alterna- Detje, F. (1996). Sprichwörter und Handeln. Eine psy- tives to the analog photography had been developed chologische Untersuchung. Frankfurt/M.: Lang. far too late. Deutsch, C. H. (1999). Chief Says Kodak Is Pointed in the This behavior had some short-term advantages. Right Direction. New York Times. Retrieved from The competence was protected and a positive self-im- http://www.nytimes.com/1999/12/25/business/ age was kept. However, in the long run, it led to the chief-says-kodak-is-pointed-in-the-right-direc- downfall of the company. A comparison of the „Econ- tion.html?src=pm&pagewanted=1 (10/24/2015). omist“ (2012) dramatically highlights this decent: Deutsch, C. H. (2008). At Kodak, Some Old Things „Strange to recall, Kodak was the Google of its day“ Are New Again. New York Times. Retrieved from (p.1). http://www.nytimes.com/2008/05/02/tech- nology/02kodak.html?pagewanted=all&_r=0 (10/24/2015) References dpa (2012). Kodak entlässt 200 weitere Mitarbei- ter. Handelsblatt. Retrieved from http://www. Ackerman, R. (2014). The diminishing criterion model handelsblatt.com/unternehmen/industrie/insol- for metacognitive regulation of time investment. venter-fotopionier-kodak-entlaesst-200-weitere- Journal of experimental psychology: General, 143 mitarbeiter/7195690.htm (10/24/2015). (3), 1349-1368. Dörner, D. (1999). Bauplan für eine Seele. Reinbek: Ro- Anderson, J. R. (1990). The adaptive character of wohlt. thought. Hillsdale, NJ: Erlbaum. Dörner, D. (2003). Die Logik des Mißlingens (2nd rev. Ariely, D. (2008). Predictably irrational. New York: ed.). Reinbek: Rowohlt. HarperCollins. Dörner, D. (2012). Emotion und Handeln. In P. Badke- Ariely, D. & Jones, S. (2010). The upside of irrationality: Schaub, G. Hofinger & K. Lauche (eds.), Human The unexpected benefits of defying logic at work Factors (pp. 94-112). Berlin: Springer. and at home. New York: Harper. Dörner, D. (2014). Torheit oder wie kluge Leute sich Arnott, D. (1998). A taxonomy of decision biases. Tech- selbst verdummen. In P. Sachse & E. Ulich (eds.), nical Report 1/98, School of Information Manage- Psychologie menschlichen Handelns: Wissen und ment and Systems, Monash University, Caulfield. Denken – Wollen und Tun. (pp.329-338). Leng- Bazerman, M. & Moore, D. A. (2013). Judgment in Man- erich: Pabst. agerial Decision Making (8th ed.). New York: John Dörner, D. & Güss, C. D. (2011). A psychological anal- Wiley & Sons. ysis of Adolf Hitler’s decision making as Com- Bhandari, G. & Hassanein, K. (2012). An agent-based mander in Chief: Summa confidentia et nimius framework for investment decision- metus. Review of General Psychology, 15, 37-49. support systems. Behaviour & Information Tech- Dörner, D. & Güss, C. D. (2013). PSI: a computational nology, 31 (5), 495-507. architecture of cognition, motivation, and emo- Brown, R. V., Kahr, A. S. & Peterson, C. (1974). Decision tion. Review of General Psychology, 17 (3), 297- analysis for the manager. New York: Holt Rinehart 317. & Winston. Dörner, D., Gerdes, J. & Hagg, J. (2008). Angst und an- Brüggemann, U., Strohschneider, S. & Rek, U. (2006). dere Gefühle oder: Sind Gefühle berechenbar? Die PSI-Theorie: Psychologische Grundlagen für Psychologie des Alltagshandelns / Psychology of das Simulationsmodell eines Nautikers. Bamberg: Everyday Activity, 1 (1), 44-56. Institut für Theoretische Psychologie der Otto- Friedrichs-Universität. 46 M. Domeier & P. Sachse

Dörner, D., Bartl, C., Detje, F., Gerdes, J., Halcour, D., Klein, G. (2008). Naturalistic decision making. Human Schaub, H. & Starker, U. (2002). Die Mechanik des Factors: The Journal of the Human Factors and Seelenwagens. Eine neuronale Theorie der Hand- Ergonomics Society, 50, 456-460. lungsregulation. Bern: Hans Huber. Lehky, A. (2012). Kodak: Wie man einen Konzern an Ekman, P. (1992). An argument for basic emotions. die Wand fährt. Die Presse. Retrieved from http:// Cognition and Emotion, 6, 169-200. karrierenews.diepresse.com/home/karriere- Ekman, P. (1999). Basic Emotions. In T. Dalgleish & M. trends/725362/Kodak_Wie-man-einen-Konzern- Power (eds.), Handbook of cognition and emotion an-die-Wand-faehrt (10/24/2015). (pp. 45-60). Sussex, UK: Wiley. McFadden, D. (1999), „Rationality for economists?“, Economist, The (2012). The last Kodak moment? The Journal of Risk and Uncertainty, 19 (1), pp. 73- Economist. Retrived from http://www.economist. 105. com/node/21542796 (10/24/2015). Mui, C. (2012). How Kodak Failed. Forbes. Re- Eckl-Dorna, W. (2010). Kodak müht sich um sein trieved from http://www.forbes.com/sites/ Zukunfts-Bild. WirtschaftsWoche. Retrieved chunkamui/2012/01/18/how-kodak-failed/2/ from http://www.wiwo.de/unternehmen/foto- (10/24/2015). markt-kodak-mueht-sich-um-sein-zukunfts- Newell, A. (1987). Unified theories of cognition. Cam- bild/5693090.html (10/24/2015). bridge, MA: Harvard University Press. Edwards, W. (1984). What constitutes „a good deci- Nutt, P. C. (2002). Why decisions fail: Avoiding the blun- sion“? Acta Psychologica, 56, 5-27. ders and traps that lead to debacles. San Francis- Fiske, S. T. & Taylor, S. E. (2013). Social cognition: co: Berrett-Koehler Store. From brains to culture. Sage. Reiss, S. (2004). Multifaceted nature of intrinsic mo- Gigerenzer, G. & Gaissmaier, W. (2011). Heuristic de- tivation. The theory of 16 basic drives. Review of cision making. Annual review of psychology, 62, General Psychology, 8, 179-193. 451-482. Schaub, H. (1996). Exception Error: Über Fehler und Gödecke, C. (2011). Hochglanz und Gloria. Der Spie- deren Ursachen beim Handeln in Unbestimmt- gel. Retrieved from http://einestages.spiegel.de/ heit und Komplexität. gdi impuls, 4 (96), 1-12. static/topicalbumbackground/23634/hochglanz_ Scherer, K. R. (2009). The dynamic architecture of und_gloria.html. emotion: Evidence for the component process Grünig, R. & Kühn, R. (2013). Successful decision-mak- model. Cognition and emotion, 23(7), 1307-1351. ing (3rd ed.). Berlin: Springer. Schroda, F. (2000). Über das Ende wird am Anfang Hacker, W. & Sachse, P. (2014). Allgemeine Arbeitspsy- entschieden (Doctoral dissertation), Technische chologie: Psychische Regulation von Tätigkeiten Universität Berlin, Universitätsbibliothek. (3rd rev. ed.) Göttingen: Hogrefe. Spychiger, M. B. (2008). Lernen aus Fehlern und Ent- Hammond, J. S., Keeney, R. L. & Raiffa, H. (1998). The wicklung von Fehlerkultur. Erwägen Wissen hidden traps in decision making. Harvard Busi- Ethik, 19 (3), 274-282. ness Review, 16 (5), 47-58. Stanovich, K. E., Toplak, M. E. & West, R. F. (2008). The Holusha, J. (1989, December 9). Click: Up, Down development of rational thought: A taxonomy of and Out at Kodak. New York Times. Retrieved heuristics and biases. Advances in child develop- from http://www.nytimes.com/1989/12/09/busi- ment and behavior, 36, 251-285. ness/click-up-down-and-out-at-kodak.html Thompson, V. A., Turner, J. A. P. & Pennycook, G. (10/24/2015). (2011). Intuition, reason, and metacognition. Cog- Jost, P.J. (2001). Organisation und Motivation: Eine nitive psychology, 63 (3), 107-140. ökonomisch-psychologische Einführung. Wies- von der Weth, R. (2001). Management der Komplex- baden: Gabler ität: ressourcenorientiertes Handeln in der Praxis. Kahneman, D. (2003). Maps of bounded rationality: Bern: Huber. Psychology for behavioral economics. The Ameri- Weingardt, M. (2008). Fehlerparodoxon. Transdiszipli- can economic review, 93 (5), 1449-1475. näre Grundlagen zur Theorie und Produktivität Kahneman, D. (2011). Thinking, Fast and Slow. New des Fehlers. Erwägen Wissen Ethik, 19 (3), 283- York: Farrar, Straus & Giroux. 290. Kerr, N. L., MacCoun, R. J. & Kramer, G. P. (1996). Bias Wundt, W. (1896). Grundriß der Psychologie. Leipzig: in judgment: Comparing individuals and groups. Engelmann. Psychological Review, 103 (4), 687-719. The behavioral decision-making architecture 47

Yates, J. F., Veinott, E. S. & Patalano, A. L. (2003). Hard Correspondence to: decisions, bad decisions: On decision quality and Mag. Markus Domeier decision aiding. Emerging perspectives on judg- Institute of Psychology ment and decision research, 13-63. University of Innsbruck Zapf, D., Frese, M. & Brodbeck, F. C. (1999). Fehler und Innrain 52 Fehlermanagement. In D. Frey & C. Graf Hoyos, A-6020 Innsbruck Arbeits-und Organisationspsychologie. Ein Lehr- [email protected] buch (pp.398-411). Weinheim: Beltz Psychologie Verlags Union.