Security and Cognitive Bias: Exploring the Role of the Mind
Total Page:16
File Type:pdf, Size:1020Kb
SECURE SYSTEMS Editors: Patrick McDaniel, [email protected]; Sean W. Smith, [email protected] to machine rules; it’s where users Security and Cognitive experience frustration and is the medium through which that frus- tration is conveyed. Bias: Exploring the Role While we practitioners have spent the last 40 years building fan- cier machines, psychologists have of the Mind spent those decades document- ing ways in which human minds systematically (and predictably) Sean W. Smith | Dartmouth College misperceive things. Minds are part of the system, and cognitive biases tell us how minds get things wrong. (For quick introductions to this field, seeRational Choice omputer security aims to to patch holes, but balancing those in an Uncertain World, an under- C ensure that only “good” updates while keeping mission- graduate-level textbook;2 Cognitive behavior happens in computer critical applications running un- Illusions, a graduate-level book;3 systems, despite potential action impaired is tricky—many users just or Stumbling on Happiness, more by malicious adversaries. Conse- give up. (Stuxnet was lauded for casual reading.4 A pioneer in this quently, practitioners have focused the number of 0-day holes it used, space, Daniel Kahneman— a Nobel primarily on the technology to pro- but five-year holes would suffice to laureate— also has a new book out, hibit “bad” things—according to penetrate much of our information Thinking, Fast and Slow, for a gen- some set of rules—and to a lesser infrastructure.) Savvy home users, eral audience.5) extent on the structure of such rules. trying to (legally) share music files Unfortunately, fieldwork and with another household computer, To What Extent Might anecdotes report how we con- will struggle over drop-down menu This Affect the Usable tinue to get the rules wrong. We options attempting to open only Security Problem? keep hearing that security is hard the proper holes in the network Consider the creation of security to use and gets in the way. In the perimeter. Developers might know policies—the formal rules stat- workplace, writing down pass- that advanced protection technol- ing whether subject S can perform words on Post-it notes hidden ogy, such as SELinux, will help keep action A on object O right now under keyboards, under tables, or programs in the bounds of secure (let’s call this time t1). It’s tempting in desk drawers is endemic, because behavior, but they have no easy way to imagine that an omniscient deity humans have too many to remem- of formally telling the system what hovers in the computer, looking at ber—and perhaps also because those bounds are. a request’s full context and implica- the IT system forces an authenti- So, it’s hard to create and config- tions and making the wisest possible cation system that doesn’t meet ure security technology and hard to decision. However, in reality, this users’ needs. (Recently, a school use it after deployment. However, decision was probably made much system secretary was lambasted the charter of this department is to earlier in time (at a time t0 t1) by for misusing the superintendent’s look at the broader “system” con- a security officer trying to imagine password to change grades—but text of security—and the human what S would be doing in the≪ future no one seemed to think it odd that mind is a component in both secu- and whether action A would be con- she knew the password in the first rity creation and use. The human sistent with the organization’s goals place.1) IT staffs know that keep- mind is the arena in which secu- and values. We can pretend that the ing software updated is important rity engineers translate “goodness” policy rules came from the deity at 1540-7993/12/$31.00 © 2012 IEEE Copublished by the IEEE Computer and Reliability Societies September/October 2012 75 SECURE SYSTEMS t1, but it was all in the officer’s head a set of jams. One set of test sub- even about dry factual things such at t0. Cognitive bias can tell us how jects ranked the jams without think- as an estimated selling price for a cof- these rules might differ. If we don’t ing (that is, using system 1); their fee mug, when they are in the situa- pay attention to this difference, we rankings closely correlated with the tion themselves versus when they are risk creating incorrect policies. experts’ rankings. However, other speculating about themselves in the Alternatively, consider the case sets of test subjects were asked to future or about someone else.8–10 In of a subject S complaining about think carefully while ranking the our fieldwork in access control in unusable security features (or, for jams—and their rankings were very large enterprises, we kept hearing that matter, other unusable aspects different. Nonexperts could do the how users needed to work around of IT). It’s tempting to imagine that task with system 1 but not with sys- the access control system because an omniscient deity is hovering in tem 2. For jam, introspection inhib- the policy didn’t allow them to do S’s mind, who recorded this bad its intuition. what they needed to perform their experience at time t1. However, in Sticky jam made me think of jobs. In the case of healthcare IT, reality, we have a security engineer sticky security policy problems. some researchers have even reached hearing S’s recollections, at time We technologists build elaborate the conclusion that the problem is a t2 t1, of how S felt at t1. We can sets of knobs—drop-down menus, dearth of clinicians among the pol- pretend these recollections are the check boxes, access control lists— icy makers. same≫ as the deity’s observations, and expect users to figure out how Could the empathy gap be play- but they were all filtered through S’s to map their notion of “goodness” ing a role here? To examine this head. Cognitive bias can tell us how to a setting of the knobs, perhaps question, we recruited nearly 200 the recollections and observations moving a system 1 goal to a sys- clinicians and staff members at might differ. In this case, if we don’t tem 2 task. Might we see the same a large hospital and partitioned pay attention to this difference, we inhibition phenomenon here? To them into two groups.11 We gave risk “fixing” the wrong thing. test this, my team created a fictional one group a series of access con- social network. Users had various trol scenarios we developed with The Dual-Process Model categories of personal information, a medical informatics specialist. In my lab at Dartmouth, my col- and the GUI told users the vari- These scenarios were all phrased leagues and I have performed some ous levels of connection they had in an abstract, role-based way, as initial exploration into how two with each friend. We presented one is often found in security policies sources of cognitive bias—the dual- group of test subjects a sequence of (for example, “Should a physician process model and the empathy friends and asked them to decide be able to see information I about gap—affect security policy creation. which information they’d share patient A in this particular con- The dual-process model parti- with each friend. Another group text?”). We gave the other group tions the mind into two parts: an was asked to think about various the same scenarios but instead intuitive, nonverbal, and almost social network privacy issues, and phrased them in a way that put the nonconscious system 1, and a verbal, then given the same choices. The test subject directly in the setting; introspective, conscious system 2. second group made significantly each wildcard became specific Some tasks are better done by one different choices—but to our sur- (for example, “You are a physician system or the other, and the systems prise, the difference was one-sided: treating patient Alice...”). can interfere with each other. How- the group asked to think about pri- For two-thirds of the scenarios, ever, this isn’t just abstract theory; vacy gave more information away!7 the direct-experience group made what makes the last few decades of Perhaps introspection inhibits intu- significantly looser judgments than this science so interesting for peo- ition also when it comes to secu- the policy maker group, suggesting ple like me is that these theories are rity policy. (In hindsight, I wonder that even experienced medical staff reinforced by experiments. We can whether the cognitive bias toward will make access control policies use the theories to make predictions dissonance reduction might have that experienced medical staff will that are borne out in practice! been at play; maybe the results find overly constraining. (However, For example, psychologists Tim- would have differed if we didn’t call in some of the other scenarios, the othy Wilson and Jonathan Schooler them “friends.”) direct-experience group made sig- carried out some experiments nificantly tighter decisions, oddly.) regarding jam (and by “jam,” I mean The Empathy Gap Maybe the problem with policy cre- the sweet condiment one puts on We also examined what psycholo- ation isn’t the policy makers’ back- toast, not an obscure security acro- gists call the empathy gap: the very grounds but the cognitive bias built nym).6 Trained taste experts ranked different decisions people make, into human minds. 76 IEEE Security & Privacy September/October 2012 Bounded Rationality approaching many of these prob- outperformed the mathematically and the Anchoring Effect lems, human minds show no evi- optimal ones! In subsequent work, At the University of Southern Cali- dence of actually carrying out these Pita’s group further improved their fornia, Milind Tambe has also been algorithms, and so are perhaps model by taking into account pros- looking at the role of cognitive doing something much simpler and pect theory, which describes how bias, but in the context of optimiz- less correct.