<<

Go Beyond the One Line™

Human Factors and Situational Awareness

Mike Legatt, Ph.D., CPT CEO and Founder

EMMOS Conference 1 CONFIDENTIAL: DO NOT CIRCULATE WITHOUT WRITTEN PERMISSION FROM RESILIENTGRID, INC September 22, 2019 Core Philosophies: The most important components on the grid are the human beings in control rooms and in the field.

“All“All criticalorganizations infrastructuresare areperfectly perfectlyaligned aligned to to get thegetresults the resultsthey getthey.” get.” Arthur W. Jones

2 Complexity, Complicatedness, Speed and Depth

Complex: Many interdependent components

Hard to get order, control, or predictably. “Emergent system”

Complicated: Many independent components Once you can separate components, you can deal with each of them systematically Speed and Depth: Growth of big and fast data

Drowning in data, thirsty for information

Electric Power is increasing in all of these areas!

3 Where are we? A Systems Theory View

4 Where are we? A Systems Theory View

5 Increasing challenges

6 Growing Rates of Change

7 Image © Don Hinchliffe Growing Rates of Change

8 Image © Don Hinchliffe Challenges are Changing

9 Source: ERCOT Legacies Can Persist Even When Harmful

10 Images © NASA, Alex Colville Legacy Mental Models

11 Legacy Mental Models

Image© Monitor Mapboard Systems 12 Legacies Can Lead to Failure in New Systems Some examples:

• Reliance on system operators for turning data into information

• Operators having time to stare at a mapboard (or could use a laser pointer to follow through it)

• Mental models of how the system works that no longer apply

• Safety assumptions (e.g., an open breaker can have generator/solar/battery backflow and is not safe to work on)

13 Situational Awareness

Scanning Focus

THREAT!!!!

14 Situational Awareness Three levels of situation awareness: – Level 1: • What is going on? – Level 2: Comprehension Focus • What does it mean? – Level 3: Projection • Where is it going? What am I going to do about it? Situation awareness is necessary both in individuals and within teams.

Sources: Endsley (1988), The Far Side 15 Situational Awareness Components of situation awareness

Stress/ Perceived Mental Safety Models

Pattern Recognition Situation Working Awareness Memory

Attention

Performance 16 Based On: Steve Kass, Military Psychology Information Processing in Humans Most difficult to process information under extreme stress (reptilian complex) Humans make 3 – 7 mistakes per hour awake, 11-17 under extreme stress.

17 Working Memory Short-term / processing memory (7 ± 2; Miller, 1956) These 7±2 points can be “chunks”, allowing for better storage and processing. Working memory decreases under stress (7±2 => 3-5, or even lower with long term damage)

747 cockpit 18 Attention and Human Performance

19 Situation Awareness Errors

20 Source: SATechnologies SCANNING THREAT!!!!

FOCUS

Situational Awareness?

21

© Independent Electricity System Operator, CA Data vs. Information

22 Out of the Loop Syndrome

23 Focus on what’s important

24 Source: J. Merlo 25 26 27 28 CONFIDENTIAL: DO NOT CIRCULATE WITHOUT WRITTEN PERMISSION FROM RESILIENTGRID, INC 29 CONFIDENTIAL: DO NOT CIRCULATE WITHOUT WRITTEN PERMISSION FROM RESILIENTGRID, INC “Multitasking” or switchtasking “Multitasking” is the attempt to carry on two or more tasks or activities at the same time. But, really what is happening is both – Habit / automation – learned behaviors being repeated with little thought – Frequent “mental set shifting” / “context switching” which is computationally intensive and risky. – Should something (e.g., driving a car) need a jump in attention, there may be insufficient resources available to help.

30 3 0 Responding To Human Error

Outcome Bias Over-Reaction Under-Reaction • Discipline of discrete error • Turn a blind eye to risky choices • Discipline person who didn’t see risk • Allow reckless people to go unchecked • Over-reaction to singular events • Pass over severe system design flaws

“The single greatest impediment to error prevention… is that we punish people for making mistakes.” – Dr. Lucian Leape, Professor, Harvard School of Public Health, 1999 Testimony before Congress on Health Care Quality Improvement

31 Continuously Improve Organizational Culture • Capture and act upon “small signals”, such as near misses • Recognize that success in an emergency requires adaptive capacity • Ensure Deference to Expertise • Avoid Outcome Bias and other cognitive biases • Ensure fast-moving automation incorporates operators “In the loop” and recognizes human information processing capabilities

32 Near-Miss Reporting

An organization focused on being proactive, not reactive Accident

An organization with the Safety Error Catches ability to not only capture, but Near Check also track and trend near Misses misses (organizational Lucky (?) situational awareness) Misstep Peer Check

33 ©2016, Resilient Grid. All rights reserved. | [email protected] | 512-766-4743 | resilientgrid.com Human Factors Engineering • A great many human factors influence the overall reliability of the system, for example: Exercise Visual system function Sleep Cognitive biases Corporate culture Mood Genetic factors Training Self-monitoring Abstract reasoning, empathy Diet Self-actualization Stress & fear Positive thinking Perceived safety Key performance indicators / HR metrics

34 Field of Vision

• Resolution on retina highest at center, lower towards periphery. – Farther out, you can only detect motion & vague shapes. – Motion in the periphery can be distracting

35

Source: Lean Valley Area of Reach

• Lay out your workspace, and user interfaces so the most commonly-used interfaces and materials are in front of you, and less frequently used materials/interfaces are on the periphery.

36

Source: UN Enable Blind Spot

• Close one eye and stare at the cross (if right eye open) or dot (if left eye open). Move your head back or forward

37 Issues with color • Color discriminability decreases both with age and nicotine use • Color deficiency / blindness

38 Source: Chroma: A wearable augmented reality… Shift Work Disorder • Social Rhythm Disruption is also something to consider: –Loss of social cues predicts a bipolar event more than general psychological stress –Social support may be lacking –Spending time with loved ones and friends may be harder on night shift rotations, etc. –Even outside shiftwork, doesn’t make people feel as connected as in-person interactions

Source: Schartz, H. (2013) 39 Systems Design of the Future

• Design systems based on knowledge of the visual system (e.g., color choice, contrast, reduce misplaced salience risks) • Look holistically at the system (e.g., contrast differentials between system and background) • As improved methods for human-systems interfaces grow (e.g., AI, ML, neural interfaces), ensure design based around operations in high-stress fast-paced environments

40 Unintended Consequences

The Cobra Effect

41 The Most Important Things Still Are • Well-implemented technology leads to Reliability (and Robustness)

• Resiliency, especially in unforeseen circumstances, comes from humans

• “What you can measure you can manage” doesn’t work with people

• Continuous improvement towards a series of goals, success is multiple measures within tolerance bands, not a single metric

• Organizational culture is critical. There are research-based frameworks that can help organizations reduce risks and accelerate into disruption succesfully.

42 Thank You!!!

Mike Legatt, Ph.D.

resilientgrid.com/emmos2019

43 CONFIDENTIAL: DO NOT CIRCULATE WITHOUT WRITTEN PERMISSION FROM RESILIENTGRID, INC