SAVING HUMAN LIVES Issues in Business Ethics

VOLUME 21

Series Editors

Henk van Luijk, Emeritus Professor of Business Ethics Patricia Werhane, University of Virginia, U.S.A.

Editorial Board

Brenda Almond, University of Hull, Hull, U.K. Antonio Argandoña, IESE, Barcelona, Spain William C. Frederick, University of Pittsburgh, U.S.A. Georges Enderle, University of Notre Dame, U.S.A. Norman E. Bowie, University of Minnesota, U.S.A. Brian Harvey, Manchester Business School, U.K. Horst Steinmann, University of Erlangen-Nurnberg, Nurnberg, Germany

The titles published in this series are listed at the end of this volume. Saving Human Lives Lessons in Management Ethics

by

ROBERT ELLIOTT ALLINSON Soka University of America The Chinese University of Hong Kong A C.I.P. Catalogue record for this book is available from the Library of Congress.

ISBN-10 1-4020-2905-5 (HB) ISBN-10 1-4020-2980-2 ( e-book) ISBN-13 978-1-4020-2905-9 (HB) ISBN-13 978-1-4020-2980-6 (e-book)

Published by Springer, P.O. Box 17, 3300 AA Dordrecht, The Netherlands. www.springeronline.com

Printed on acid-free paper

All Rights Reserved © 2005 Springer No part of this work may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, electronic, mechanical, photocopying, microfilming, recording or otherwise, without written permission from the Publisher, with the exception of any material supplied specifically for the purpose of being entered and executed on a computer system, for exclusive use by the purchaser of the work.

Printed in the Netherlands. To my wife, Iréna, whose unfailing support and brilliant suggestions inform this book and grace my existence. To the bravery of the late Justice Peter Mahon, Stuart Macfarlane and Captain Gordon Vette who risked their careers and their lives to research and investigate the events that made up the disaster on Mt. Erebus, whose work may serve as an inspiration to us all for the courage that humanity needs to struggle for the cause of justice. Saving Human Lives is an attempt to demonstrate how ethical management can save human lives. Its thesis is explored through a detailed investigation of case studies all of which support the thesis that greater attention to ethics in management would have saved human lives. To this author, the saving of human lives is the greatest possible contribution ethics can make to management and it is to this end-goal that this book is dedicated. As one reads through the chapters one readily notice parallels to numerous global crises and disasters that have occurred since the preparation of this book. The Columbia space shuttle disaster is an ominous reminder that the lessons of history were not learned in the Challengerr space shuttle disaster. The terrible events of September 11 in the United States are horrific reminders that the lessons of management ethics could also have prevented the needless loss of lives. In both of these recent cases, warnings of the impending disasters were out there but were neglected or trivialized. The prioritization of ethics as is strongly argued for throughout this volume would have not tolerated such a neglect of the advance warnings of the serious, potential threat to human lives. To give special thanks to those who have helped in the preparation of this volume, I wish to thank the contributions of Stuart Macfarlane, Captain Gordon Vette, Mrs. Justice Mahon, Mrs. Jim Collins and Peter McErlane all of whom gave valuable information which aided my investigations into the disasterr on Mt. Erebus enormously. All of these individuals were kind to give of their personal time when I visited with them in New Zealand and provided both materials and a focus to my investigation which was immeasurable in itsvalue. Without the opportunity for personal interviews with all of the aforementioned, there would have been no way of obtaining crucial information which has remained unpublished. I also wish to thank Roger Boisjoly for keeping an extensive, personal correspondence with me that shed invaluable light on my investigations of the Challengerr space shuttle disaster. I wish to thank the University of Canterbury which offered me Erskine Fellowship that enabled me to come to New Zealand. I was able to deepen my investigations of the disaster on Mt. Erebus that I had originally undertaken over a decade earlier. Some of the material in this book originally appeared in my earlier volume, Global Disasters: Inquiries into Management Ethics. viii

The materials in the chapters on the Challengerr disaster and the disaster on Mt. Erebus are revised and new chapters have been added including case studies of the Titanic disaster and the Vasa disaster. The chapters on the Challengerr disaster and the disaster on Mt. Erebus contain key points never before published which cast great light on these disasters and provide powerful support for the thesis of the present book that ethical management could have prevented these disasters. TABLE OF CONTENTS CHAPTER 1 ACCIDENTS, TRAGEDIES AND DISASTERS...... 1 THE RULE OF ACCIDENTAL ...... 3 THE EXPLANATION OF HUMAN ERROR...... 3 THE EXPLANATION OF A BREAKDOWN OF A MATERIAL OR TECHNICAL COMPONENT AND ITS COROLLARY, “”RISKY TECHNOLOGY ...... 3 RISKY OR UNRULY TECHNOLOGY?...... 4 THE EXPLANATION OF ORGANIZATIONAL INERTIA OR BUREAUCRATIC DRIFT...... 6 ACCIDENTS WILL HAPPEN ...... 7 THE WORD ‘ACCIDENT’ ...... 8 THE BELIEF IN MONOCAUSALITY...... 12 MULTI-CAUSALITY AND MULTIPLE RESPONSIBILITY ...... 14 FAULT FINDING AND THE SCAPEGOAT...... 15 WARNINGS AND ETHICS ...... 17 FREEDOM AND ETHICS...... 18 NOTES ...... 19

CHAPTER 2 THE INTERNAL RELATIONSHIP BETWEEN ETHICS AND BUSINESS...... 21 ETHICS AS INVOLVED IN THE GOALSOF AN ORGANIZATION...... 22 ETHICS AND THE CONDUCT OF BUSINESS ENTERPRISE...... 33

ix x TABLE OF CONTENTS

ETHICS AND THE INFRASTRUCTURE OF A BUSINESS ORGANIZATION...... 33 THE WILL TO COMMUNICATE: FORMAL AND INFORMAL REPORTING CHANNELS...... 34 THE UNDERLYING PRINCIPLE OF ETHICAL COMMUNICATION: RESPECT FOR PERSONS...... 34 ETHICS AND INFORMAL CHANNELSOF COMMUNICATION ...... 36 ETHICS AND FORMAL REPORTING CHANNELS...... 37 THE ARGUMENT FOR THE EQUIVALENCE IN EFFECT OF COMPETING NORMATIVE ETHICAL JUSTIFICATIONS...... 39 NOTES ...... 46

CHAPTER 3 THE BUCK STOPS HERE AND IT STOPS EVERYWHERE ELSE AS WELL...... 48 THE BUCK STOPS HERE...... 48 THE WILL TO COMMUNICATE...... 50 THE MANAGER’S TASK ...... 52 THE MANAGER AS EDUCATOR AND FACILITATOR OF GOOD WILL...... 53 NOTES ...... 55

CHAPTER 4 CRISIS MANAGEMENT AND DISASTER PREVENTION MANAGEMENT ...... 60 CRISIS MANAGEMENT: THE “BAND-AID” APPROACH...... 60 TABLE OF CONTENTS xi

CONCEPTUAL PREPAREDNESS...... 64 THE EXPLICIT PRIORITIZATION OF A SAFETY ETHOS ...... 66 THE INDEPENDENTLY FUNDED SAFETY BOARD WITH FULL VETO POWERSOVER OPERATIONAL DECISIONS ...... 67 NOTES ...... 68

CHAPTER 5 THE VASA DISASTER...... 71 THE CONSTRUCTION OF THE VASA ...... 72 THE STABILITY TEST ...... 73 THE QUESTION OF BALLAST ...... 74 THE WIND PRESSURE ON THE SAILS ...... 75 THE VASA’SS CENTER OF GRAVITY ...... 76 WHY THE VASA CAPSIZED...... 76 CONCLUSIONS ...... 82 NOTES ...... 82

CHAPTER 6 THE TITANICC DISASTER...... 84 METAPHYSICAL BELIEF SYSTEMS ...... 84 LOSS OF LIFE...... 87 REPORT OF THE COURT (BRITISH REPORT)...... 87 THE COLLISION ...... 87 CAUSES OF THE DISASTER...... 88 WARNINGS...... 88 WARNINGS TO PASSENGERS ...... 91 xii TABLE OF CONTENTS

SPEED OF THE SHIP...... 92 WEATHER...... 92 CAUSES OF DEATHS ...... 92 HOW ELSE COULDTHE TITANIC DISASTER BEEN PREVENTED? ...... 92 THE SAILINGORDERS...... 93 RELEVANT DESIGN FEATURES ...... 94 RIVETS ...... 94 THE INADEQUACY OF THE HUMAN ERROR HYPOTHESIS...... 96 LIFEBOATS...... 97 THIRD-CLASS PASSENGERS ...... 98 NEARBY RESCUE POSSIBILITIES ...... 99 THE RESCUE BY THE S.S. “CARPATHIA”...... 99 S.S. “CALIFORNIANN”...... 99 FINDINGS OF THE COURT ...... 101 LOOK-OUT...... 101 SPEED ...... 102 RECOMMENDATIONS...... 102 NOTES ...... 104

CHAPTER 7 THE SPACE SHUTTLE CHALLENGER DISASTER...... 107 A BRIEF SYNOPSIS ...... 108 KEY WORDS...... 109 THE WORD ‘ACCIDENT’ ...... 109 CAUSE AND CONTRIBUTINGCAUSE ...... 112 DECISION MAKING ...... 115 TABLE OF CONTENTS xiii

THE ATMOSPHERE OF THE DECISION MAKING PROCESS...... 115 THE DECISION MUST BE MADE UNDER PRESSURE...... 117 A FIXED DEADLINE MUST BE MET...... 118 AWRONG DECISION WILL HAVE GRAVE CONSEQUENCES...... 119 THE PRESENCE OF IRREGULARITIES...... 120 ABBREVIATED CHRONOLOGY OF EVENTS LEADING TO CHALLENGER LAUNCH ...... 121 (a.) THE LACK OF ANY CLEAR UNIFORM GUIDELINES AS TO MORAL CRITERIA ...... 123 (b.) A FORMULA FOR THE ATTRIBUTION OF THE RESPONSIBILITY FOR THE BURDEN OF PROOF IN AN ARGUMENT...... 124 (c.) WHAT WOULD COUNT AS SUFFICIENT JUSTIFICATION FOR CONCLUSIONS ...... 126 (d.) THE LACK OF UNDERSTANDING OF WHAT INPUT IS RELEVANT TO A DECISION...... 126 THE LACK OF A SPELLED OUT DECISION MAKING MECHANISM ...... 130 MANAGEMENT STRUCTURE ...... 131 THE LANGUAGE OF COMMUNICATION ...... 134 RESPONSIBILITY: BOTTOM UP ...... 137 TOP DOWN RESPONSIBILITY ...... 139 DORMANT STAGE ...... 139 THE WILL TO COMMUNICATE...... 141 NOTES ...... 143 xiv TABLE OF CONTENTS

CHAPTER 8 POST-CHALLENGER INVESTIGATIONS...... 154 THE WORD ‘ACCIDENT’ ...... 154 THE CONFLATION OF GENERAL UNKNOWN RISK OF SPACE TRAVEL WITH THE SPECIFICALLY FOREKNOWNRISK OF THE O-RINGS ...... 155 SAFETY PRIORITY...... 157 DECISION MAKING ...... 157 SAFETY FIRST?...... 158 IS THERE A GREATER SENSE OF RESPONSIBILITY NOW?...... 162 WERE MIDDLE MANAGERSSIMPLY FOLLOWING POLICY? ...... 163 WERE THE MIDDLE MANAGERS MORAL?...... 164 WAS THE CHALLENGER DISASTER A MISTAKE?...165 WAS THE CHALLENGER DISASTER AN ACCIDENT? ...... 165 WAS THE CHALLENGER DISASTER THE INEVITABLE OUTCOME OF CULTURAL DETERMINISM? AND, WAS MORAL CHOICE AND MORAL RESPONSIBILITY THEREFORE IRRELEVANT AS AN EXPLANATION? ...... 166 THE HISTORY OF O-RING EROSION...... 168 NORMALIZED DECISIONS? ...... 170 LINKS BETWEEN TEMPERATURE AND EROSION...... 170 TRUSTING ESTABLISHED PATTERNSOF DEVIANCE? ...... 173 TABLE OF CONTENTS xv

FAITH IN THE SECONDARY SEAL? ...... 173 DID ENGINEERS BELIEVE THAT THE CHALLENGER WASSAFE TO FLY?...... 176 WERE THE ENGINEERS UNAWARE OF WHAT WAS A PROBABLE OUTCOME? ...... 178 THE QUESTION OF “HARD DATA” ...... 180 ETHICAL DECISION MAKING ...... 181 CONCLUSION...... 182 NOTES ...... 183

CHAPTER 9 THE HERALD OF FREE ENTERPRISEE DISASTER ...... 198 THE IMMEDIATE CAUSE OF THE DISASTER AND MULTI-RESPONSIBILITY ...... 199 THE ORDERS...... 202 BOTTOMUP RESPONSIBILITY ...... 204 TOP DOWN RESPONSIBILITY ...... 208 A DYSFUNCTIONAL MANAGEMENT...... 209 THE LACK OF A CLEAR ATTRIBUTION OF DOMAINSOF RESPONSIBILITY FOR OFFICERS.....210 THE LACK OF THE ISSUANCE OF CLEAR INSTRUCTIONS ...... 210 TECHNICAL COMPONENT...... 212 THE CLOSINGOF THE DOORS...... 213 THE WILL TO COMMUNICATE ...... 213 CONCLUSIONSOF THE COURT...... 218 NOTES ...... 220 xvi TABLE OF CONTENTS

CHAPTER 10 THE KING’SCROSS UNDERGROUND FIRE...... 223 EPISTEMOLOGICAL FRAMEWORKS COMPARED ...... 224 FENNELL’S EPISTEMOLOGICAL FRAMEWORK.....227 THE USE OF WORDS ...... 228 THE CAUSE OF THE FIRE...... 230 RESPONSIBILITY FOR THE FIRE: TOP DOWN...... 233 RESPONSIBILITY: BOTTOM UP ...... 235 THE IMPORTANCE OF A SAFETY ETHOS ...... 246 THE ROLE OF MANAGEMENT STRUCTURE IN BLOCKING THE FLOW OF INFORMATION ...... 247 FENNELL’S RECOMMENDATIONS: THE PRIMACY OF SAFETY...... 249 PHILOSOPHICAL UNDERPINNINGS...... 251 NOTES ...... 252

CHAPTER 11 THE DISASTER ON MT. EREBUS ...... 256 ASHORT HISTORY OF THE DISASTER...... 258 THE CHIPPINDALE REPORT...... 258 JUSTICE MAHON’S CRITIQUE OF THE CHIPPINDALE REPORT...... 259 THE EVIDENCE FROM THE FLIGHT-DECK TAPES ...... 260 VETTE’S TEXT...... 265 MACFARLANE’S NOTES ON VETTE’S TEXT ...... 265 TABLE OF CONTENTS xvii

MACFARLANE’S REPLY TO 1. (THAT THE CHANGE IN COMPUTER FLIGHT PATH DID NOT MISLEAD THE CREW)...... 266 MACFARLANE’S REPLY TO 2. (THAT PILOTS VIOLATED INSTRUCTIONS NOT TO DESCEND BELOW 16,000 FEET) ...... 268 MACFARLANE’S REPLY TO 3. (THAT THE CREW WERE UNCERTAIN OF THEIR POSITION)....270 MACFARLANE’S REPLY TO 4. (THAT RADAR WOULD HAVE SHOWN EREBUS AHEAD)...... 271 TAKING A PHENOMENOLOGICAL VIEW...... 272 THE WHITEOUT PHENOMENON ...... 272 PHENOMENOLOGICAL APPROACH: TAPES...... 274 THE COHERENCE THEORY OF TRUTH...... 276 MISMANAGEMENT ...... 278 THE CAUSE OF THE DISASTER ...... 280 DEFECTS IN ADMINISTRATIVE STRUCTURE ...... 283 DEFECTS IN ADMINISTRATIVE COMMUNICATIONSSYSTEM ...... 284 SUMMARY OF MANAGEMENT DEFECTS ...... 285 THE LACK OF ANY SAFETY ETHOS...... 287 TOP DOWN RESPONSIBILITY ...... 287 NOTES ...... 289

APPENDICES TO CHAPTER 11 ...... 313 APPENDIX ONE ...... 313 APPENDIX TWO ...... 316 APPENDIX THREE...... 320 NOTES ...... 320 xviii TABLE OF CONTENTS

CHAPTER 12 MORAL RESPONSIBILITY AND TECHNO-ORGANIZATION ...... 322 NOTES...... 332

BIBLIOGRAPHY...... 334

INDEX...... 347 CHAPTER 1

ACCIDENTS, TRAGEDIES AND DISASTERS

A prime thesis of this book is that a leading cause of man-made, corporate or organizational disasters is a lack of accountability, especially on the part of top management. This lack of accountability can be traced to or at least shown to be compatible with a certain metaphysical view of the world, the view of determinism. The thesis of determinism is that no matter what human beings do, or do not do, certain outcomes are inevitable. In ancient Greek times, to believe that certain outcomes were inevitable was to believe that one was a tool of the play of the gods, or of Fate. In terms of highly consequential and unfortunate events of life such a view that onewas ruled by Fate was linked with the idea that life was inevitably tragic. Tragedy as a form of drama came into being to justify the fated outcomes and in the process to ennoble the participants in the tragic events. The usage of such terms in these modern times has degenerated.1 Whenever a disaster is described as a ‘tragedy’, one does not need to further inquire into the cause or causes of this event because it has now been lifted outside of human power into the domain of Greek drama and fate. As a tragedy, it was fated to be and the only response is to accept it (and others of its kind) as part of the inescapable human situation. One may mourn it and sympathize briefly with the victims. But one is freed (by thinking of it as a tragedy) from the need to consider how one may prevent like disasters from happening in the future. While it may be objected by the reader that this is too purist a view of the everyday person’s understanding of the word, ‘tragedy’, it is the view of the present author that one must be careful both of unconscious associations in one’s own mind and the associations that such a word usage creates in the minds of future readers. The word ‘tragedy’ possesses a grand and influential history. The tragic heroes and heroines of Greek and Shakespearean tragedy generally stood for higher values and it was because they stood for those higher values combined with some internal character flaws that they determined their own fate. They come down as heroes and

1 2 SAVING HUMAN LIVES heroines to present readers precisely because of a combination of nobility and internal character flaws that brought about their own end. It is Oedipus’ desire to know coupled with his arrogance and rash action that in the end is the cause of his own tragic fate. It is Antigone’s placing of her ethical principles above the law of the land coupled with her open defiance of her uncle, the king, that brings about her tragic fate. It is Macbeth’s ambition to be king coupled with a susceptibility to be influenced by Lady Macbeth that causes him tocommit murder. In thecases of disasterscaused by a combination of a lack of sound management practises and a lack of ethics on the part of senior and line management, there are no noble heroes or heroines andcharacter flaws should play no role in a properly managed organization. While few today would be willing to subscribe to a view that life was fated or tragic in the high sense of the term, it is the thesis of the present author that other forms of determinism rule today. The idea of chance or accident has overtaken the whimsical actions of the Greek gods and the mysteries of Technology have overtaken the mysteries of the three Fates of Greek mythology. Such beliefs which can be referred to as determinism by Accidents and determinism by Technological/Organizational complexity are argued in the course of this work to be key factors which have influenced decision making that has led to disastrous outcomes. While these are certainly not the only factors, they can play a role in influencing or justifying additional key factors such as a lack of ethical and epistemological accountability. In his book, Normal Accidents: Living with High Risk Technologies, Charles Perrow has given birth to the idea that certain technological disasters are inevitable due to the interlocking complexity of technology and bureaucratic organizations and that accidents under such circumstances become normal and expected. With such a thesis, cosmic and technological determinism are woven into thesame fabric. ACCIDENTS, TRAGEDIES AND DISASTERS 3

THE RULE OF ACCIDENTAL

The tacit thesis that the leading cause of disasters is the accidental typically manifests itself in three different forms. One form in which the accidental appears is in the guise of human error; the second form in which the accidental appears is in the guise of a breakdown of a material or technical component, the third form in which the accidental appears is through the inertia of compliance with the mechanisms of organizational routines. This is not intended as a complete list of the manifestations of the accident or the mistake; it is only intended to be illustrative and educational.

THE EXPLANATION OF HUMAN ERROR

Human error appears in a number of guises. It can appear simply as human error. It can appear as , operator error, faulty judgement, a judgement call, and so on. In one and all of these forms, it is ultimately ineliminable because human error is accepted as a fundamental human condition. Human beings, being finite and imperfect, cannot avoid error. As there will always be human error, there is not much that can be done about it. The inevitability of human error is noted in such popular phrases as ‘we all make mistakes’; ‘nobody’s perfect’; and even, ‘it’s humaan nature’; or, as a justification for someone accused of an error, ‘I’m only human’. Since human error is ineliminable (under anyone’s view), if the disaster is caused by human error, then the disaster was not preventable and the matter can be put to rest.2

THE EXPLANATION OF A BREAKDOWN OF A MATERIAL OR TECHNICAL COMPONENT AND ITS COROLLARY, “RISKY TECHNOLOGY”

One often hears the phrase, ‘it was the part that was faulty.’ If the explanation offered as the cause of a disaster is not human error, it 4 SAVING HUMAN LIVES will most frequently be a malfunctioning component part. A famous example of the faulty part explanation is the O-ring malfunction explanation of the Challengerr disaster. Since ultimately all material parts are finite and imperfect and subject to breakdown, it is inevitable that eventually any and every part may malfunction. Hence, the explanation of the breakdown of a material component appears, like the explanation of human error, to be ineliminable. Sometimes the material part explanation is compounded with the use of the phrase, ‘high risk technology’ so as to imply that there is even a greater likelihood of a parts breakdown the higher the technology that is in use. What is forgotten in the parts analysis are usually several key questions such as why was such technology chosen in the first place (in the case of the Challenger, the malfunctioning O-rings were the least safe out of four engineering designs submitted). Thus, it was not so much a case of ‘risky technology’ (that was bound to go wrong), but a case of risky choice on the part of the management that selected the design to be built. Other key questions that are usually neglected in the ‘malfunctioning part’ explanation are: Were there regular, recent and thorough inspections of the part in question? Did the part pass rigorous safety standards? Did any warnings exist that such a part was likely to malfunction with disastrous consequences? The answer to the third question, the existence of prior warnings that the part was likely to malfunction with disastrous consequences was ‘yes’ in the case of the Challengerr disaster as there was an eight year history of red flagged warnings concerning the continued reliance on the O-rings.

RISKY OR UNRULY TECHNOLOGY?

From Perrow’s title of his book, Normal Accidents: Living with High-Risk Technologies (1984, 1999) to Diane Vaughan’s title of her 1996 book, The Challenger Launch Decision, Risky Technology, Culture and Deviance at NASA, one gains the impression in post- Challengerr literature that one is a victim today to the ever increasing advances in risky technology. Vaughan states in her preface that, ACCIDENTS, TRAGEDIES AND DISASTERS 5

‘The practical lessons from the Challengerr accident warn us about the hazards of living in this technological age’. In the last sentence of her preface she writes, ‘This case directs our attention to the relentless inevitability of mistake in organizations - and the irrevocable harm that can occur when those organizations deal in risky technology’. But there was no necessity to fly if the technology were that risky or to choose that technology in the first place. The O-ring design of giant rubber gaskets keeping combustible hot gases from leaking out ranked fourth out of four submitted engineering designs and according to one important article co-written by Trudy Bell, Senior Editor of the engineeringjournal, IEEE Spectrum and Karl Esch, ‘The Space Shuttle: A case of subjective engineering’, Vol. 26, No. 6, June 1989, p. 44, [curiously omitted from her list of sources] was the chief cause of the Challengerr disaster. (IEEE is the acronym for the Institute of Electrical and Electronic Engineers). The real culprit was not the ‘risky technology’ of her subtitle, which locates the risk in the technology but ‘risky assessment’, which locates the risk in the decision to employ technology while not basing that decision on known safe designs and continued use on performance data and not on subjective judgements. The phrase ‘risky technology’ mistakes the efffect for the cause. It is more appropriate to speak of risky assessment (not ‘faulty judgment’ for that could be an even more sophistic version of the fallacious inevitable human error hyypothesis discussed above). ‘Risky assessment’ clearly locates the problem in the choice of unreliable and in the case of where life and death risks (of others) are being taken, unethical assessment strategies. Perhaps the best descriptive phrase to use is ‘ethically and epistemologically reckless assessment’. What is lost in the pseudo-concept of technological hazard or risky technology is that there is no need to choose a risky or an unruly technology. Turner and Pidgeon are mistaken (as is Vaughan) when they say that ‘ ... Vaughan points out that the shuttle and its operation were at the cutting edge of high performance aerospace systems. [certainly not the O-rings!] As a consequence the engineers were dealing with the ongoing development of an ‘unruly’ technology’3. Not so, there was no need to choose the poorest design available. Safer designs were available at the time. The Aerojet’s design was safer. In fact, the solid rocket joints have actually been completely 6 SAVING HUMAN LIVES redesigned. There are now three O-rings and when the boosters are firing, the pressure from the booster makes for a tighter seal. The astronauts now have access to explosive bolt escape hatches and are now provided both with parachutes and space pressure suits. The astronauts are now informed of any potential problems with a launch in question4. As will be discussed in Chapter Eight below, the astronauts could in fact have been saved with the installation of an abort system whereas Vaughan states that no abort system was possible (despite the fact that an abort system was indeed possible according to a number of sources which are not to be found in her bibliography, including Mike W. Martin and Roland Schinzinger, Ethics in Engineering, Second Edition, New York: McGraw Hill Company, 1989 and even in a source that she frequently cites, Richard Lewis, Challenger: The Fateful Voyage, New York: Columbia University Press, 1988, which points out that it was an issue of policy not possibility that no abort system was installed) leaving her readers with the impression that the two civilians and five astronauts were helpless victims of risky technology. According to Schwartz, at the time of the Challengerr launch, NASA was cutting their safety budget by half a billion dollars (New York Times, 24 April, 1986) and their quality control staff by 70% (New York Times, 8 May, 1986).5

THE EXPLANATION OF ORGANIZATIONAL INERTIA OR BUREAUCRATIC DRIFT

While this is not a common manifestation of the accidental cause, it can be cited as one. The explanation is that as bureaucracy in organizations increase there is a tendency for human responsibility for decisions to be supplanted by bureaucratic processes. As a result, as the good poet says, ‘there is many a slip twixt cup and lip’. While large organizations inevitably breed inertia, what is neglected in this explanation is that if the decision making of the organization poses life and death consequences for certain individuals, then such decision making cannot be left to the vagaries of inertia. ACCIDENTS, TRAGEDIES AND DISASTERS 7

Management is responsible for ensuring that life and death decisions are not allowed to be decided by bureaucratic drift.

ACCIDENTS WILL HAPPEN

One of the most prevalent of beliefs that tends to undermine the project of disaster prevention management before it gets underway is the belief that accidents are inevitable given the cosmic design or lack of design of the universe. Just as the belief in the inevitability of human error can function as a strong disincentive to developing an orientation towards disaster prevention, the belief that accidents will happen and there is no way to prevent them also functions as a strong disincentive to building a comprehensive program of disaster prevention management. The important thing to focus on for the present moment is that both of these are beliefs; they are not established facts in any sense off the word. While no one would attempt to argue for the possibility of eliminating human error, that is not the same as arguing for the possibility of reducing human error to an ineliminable minimum. That human error can be reduced in its occurrence and its effects is one of the by-products of disaster prevention management. How this occurs will be demonstrated in the empirical cases to be examined. But the first step is to demythologize thebelief that human error cannot be eliminated. The term ‘demythologize’ is preferable to the term ‘deconstruct’ because while deconstruction can take place to serve a number of purposes, the motive behind demythologizing is to attempt to take away the power of the belief in question to influence choices and behavior by showing that the belief is a myth. While human error cannot be eliminated, one can reduce the importance of this fact by replacing the phrase ‘there will always be human error’ with the phrase, ‘the incidence and the effects of human error can be greatly reduced’. Likewise, with the thesis, ‘accidents will happen’; there is no need to argue that accidents will never happen. In order to properly demythologize the phrase ‘accidents will happen’, it can be altered to read, ‘corporate disasters are largely not accidents at all, but nearly always functions of mismanagement’. Thus, one need not lock horns 8 SAVING HUMAN LIVES with theview that “accidents are inevitable”. All one needs todo is to provide sound arguments that disasters of the sort that this book addresses are hardly accidental in their occurrence but arise from a series of bad management practises. These bad management practises can be said to arise from a lack of ethical sensitivity or an ethical consciousness.

THE WORD ‘ACCIDENT’

If the view that is presented in this book turns out to be correct, it is rather important that one avoids the use of the word ‘accident’ to refer to the kinds of disasters under discussion. According to its dictionary definition, the word ‘accidental’ carries with it both the connotations of something that occurs by chance and something nonessential or incidental.6 The use of the wordd ‘accident’ in this connection carries with it the connotation that whatever happened (as the Challenger “accident”) was inevitable (since accidents are inevitable and the Challengerr disaster was an accident, it was inevitable), and therefore could not have been prevented. As will be discussed below, literature concerned with the Challengerr disaster abounds with the term ‘accident’ to refer to the Challengerr disaster. Even the title of the Presidential Report claims to be the Report of the Presidential Commission ... on the Challenger Accident. It is indeed unfortunate that such a prestigious body adds its weight through its choice of language to the belief system that disasters such as the Challengerr were not preventable and, also, as the word ‘accident’ also connotes, were somewhat trivial in nature. (One does not refer to an earthquake, for example, as an accident even though it was not preventable by known human means). While the Presidential Committee might not have thought that the Challengerr disaster was not preventable, their use of the term ‘accident’ in the title and throughout their report carried with it an influence all of its own. The thesis that “accidents will happen” and that therefore nothing can be done to prevent their occurrence reaches its logical fulfillment in the thesis of Charles Perrow that accidents are so inevitable and therefore non preventable that we are even justified in calling them ACCIDENTS, TRAGEDIES AND DISASTERS 9

“normal”. Such a usage does extreme violence to the concept of an accident which by definition would be an occurrence that was not a regular occurrence and therefore abnormal rather than normal. Perrow’s intention must be to create the impression that “accidents” are part of the normal sequence of events. But statistically this is not the case. Nevertheless, we find the main title of his book to be Normal Accidents where he not only refers to such major disasters as Three Mile Island as accidents (here, he is not alone as the Presidential Report also refers to the Three Mile Island Accident), but moreover that nothing at all can be done to prevent their occurrence. Nothing could be further from the thesis of this present book. Perrow’s view, however, cannot be simply dismissed. He argues that high technology of today carries with it such interlocking complexity that it is impossible to eliminate the possibility of accidents. The problem with his analysis is that one must bear in mind that his thesis that “accidents are inevitable and normal” is a belief, and not a given fact. Some of the evidence that he provides in his book does not support his conclusion. For example, in his treatment of the Erebus disaster, which occupies three pages of discussion, he cites the complexity and the coupling of the system that resulted in the substitution of the wrong programming for the pilot to follow. In the chapter of this present book, ‘The Disaster on ’, it is argued that the fact that this substitution went undetected and/or was not communicated to the pilot was due to faulty management practises and a lack of an ethical imperative both of which could have been otherwise.7 If one leaves it as simple “complexity and interactive coupling”, then management is not responsible. The Erebus example, at the very least, does not support Perrow’s conclusion that this disaster was ineliminable unless one simply wants to hold onto the implacable belief that the complexity of causes in and of itself is of such a nature that there is no place at all for the concept of responsibility. The ‘accident’ and ‘tragedy’ language pervades many sources. Claus Jensen’s book on the Challengerr published by Farrar, Straus and Giroux in 1996 bears the title, No Downlink, A Dramatic Narrative about the Challenger Accident and Our Time. In Rosa Lynn B. Pinkus, Larry J. Shuman, Norman P. Hummon, Harvey Wolfe, 10 SAVING HUMAN LIVES

Engineering Ethics, Balancing Cost, Schedule, and Risk, Lessons Learned from the Space Shuttle, published by Cambridge University Press in 1997, the term ‘accident’ appears on page 140 when describing the Challenger, ‘accident’ is used three times in the next two pages, ‘tragic accident’ on pp. 240 and 278 (a double whammy), and forms part of the title of chapter 14, thus appearing as a header of every page from p. 277 to p. 327 in the powerfully influencing phrase, ‘The Challenger [sic] Accident’. When the Challenger disaster is referred to as a tragedy it summons up an aura of inevitability and nobility when in fact, it was horrific and not tragic at all. To reference it as ‘tragic’ implies not only that it was noble and hence in some important sense worthy and redeeming but also implies that it was fated and thus unpreventable. The term ‘tragedy’ takes the responsibility for the decision out of the hands of the launch decision makers altogether and assigns the responsibility to a higher force. Today it is the gods of Technology instead of the gods of Fate. The meaning of ‘tragedy’ as mournful serves to focus attention on the victims and not on those responsible decision-makers. In Caroline Whitbeck’s, Ethics in Engineering Practice and Research, published by Cambridge University Press in 1998 there is a mixed usage. The term ‘accident’ is freely interspersed with ‘disaster’ thus diluting the meaning of ‘disaster’. For example, the word ‘accident’ appears five times on p. 144 to three uses of ‘disaster’. On p. 145, ‘accident’ appears three more times. The very first page of Perrow’s new Afterword to his 1999 edition of Normal Accidents: Living with High-Risk Technologies features theword ‘accident’ no less than ten times and the word ‘accident’ occurs repeatedly throughout his Afterword. Whatever arguments that do appear conferring ethical responsibility to decision-makers, the incessant reference to the outcome as an ‘accident’ trivializes and neutralizes theethical responsibility supposedly attributed to decision-makers. The subliminal effect of this word usage in terms of influencing the possibility of future Challenger like disasters cannot be ignored. At the same time it reveals metaphysical belief systems most likely shared by those who made the decision to launch: a metaphysical beliefs that exempted them from responsibility in advance. It is important to take note of metaphysical belief systems that are ethically neutering and to address this issue. ACCIDENTS, TRAGEDIES AND DISASTERS 11

The practise of employing the words ‘accident’ and ‘tragedy’ in literatureon the Challengerr and other corporate disasters is endemic. Diane Vaughan, whose later book on the Challengerr was to attract much attention, was to make an early start as an arch user of the accident and tragedy language. On the first page of her article, ‘Autonomy, Interdependence, and Social Control: NASA and the Space Shuttle Challenger’, Administrative Science Quarterly, Vol. 35, Number l, March 1990 the word ‘accident’ appears nine times, three times further buttressed by the adjective ‘technical system’ and once by the adjective ‘normal’. The word ‘accident’ is employed a total of forty-seven times in the course of her article. ‘Tragedy’ appears twice on page one and a total of nine times during the course of her article. In a later article, as mentioned above, she switches her pre- ference to ‘tragedy’ as in the title, ‘The Trickle-Down Effect: Policy Decisions, Risky Work, the Challengerr Tragedy’, California Management Review, Vol. 39, No. 2, Winter 1997. (Here the concept of “risky work” also makes an appearance – the real question is whose choice is it andwhether it is necessary to work under risky conditions). Such language implies a lack of responsibility since accidents do happen and tragedies are of course due to Fate. Occasionally, there has been a change from the use of the word ‘accident’ to theword ‘incident’ as in the title of Joseph R. Herkert’s otherwise excellent article, ‘Management’s Hat Trick: Misuse of “Engineering Judgement” in the Challenger Incident’, in the Journal of Business Ethics, Vol. 10, No. 8, August 1991. Herkert’s usage in his article is evenly distributed between the two terms ‘accident’ and ‘incident’. In ‘Risk Reporting And The Management of Industrial Crises’, Dorothy Nelkin does raise the issue of the use of words and the importance of the proper word choice. For example, she asks, ‘Was Three Mile Island an ‘accident’ or an ‘incident’?’ And she goes on to comment that, ‘Selective use of adjectives can trivialize an event or render it important ... define an issue as a problem or reduce it to a routine’. Cff , Journal of Management Studies, Vol. 25, Number 4, July 1988, p. 348. Nevertheless, the trend to use the accident and tragedy language carries on. An interesting new trend may be indicated in the Post-Modern mixture of the language of ‘disaster’, ‘accident’ and even ‘catastrophe’, ‘catastrophic accident’ and ‘calamitous accident’ at several junctures to describe the 12 SAVING HUMAN LIVES

Challengerr disaster in Ann Larabee, Decade of Disaster, published by the University of Illinois Press in 2000. (While the present author originally thought that the ‘Post-Modern’ description was tongue in cheek, it was later discovered that Larabee’s work on the Challenger actually received the Postmodern Culture Journal’s Electronic text award). The addition of the word ‘catastrophe’ to describe the Challengerr disaster betokens the powerful hand of both Chance and Fate at once since one tends to think of catastrophes, like a Comet crashing into earth, as something which is not due to man’s actions and at the same time is unpreventable. ‘Calamity’ is not much better. It emphasizes the woeful side of the matter and thus falls in more with the ‘tragedy’ language while still borrowing the association with Chance and Fate that are connoted by ‘catastrophe’. With such a variety of descriptions the Challenger disaster becomes more and more opaque in Larabee’s treatment, at one point even becoming a ‘necrodrama’, thus casting the Challengerr disaster as an even more ghoulish spectacle than a ‘tragedy’. A ‘necrodrama’ presumably would be a kind of horror movie which would provide delectable appeal to those of certain theatrical and amoral (or immoral) palettes. Perhaps the change from ‘tragedy’ to ‘necrodrama’ is a reflection of the changing sensibility of the times.

THE BELIEF IN MONOCAUSALITY

While it may be argued that no one takes the thesis of single causality seriously nowadays, there is a great deal of discussion of the concept of “human error” which seems to attempt to place blame for disasters on the shoulders of a particular individual or individuals. Such an approach, which tends to ignore systemic management factors would seem to be subject to the belief that events take place in isolation from a network or system of interacting causes. Whenever there is a strong effort to fix blame on a certain individual or group, such as is noticed in the case of the Herald of Free Enterprise disaster or the case of the disaster at Mt. Erebus which was and still is argued by the to be a case of “pilot error”, there is frequently a behavior known as, ‘scapegoating’. This ACCIDENTS, TRAGEDIES AND DISASTERS 13 was especially true of the Presidential Commission’s findings in the case of the Three Mile Island disaster where they argued that ‘the major cause of the accident was due to the inappropriate actions by those who were operating the plant’8. Even in the case of where multiple causes are sometimes considered as in the Presidential Report on the Challengerr disaster, the causes are separated out as if they operated in isolation from each other as single causes, divided into technical causes on theone hand and faulty management causes on the other. In this case, as will be seen below, it is not at all clear that the technical cause can properly speaking be considered as a cause, or at least, not as a primary cause as it is so designated in the Presidential Report. If the idea of causality is to be utilized, it must be borne in mind that no cause operates in splendid isolation from other causes. The belief in monocausality may lead the unwary thinker in the direction of either finding particular scapegoats on which to affix blame or it may lead the unwary thinker in the direction of singling out technical factors on which to place major blame. A cause cannot operate singly: it always operates as an ingredient in a network of connections. In order to better appreciate the causes of a disaster, it is best to steer clear of the belief in single causes operating alone or in small clusters apart from some systemic background out of which they emerge. The reason why it is important to fully appreciate the systemic origin of any particular cause is that one way of initiating a change in how disasters may be approached is to look for the systemic causality in addition to the particular causation of any particular element in that system. It is not that by paying attention to systemic factors that one will ignore particular causes, but one will be in a better position to initiate a possible change in the causal pattern if the matter is approached both from the perspective of the individual causal agent and from the perspective of the systemic background out of which that causal agent arises. When the systemic background is fully considered, it should be discovered that no one cause operates or can operate in isolation from both a general system of causation and the co- operation of a series of co-contributing causes. Some so-called single causes, such as a technical defect in an O-ring design in the case of the Challenger disaster are not, properly speaking, single causes at all, as by themselves they possess no capacity for action but are set