Thomas Virgona Doctoral Dissertation: Defense
September 11, 2001:
A Study of the Human Aspects of Disaster Recovery Efforts for Wall Street Financial Services Firms
A Dissertation
Submitted to the Faculty
Of
Long Island University
By
Thomas James Virgona
In partial fulfillment of the requirements for the degree
Of
Doctor of Philosophy in Information Studies
Spring 2008
Thomas Virgona
74 Waverly Avenues
East Rockaway, New York 11518
516-599-2890
This dissertation is dedicated to public use; copying and reprinting are encouraged.
Page: 1 of 237
Thomas Virgona Doctoral Dissertation: Defense
Dedication
To my loving wife of over twenty years, for her unwavering support and confidence. Without Denise, who has been my emotional anchor through not only the challenges of my doctoral studies, but my entire adult life, this achievement would hold no value. To my mother, for instilling the importance of hard work and higher education.
To my late father, who has been my role-model for persistence and personal sacrifices, and who instilled in me the inspiration to set high goals and the confidence to achieve them.
To my children, TJ, Nicole and Joey, who have grown into three of the most special people any parent could wish.
To my colleagues in the program, especially Dan and Chris, who have provided (largely useless) advice and support during the coursework, comprehensive exams and dissertation process.
I wish to thank my committee members who were more than generous with their expertise and precious time. A special thanks to Dr. Hunter, Dr. Knapp, and Dr. Hildreth, for their countless hours of reflecting, reading, encouraging, and most of all patience throughout the entire process. Thank you Dr. Westermann and Dr. Koenig for agreeing to serve on my committee.
Page: 2 of 237
Thomas Virgona Doctoral Dissertation: Defense
Table of Contents 1 INTRODUCTION AND CONTEXT ...... 8 1.1 A SUCCINCT STATEMENT OF THE PROBLEM UNDER INVESTIGATION, INCLUDING ITS IMPORTANCE TO THE DISCIPLINE ...... 8 1.2 THEORETICAL FRAMEWORK ...... 20 2 LITERATURE REVIEW ...... 30 2.1 INFORMATION SYSTEMS TECHNOLOGY – THE BEGINNING ...... 31 2.2 THE GROWTH OF INFORMATION SYSTEMS TECHNOLOGY ...... 33 2.3 CHANGES TO DISASTER PLANNING CAUSED BY THE COLD WAR...... 36 2.4 CHANGES TO DISASTER PLANNING IN RELATION TO OTHER “DISASTERS” ...... 42 2.5 INFORMATION SYSTEMS AND TECHNOLOGY – THEORIES AND METHODOLOGIES...... 48 2.6 THE HUMAN COMPONENT OF INFORMATION SYSTEMS TECHNOLOGY...... 56 2.7 THE RELATIONSHIP OF THE EVENTS OF SEPTEMBER 11, 2001 TO INFORMATION SYSTEMS...... 61 2.8 SCHOLARLY LITERATURE; THE IMPACT OF SEPTEMBER 11, 2001 IN VARIOUS DISCIPLINES...... 64 2.9 SEPTEMBER 11, 2001: INFORMATION TECHNOLOGY AND DISASTER RECOVERY...... 74 3 STUDY DESIGN AND METHODOLOGY...... 89 3.1 RESEARCH QUESTIONS...... 89 3.2 OPERATIONAL DEFINITIONS AND LIMITATIONS...... 94 3.3 A SPECIFIC METHODOLOGY AND JUSTIFICATION...... 97 3.4 SAMPLE DATA-GATHERING IMPLEMENTS ...... 98 3.5 A STATEMENT IDENTIFYING POTENTIAL ANALYTICAL METHODS AND EXPECTED RESULTS ...... 105 3.6 MAPPING OF METHODOLOGICAL TECHNIQUES TO RESEARCH QUESTIONS...... 106 4 RESULTS...... 108 4.1 UNSTRUCTURED INTERVIEWS ...... 108 4.1.1 Senior Technology Manager...... 108 4.1.2 Help Desk Manager ...... 110 4.1.3 Application Manager...... 113 4.1.4 Network Engineer ...... 116 4.1.5 Business User...... 118 4.1.6 Database Administrator ...... 121 4.1.7 Summary...... 123 4.2 DISASTER RECOVERY TEST OBSERVATIONS ...... 124 4.2.1 Summary...... 127 4.3 FOCUS GROUP ...... 128 4.3.1 Summary...... 130 4.4 ARTIFACT ANALYSIS ...... 131 4.4.1 Asset Sales System...... 131 4.4.2 Global Technology Department...... 136 4.4.3 Funds Transfer Application...... 140 4.4.4 Globally Deployed Application...... 140 4.4.5 Loss of a Building ...... 144 4.4.6 Summary...... 150 5 DATA ANALYSIS AND FINDINGS (AND RELATIONSHIP TO PRIOR RESEARCH)...... 152
6 CONCLUSIONS AND RECOMMENDATIONS FOR FURTHER STUDY...... 167
7 REFERENCES...... 172
8 GLOSSARY ...... 202
Page: 3 of 237
Thomas Virgona Doctoral Dissertation: Defense
9 APPENDICES ...... 205 9.1 APPENDIX A: SDLC DELIVERABLES AND APPROVERS...... 205 9.2 APPENDIX B: TYPES OF RESEARCH DESIGN ...... 208 9.3 APPENDIX C: INFORMED CONSENT FORM...... 209 9.4 APPENDIX D: INSTITUTIONAL REVIEW BOARD APPROVAL ...... 210 9.5 APPENDIX E: INTERVIEW SCRIPT...... 212 9.6 APPENDIX F: DEFENSE ACCEPTANCE FORM ...... 214 9.7 APPENDIX G: INDIVIDUAL INTERVIEW TRANSCRIPTS ...... 216
List of Tables
Table 1 – Dynes’ Four Types of Tasks ...... 43 Table 2 - Definition of Terms Used in this Dissertation...... 94 Table 3 - Bergs Four Steps for Conducting a Focus Group...... 99 Table 4 - Methodical Techniques Used to Address Research Question ...... 106 Table 5 - Recovery Strategy...... 138 Table 6 - Loss of staff ...... 139 Table 7 - Assembly Points...... 145
Page: 4 of 237
Thomas Virgona Doctoral Dissertation: Defense
Abstract
Single events have impacted information technology and the general society. The effects of Guttenberg’s printing press were extraordinarily far-reaching: speed, uniformity of texts and relative cheapness (Eisenstein 1979). Another groundbreaking event in the field of information technology was the Soviet’s launching of Sputnik in October 1957. As a result, the American government, educators and general society placed heavy focus on organizing scientific information, increasing science and technology education, and established a national center for scientific and technical information. A more recent phenomenon is the diffusion of computers, and more specifically, the Internet. The Internet is the fabric of our lives, a ubiquitous presence. Information technology is the present-day equivalent of electricity in the industrial era.
This research examined the changes to disaster recovery plans for financial firms located in the Wall Street area since the events of September 11, 2001. The literature indicates that disaster recovery plans usually rely upon human capital and expertise, which has required critical information service providers (e.g., the financial services industry) to reexamine existing contingency plans.1 This dissertation will investigate the role people played in the disaster recovery efforts and subsequent updates to disaster recovery plans to account for these roles and tasks.
Why study disasters at all? There is one main reason: The scholarly study of disasters helps answer important questions as societies try to maintain order in the face of uncertainty
(Robert Stallings in Quarantelli 1998). For this dissertation, non-experimental design was used (exploratory and descriptive). Qualitative analysis is intended to produce an explanation of a phenomenon, particularly an identification of patterns. Specifically, the study included a
Page: 5 of 237
Thomas Virgona Doctoral Dissertation: Defense
focus group, unstructured interviews, and observations of a disaster recovery test and content analysis of disaster recovery plans. The research goal was to uncover what, if anything, we have learned from the events of September 11, 2001.
The devastating loss of life on September 11, 2001 was concentrated in the financial industry. Fatalities in that industry represented over 74 percent of the total civilian2 casualties in the World Trade Center attacks, and one firm, Cantor Fitzgerald, lost 658 employees
(General Accounting Office 2003). Ironically, September 11, 2001 was not the first such attack aimed at Wall Street. On September 16, 1920, a horse-drawn wagon carrying hundreds of pounds of explosives was detonated at the corner of Wall and Broad Streets in lower
Manhattan, killing thirty people and causing the New York Stock Exchange to close. The exchange reopened the next day and banking and financial activity quickly returned to normal
(Brooks 1969). One difference between the incident in 1920 and 2001, aside from the magnitude of the loss of life, is the reliance on technology. Lacker calls this a “technology shock”, a significant damage to operational capability due to either the inoperability of physical capital or the loss of staff (Lacker 2003).
The events of September 11, 2001 literally struck at the heart of America’s financial information center, causing both immediate and long-term adjustments. Information technology professionals on Wall Street on September 11, 2001 were placed in a unique position at the center of a disaster, concerned for family members and asked to recover information system with little or no “status” information. These professionals performed the
1 The penalty of a late information, even only a few minutes late, can be catastrophic to a company (Horton 1985). 2 In this specific context, fire fighters and police are not considered “civilian”. Page: 6 of 237
Thomas Virgona Doctoral Dissertation: Defense
task to the best of their abilities despite enormous distractions. What was learned during this process may be critical for future designers of information systems: dependence on human resources is critical during emergencies, recovery plans may provide little or no value during an actual emergency, assumptions about travel or communications may all be invalid and the dependency on other firms may be essential.
Page: 7 of 237
Thomas Virgona Doctoral Dissertation: Defense
1 Introduction and Context
1.1 A Succinct Statement of the Problem under Investigation, Including its Importance to the Discipline
Information systems and technology have changed as a result of both major revolutionary events and smaller evolutionary adjustments. Scholars have devoted a tremendous amount of focus to three “events” (the printing press, the launch of Sputnik and the Internet) that had dramatic effects on how information technology impacted the general society. In the late 15th century, the reproduction of written materials began to move from the copyist’s desk to the printer’s workshop because of Guttenberg’s invention of the printing press in Mainz, Germany. The effects of Guttenberg’s invention were extraordinarily far- reaching: speed of reproduction, uniformity of texts and relative cheapness (Eisenstein 1979).
The invention of movable type has been discussed as one source that ignited the Protestant
Reformation. There is a tendency to forget the awesome power the church had in the area of information control. The weekly church sermon provided news, real estate transactions and other mundane matters. The Roman Church had moved against Bible-printing and even developed a new form of censorship. Guttenberg’s invention represented a fundamental informational control shift away from the church to other disseminators of information.
Some of the subtle nuances of the introduction of the printing press included the ability for dispersed readers simultaneously to read maps, images, text and diagrams. Books that were known to be banned had a built-in attraction (Eisenstein 1979). Eisenstein noted the new kinds of medical self-help that ensued, as physicians were killing more patients than they saved (Eisenstein 1979)!
Page: 8 of 237
Thomas Virgona Doctoral Dissertation: Defense
Possibly no social revolution in European history is as fundamental as that which saw book learning (previously assigned to older men and monks) gradually become the focus of daily life during childhood, adolescence and early adulthood. It also widened the gap between literate and non-literate cultures in a manner that placed the well-read adult at an increasing distance from the unschooled small child (Eisenstein 1979)3. There are arguments for regarding Guttenberg’s invention as part of a continuously unfolding process. For at least
50 years after the introduction of movable type, there was no striking evidence of a cultural shift. Eisenstein (1979) argues that one must wait a full century to see evidence to emerge into full view.
The printing press also introduced some unique organizational shifts. Early printers were not only responsible for printing reference guides, but also compiling them (Eisenstein
1979). The growing power of the press as an independent group empowered all classes.
Gifted boys who might have become preachers became publicists.
“Perhaps no social revolution in European history is as fundamental as that which saw book learning gradually become the focus of daily life during childhood, adolescence and early manhood” (Eisenstein 1979, page 432). One of the most important cultural shifts resulting from the printing press was learning by reading, with the transmission of knowledge becoming much more efficient. “The nature of collective memory was transformed”
(Eisenstein 1979, page 66). The act of putting the Bible in everyone’s hand did encourage splintering of congregations and a new tendency towards religious self-help.
3 "The prospect of tackling this subject is far too vast to be assessed by any present or future assemblage is apt to even daunt the most audacious individual. If it is too vast to be handled by any single scholar, however, it is, by the same token, also too vast to be avoided by any single scholar" (Eisenstein 1979). Page: 9 of 237
Thomas Virgona Doctoral Dissertation: Defense
Before the printing press, scientific knowledge was slow to spread. Charting of the planets, mapping the earth and codifying laws, synchronizing chronologies and compiling bibliographies were all revolutionized by the printing press (Eisenstein 1979). Main centers of knowledge dispersed. The local storyteller was replaced by the literate villager. With the new editions of books and dictionaries, scholars found teaching to be an easier task, matching information from books to lectures (Eisenstein 1979). Journals speeded up circulation of scientific news and enabled scattered scholars to keep abreast of each others’ work. Hebrew and Arabic studies gained new momentum, as Medieval Bible studies had depended upon oral contact with the Jews and Greeks. The result was a move from manual to mental labor. The printer’s workshop attracted scholars of the day. Easily transmitted information enabled each subsequent generation to probe deeper into the past and advance beyond the position of its predecessor. The new format of technical literature increased scholar’s ability to cite prior works. One question still remained: “Had technology gone to press or was it still largely concealed?” (Eisenstein 1979, page 555), or as stated in Polyanyi’s maxim: “we know more than we can tell” (Srikantaiah and Michael 2000).
In the 15th century, the Roman Catholic Church was an integral part of people’s lives, in many ways shaping thoughts and perceptions. Many changes in society are caused by the introduction of technology. Today, people have integrated computers into their thought processes, into the way they work and think. As the printing press had a large impact on the way information was used throughout society in the 15th century, will the events of September
11, 2001 have a similar impact on information service providers in the area of disaster recovery, specifically the firms on Wall Street? Will “owners” of the financial information systems be forced to change their views on the protection of information, not unlike the
Page: 10 of 237
Thomas Virgona Doctoral Dissertation: Defense
changes the Roman Catholic Church went through as a result of the introduction of the printing press? As Postman said (1992), new technologies alter the structure of our interests; the things we think about. Modern computer technology had made information available in many forms and virtually instantaneously. As a result, critical information service providers need to respond to outages or users may go elsewhere for information, just as the Roman
Catholic Church discovered.
A second significant event in the field of information technology was the Soviet’s launching of Sputnik in October 1957. Kippenberger looks at the beginning of the e- commerce boom and its worldwide growth, and believes it all began after the Soviet Union launched the Sputnik in 1957 (Kippenberger 1999).Since Sputnik, the growth in American technology has been unprecedented. When the Russians astonished the world with Sputnik I, even the general public became concerned. Governments listened as well and provided funding using the following rationale: Because science and technology are strategically important for society, all efforts that help them, information activities in particular, are also important (Saracevic 1999). President Eisenhower was much less concerned about the Soviet actions than was the general public but nonetheless substantially altered many defense programs in order to meet perceived public demands. The President acknowledged privately that at least two-thirds of a spending supplement was used to meet public fears, not real security needs (Payne 1994).
On October 9, 1957, Eisenhower faced the press for the first time since the launch. Seeking to calm Congress and the public, he assured reporters that Sputnik contained "no additional threat to the United States," adding that "from what [the Soviets] say, they have put one small ball in the air." When asked how his administration could have let the Soviets be first in space, Eisenhower said that "no one ever suggested to me . . . a race except, of course, more than once we would say, well, there is going to be a great psychological advantage in world politics to putting the thing up, but . . . in view of the real scientific character of our development, there didn't seem to be a reason for just trying to grow hysterical about it." He added that he had provided the
Page: 11 of 237
Thomas Virgona Doctoral Dissertation: Defense
U.S. satellite and missile efforts with funds "to the limit of my ability . . . and that is all I can do." (Anonymous 1957)
The launching of the Sputnik satellite into space by the Soviets in October of 1957 prompted ample public interest in the U.S.A. and led to a mini-explosion in Soviet studies.
New Russian and Soviet study programs were established at many universities and the few already existing programs got a second wind (Zilper 2002). In the U.S., the National Science
Foundation (NSF) Act of 1950 established NSF and provided funding for a number of mandates, among them “to foster the interchange of scientific information among scientists in the U.S. and foreign countries” and “to further the full dissemination of [scientific and technical] information of scientific value consistent with the national interest”. The 1958
National Defense Education Act (also known as the “Sputnik Act”) enlarged the scope of the
National Science Foundation to include a task to develop methods, including mechanized systems, for making scientific information available”. By those mandates, an NSF division, which after a number of name and direction changes is now called the Division of Information and Intelligent Systems (IIS), has supported research in these areas since the 1950s.
Importantly, the field-defining studies that NSF supported included, among others, Cranfield
IR4 evaluation studies in the 1950s and 1960s, large potions of SMART studies from the
1960s to 1990s, and now the Digital Libraries Initiatives (Saracevic 1999). Information
Science developed and flourished, as did many other fields, due in large part due to government support by a host of agencies. Historically, the support was a success—it was instrumental in creation of the whole enterprise of information science and even of the information online industry based on Information Retrieval (IR) (Hahn 1996). But to the credit of information science it kept growing on its own even after government support slackened substantially. This cannot be said of a number of other fields or areas that
Page: 12 of 237
Thomas Virgona Doctoral Dissertation: Defense
floundered after the government stopped being their main source and resource (Saracevic
1999).
Workers are now technically educated and able to take initiatives, which leads to a third growth event in information technology, the introduction of the Internet. Of all the technical innovations featured in the ARPANET, forerunner of the Internet, perhaps the most celebrated was packet switching. The Internet is the fabric of our lives, a ubiquitous presence.
Information technology is the present-day equivalent of electricity in the industrial era. In reality, the social impact of cyberspace upon the individual is only beginning to be understood
(Conway et al 2003). Winner suggests the most significant challenge posed by the linking of computers and telecommunications is the prospect that the basic structures of political order will be recast (Winner 1986).
The present time could be characterized as the era of the Internet. The Internet seems to have some positive effect on social interaction, and it tends to increase exposure to other sources of information. The Internet provides a tool to give a voice to people who would otherwise find difficulty in obtaining that voice. However, the Internet still has difficulty in attracting the most deprived and socially excluded in society. It is in these respects that the
Internet, rather than providing a vehicle for liberation, serves to reinforce the prevailing control, as the more powerful have the louder and more eloquent voices (Conway et al 2003).
Indeed, wider access and participation in the information society is paramount for broader issues of social inclusion. Many theorists reject any suggestion that the “information revolution” has overturned everything that went before. On the contrary, they come to explain that it is a primarily an outcome and expression of established and continuing relations (Webster 1995). Herbert Schiller suggests that the information explosion of the
4 The Cranfield experiments were designed to evaluate the performance of various indexing languages in retrieval. Page: 13 of 237
Thomas Virgona Doctoral Dissertation: Defense
post-war years is the consequence, for the most part, of corporate capitalism’s inexorable march (Webster 1995).
The Internet is not just a technology, but a technology of freedom. The fundamental digital divide is not measured by the number of connections to the Internet, but by the consequences of both connection and lack of connection (Castells 3003). Technology is a vitally important aspect of the human condition. Technologies feed, clothe, and provide shelter for us; they transport, entertain, and heal us; they provide the bases of wealth and of leisure; they also pollute and kill (Mackenzie and Wajcman 1999). Castells defined the digital divide by the following categories: Income, Education, Age and Ethnic.
Littlefield researched the impact of the Internet on real estate sales. Internet access provides convenience and opportunity for home buyers, providing the ability to search the
Internet for house-related information. The younger generation of home buyers appears to use the Internet to aid their home purchases more than older generations do (Littlefield, et al
2000).
It is critical to note that these three events (the printing press, Sputnik, and the
Internet) were protracted historical events and did not occur as a singular phenomenon.
Fischer refers to the reduction of these extended trends to momentary transformation as
“telescopic fallacies” (Fischer 1970). An exhaustive study of any or all three of these innovations and their origins would demonstrate an extended development timeline.
Eisenstein (1979) favors the gradualist, evolutionary approach. However, the use of these singular phenomena is meant to show the impact to information systems and not make a long story short. Historians also find similar telescoping problems when survey respondents recall events (Fischer 1970).
Page: 14 of 237
Thomas Virgona Doctoral Dissertation: Defense
The underlying issue of this research is that on September 11, 2001, the terrorist attacks that struck downtown Manhattan rendered Wall Street area financial services unable to provide critical information services. This research will investigate the role people played in the disaster recovery efforts and subsequent updates to disaster recovery plans to account for these roles and tasks. The research indicates that the disruption to the workings of the financial information systems rendered them unusable by customers and clients. It is important to remember that many of the system failures and outages that occurred on that fateful day are not public knowledge and are treated as confidential information.
Social scientists generally agree on what disasters are and how they are distinguished from other social phenomena (Kreps, and Kroll-Smith/Gunter in Quarantelli 1998). For this study of September 11, 2001, the Porfiriev definition is sufficient (in Quarantelli 1998). A disaster is a condition destabilizing the social system that manifests itself in a malfunctioning or disruption of connections and communications of a social unit, partial or total destruction, making it necessary to take extraordinary or emergency countermeasures to reestablish stability (Kreps, and Kroll-Smith/Gunter in Quarantelli 1998).
Gilbert classifies disasters into three paradigms (Quarantelli 1998). The first is a catastrophe imputed by an external agent or human communities reacting against an aggression. Gilbert calls this a duplication of war. The second disaster is an expression of social vulnerabilities. The third disaster is an entrance into a state of uncertainty.5 September
11, 2001 would fall into the first paradigm – duplication of war. These events involve considerable harm to the physical and social environment. They happened suddenly and something might have been done to mitigate their effect before or after they happened (Kreps in Quarantelli 1998).
Page: 15 of 237
Thomas Virgona Doctoral Dissertation: Defense
The devastating loss of life was concentrated in the financial industry. Fatalities in that industry represented over 74 percent of the total civilian casualties in the World Trade Center attacks, and one firm, Cantor Fitzgerald, lost 658 employees (General Accounting Office
2003). It is not specifically known how many held positions in information technology or were responsible for disaster recovery tasks. Terrorist attacks on physical infrastructure are capable of interrupting major financial, banking and payment functions. Ironically,
September 11, 2001 was not the first such attack aimed at Wall Street. On September 16,
1920, a horse-drawn wagon carrying hundreds of pounds of explosives was detonated at the corner of Wall and Broad Streets in lower Manhattan, killing thirty people and causing the
New York Stock Exchange to close. The exchange reopened the next day and banking and financial activity quickly returned to normal (Brooks 1969). One difference, aside from the magnitude of the loss of life, between the incident in 1920 and 2001 is the reliance on technology. Lacker calls this a “technology shock,” a significant loss of operational capability due to either the loss or malfunction of physical capital or the loss of staff (Lacker
2003).
One of the most visible disruptions was that the New York Stock Exchange (NYSE) ceased all operations for four business days. Although not a direct target of the attack, the dependency on other financial systems (e.g., inter-bank payments and the Federal Reserve
Bank) made normal business operations impossible. However, at the core of the issue was the disruption of inter-bank payment systems (Lacker 2003). On that Wednesday (9/12/2001),
5 Disasters need to be studied within human groups, and not the results of external factors. The framework for disaster is not conflict or external attacks, but the results of upsetting human relations. Page: 16 of 237
Thomas Virgona Doctoral Dissertation: Defense
Richard A. Grasso, chairman of the New York Stock Exchange, vowed that U.S. stock trading would resume no later than Monday (9/17/2001) (Blustein and Day 2001). The decision to shut down the NYSE and when to return to operation was a difficult one, fraught with risk. There was a risk of bringing the markets back too soon if too few participants were functioning again. Also poor liquidity could hamper trading and exacerbate the expected price declines. Moreover, physical conditions in Lower Manhattan were unpleasant and potentially harmful. Conversely, the symbolic value of a return to normalcy was very attractive. The questions remain unanswered: Where these plans fully tested in regularly scheduled Disaster Recovery simulations? Were operational and technology decisions on
September 11, 2001 dependent on individuals.6
From an economic and operational perspective, the banking system was in relatively healthy condition on September 11, 2001. From a geographical perspective, it was a true disaster. The facilities of the New York Board of Trade in Four World Trade Center were destroyed. Several firms, including the Federal Reserve Bank itself, were forced to relocate to disaster recovery sites. Regional stock exchanges, the NASDAQ Stock Market, the Chicago
Board of Trade, the Bond Market Association and the Chicago Mercantile Exchange all closed as well. European markets remained officially open but from a “human” perspective, traders found it difficult to do much business. Connections to the Bank of New York (BoNY) were lost for part of the week and as a result the bank did not know what securities and cash it had received, and it was unable to transmit settlement instructions (Costa 2001).
6 In some scenarios, individuals do not make decisions to implement disaster recovery plans. For example: During a power outage, computer systems cutover to auxiliary power without human intervention. Page: 17 of 237
Thomas Virgona Doctoral Dissertation: Defense
On the Federal Reserve’s Fedwire Funds Transfer System, payments are initiated by the sender of funds, but the major banks’ inability to send funds transfer payment instructions following the September 11, 2001 attacks meant funds accumulated in that bank’s account. At one point during the week after September 11, BoNY publicly reported to be overdue on $100 billion in payments (Beckett and Sapsford 2001). The Moscow International Currency
Exchange (MICEX), which used BoNY as a business partner, suspended trading due to
BoNYs problems.
Banks with excess balances found it difficult to locate borrowers. The general disruption in payment flows meant uncertainty for many banks about whether scheduled incoming payments would be received as planned (McAndrews and Potter 2002). This lack of clarity on bank finances at a macro level caused trickle down concerns at a microeconomic level. Reports of increased cash withdrawals by bank depositors were common. Currency in circulation increased by $4.4 billion from Monday (9/10/2001) to Wednesday (9/12/2001)
(Lacker 2003). The government securities market also was hit particularly hard because many critical market participants were incapacitated, in part because the government securities market opens earlier that the stock market. Trading in U.S. government securities starts at 8 a.m. in New York. Dealers in U.S. government securities trade with each other through inter-dealer brokers (IDBs). Cantor Fitzgerald, which suffered tragic losses, was the largest IDB prior to the attack.
Two hundred thousand voice access lines went out, 100,000 PBX/Centrex lines went out, 3.6 million data circuits went out, 10 cellular towers were lost or damaged, and approximately 14,000 businesses and 20,000 residential customers were affected (Lacker
2003). The disruption to communications links impaired many institutions’ ability to initiate payment instructions. The failure of many communications links between government
Page: 18 of 237
Thomas Virgona Doctoral Dissertation: Defense
securities dealers and the market clearing and settling institutions was also a source of major disruption. The voice communication systems which replaced e-mail or computer system communication to conduct business as a contingency proved to be unreliable (Lacker 2003).
In New York and Washington, bank branch closings were widespread, but many banks outside those cities closed branches briefly as well. Some state banking agencies and the Office of the Comptroller of the Currency issued statements allowing banks to close at their discretion (Lacker 2003). Keeping in mind that Wall Street is a major information service provider, the financial world was literally in a state of flux immediately after
September 11, 2001. There was no reliable method of knowing your bank balance or if checks were processed, as well as your stock and bond portfolio status.
What happened the first day the financial markets re-opened? The Dow Jones
Industrial Average experienced its largest single-day drop ever, with the Industrials tumbling more than 680 points (7%). By comparison, on Tuesday October 29, 1929, Wall Street witnessed a 13% drop, known in financial mythology as “Black Tuesday”.
The events of September 11, 2001 literally struck at the heart of America’s financial information center. What impact will September 11, 2001 have on Information
Systems and Technology, specifically disaster recovery planning and implementation? Will it be revolutionary or evolutionary? What were the human factors encountered that day? Has the role of individuals been accounted for in updated disaster recovery plans? In many ways,
September 11, 2001 was the first implementation of full disaster recovery effort, thereby exposing deficiencies in recovery plans. As scholars, it is imperative to investigate the root causes and impact of these changes, and future implications for the discipline as a whole. “So it is natural to ask how the events of 11 September will affect our profession in the months and years ahead (Hayhoe 2002).” Scholars believe these “human” factors present one of the
Page: 19 of 237
Thomas Virgona Doctoral Dissertation: Defense
most unpredictable areas for disaster recovery researchers (Sikich 2003). How will humans react to unfolding events? Sikich also puts forth the definition of human factors7 in the context of business continuity (Sikich 2003). Questions that are now relevant include:
• How well do you know your workforce?
• What is the extent of background checks that are part of the screening
process?
• Can someone, either overtly, clandestinely, or unwittingly, be
compromised into creating an exposure that puts the firm at risk?
• How can you implement checks and balances so that critical information is
not subject to compromise.
1.2 Theoretical Framework
Several theoretical frameworks are available for this research. As Geoges Anderla said (in Horton and Lewis 1991) many disciplines, especially economics, land themselves in trouble whenever they attempt to integrate technology innovation into theoretical frameworks.
Everett Rogers has developed the very versatile diffusion theory and has already applied the theory to information communication on September 11, 2001 (Rogers 2003). The events were so shocking that people felt they needed to share the news and their reactions with others. In a survey, 88 of the 127 respondents (69%) communicated the news to others
(Rogers 2003). Those 88 people communicated with 418 others (average of 4.8 people
7 Not to be confused with human ‘aspects.’ Page: 20 of 237
Thomas Virgona Doctoral Dissertation: Defense
each). Despite advanced technology, this is a clear example of the reliance of human communications networks.
Control of information has always been dictated by technology. Frederick Ferre stated that the definition of technology is the practical implementations of intelligence (Ferre 1988).
There is a tendency to forget that sermons used to couple news, real estate transactions and other mundane matters (Eisenstein 1979). The Sunday paper has replaced church going as an information source. Until Gutenberg, the church had censured ideas more than texts. In the big cities, newspapers succeeded in reaching the general population, whose cultural and educational level was low (Martin 1994). The printing press was such a major technological advancement that Sir Francis Bacon said it was one of the three inventions (printing, gunpowder, and the compass) that changed the state of the whole world (Eisenstein 1979).
Christopher Burns (in Horton and Lewis 1991) raises the information control issue in a more recent event – “Three Mile Island.” What went wrong at Three Mile Island is classic information management collapse that raised critical questions about how to control the new information environments. As with September 11, 2001, communication lines were also unavailable or simply broken.
More specific to this research effort would be Hollan’s views on Distributed
Cognition. The theory of distributed cognition seeks to understand the organization of cognitive systems (Hollan, Hutchins and Kirsch 2000). Unlike traditional theories, it reaches beyond what is considered cognitive and beyond the individual to encompass interactions between people and with resources and materials in the environment. In this particular scenario, technology was the material and the environment is September 11, 2001. It is distributed by placing memories, facts, or knowledge on the objects, individuals, and tools in our environment. Distributed cognition is a useful approach for (re)designing social aspects
Page: 21 of 237
Thomas Virgona Doctoral Dissertation: Defense
of cognition by putting emphasis on the individual and his or her environment. Distributed cognition views a system as a set of representations, and models the interchange of information among these representations. These representations can be either in the mental space of the participants or external representations available in the environment. This model establishes a new foundation for human-computer interaction research. In the more traditional views of cognitive factors, the boundaries are those of the individuals. One aim of cognition, or the mental process of knowing, is to create knowledge through perception, reasoning, or intuition. In Hollan’s view of distributed cognition, one expects systems to configure themselves dynamically to align subsystems and accomplish detailed functions
(Hollan et al 2000). Christopher Burns (in Horton and Lewis 1991) noted the importance of team processing in technology. He recognized that the old industrial approach of segregating work and assigning it to specialists does not produce the best result when information work is involved. A carefully engineered collaborative approach is more effective.
When applied to observing human activity in its natural setting, three types of distributive cognitive processes become apparent (Hollan et al 2000). In order to understand and design effective human-computer interactions, it is critical that these processes are understood.
• Cognitive processes may be distributed across the members of a social group.
They involve trajectories of information (transmission and transformation) and are
reflective of an underlying cognitive architecture. The broader context includes
phenomena that emerge in social interactions as well as the interactions between
people and the structure in their environments. In this research, the application of
the trajectories is the design and execution of disaster recovery information and the
subsequent human interactions.
Page: 22 of 237
Thomas Virgona Doctoral Dissertation: Defense
• Cognitive processes may involve coordination between internal and external
(material or environmental) structures. This is an essential fact of cognition
that people are designed to use. It is these specific undocumented interactions
during September 11, 2001 that require investigation to determine gaps in disaster
recovery plans on that day. “Well-designed work materials become integrated into
the way people think, and control activities, part of the distributed system of
cognitive control” (Hollan et al 2000, page 178).
• Process may be distributed through time in such a way that the products of
earlier events can transform later events. Culture accumulates solutions to
frequently encountered problems. The meeting of cognition and culture is the
concept that a person’s environment is a reservoir of resources for learning,
problem solving and reasoning. One of the goals of distributed cognition is to
return culture, context, and history to the cognitive view. One aim of this study is
to learn from the disaster recovery failures8 on September 11, 2001 and to design
better controlled and more effective plans.
Since the cognitive properties of the entire system are larger than any one individual’s activity, cognitive ethnography9 must be event-centered (e.g. September 11, 2001). When speaking to experts on the lessons of financial information systems on September 11, 2001, one must know the design structures and how they were organized. Hollan believes this forces us to look at the barriers between what is defined as inside and outside, forcing exploration of interface components. A rapport must be established with the participants.
What processes and tasks were people engaged in, and what actions performed during the event (September 11, 2001) were meaningful? Hollan believes this is invariably revealing
8 Learning can also happen from success; however, this study is focus on failures on September 11, 2001. Page: 23 of 237
Thomas Virgona Doctoral Dissertation: Defense
and surprising. “As we build richer, more all-encompassing computational environments it becomes more important than ever to understand the ways human agents and their local environments are tightly coupled in the processing loops that result in intelligent action”
(Hollan et al 2000, page 186). After all, we are constantly reorganizing the work environments to optimize performance. Individual work tasks are no longer confined to a desk, but reach into the global networked world. Distributed cognition is tailored to understand the interactions among people and technology. The framework requires the observation of human activity, the analysis of cognitive processes of social groups, coordination of internal and external structures, and how products of earlier events can transform the nature of later events.
Distributed cognition is a popular framework for researchers. A recent Google search indicated 249 scholarly articles citing Hollan’s work. Rogers (2004) wrote that the distributed cognition approach has been used primarily by researchers to analyze a variety of cognitive systems, including:
• Airline cockpits: As with supporting financial system on Wall Street, flying a
modern jet transport is a job that cannot (at least not in current practice) be done by an
individual acting alone. The distribution of access to information is an important
property of systems of the distributed cognition theory. The shared understanding of
the situation is known as an inter-subjective understanding. A cockpit provides an
opportunity to study the interactions of internal and external representational structure
and the distribution of cognitive activity among the members of the crew. The
properties of the larger system emerge from the interactions among the members of
the crew and the contents of those communications. Interpretations are determined in
9 A cognitive ethnography study investigates information practices in experimental life sciences research (Peter Jones 2005). Page: 24 of 237
Thomas Virgona Doctoral Dissertation: Defense
part by the access to information of the crew. Through an analysis of audio and video
recordings of the behaviors of real airline flight crews performing in a high fidelity
flight simulator, Hutchins and Klausen were able to demonstrate that the expertise in
this system resides not only in the knowledge and skills of the human actors, but in the
organization of the tools in the work environment as well (Hutchins and Klausen,
1996).10 The analysis reveals a pattern of cooperation and coordination of actions
among the crew. Ironically, one pilot remarked, the cockpit is a poor classroom, a
considerable amount of training takes place there.
Hutchins and Palen also studied distributed cognition in a cockpit environment. Their research looked to supplementing the meaning of verbal communications with Space and
Gestures.11 The gestures acquire their meaning by virtue of being superimposed on the meaningful spatial layout of the control panel. The same gestures produced in the absence of the panel would, of course, be quite meaningless (Hutchins and Palen 1997).
• Call centers: Ackerman and Halverson studied organizational memory in the
framework of distributed cognition in a telephone hotline group. It was noted that
most studies of organization memory have largely focused on the technology
systems designed to replace human and paper-based memory systems (Ackerman
and Halverson, 1998). Telephone hotlines12 in general, are good places to study
memory in an organization, because their operation is so information intensive. In
the telephone hotline study, memories were complexly distributed, interwoven,
10 This research included analysis of cockpit transcripts. 11 The Hutchins and Palen research employed the use of videotapes. 12 The Ackerman and Halverson study used several data collection methods: direct observation, video, semi-structured interviews and social network analysis. Page: 25 of 237
Thomas Virgona Doctoral Dissertation: Defense
and occasionally overlaid, which makes telephone hotlines a good research area
for distributed cognition. It was found that memory is both an artifact that holds
its state and an artifact that is embedded in many organizational and individual
processes.
• Software teams: Software development can be a highly social activity involving
frequent interaction between programmers and with their development tools in the
performance of a task (Flor and Hutchins, 1992). The development and
comprehension of a computer program is a function of how well the system
performs as a whole. Other system-level variables include how well programmers
communicate inside and outside the group and the use of development tools. The
system of system-level properties is very complex, yet, is too difficult to ignore.
The system performs the task, not any individual!
• Control systems: In a study of operations in emergency resource centers, public
displays (e.g. a flip chart) were noted to perform central roles in indicating status
information and facilitating discussions among decision makers (Garbis & Waern,
1999).
• Engineering practice: Rogers studied how networking technology has changed
the working practices of an engineering company (Rogers 1994). Rogers
specifically examined how a close-knit group of engineers attempt to collaborate
when managing a networked system, while at the same time trying to maintain
coordination of their interdependent activities. Through a Distributed Cognition
Page: 26 of 237
Thomas Virgona Doctoral Dissertation: Defense
analysis, Rogers was able to reveal various breakdowns that occurred in the work
activities and the mechanisms by which the group had adapted its working practice
to overcome them.
One of the main outcomes of the distributed cognition approach is the discovery of complex interdependencies between people and artifacts in their work activities (Rogers
2004). In this sense, the distributed cognition approach is difficult to apply, since there is not a set of clear features to look for, nor is there a check-list that can be easily followed when doing the analysis. The distributed cognition framework can provide insights for changing a design to improve user performance, or more generally, a work practice (Rogers 2004).
Distributed cognition is the appropriate theoretical grounding for this study, describing how systems technologists worked together during the disaster. Technical coordination and systemic processes occur by means of shared practices, beliefs, values, and structures of interaction, which are institutionally based. The cognitive processes studied here are distributed across the members of a social group – technology professionals located at Ground
Zero on September 11, 2001. The cognitive processes to be studied involve coordination between internal (corporate communications) and external (news of that day) structures. Did the tasks performed during that horrific day transform the nature of later DR events?
Distributed cognition provides the grounded theory framework that will be most fruitful for this research.
The expected outcome of this research will be a better understanding of the reliance on
“humans” during information technology disasters. Specifically, when disaster recovery plans are compiled and tested, what is the “new” (or modified) role of individuals during a real disaster scenario? Although the research will focus on Wall Street firms directly impacted by the events of September 11, 2001, all information technology professionals
Page: 27 of 237
Thomas Virgona Doctoral Dissertation: Defense
entrusted to build and maintain information systems can use the findings to enhance existing recovery plans. Disaster recovery plans are no longer a “second thought” or “we will get to that later.” As a result of the events of September 11, 2001, information systems must be constructed with functional and tested disaster recovery plans. Usability during a disaster is now a critical component of the Systems Design phase of the Systems Development Life
Cycle (SDLC). Additionally, auditors and regulators will include the Disaster Recovery (DR) design and testing results in reviews.
Below is a ”Waterfall” Depiction of the Research Problems’ relation to the discipline of Information Studies and how the final research questions are a subset of the Information
Science discipline:
Information Science: The theoretical discipline concerned with the application of mathematics, systems design, and other information processing concepts; it is an interdisciplinary science involving the efforts and skills of librarians, logicians, engineers, mathematicians and behavioral scientists. The application of information science results in an information system (Borko 1968).
Systems Analysis: Systems analysis is a means of viewing circumstances realistically
and designing practical solutions (Osborne and Nakamura. 2000). This includes a set
of guidelines and techniques that assists a systems analyst in stating functional
requirements of a system in logical terms (Yourdon 1979).
Systems Design: Systems design is a set of guidelines and techniques that
assists a systems designer in determining which modules, interconnected in a
way, will best solve a well-stated problem (Yourdon 1979).
Page: 28 of 237
Thomas Virgona Doctoral Dissertation: Defense
Disaster Recovery: Any system that relies on a computer13 should
include a plan to cope with the loss of that computer. It should be
written to take into account several levels of a disaster and should be
reviewed at regular intervals. Some responses may include: revert to a
manual system, create a special temporary system, maintain a second
backup computer, move operation to another location, or stop
operations (Osborne and Nakamura 2000).
Human Factors: Human Factors is the scientific discipline
concerned with the fundamental understanding of interactions
among human and other elements of a system, and the
application of appropriate methods and theory to improve
human well-being and overall system performance (Karwowski,
2001).
13 All information systems do not rely on a computer information system. Page: 29 of 237
Thomas Virgona Doctoral Dissertation: Defense
2 Literature Review
The events of September 11, 2001 represent one of the most significant moments in
American history. The terrorist attacks that occurred that day impacted every area of
American life. Financial markets were closed and skewed for months. Simple everyday tasks, such as commuting to work or going to an airport, were changed immediately. Many citizens were psychologically impacted and lived with a fear of another attack. Beniger once stated that important transformations of society rarely result from a single discreet event (Beniger
1986). September 11, 2001 may be that rare exception.14 Time will determine if the events of
September 11, 2001 will rise to the level of true information disaster. Barker (2007) listed what he believed to the biggest information technology disasters to date:15
1. Faulty Soviet early warning system nearly causes WWIII (1983)
2. The AT&T network collapse (1990)
3. The explosion of the Ariane 5 (1996)
4. Airbus A380 suffers from incompatible software issues (2006)
5. Mars Climate Observer metric problem (1998)
6. EDS and the Child Support Agency (2004)
7. The two-digit year-2000 problem (1999/2000)
8. When the laptops exploded (2006)
9. Siemens and the passport system (1999)
10. LA Airport flights grounded (2007)
One discipline clearly impacted, but not yet researched is the field of information technology. The fallout from the September 11, 2001 investigations indicates a vast array of
14 Pearl Harbor probably was another.
Page: 30 of 237
Thomas Virgona Doctoral Dissertation: Defense
information failures, both on a human and machine level.16 As business skills change, decision-makers must now seek out, retrieve, reorganize, assimilate, interpret, and utilize data
(Horton 1994a). As Gray and Altmann wrote in 2001, information in the world is useful only if we can find it when we need it. This will be a critical theme of September 11, 2001 – people now think of information as an all-pervasive/universal resource (Horton 1994a).
Single events can have a significant impact on this growing field. Horton and Lewis (1991) stated that in many information cases the senders of specific messages were either uninformed, misinformed, or if they were informed, therefore not able to fit the information into preconceived stereotypes, value systems, belief systems or attitudes. Several events that depict information disasters, according to Horton and Lewis are war (Hitler’s decision to attack the Soviet Union in 1941, Civil War Intelligence and Signals in 1864), tragedy (Three
Mile Island, and The Tacoma Bridge Disaster) and business (the stock market crash of
October 1987).
2.1 Information Systems Technology – The Beginning
In relative terms, information science does not have the deep and rich history of other disciplines, such as mathematics or astronomy. Shera wrote about the origins of information science.17 Like librarianship, information science is still largely an agglomeration of knowledge and technologies drawn from other areas (Bennet 1988). During the Second
World War, an outburst of scientific and technical innovations occurred, including the reliance on information. In the same general period, Vannevar Bush (Bush 1945) published his landmark article, including the theoretical Memex machine, which transformed thought
15 http://resources.zdnet.co.uk/articles/0,1000001991,39290976,00.htm 16 Horton and Lewis (1991) addressed this topic by bringing together examples of “how information mismanagement led to human misery, political misfortune and business failure.”
Page: 31 of 237
Thomas Virgona Doctoral Dissertation: Defense
and human creative activity. When the Russians astonished the world with Sputnik I, even the general public became concerned (Bennet 1988). Technology arose in a world of bizarre contradiction: starving people vs. bigger bombs (Winner 1986). Early in February 1958,
Allen Kent and his colleagues organized a national meeting at Case Western University in
Cleveland to discuss a proposal to establish a national center for scientific and technical information. Many U.S. scientists suggested that one of the reasons for the Soviets’ taking the lead in the space race was the existence of their Institute of Scientific Information (Kent in
Hahn 1998).
Scientists themselves became national heroes as the nation’s strength came to be determined equally by military might and by scientific capability (Bowles 1998). Information systems for science and technology had a privileged existence because of industrial and military needs and government policies. In the beginning, much of the pioneering was by individuals trained in chemistry (Buckland 1998). Eventually, information science grew and evolved into a practical discipline. The demarcation between the old and new ways of searching for information was the change from paper indexes and card files to online database
(Hahn 1996). The three groups18 of pioneers were:
1. Programmers, system designers and researchers
2. Large scale commercial and government operations
3. Early PC Users; their perspective provides yet another angle to judge success and
failures and to measure the rate of development.
17 The Information Society Concept dates back to Fritz Machlup in 1962 (Beniger 1986). Despite Machlup’s writings, there is a an ongoing scholarly debate on the we are an “Information Society”. 18 These are not exclusive categories and do have overlap. Page: 32 of 237
Thomas Virgona Doctoral Dissertation: Defense
2.2 The Growth of Information Systems Technology
Americans have enthusiastically embraced new information technology that has come along (Chandler and Cortada 2000). The development of online systems and services is not just a story of tapes, terminals, telephones, search engines, algorithms and downtime; it is also a story of people. Acquisition of information technology is associated with social privilege
(Chandler and Cortada 2000).
The leaders of the online age can be divided into three groups: the developers, the managers, and the users (Hahn 1998). The developers were diverse in their geographic and disciplinary backgrounds and their underlying goals. They were aggressive, competitive and imaginative in creating opportunities to exploit the latest hardware and software of the initial period. The second group, managers and trainers, demonstrated the problems of online systems. With zeal, perseverance, charm and even chicanery, they recruited and trained the first users. The users were the third group, playing a critical role in evaluating new systems, testing documentation, and assessing training programs (Hahn 1998).
Measurements can be used to demonstrate the growth and change of information systems. Moore’s law is a simple example of the volatility of the discipline. Moore's Law states that processor power doubles every 18 months (Otto, Cook and Chung 2001). A 2004 study at Berkeley states that annually we are generating about 4.5 Exabytes of magnetically stored information. This is equivalent of 34,000 Libraries of Congress! (Scholl 2004). The changes are fast and significant. A historical comparison can be made to another technology
– railroads. In the middle of the 19th century, there were 7 distinct track gauges for railroads operating in North America. This infrastructural impediment to the flow of goods had demonstrable effects on economic development, to say nothing of the additional costs of
Page: 33 of 237
Thomas Virgona Doctoral Dissertation: Defense
supporting such a rail network. It is important to note that nobody stopped the trains to wait for the tracks to become the same width (Weibel 1997). The analogy is clear. Current attempts to standardize information technology have not always been successful. Other information age issues include the misinterpretation of market data, complex systems going awry, and making difficult decisions with the information “at hand” (Burns in Horton and
Lewis 1991).
Information technology is the present day equivalent of electricity in the industrial era.
The Internet is the fabric of our lives (Castells 2003). What made it possible for the Internet to embrace the world at large was the development of the World Wide Web, an information sharing application developed in 1990 by an English programmer named Tim Bernes-Lee.
The Internet was purposely designed as a technology of free communication (Castells 2003).
The origins of the Internet are found in the ARPANET in 1969. The diffusion of the Internet provides a platform for a vast array of the changes19 in information systems and technology.
The Internet continues to evolve. Web portals are seen as positive potential frameworks for achieving order out of chaos (Lakos 2004). Key principles govern any portal design: simplicity, dependability, quantifiable value, personalization, and systematic management. Structured development concepts now include usability, self-navigation, self- sufficiency, personalization, and identifying content that is vital to the users. From many points of view, hypertext, and hypermedia have been a success (Jones and Willett 1997).
People pass around URLs as a way of sharing experiences (Brown et al, 2002).
19 Changes include the number of internet users, uses for software, new ways to communicate and shop, etc. Page: 34 of 237
Thomas Virgona Doctoral Dissertation: Defense
The WWW URLs are a unique interface and security is major, but not a singular, concern. The direct manipulation interface with its click-and-point modus operandi has made non-sequential reading of an information resource easy and productive (Jones and Willett
1997). Copyright and patent rights present issues (Baeza-Yates 1999) and are growing global concerns. The web then leads to other concerns: distributed data, high percentage of volatile data, large volume, unstructured and redundant data, quality of data and heterogeneous data.
Also, problems continue to arise from internal politics and the “war for screen real estate”
(Tennant and Michalak 2004). Two types of changes lead to web page and web site mortality: content and structure (Koehler 1999). Almost without exception, over the period of a year, all web documents change (Koehler 1999). Web sites demonstrate a great deal of volatility (and variability). Networked electronic information is often transitory, without quality control or stability (Velluci 1998). Near term challenges for the Internet include, but are not limited to security, ownership and structure (Rowley and Farrow 2000).
There sometimes is a cultural preference for paper over screen information. Why do air traffic controllers prefer paper to electronics? Flexibility in spatial layout, ease of manipulations, easy and direct marking and information at a glance are a few noted benefits.
When observing police officers trying to use laptops, it was discovered that laptop design, shape, input methods, and software did not support interweaving of the police officer’s computer activity with a conversation with the crime victims (Sellen and Harper 2002).
Page: 35 of 237
Thomas Virgona Doctoral Dissertation: Defense
2.3 Changes to Disaster Planning Caused by the Cold War
The 1950s and 1960s saw an emergence of policies and plans dealing with the threat of a nuclear attack resulting from the Cold War between the United States and the Soviet
Union (Fagan et al 2005). The first federal disaster planning administration, the Federal Civil
Defense Administration (FCDA), was created by President Harry Truman in 1949 after the
Soviet Union detonated its first atomic weapon. Congress remedied the fact that there was no instrument in place to offer direct federal aid to state and local governments during an emergency by passing the Federal Civil Defense Act of 1950. The FCDA later was made an independent agency (Fagan et al 2005). It then took over the responsibilities of what was once the National Security Resources Board (NSRB) which was created by the National
Security Act of 1947. This board was created to advise the President on coordinated mobilization of the United States during times of war.
There was tension within all levels of government at this time about the difference between civil defense activities during times of war and natural disaster relief efforts and what types of aid and activities were to be used for each. Also during this time, civil defense planners were creating mass evacuation policies for assumed targets of the USSR, on the belief that major cities and installations would become prime targets for nuclear missiles
(Fagan et al 2005). The Federal Civil Defense Act was modified in 1958 to allow the government to allocate money for civil emergency preparedness. During the 1960s, the
Office of Emergency Planning (OEP), which was renamed to the Office of Emergency
Page: 36 of 237
Thomas Virgona Doctoral Dissertation: Defense
Preparedness, became the lead organization for the coordination all of civilian emergency preparedness events (Fagan et al 2005). These activities included disaster relief, post-attack analysis, financial stabilization, resource deployment, and continuity of government functions.
During the Cold War era,20 disasters became more important to practitioners and scholars (McEntire 2004). The threat of “mutually assured destruction” reached its pinnacle during the Cuban Missile Crisis in 1962. Civil defense grew during this time to organize air- raid precautions, shelters and alarms for everyday citizens. Civil defense during the Cold War
(1948-1989) included the development of plans to relocate large civilian populations in the event of a threatened nuclear attack (Alexander 2002).
One element of civil defense in the nuclear era has been a strategy to preserve a functional government by protecting key political and military leaders. Underground bunkers were set up with dedicated communications systems and food stockpiles (Jackson 1994).
While civil defense was being ingrained in the institutional fiber of the American government, military leaders wondered how the populace would react after a nuclear exchange. Because it would be impossible and unethical to run a test on humans, the government looked to scholars for assistance. Millions of dollars were poured into the social sciences (particularly Sociology): academic institutions (such as the well-known Disaster
Research Center) were created to answer the questions (McEntire 2004). Although people’s responses to disasters had been studied for years, scholars were now able to conclude that victims generally exhibited rational behavior in natural disasters (McEntire 2004).
20 Russett (1993) defined the Cold War era as a period of escalated tension and hostilities between the Unites States and the Soviet Union in the areas of politics, military, ideology, etc Page: 37 of 237
Thomas Virgona Doctoral Dissertation: Defense
It is also critical to note the role and responsibilities that the military can take during a national crisis. Military involvement in direct law enforcement activities is normally prohibited by the Posse Comitatus Act (Brake 2003). This act prohibits the use of the military in activities such as: arrest; seizures of evidence; search of persons; search of a building; investigation of a crime; interviewing witnesses; pursuit of an escaped prisoner; search of an area for a suspect and other like activities. The Posse Comitatus Act, however, does not stop the military from providing logistical support, technical advice, facilities, training, and other forms of assistance to civilian law enforcement agencies. Courts have held that providing assistance falls in the “passive” category and does not violate the Posse Comitatus Act.
Technical support activities such as explosive ordinance disposal, providing specialized equipment, and expert advice on weapons of mass destruction (WMD) devices also do not violate the act (Brake 2003).
The military has now started dispensing its crisis management expertise to civilian groups through the use of simulators. The University of Central Florida, in conjunction with the U.S. Army and the Orange County (Florida) Fire Rescue Department, has developed and fielded a series of simulations for conducting disaster exercises and training public safety personnel to respond to disasters (Kincaid et al 2003). Researchers are also gathering persuasive evidence that training effectiveness is substantially improved by the use of simulation as compared with traditional field exercises. The training is in two distinct areas: emergency management incident command and emergency medical care performed in the field. The simulated scenarios include treatment of battlefield casualties and crisis management. It is interesting to note that police officers with military training experience the same levels of stress as police officers without military training (Patterson 2002).
Page: 38 of 237
Thomas Virgona Doctoral Dissertation: Defense
A goal of Cold War civil defense was to enable the greatest number of Americans to survive a Soviet nuclear attack should one occur, with a clear focus on the people as a strategic national resource (Dory 2003). Through outreach and educational events, the government provided the public with a basic understanding of the nature of the Soviet threat, the nation’s vulnerability to nuclear attack, and potential consequences if one were to occur.
The federal government developed some contingencies defining the roles and activities that agencies would perform under grave scenarios (e.g., Soviet sneak attack on Washington, and nuclear war) (Carafano 2006).
As cuts in military spending started at the end of World War II, military planners
(along with some civilian supporters) proposed a new understanding of military forces. Rather than rely upon rapid mobilizations following the outbreak of war, these planners argued that it was necessary to permanently prepare for unannounced attacks: what Sherry called an
“ideology of preparedness.” (Sherry 1977). Technological developments of the atomic bomb and the long-range bomber rendered obsolete the traditional reliance on oceans for a defense in geographic isolation. Instead, these planners believed that a new era of “total war” had begun in which “the battle was not confined to the front lines but extended to the home front as well.” E.B. White (White 1949) contemplated nuclear attacks at the beginning of the Cold
War: and his words seem eerily prescient in relation to September 11, 2001. He wrote:
The subtlest change in New York… is something people don't speak much about but that is in everyone's mind. The city, for the first time in its long history, is destructible. A single flight of planes no bigger than a wedge of geese can quickly end this island fantasy, burn the towers, crumble the bridges, turn the underground passages into lethal chambers, cremate the millions. The intimation of mortality is part of New York now: in the sound of jets overhead, the black headlines of the latest edition. All dwellers in cities must live with the stubborn fact of annihilation; in New York the fact is somewhat more concentrated because of the concentration of the city itself, and because, of all targets, New York has a certain clear priority. In the mind of whatever perverted dreamer who might loose the lightning, New York must hold a steady, irresistible charm
Page: 39 of 237
Thomas Virgona Doctoral Dissertation: Defense
Records management also underwent a transition and growth during this period.
Executive Order 9784 in 1946 required all executive branch agencies to implement records management programs and expanded the management authority of the National Archives
(Cox 2000). The mandate was better defined in the 1950 Federal Records Act. Records management developed into records creation, maintenance, and disposition. The act also required each agency to establish an ongoing program for records management and to work in concert with the National Archives. During the 1950s, vital records program originated as part of the “Continuity of Government” program. The initial focus of vital records programs was the continuation of Federal agency operations under national emergency conditions, including a possible enemy nuclear attack upon the United States, and the reconstitution of normal agency activities at the emergency’s conclusion. The Bureau of Budget established requirements21 for vital operating records protection programs. Executive Order 10346 in
1952 made each Federal department and agency responsible for carrying out its essential functions in an emergency. Subsequent presidents have issued various executive orders that have modified Federal continuity of government and emergency preparedness responsibilities.
The vital records program has increasingly been dedicated to meeting the challenges Federal agencies encounter in continuing their operations and protecting their records in the face of natural disasters and terrorism (National Archives and Records Administration 1996).
During this period, the military spawned a new industry for the private sector.
Companies were now bidding on defense contacts for the United States and NATO military forces. Adding to the complexity were new multi-national corporations that campaigned to strengthen international economies (Latham22 in Schain 2001). Much of the debate at this
21 Bulletins No. 51-14, May 22, 1951, and No. 52-5, September 6, 1951. 22 Chapter 3: Cooperation and Community in Europe: What the Marshal Plan Proposed, NATO Disposed. Page: 40 of 237
Thomas Virgona Doctoral Dissertation: Defense
time centered on the Marshall Plan and the financial aid provide to Western Europe at the end of World War II. Some in the business community expressed concern with the level of government involvement in global economics. Others, such as General Electric president
Philip D. Reed, believed the plan would open more markets to American companies
(McGlade23 in Schain 2001). Citing fears of industrial espionage, major firms such as DuPont and General Electric began to restrict, and in some cases prohibit, United States Technical
Assistance & Productivity Program24 teams from visiting (McGlade in Schain 2001).
Information resources are now embedded in new services in such a way as to appear indistinguishable from the product itself25 (Horton 1994a). Disaster recovery requirements for these services also created new opportunities for the private sector. One such company is
Iron Mountain.26
Iron Mountain was founded in 1951 in Livingston, NY, 125 miles north of New York City. In 1936, Herman Knaust purchased a depleted iron ore mine and 100 acres of land for $9,000 because he needed more space to grow his product. But by 1950, the mushroom market shifted and Mr. Knaust was looking for alternative uses for his mine, which he had named Iron Mountain. After World War II, Mr. Knaust sponsored the relocation to the United States of many Jewish immigrants who had lost their identity because their personal records had been destroyed. During that same period, the world was embroiled in cold war apprehension about atomic security. Both factors impressed upon Mr. Knaust the need to protect information from the havoc of wars or lesser disasters. In 1951, Iron Mountain Atomic Storage, Inc. was founded. Mr. Knaust opened the first "vaults" inside Iron Mountain and put a sales office in the Empire State Building. Having a knack for publicity, he persuaded luminaries such as General Douglas MacArthur to visit Iron Mountain. The attendant publicity was the extent of the new venture's marketing program. Iron Mountain's first customer was East River Savings Bank, who brought microfilm copies of deposit records and duplicate signature cards in armored cars for storage in the new mountain facility. Other corporate customers soon followed as New York-based companies began to see the need to protect their vital records.
23 Chapter 10: A Single Path for European Recovery? American Business Debates and Conflicts over the Marshall Plan. 24 USTA&P was started in 1948 as “exchange of persons in industry” program. 25 Horton cited classic examples: Merrill Lunch Cash Management Account, Federal Express Zap Mail and MCI Mail (Horton 1994). 26 http://www.ironmountain.com/company/history.asp Page: 41 of 237
Thomas Virgona Doctoral Dissertation: Defense
Today, the focus of many governments has shifted to terrorism. Modern disasters are complex enough to require the utmost flexibility in their management. From the 1970s onwards, disaster research stressed non-military models of civil protection, such as the incident command system (ICS). Civil protection later emerged as demand increased under the duress of more serious, civilian disasters such as earthquakes, hurricanes, floods, and transportation crashes (Blanchard 1984). The ICS is different from the traditional command- and-control model derived from the direction of troops during combat, as it relies on information sharing and collaboration among task forces (Irwin, 1989). Decision making is a major problem in disasters. Other areas for concern during disasters include bureaucratic politics/procedures, groupthink and misperception (McEntire 2004).
2.4 Changes to Disaster Planning in Relation to Other “Disasters”
Although September 11, 2001 has spurred disaster recovery planning, disaster recovery and continuity of business plans can be implemented for a variety of reasons. The reasons can be natural (flooding, hurricane, etc) or human (terrorism, war, blackout, etc.). In both cases, uncertainty is at the core of the problem (Hewitt in Quarantelli 1998).
Russell Dynes has developed a disaster topology to describe a tasks performed pre-and post-event. The purpose of the topology here is to provide a framework for describing extraordinary efforts and judgments during a disaster. The four types of tasks are (Dynes in
Quarantelli 1998):
Page: 42 of 237
Thomas Virgona Doctoral Dissertation: Defense
Table 1 – Dynes’ Four Types of Tasks
Tasks
Routine Non-routine
Same as pre- Type I – Established Type III – Extending
Organizational disaster
Structure New Type II - Expanding Type IV - Emergent
• Type I – organizations carry on the same tasks with the same structure but often
expand their conventional efforts by extending the workday and double shifting.
• Type II – Organization expands their structures to carry out anticipated disaster
tasks. These organizations anticipate the involvement and use volunteers to cope
with the extraordinary effort.
• Type III – Organizations with no anticipated emergency responsibility, but may
become involved because they possess manpower and other resources.
• Type IV – These organizations do not exist before a disaster. They become
involved with new tasks and develop a structure to deal with the assigned work.
Tucson, Arizona, experienced two large-scale floods in October 1983 and January
1993. McHugh’s research into the human response to the 1983 event, found that the community's emergency co-ordination center was ineffective and isolated from the public safety response network (McHugh 1995). Local government mitigated these deficiencies before the January 1993 flood in two ways. First, the community's emergency management agency merged into the Sheriff's Department and second, through consensus building and training, the community institutionalized an effective disaster response organizational Page: 43 of 237
Thomas Virgona Doctoral Dissertation: Defense
structure (Type II). During the Mexico City earthquake in 1985, the federal government did have a plan for disasters – assignment of responsibility to the Mexican military (Kreps in
Quarantelli 1998). There had been no formal planning for disasters of any kind. Much of the
Mexico City response was tied to specific locations and not centrally controlled (Type IV).
During the Chernobyl reactor accident in 1986, initial attempts to monitor and control radioactive contamination were largely improvised (Type IV) (Kreps in Quarantelli 1998).
Simply put, being on-site or responding to the accident was likely to result in acute radiation sickness.
Regardless of all we know, we acknowledge the impossibility of predicting future events. The private sector also recognizes the problems of predicting the future and planning for inevitable disasters. Consequently, companies should not use time and resources attempting to plan for patterns that are simply unpredictable (Day, et al 2004). Rather, it is critical that companies pay attention to contingency planning (such as crisis management plans). Crisis management plans must be robust enough to handle all forms of the unexpected. As events arise that give us insight into the unforeseen, it is essential that organizations reexamine their crisis management plans to see if they were designed effectively enough to handle the unique features of our evolving environment (Day, et al
2004). Decision-making under pressure requires certain capabilities and the factors that shape decisions under pressure are quite different from those in on-disasters circumstances (Childs
2004). When researching firefighters specifically, “freelancing” decisions during a disaster presents other problems, including endangering lives. Issues encountered during a disaster, which create dysfunction under intensely stressful “battlefield” conditions, can be mitigated by repeated practice under more realistic conditions (McHugh 1995). This practice enhances performance and adaptability to varied conditions.
Page: 44 of 237
Thomas Virgona Doctoral Dissertation: Defense
The first systematic efforts by the United States federal government to give some assistance after a disaster were during the Dust Bowl Years in the late 1920s and early 1930s after farmlands were devastated. The onslaught of WWII and the development of missiles capable of traveling several hundred miles by Germany was the catalyst for the federal government to develop a federal civil defense system (Fagan et al 2005). In 1979, the Federal
Emergency Management Agency (FEMA) was created by President Jimmy Carter to house civil defense emergency preparedness functions together in one organization. For the next two decades, FEMA would be the center for state and local emergency preparedness.
In the 1980s, an idea known as “Comprehensive Emergency Management” (CEM) developed within FEMA’s civilian programs. CEM refers to the responsibility for managing responses to all types of disasters and emergencies through the coordination of multiple agencies or entities. It is in this process that many feel the government failed during Hurricane
Katrina. Before September 11, 2001, there was no comprehensive federal emergency response plan available that integrated all the federal agencies and their respective roles. To remedy this, the Department of Homeland Security created the National Incident
Management System (NIMS), which works as a guide for the federal government, as well as the state and local governments (Fagan et all 2005). NIMS is an emergency response system aimed at providing flexibility and standardization throughout the life cycle of an incident. The main goal of NIMS is to provide effective and efficient coordination among the various levels of government during an emergency. NIMS is designed to function regardless of the size or difficulty of the incident; it uses a standard language to unify the response effort.
Hurricane Katrina provides tremendous lessons in disaster recovery, as the federal government has been criticized for its response. Katrina once again showed the reliance on technology and communications. The aftermath of Katrina has reinforced the role of
Page: 45 of 237
Thomas Virgona Doctoral Dissertation: Defense
communication networks and information management in providing effective response to a large-scale disaster (Banipal 2006). Breakdown of phone circuits, flooding of Public Service
Transmission lines and disruption of electricity contributed to failure of communication systems. Overall wireless voice and data networks had faster recovery time and performed better than the landline networks. The absence of inter-agency information system contributed to delayed response. Banipal (2006) specifically targets the design aspect in lessons learned from Katrina. Officials will need to refocus on the design of networks and information management systems so as to improve inter-agency communication, speed up recovery efforts and limit loss in business value. It is imperative that organizations involved in the disaster recovery process have all the information they need – quickly and accurately. Quick response to disaster has the potential to reduce total loss significantly.
In the specific case of records management, many problems that were identified following Hurricane Katrina (Ritzenthaler 2006). During the storm, records were exposed to sewage, petrochemicals and coroner lab contaminates. Building issues included power, access and security. The records recovery followed these steps: vacuum freeze dry items, sterilize item, clean and then reformat the record. For example, the Orleans Parish District
Attorney’s Records contained 785 cubic feet of archives and 36 computers (Ritzenthaler
2006). The critical lesson learned from Katrina from the records management viewpoint was
“know your records from several perspectives: vital, permanent, media, locations.”
Quarantelli (2005) wrote of the “catastrophe consequences” that can be learned from
Hurricane Katrina. In Katrina, there was across-the-board and almost total disruption of community functions. Most of the community infrastructure was heavily impacted. Local officials were unable to undertake their usual work role, and this often extended into the recovery period. In catastrophic situations local personnel are often unable for some time,
Page: 46 of 237
Thomas Virgona Doctoral Dissertation: Defense
both immediately after impact and into the recovery period, to carry out their formal and organizational work roles. This is because some local workers either were unable to communicate with or be contacted by their usual clients or customers and/or were unable to provide whatever information, knowledge or skills, etc. they usually can provide (Quarantelli
2005). Help from nearby communities could not be provided. (In many catastrophes, not only are all or most of the residents in a particular community affected, but often those in nearby localities are also impacted.) Most, if not all places of work, recreation, worship and education such as schools were totally shut down and the lifeline infrastructures was so badly disrupted that there were extensive shortages of electricity, water, mail or phone services as well as other means of communication and transportation. One of the more important Katrina consequences was the media activity. With Katrina, there was far more diffusion of rumors that usually occurs in disasters (Quarantelli 2005). The media were not always accurate in reports of looting in the post-disaster time period (Barsky et al 2006). While looting did occur, which is atypical for disasters, the anti-social behavior was widely depicted as out of control.
The question of “who is in charge?” was reiterated over and over again, depicting the command and control model as inept (Quarantelli 2005).
One example of a positive post-Katrina lesson is the New Orleans Veteran’s
Administration Medical Center. With a new Computerized Patient Record System, all patient records, prescriptions, and laboratory and radiology results on every New Orleans VA patient are now available at any VA medical center and by any VA physician nationwide
(Anonymous 2005).
A study into the role of social capital in the post earthquake reconstruction programs in two cases (Kobe, Japan and Gujarat, India) demonstrated the reliance upon people during a disaster. Social capital refers to the trust, social norms, and networks which affect social and
Page: 47 of 237
Thomas Virgona Doctoral Dissertation: Defense
economic activities. It is not a new idea that trust and networks help reduce transaction costs and make things easier. The Kobe case study shows that the community with social capital and with a tradition of community activities can pro-actively participate in the reconstruction program, and thereby make a successful and speedy recovery (Nakagawa and Shaw. 2004).
2.5 Information Systems and Technology – Theories and Methodologies
The growth of Information Technology is remarkable when one considers the age of the discipline. Early approaches emphasized the “waterfall”27 approach. One flaw was that critical requirements often emerge during system development and cannot be anticipated.
Brooks concluded that software designers should plan to throw one version of the software away (John Carroll in Baecker, Ronald, et al. 1995). That lesson continues, and design is now seen as opportunistic, concrete, and necessarily iterative.
Austrian biologist Ludwig von Bertalanffy defined a system as an entity which maintains its existence through the mutual interaction of its parts (Bertalanffy 1968). The environment is the part of the world that can be ignored by the information system itself, except for its interaction with the system. In order to understand the relationship among other systems, inputs, outputs and processes, one needs to understand the environment in which all of this occurs. The environment represents everything that is important to understanding the functioning of the system, but is not part of the system. It includes competition, people, technology, capital, raw materials, data, regulation and opportunities. Prescriptive information system methodologies are unlikely to cope well with strategic uncertainty, user communication or staff development (Middleton 1999). Middleton’s recommendations are to
27 The waterfall approach emphasizes feedback loops between the following development phases: System feasibility, Requirements, Design, Coding, Integration, Implementation and Operations/Maintenance (Boehm 1988). Page: 48 of 237
Thomas Virgona Doctoral Dissertation: Defense
focus more on soft organizational issues and to use approaches tailored to each project. While each technology project progresses through the System Development Life Cycle (SDLC), each project needs to ensure the disaster recovery designed for the developing application is in alignment with the organizations needs.
From a pragmatic point of view, the traditional System Development Life Cycle is one of the most critical methodologies in information technology. Disaster recovery is dependent on the SDLC for ensuring disaster recovery planning is integrated throughout the technology development process: the requirements for the system’s recovery are defined in the analysis phase, the system is designed to provide service during a disaster within the specified timeframes and testing the recovery capabilities is part of the creation of the project, thus ensuring continued use during a disaster.
Over the years, the basic SDLC has been modified for newer technologies, such as object oriented design, Unified Modeling Language (UML) and prototyping, but the basic construct remains in place. The steps in modern systems analysis are: problem definition; data collection and analysis; analysis of alternatives; feasibility determination; systems proposal; system design; pilot study; implementation; system review and evaluation. Systems analysis is a means of viewing circumstances realistically and designing practical solutions
(Osborne and Nakamura 2000). However, there is no guarantee the solution may be found.
The drawbacks of the “waterfall” approach have been well documented: managing ever- shifting requirements, poor relationships with the users and the emergence of serious problems late in a project (Middleton 1999). Poor quality is largely attributed to design problems (Cole 1981), which can be avoided by paying attention to quality problems during design, understanding customer requirements, and designing modularized objects for reuse.
Participation of users, vendors, and developers in the core design and development process
Page: 49 of 237
Thomas Virgona Doctoral Dissertation: Defense
promotes mutual understanding of issues and constraints to be addressed to improve quality.
Project management problems can be quickly summarized as (Middleton 1999):
1) Users did not know what they wanted.
2) Users did not know the possibilities of the technology.
3) User’s perceptions changed while the system was being developed.
4) The developers did not understand the intricacies of the user's work.
5) There were constant changes in the external environment that were
not anticipated.
In recognition of these issues, information system professionals enhanced the SDLC to include the use of:
• Flowcharts: Diagram that shows the operations performed in an information
processing systems and the sequence in which the operations performed.
Flowchart symbols are used to represent the operations and sequence of
operations (IBM 1969).
• Data dictionaries: A data dictionary is a set of metadata28 that contains
definitions and representations of data elements (Yourdon 1989).
• Decision tables: This table describes terms of conditions that must be satisfied
in order to carry out the action specified in the decision table. With every
decision table a set of decision rules, called a decision algorithm can be
associated. It is shown that every decision algorithm reveals some well-known
probabilistic properties, in particular it satisfies the total probability theorem
and the Bayes’ theorem. These properties give a new method of drawing
28 Hurley definition will be used as the working definition: Metadata is the data that in some manner describe the content of the object but is separate from the content (Dentinger 1998). Page: 50 of 237
Thomas Virgona Doctoral Dissertation: Defense
conclusions from data, without referring to prior and posterior probabilities
(Pawlak 2000).
• ER diagrams: Entity-relationship modeling is a method used to present a
system and its requirements in a top-down approach. This approach is
commonly used in Database design. The diagrams created using this method
are called ER diagrams (Chen 1976).
• UML: The Unified Modeling Language (UML) is a family of graphical
notations backed by a single meta-model that help in describing and designing
software systems, particularly software systems built using the object-
orientated (OO) style (Fowler 2004).
• GANNT chart: This chart displays the time span of each task as indicated by
the length of a line on an adjacent calendar (Carter 1987).
• DFD: A data flow diagram (DFD) is a graphical representation of the "flow" of
data through an information system. A data flow diagram can also be used for
the visualization of data processing (structured design). It is common practice
for a designer to draw a context-level DFD first which shows the interaction
between the system and outside entities. This context-level DFD is then
"exploded" to show more detail of the system being modeled (Yourdon 1979).
CASE tools are being used by larger firms to emphasize the prototyping and code generation facilities and to build completed systems. Smaller firms are primarily using the tools for analysis and design and to share development work across teams. Support for data flow diagrams and the data dictionary were revealed as key factors for improving productivity
(Post, Kagan, Leim, 1998).
Page: 51 of 237
Thomas Virgona Doctoral Dissertation: Defense
With the birth of object oriented programming, information technology was introduced to another subtle change to the SDLC methodology. It is important to understand the ongoing changes to information technology. Object Oriented (OO) analysis and design is not as mature as other structured techniques. In general, the OO style is to use several little objects29 with many small methods.30 The technique of keeping data with objects and, if necessary, providing techniques for making it available is called encapsulation and has been part of OO since its inception. Another core concept of OOP is polymorphism, the idea that a super-class defines a generic behavior, while specific instances of that behavior are refined when that super class is referred to by a class (Osborne and Nakamura 2000). This style is very confusing to people used to long procedures; indeed, this change is the heart of the paradigm shift of object orientation. Unified Modeling Language (UML) can be used to support Object Oriented Project Development initiatives. It is important to note that the UML is a modeling language, not a methodology. The UML has no notion of process, which is an important part of a methodology. Models based on objects provide a different perspective since they are structured around real-world objects. The benefits of object oriented analysis and design are reusability, reliability, seamless integration with a graphical user interface
(GUI), and speedier design. The suggested framework for Object Oriented Analysis and design is (Osborne and Nakamura 2000):
• Prototyping
• Diagramming tools
• UML
Designing the system includes:
• Functional Specification
29 There is much debate over the definition of an object. In OOP, an object is a finite set of components (Xing 2003). Page: 52 of 237
Thomas Virgona Doctoral Dissertation: Defense
• Determination of alternatives
• Conceptual design (inputs, outputs, processes, files)
• System integration
An example of a highly successful conceptual model based on an object is the spreadsheet (Winograd 1996). The first spreadsheet was designed by Dan Bricklin and was called VisiCalc. The main reasons why the spreadsheet has become so successful are that
Bricklin understood what kind of tool would be useful to people in the financial world (like accountants), and he knew how to design it so that it could be used in a way that these people would find useful (Preece 2002).
Methodologies developed over the last half century are more examples of changes and growth to the field of Information Science. Although not born or derived from disasters or singular events, they do represent the evolution of the field. The SDLC may be the most widely known, but it not the only contribution.
A mental model is one's way of looking at the world, a framework for the cognitive processes of our mind. In other words, it determines how we think and act. Much of the work involving mental models comes from Chris Argyris and his colleagues at Harvard University.
The object of activity theory31 is to understand the unity of consciousness and activity. The concern is that activity theory is hard to learn, and, because we have not seen its actual benefits realized in specific empirical studies, the time spent learning it would be of dubious benefit (Nardi 1995). The GOMS methodology involves: Goals, simple Operations, Methods of accomplishing a goal, and a Selection rule for alternatives. Cognitive modeling is the
30 Method is the body of a programming procedure for an object (Fowler 2004). 31 Vygotskian activity theory. In this approach the main feature of the psyche is the active position of human beings toward the world in which they live. Humans are continually changing the objects and creating artifacts – tools (Verenikina and Gould 1998). Page: 53 of 237
Thomas Virgona Doctoral Dissertation: Defense
application of cognitive theory to applied problems (Gray and Altmann 2001). Models vary in their concern with generality versus realism.
Systems’ thinking is a set of tools, a unique perspective on reality, and a specific vocabulary. It dates back to the 1940s and 1950s when thinkers such as Wiener, von
Bertalanffy, Ashby and von Foerster founded the domain through a series of interdisciplinary meetings (Heylighen, Joslyn, and Turchin. 1999). Systems theory or systems science argues that however complex or diverse the world, we will always find different types of organization in it. Such organization can be described by concepts and principles which are independent from the specific domain at which we are looking. The steps in systems thinking are the following: specify a problem/issue, construct a hypothesis, test the hypothesis, and implement changes (included looping feedback). Systems skills thinking requires: dynamic thinking (framing a problem of behavior over time); systems as a cause thinking (placing responsibility for a behavior on the internal actors who manage the policies and plumbing of a system); forest thinking (believing that, to know something, you must understand the context of relationships); operational thinking (concentrating on getting causality and understanding of how a behavior is actually generated); closed loop thinking (viewing causality as an ongoing process, not a one time event, with the effect feeding back to influence the causes, and the causes affecting each other); quantitative thinking (accepting that you can always quantify, but you can't always measure); and scientific thinking (recognizing that all models are working hypotheses that always have limited applicability) (Richmond 2000). The benefits of systems thinking include:
• More effective problem solving
• More effective leadership
• More effective communications
Page: 54 of 237
Thomas Virgona Doctoral Dissertation: Defense
• More effective planning
• More effective organizational development
• Avoiding Founder's Syndrome
Founder's Syndrome occurs when an organization operates primarily according to the personality of one of the members of the organization (usually the founder), rather than according to the mission (purpose) of the organization. When first starting their organizations, founders often have to do whatever it takes to get the organization off the ground, including making seat-of-the-pants decisions in order to deal with frequent crises that suddenly arise in the workplace. As a result, founders often struggle to see the larger picture and to plan effectively in order to make more proactive decisions. Consequently, the organization gets stuck in a highly reactive mode characterized by lack of funds and having to deal with one major crisis after another. The best "cure" for this syndrome is a broader understanding of the structures and processes of an organization, including an appreciation for the importance of planning (McNamara 1999).
McNamara describes how an organization seems to experience the same kinds of
problems over and over again. The problems seem to cycle through the organization. Over
time, members of the organization come to recognize the pattern of events in the cycle, rather
than the cycle itself. Parents notice this as they mature as parents. Over time, they recognize
the various phases their children go through and consider these phases when dealing with the
specific behaviors of their children (McNamara 1999). Systems that do not interact with their
environment (e.g., get feedback from customers) tend to reach limits.
Prescriptive information system methodologies are unlikely to cope well with strategic
uncertainty, user communication or staff development. Middleton’s recommendations are to
focus more on soft organizational issues and to tailor approaches to each project. All
Page: 55 of 237
Thomas Virgona Doctoral Dissertation: Defense
elements of the organization need to be developed in order to attain quality goals; piecemeal adoption of selected quality management practices are unlikely to be effective (Middleton
1997). This is essential in disaster planning, as critical components of the organization must continue to function in harmony with other organizational functions. Fleshing out the key areas during the initial system design will avoid haphazard decisions during a disaster.
2.6 The Human Component of Information Systems Technology
Despite the evolution and advances in information systems and technology, it is an almost universal finding in studies investigating human information behavior that people choose other people as their preferred source of information (Johnson 2004). Studies of academic researchers in both the sciences and the humanities have revealed the importance of consulting with colleagues at different stages of their research (Johnson 2004). Professionals, such as engineers, nurses, physicians and dentists rely on co-workers and knowledgeable colleagues in their search for work-related information (Leckie, et al., 1996). People are also among the most important sources consulted by chief executive officers during their environmental scanning (Choo 1993). Studies of ordinary citizens' preferred sources of information also confirm the importance of personal contacts in information seeking behavior
(Warner 1993). The poor, as well, prefer people over other sources of information (Agada
1999). The explanation for the use of people as information sources has often been that they are “typically easier and more readily accessible than the most authoritative printed sources”
(Case 2002). Immigrants are generally perceived to be information poor, meaning they face major challenges with finding and using greatly needed everyday information (Agada 1999).
Research findings suggest that personal networks were used more readily than any other type
Page: 56 of 237
Thomas Virgona Doctoral Dissertation: Defense
of information source (Fisher 2004). The ability of these populations to establish themselves independently is limited and often restricted by barriers of language and influence. There is a negative spiral effect for these populations as they work to improve their socio-economic situation while being unable to operate outside of the community information system they have established for themselves (Fisher 2004).
Human Computer Interaction (HCI) is a growing and maturing field. Although not the primary focus of this dissertation, the discipline of HCI has an impact on systems design, including disaster recovery planning. A unique aspect of the field is that HCI treats the computer and its operator as equals (Verenikina and Gould 1998). HCI research has investigated usability and where it fits into the concept of systems design. For good design, the designers need to know the users and their tasks (Karat and Karat 2003). Developing new systems is always done within a context of design trade-offs and limited resources (Karat and
Karat 2003). Since the events of September 11, 2001, usability may have taken on new context, especially for information providers, such as Wall Street financial firms. External parties, vendors and internal staff who use financial information to make critical economic decisions, require that data be available when needed. System analysis and design now includes Disaster Recovery and Continuity of Business as critical components of the phase
(Osborne and Nakamura 2000). Perhaps Everett Brenner put it best in “Brenner’s Law”:
Determine the best system you can foresee before designing the system you can afford (Hahn
1998).
Donald Norman has documented a common sense approach to usable design.
Designing well is not easy and it usually takes five or six attempts to get a product right. If an error is possible, someone will make it. In keeping with the human element theme of
September 11, 2001 and design problems, Norman cites an incident aboard a Lockheed L-
Page: 57 of 237
Thomas Virgona Doctoral Dissertation: Defense
1011 airliner flight to Miami as an example of poor design during disasters. The pilots were too busy to instruct the flight crew properly, so the passengers were not given safety instructions by the personnel (Norman 1998). This is an example of a disaster plan that relied on humans to convey information during a crisis, despite those people being occupied with other critical tasks.
Norman introduces several concepts that he uses in his analysis of both good and bad design: affordances (buttons are for pushing, menus are for choosing); constraints (logical relationship between the functional layout of components); conceptual models (a good conceptual model allows us to predict the effects of our actions); mappings (relationship between the controls and the results); visibility (the system state should be visible and interpretable); and feedback (sending information back to the user about what action has actually been done and what results have been accomplished). When applied to information systems, these concepts mean that computer systems must be capable of making things visible
(or audible). Norman also points out a critical linkage between usability and design (Norman
1988). Usability is rarely a consideration when purchasing, in fact, the purchaser is rarely the user.
Norman’s key concepts of user-centered design are (Norman 1988):
• Make it easy to determine what actions are possible at any moment.
• Make things visible, including the conceptual model.
• Make it easy to evaluate the current state of the system.
• Follow natural mappings between intentions and the required actions.
Landauer believes these contributions are marginal. Useful theory is impossible, because the behavior of human-computer systems is chaotic or worse, highly complex, dependent on many unpredictable variables, or just too hard to understand. Middleton (1999)
Page: 58 of 237
Thomas Virgona Doctoral Dissertation: Defense
also questioned the strict methodologies. Prescriptive information system methodologies are unlikely to cope well with strategic uncertainty, user communication or staff development.
His recommendations focus more on soft organizational issues and to use approaches tailored to each project.
With the Internet and many other sources available online, there is a need to ensure that people who are information technology savvy do not confuse this with having information literacy skills. There is more to information seeking than just knowing where to find information; that is, it also includes problem solving and evaluation of sources. The ability to validate sources is probably even more important today with the volume of information available on the Internet (Kerins 2004). The theory of social capital, however, suggests that the use of people as information sources is not necessarily an easy option, but may also require a considerable effort (Johnson 2004). The Internet seems to have a positive effect on social interaction, and it tends to increase exposure to other sources of information.
The body of evidence does not support the thesis that the Internet leads to lower social interaction and cause greater social isolation (Castells 2003).
Human information behavior is a highly active area of research within Information
Science and other fields. Research that has been carried out to date has contributed greatly to our understanding of human-information interaction. Yet, Fidel states that very few studies have generated results that are directly relevant to the design of information systems (Fidel
2004).
In recent years, researchers in HCI have criticized the gap between research results and practical design. There is an emerging consensus among researchers that the cognitive approach to HCI may be limited (Uden and Willis N.D.). Landauer writes that useful theory is impossible, because the behavior of human-computer systems is chaotic or worse, highly
Page: 59 of 237
Thomas Virgona Doctoral Dissertation: Defense
complex, dependent on many unpredictable variables, or just too hard to understand.
Theories have minor impact, such as Fitt’s Law32 and Hick’s Law33 (Landauer 1991). Even the best applications of theory have produced only small quantities and/or local gains in productivity. The few successful computer and HCI inventions to date have come from lucky hunches and produced mundane results, such as rules of thumb for the use of color or and empirical generalizations of user needs and characteristics.
Clearly, information systems would be most effective if their design is informed by an understanding of the human-information interaction of their intended users (Fidel and
Pejtersen 2004). Yet, information systems have been designed and widely used almost completely unaffected by results of studies in human information behavior. It is important to examine how human-information behavior research could inform design. A variety of reasons have probably motivated systems designers to ignore this research, such as pressure to design systems quickly, no obvious relevance of research results to design, and lack of appreciation of soft research. Instead of analyzing these reasons, Dervin thought it might be useful to examine how results of human information behavior research projects can increase their applicability to systems design. This will address a standing concern bridging the gap between designers and researchers, and increasing the relevance of academic research to the practitioners' work (Dervin. 2003).
The information systems themselves – not the people – can become the stable structure of the organization (Srikantaiah and Koenig 2000). This in-turn, will remove some of the reliance on human resources to solve problems during disasters. As Dombrowsky
32 Fitts's law is a model of human movement which predicts the time required to rapidly move to a target area, as a function of the distance to the target and the size of the target (e.g., on computers with a mouse) (Fitts 1954).
33Hick's law is a model of human-computer interaction that describes the time it takes for a user to make a decision as a function of the possible choices he or she has (Hick 1952). Page: 60 of 237
Thomas Virgona Doctoral Dissertation: Defense
wrote, it is foolish to intervene in systems upon which people depend without knowing how the systems work and how they will react (Quarantelli 1998).
2.7 The Relationship of the Events of September 11, 2001 to Information Systems
Humans have deployed technology to combat disaster since the beginning of recorded history. The cradle of Western civilization, the Tigris-Euphrates river valley, was settled and urbanized through an extensive flood control infrastructure that stabilized the flow of water to fields while also protecting fixed settlements (Moss and Townsend 2006). Over the past century, the role of technology has expanded from just mitigating the impacts of natural disaster to producing disaster itself. The devastating effects of aerial bombardment of cities during 20th century may well have killed more people than all natural disasters in history combined. Chernobyl (1986) and Bhopal (1984) demonstrate the potential for chemical and nuclear industrial accidents to cause major disasters (Moss and Townsend 2006).
A disaster is an unexpected occurrence inflicting widespread destruction and distress and having long-term adverse effects on society. An emergency is a situation or occurrence of a serious nature, developing suddenly and unexpectedly, and demanding immediate action
(power failure and minor flooding) (Hunter 1997). The events of September 11, 2001 can be defined as both an emergency and a disaster.
In reviewing the September 11, 2001 investigations, a common and tragic theme is the failure of information, both in quality and communication as documented in The Complete
Investigation; The September 11, 2001 Report: The National Commission on Terrorist
Attacks Upon the United States. On the morning of September 11, 2001, the existing information infrastructure design was unsuited in every respect for what was about to happen. Page: 61 of 237
Thomas Virgona Doctoral Dissertation: Defense
Problems with information that morning were vast. Even the President told investigators he was frustrated with communications. The airlines were facing an escalating number of conflicting and, for the most part, erroneous reports about other flights, as well as a continuing lack of vital information from the Federal Aviation Administration (FAA) about the hijacked flights. Several FAA air traffic control officials told investigators it was the air carriers’ responsibility to notify their planes of security problems. Most federal agencies learned about the crash in New York from CNN. Some startling revelations about information communication breakdowns on the fateful morning: