Evaluating Emergency Management Models and Data Bases: a Suggested Approach
Total Page:16
File Type:pdf, Size:1020Kb
NATL INST. OF STAND & TECH R.I.C. ' . NATIONAL INSTITUTE OF STANDARDS & TECHNOLOGY Research Information Center Gaithersburg, MD 20890 NISTIR 88-3826 Evaluating Emergency Management Models and Data Bases: A Suggested Approach Robert E. Chapman, Saul I. Gass and James J. Filliben U.S. DEPARTMENT OF COMMERCE National Institute of Standards and Technology (Formerly National Bureau of Standards) Center for Computing and Applied Mathematics National Engineering Laboratory Gaithersburg, MD 20899 and Carl M. Harris Department of Operations Research and Applied Statistics George Mason University Fairfax, Va 22030 March 1 989 Sponsored by: Federal Emergency Management Agency 500 C Street, SW Washington, DC 20472 and U.S. DEPARTMENT OF COMMERCE National Institute of Standards and Technology Center for Computing and applied Mathematics National Engineering Laboratory Gaithersburg, MD 20899 NISTIR 88-3826 Evaluating Emergency Management Models and Data Bases: A Suggested Approach Robert E. Chapman, Saul I. Gass and James J. Filliben U.S. DEPARTMENT OF COMMERCE National Institute of Standards and Technology (Formerly National Bureau of Standards) Center for Computing and Applied Mathematics National Engineering Laboratory Gaithersburg, MD 20899 and Carl M. Harris Department of Operations Research and Applied Statistics George Mason University Fairfax, Va 22030 March 1 989 National Bureau of Standards became the National Institute of Standards and Technology on August 23, 1988, when the Omnibus Trade and Competitiveness Act was signed. NIST retains all NBS functions. Its new programs will encourage improved use of technology by U.S. industry. Sponsored by: Federal Emergency Management Agency 500 C Street, SW Washington, DC 20472 and U.S. DEPARTMENT OF COMMERCE National Institute of Standards and Technology Center for Computing and Applied Mathematics National Engineering Laboratory Gaithersburg, MD 20899 U.S. DEPARTMENT OF COMMERCE Robert A. Mosbacher, Secretary Ernest Ambler, Acting Under Secretary for Technology NATIONAL INSTITUTE OF STANDARDS AND TECHNOLOGY Raymond G. Kammer, Acting Director . A NOTE TO THE READER On August 23, 1988, the Omnibus Trade and Competitiveness Act (Public Law 100-418) was signed into law. The Act created the National Institute of Standards and Technology (NIST) from the National Bureau of Standards (NBS). Since 1901, NBS has served as the Nation’s central laboratory for developing and disseminating measurement standards and scientific data. Central to the competitiveness portion of the Act was the expanded mission of NIST. The Institute’s new responsibilities augment and build on the technical expertise of NBS, the only Federal laboratory with a specific mission to support U.S. industry. NIST will maintain the traditional functions of NBS and will continue to offer the full array of measurement and quality assurance services that were provided by NBS. These services include calibration services, standard reference materials, standard reference data, and measurement assurance p rograms iii PREFACE The Center for Computing and Applied Mathematics, National Engineering Laboratory, National Institute of Standards and Technology (NIST) is conducting a research effort under the sponsorship of the Federal Emergency Management Agency (FEMA) to develop a set of guidelines which promotes both a critical review and unbiased evaluation of FEMA models and data bases. Large-scale models and data bases are key informational resources for FEMA. The type and nature of FEMA models and data bases, however, are perhaps unique within the Federal government in that they are often concerned with severe changes or disruptions ranging from limited effects to extremes. In order to carry out its emergency missions, it is necessary for FEMA to determine which models, modeling techniques and data bases are appropriate for what purposes and which ones need modification, updating and maintenance. The development of evaluation guidelines is therefore of direct benefit to FEMA in discharging its emergency management duties. FEMA's Dynamic General Equilibrium Model (DGEM) was subjected to a critical review and evaluation of its capabilities as a military mobilization model to test and illustrate the approach recommended for evaluating emergency management models and data bases. DGEM is ideally suited for this purpose because it was designed to deal with economic and policy impacts of energy and resource supply (disruptions), catastrophic losses of resources and recovery from enemy attack, and demand and production surges and resource constraints during a military mobilization. It is hoped that these evaluation guidelines and their application to DGEM will prove useful to emergency managers and decision makers both in the application of these guidelines to other emergency management models and data bases and in their understanding and appreciation of the capabilities provided in DGEM. iv EXECUTIVE SUMMARY The purpose of this report is twofold. First, it is designed to provide the reader with a generic set of guidelines which can be used to evaluate large-scale, computer-based models and data bases. Second, the guidelines are illustrated through a critical evaluation of the Dynamic General Equilibrium Model (DGEM). DGEM is currently being used by the Federal Emergency Management Agency (FEMA) to analyze a variety of emergency management problems. The evaluation of DGEM serves both as a step-by-step procedure for conducting an in-depth model evaluation and as an introduction to a non-proprietary model which has broad applicability to the analysis of macroecomic issues. Many of the models currently in use within government and the private sector are by definition complex and large-scale. Consequently, the ability to produce accurate and realistic estimates of alternative courses of action has become a most challenging task. Furthermore, many models are dependent on data bases that are generated and maintained by diverse organizations, with each data base having a raison d’etre that may or may not be relevant to the problem under review. It should then be clear that the acceptance of model outputs as inputs to the decision-making process without investigating the validity of their generation is a luxary that the community of model users can no longer afford. Over the past 15 years a body of research has been developed that addresses the concern of model users who, upon being presented with the results of a computer-based model analysis, must decide whether the results are accurate enough or appropriate for the problem under review. This research - termed model evaluation - has contributed greatly to our ability to understand better the role of modeling in policy-oriented decision making. Model evaluation may take on a variety of forms and levels of complexity. However, any evaluation procedure has at least the following essential phases: 1. Obtain a clear and comprehensive statement of user requirements and objectives pertaining to the application of the model; 2. Generate appropriate information about the model design and perfor- mance pertaining to user requirements; and 3 . Evaluate model attributes and properties according to predetermined criteria of performance required by the user. These phases are characterized by the following activities: Determine user requirements and objectives. 0 Based on user requirements, develop questions to be answered about model performance and identify problems to be resolved or analyses to be performed. v . 0 Ascertain from the questions any problems for which specialized tech- niques are to be used to generate information needed about model design or performance relevant to the intended application of the model by the user. ° Perform the necessary tasks to generate the needed information for the evaluation. 0 Establish criteria for the model evaluation. 0 Conduct a formal evaluation to judge the integrity of the model relative to its intended use. The first five chapters of this report focus on the development of a generic evaluation procedure. Additional material on the general topic of model evaluation is presented in Appendices A, B and C. The remainder of the report illustrates how the procedure was applied to DGEM DGEM was selected for evaluation primarily for its unique approach to modeling emergency situations. The availability of documentation was also a major factor. We wish to alert the reader to the importance of documentation in any model evaluation effort because it is a means through which performance against user requirements can be measured. From a review of the DGEM documentation, it became clear that much care went into the design, development and testing of the model. Furthermore, the model's documentation is quite complete and should be sufficient to: (1) set it up on the host system; (2) execute a base-case simulation; (3) interpret the results of the base-case simulation; and (4) create, run, and interpret user-specified simulations. The analysis of the U.S. economy during a military mobilization is used to illustrate the generic evaluation procedure. This approach was taken for two reasons. First, a military mobilization represents a major perturbation about DGEM ' s "business as usual" base-case simulation. Consequently, any weaknesses of the model (e.g., instabilities, implausible results, etc.) should be revealed by such a perturbation. Second, a military mobilization requires an explicit treatment of the joint interactions of several factors (e.g., international trade, the business