System Regularities in Design of Experiments and Their Application

Total Page:16

File Type:pdf, Size:1020Kb

System Regularities in Design of Experiments and Their Application System Regularities in Design of Experiments and Their Application by Xiang Li B.S. and M.S., Engineering Mechanics Tsinghua University, 2000, 2002 Submitted to the Department of Mechanical Engineering in Partial Fulfillment of the Requirements for the Degree of Doctor of Philosophy in Mechanical Engineering at the Massachusetts Institute of Technology June, 2006 © 2006 Massachusetts Institute of Technology. All rights reserved. Signature of Author ……………………………………………………………….... Department of Mechanical Engineering May 9, 2006 Certified by .………………………………………………………………………… Daniel D. Frey Assistant Professor of Mechanical Engineering and Engineering Systems Thesis Supervisor Accepted by ..……………………………………………………………………….. Lallit Anand Chairman, Department Committee on Graduate Students System Regularities in Design of Experiments and Their Application by Xiang Li Submitted to the Department of Mechanical Engineering on May 9, 2006, in partial fulfillment of the requirements for the Degree of Doctor of Philosophy in Mechanical Engineering ABSTRACT This dissertation documents a meta-analysis of 113 data sets from published factorial experiments. The study quantifies regularities observed among main effects and multi-factor interactions. Such regularities are critical to efficient planning and analysis of experiments, and to robust design of engineering systems. Three previously observed properties are analyzed – effect sparsity, hierarchy, and heredity. A new regularity on effect synergism is introduced and shown to be statistically significant. It is shown that a preponderance of active two-factor interaction effects are synergistic, meaning that when main effects are used to increase the system response, the interactions provide an additional increase and that when main effects are used to decrease the response, the interactions generally counteract the main effects. Based on the investigation of system regularities, a new strategy is proposed for evaluating and comparing the effectiveness of robust parameter design methods. A hierarchical probability model is used to capture assumptions about robust design scenarios. A process is presented employing this model to evaluate robust design methods. This process is then used to explore three topics of debate in robust design: 1) the relative effectiveness of crossed versus combined arrays; 2) the comparative advantages of signal-to-noise ratios versus response modeling for analysis of crossed arrays; and 3) the use of adaptive versus “one shot” methods for robust design. For the particular scenarios studied, it is shown that crossed arrays are preferred to combined arrays regardless of the criterion used in selection of the combined array. It is shown that when analyzing the data from crossed arrays, signal-to-noise ratios generally provide superior performance; although that response modeling should be used when three-factor interactions are absent. Most significantly, it is shown that using an adaptive inner array design crossed with an orthogonal outer array resulted in far more improvement on average than other alternatives. Thesis Supervisor: Daniel D. Frey Title: Assistant Professor of Mechanical Engineering and Engineering Systems 2 Acknowledgments I would like to thank a number of people who contributed in many ways in making this thesis possible. First, I am grateful to my advisor, Professor Daniel D. Frey, for giving me the opportunity to work on such a challenging topic. With deep research insights and rich engineering knowledge, he has patiently guided me to explore the complicated topic of effectively applying experimental designs to engineering systems and to conquer a variety of difficulties along the way. In addition, he always gave me the freedom to pursue my interests and always offered me great opportunities to widen my perspective. I also wish to thank my doctoral committee members, Professor Warren P. Seering and Professor Roy E. Welsch, for their invaluable advice and inspiring discussions on this research. Thanks to the staff members of MIT who have been in great assistance to this work. I would especially like to thank Ms. Leslie Regan, Ms. Joan Kravit, Ms. Maggie Sullivan, Mr. Jason Pring, and Ms. Danielle Guichard-Ashbrook. I also owe a debt of gratitude to Mr. Steven Schondorf for being my mentor and giving me continuous support. I would like to sincerely acknowledge the encouragement from my colleagues in the Robust Design Group: Rajesh Jugulum, Yiben Lin, Hungjen Wang, Chad Forster, Nandan Sudarsanam, Danielle Zurovcik, Jagmeet Singh, and Troy Savoie, who have been wonderful friends and research partners. Finally, I would like to express my deepest thanks to my parents and my sister for their love and encouragement. This work could not have been done without their support. In addition, my close friends have always kept me sane and motivated: Elaine, Chen, Shuang, Dave, Bolero, Huajie, and Hao – thanks for everything. 3 Contents 1. Introduction..........................................................................................................11 1.1 Overview......................................................................................................................... 11 1.2 Motivation....................................................................................................................... 13 1.2.1 Debate on Adaptive Experimentation...................................................................... 14 1.2.2 Debate on Robust Design Methods.......................................................................... 15 1.3 Research Objectives........................................................................................................ 16 1.4 Research Roadmap.......................................................................................................... 17 1.5 Organization of the Dissertation ..................................................................................... 19 2. Regularities in Data from Experiments ...............................................................21 2.1 Overview......................................................................................................................... 21 2.2 Design of Experiments.................................................................................................... 22 2.2.1 A Historical Review................................................................................................. 23 2.2.2 A Technical Review................................................................................................. 24 2.2.3 Adaptive One-Factor-at-A-Time (OFAT) Experiments .......................................... 27 2.3 Introduction to System Regularities................................................................................ 29 2.3.1 Effect Sparsity.......................................................................................................... 30 2.3.2 Effect Hierarchy....................................................................................................... 33 2.3.3 Effect Heredity......................................................................................................... 35 2.4 DOE Models Incorporating System Regularities ........................................................... 36 2.4.1 The General Linear Model....................................................................................... 36 2.4.2 The Relaxed Weak Heredity Model ........................................................................ 37 2.4.3 The Hierarchical Probability Model ........................................................................ 39 2.5 Effects of System Regularities on DOE Methods........................................................... 42 2.6 Summary......................................................................................................................... 49 3. Verification and Quantification of System Regularities......................................50 3.1 Overview......................................................................................................................... 50 3.2 Objectives and Methods.................................................................................................. 51 3.2.1 The General Linear Model Revisited....................................................................... 52 3.2.2 The Lenth Method for Effect Analysis .................................................................... 52 3.2.3 Method for Quantifying Effect Sparsity .................................................................. 56 3.2.4 Method for Quantifying Effect Hierarchy ............................................................... 57 3.2.5 Method for Quantifying Effect heredity .................................................................. 58 4 3.3 The Set of Experimental Data......................................................................................... 59 3.4 An Illustrative Example for a Single Data Set................................................................ 62 3.5 Results of Meta-Analysis of 133 Data Sets .................................................................... 68 3.6 Quantification of the Standard Deviation c .................................................................... 71 3.7 Conclusions and Discussion ........................................................................................... 73 3.8 Summary........................................................................................................................
Recommended publications
  • Design of Experiments Application, Concepts, Examples: State of the Art
    Periodicals of Engineering and Natural Scinces ISSN 2303-4521 Vol 5, No 3, December 2017, pp. 421‒439 Available online at: http://pen.ius.edu.ba Design of Experiments Application, Concepts, Examples: State of the Art Benjamin Durakovic Industrial Engineering, International University of Sarajevo Article Info ABSTRACT Article history: Design of Experiments (DOE) is statistical tool deployed in various types of th system, process and product design, development and optimization. It is Received Aug 3 , 2018 multipurpose tool that can be used in various situations such as design for th Revised Oct 20 , 2018 comparisons, variable screening, transfer function identification, optimization th Accepted Dec 1 , 2018 and robust design. This paper explores historical aspects of DOE and provides state of the art of its application, guides researchers how to conceptualize, plan and conduct experiments, and how to analyze and interpret data including Keyword: examples. Also, this paper reveals that in past 20 years application of DOE have Design of Experiments, been grooving rapidly in manufacturing as well as non-manufacturing full factorial design; industries. It was most popular tool in scientific areas of medicine, engineering, fractional factorial design; biochemistry, physics, computer science and counts about 50% of its product design; applications compared to all other scientific areas. quality improvement; Corresponding Author: Benjamin Durakovic International University of Sarajevo Hrasnicka cesta 15 7100 Sarajevo, Bosnia Email: [email protected] 1. Introduction Design of Experiments (DOE) mathematical methodology used for planning and conducting experiments as well as analyzing and interpreting data obtained from the experiments. It is a branch of applied statistics that is used for conducting scientific studies of a system, process or product in which input variables (Xs) were manipulated to investigate its effects on measured response variable (Y).
    [Show full text]
  • Design of Experiments (DOE) Using JMP Charles E
    PharmaSUG 2014 - Paper PO08 Design of Experiments (DOE) Using JMP Charles E. Shipp, Consider Consulting, Los Angeles, CA ABSTRACT JMP has provided some of the best design of experiment software for years. The JMP team continues the tradition of providing state-of-the-art DOE support. In addition to the full range of classical and modern design of experiment approaches, JMP provides a template for Custom Design for specific requirements. The other choices include: Screening Design; Response Surface Design; Choice Design; Accelerated Life Test Design; Nonlinear Design; Space Filling Design; Full Factorial Design; Taguchi Arrays; Mixture Design; and Augmented Design. Further, sample size and power plots are available. We give an introduction to these methods followed by a few examples with factors. BRIEF HISTORY AND BACKGROUND From early times, there has been a desire to improve processes with controlled experimentation. A famous early experiment cured scurvy with seamen taking citrus fruit. Lives were saved. Thomas Edison invented the light bulb with many experiments. These experiments depended on trial and error with no software to guide, measure, and refine experimentation. Japan led the way with improving manufacturing—Genichi Taguchi wanted to create repeatable and robust quality in manufacturing. He had been taught management and statistical methods by William Edwards Deming. Deming taught “Plan, Do, Check, and Act”: PLAN (design) the experiment; DO the experiment by performing the steps; CHECK the results by testing information; and ACT on the decisions based on those results. In America, other pioneers brought us “Total Quality” and “Six Sigma” with “Design of Experiments” being an advanced methodology.
    [Show full text]
  • Concepts Underlying the Design of Experiments
    Concepts Underlying the Design of Experiments March 2000 University of Reading Statistical Services Centre Biometrics Advisory and Support Service to DFID Contents 1. Introduction 3 2. Specifying the objectives 5 3. Selection of treatments 6 3.1 Terminology 6 3.2 Control treatments 7 3.3 Factorial treatment structure 7 4. Choosing the sites 9 5. Replication and levels of variation 10 6. Choosing the blocks 12 7. Size and shape of plots in field experiments 13 8. Allocating treatment to units 14 9. Taking measurements 15 10. Data management and analysis 17 11. Taking design seriously 17 © 2000 Statistical Services Centre, The University of Reading, UK 1. Introduction This guide describes the main concepts that are involved in designing an experiment. Its direct use is to assist scientists involved in the design of on-station field trials, but many of the concepts also apply to the design of other research exercises. These range from simulation exercises, and laboratory studies, to on-farm experiments, participatory studies and surveys. What characterises the design of on-station experiments is that the researcher has control of the treatments to apply and the units (plots) to which they will be applied. The results therefore usually provide a detailed knowledge of the effects of the treat- ments within the experiment. The major limitation is that they provide this information within the artificial "environment" of small plots in a research station. A laboratory study should achieve more precision, but in a yet more artificial environment. Even more precise is a simulation model, partly because it is then very cheap to collect data.
    [Show full text]
  • The Theory of the Design of Experiments
    The Theory of the Design of Experiments D.R. COX Honorary Fellow Nuffield College Oxford, UK AND N. REID Professor of Statistics University of Toronto, Canada CHAPMAN & HALL/CRC Boca Raton London New York Washington, D.C. C195X/disclaimer Page 1 Friday, April 28, 2000 10:59 AM Library of Congress Cataloging-in-Publication Data Cox, D. R. (David Roxbee) The theory of the design of experiments / D. R. Cox, N. Reid. p. cm. — (Monographs on statistics and applied probability ; 86) Includes bibliographical references and index. ISBN 1-58488-195-X (alk. paper) 1. Experimental design. I. Reid, N. II.Title. III. Series. QA279 .C73 2000 001.4 '34 —dc21 00-029529 CIP This book contains information obtained from authentic and highly regarded sources. Reprinted material is quoted with permission, and sources are indicated. A wide variety of references are listed. Reasonable efforts have been made to publish reliable data and information, but the author and the publisher cannot assume responsibility for the validity of all materials or for the consequences of their use. Neither this book nor any part may be reproduced or transmitted in any form or by any means, electronic or mechanical, including photocopying, microfilming, and recording, or by any information storage or retrieval system, without prior permission in writing from the publisher. The consent of CRC Press LLC does not extend to copying for general distribution, for promotion, for creating new works, or for resale. Specific permission must be obtained in writing from CRC Press LLC for such copying. Direct all inquiries to CRC Press LLC, 2000 N.W.
    [Show full text]
  • Argus: Interactive a Priori Power Analysis
    © 2020 IEEE. This is the author’s version of the article that has been published in IEEE Transactions on Visualization and Computer Graphics. The final version of this record is available at: xx.xxxx/TVCG.201x.xxxxxxx/ Argus: Interactive a priori Power Analysis Xiaoyi Wang, Alexander Eiselmayer, Wendy E. Mackay, Kasper Hornbæk, Chat Wacharamanotham 50 Reading Time (Minutes) Replications: 3 Power history Power of the hypothesis Screen – Paper 1.0 40 1.0 Replications: 2 0.8 0.8 30 Pairwise differences 0.6 0.6 20 0.4 0.4 10 0 1 2 3 4 5 6 One_Column Two_Column One_Column Two_Column 0.2 Minutes with 95% confidence interval 0.2 Screen Paper Fatigue Effect Minutes 0.0 0.0 0 2 3 4 6 8 10 12 14 8 12 16 20 24 28 32 36 40 44 48 Fig. 1: Argus interface: (A) Expected-averages view helps users estimate the means of the dependent variables through interactive chart. (B) Confound sliders incorporate potential confounds, e.g., fatigue or practice effects. (C) Power trade-off view simulates data to calculate statistical power; and (D) Pairwise-difference view displays confidence intervals for mean differences, animated as a dance of intervals. (E) History view displays an interactive power history tree so users can quickly compare statistical power with previously explored configurations. Abstract— A key challenge HCI researchers face when designing a controlled experiment is choosing the appropriate number of participants, or sample size. A priori power analysis examines the relationships among multiple parameters, including the complexity associated with human participants, e.g., order and fatigue effects, to calculate the statistical power of a given experiment design.
    [Show full text]
  • Design of Experiments and Data Analysis,” 2012 Reliability and Maintainability Symposium, January, 2012
    Copyright © 2012 IEEE. Reprinted, with permission, from Huairui Guo and Adamantios Mettas, “Design of Experiments and Data Analysis,” 2012 Reliability and Maintainability Symposium, January, 2012. This material is posted here with permission of the IEEE. Such permission of the IEEE does not in any way imply IEEE endorsement of any of ReliaSoft Corporation's products or services. Internal or personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution must be obtained from the IEEE by writing to [email protected]. By choosing to view this document, you agree to all provisions of the copyright laws protecting it. 2012 Annual RELIABILITY and MAINTAINABILITY Symposium Design of Experiments and Data Analysis Huairui Guo, Ph. D. & Adamantios Mettas Huairui Guo, Ph.D., CPR. Adamantios Mettas, CPR ReliaSoft Corporation ReliaSoft Corporation 1450 S. Eastside Loop 1450 S. Eastside Loop Tucson, AZ 85710 USA Tucson, AZ 85710 USA e-mail: [email protected] e-mail: [email protected] Tutorial Notes © 2012 AR&MS SUMMARY & PURPOSE Design of Experiments (DOE) is one of the most useful statistical tools in product design and testing. While many organizations benefit from designed experiments, others are getting data with little useful information and wasting resources because of experiments that have not been carefully designed. Design of Experiments can be applied in many areas including but not limited to: design comparisons, variable identification, design optimization, process control and product performance prediction. Different design types in DOE have been developed for different purposes.
    [Show full text]
  • Introduction to Statistically Designed Experiments, NREL
    Introduction to Statistically Designed Experiments Mike Heaney ([email protected]) ASQ Denver Section 1300 Monthly Membership Meeting September 13, 2016 Englewood, Colorado NREL/PR-5500-66898 Objectives Give some basic background information on statistically designed experiments Demonstrate the advantages of using statistically designed experiments1 1 Often referred to as design of experiments (DOE or DOX), or experimental design. 2 A Few Thoughts Many companies/organizations do experiments for Process improvement Product development Marketing strategies We need to plan our experiments as efficiently and effectively as possible Statistically designed experiments have a proven track record (Mullin 2013 in Chemical & Engineering News) Conclusions supported by statistical tests Substantial reduction in total experiments Why are we still planning experiments like it’s the 19th century? 3 Outline • Research Problems • The Linear Model • Key Types of Designs o Full Factorial o Fractional Factorial o Response Surface • Sequential Approach • Summary 4 Research Problems Research Problems Is a dependent variable (Y) affected by multiple independent variables (Xs)? Objective: Increase bearing lifetime (Hellstrand 1989) Y Bearing lifetime (hours) X1 Inner ring heat treatment Outer ring osculation (ratio Outer Ring X2 between ball diameter and outer ring raceway radius) Ball X3 Cage design Cage Inner Ring 6 Research Problems Objective: Hydrogen cyanide (HCN) removal in diesel exhaust gas (Zhao et al. 2006) Y HCN conversion X1 Propene
    [Show full text]
  • Appendix A: Mathematical Formulas
    Appendix A: Mathematical Formulas Logarithms Inab = \na -\-\nb', \na/b = \na— \nb; \na^=b\na Geometric Series If Irl < 1, we find that oo oo E ar^ = ; y^akr^ = r; ^=0 A:=o y^ n Z-^ 1 -r ^^ 1 -r ^=1 k=0 Limits UHospitaVs rule. Suppose that \\vOix->XQ fix) = lim;c-^A:o sM = 0* or that liirix-^xo Z^-^) = l™x^jco SM — =t^^- Then, under some conditions, we can write that y fix) ^. fix) um = hm . x-^XQ g{x) x^xo g^{x) If the functions f\x) and g\x) satisfy the same conditions as f(x) and g(x), we can repeat the process. Moreover, the constant XQ may be equal to ±oo. Derivatives Derivative of a quotient: d fix) gix)fix) - fix)g\x) = 7^ 11 gix) ^0. dx gix) g^ix) 340 Appendix A: Mathematical Formulas Remark. This formula can also be obtained by differentiating the product f(x)h(x), where/Z(A:) := l/g(x). Chain rule. If w = ^(jc), then we have: d d du , , ^fiu) = —f(u) . — = f\u)g\x) = AgixWix). dx du dx For example, if w = jc^, we calculate — exp(M^) = exp(w^)2w • 2x = 4x^ expU"*). dx Integrals Integration by parts: j udv = uv — I vdu. Integration by substitution. If the inverse function g~^ exists, then we can write that rh rd j f(x)dx = j f[g(y)]g{y)dy, where a = g(c) ^ c = g ^(a) and b = g(d) <^ d = g ^(b). For example, / e''^'dx-^=^' f 2yeydy=2yey\l-2 f e^ dy = 2e^-\-2.
    [Show full text]
  • STATISTICAL SCIENCE Volume 31, Number 4 November 2016
    STATISTICAL SCIENCE Volume 31, Number 4 November 2016 ApproximateModelsandRobustDecisions..................James Watson and Chris Holmes 465 ModelUncertaintyFirst,NotAfterwards....................Ingrid Glad and Nils Lid Hjort 490 ContextualityofMisspecificationandData-DependentLosses..............Peter Grünwald 495 SelectionofKLNeighbourhoodinRobustBayesianInference..........Natalia A. Bochkina 499 IssuesinRobustnessAnalysis.............................................Michael Goldstein 503 Nonparametric Bayesian Clay for Robust Decision Bricks ..................................................Christian P. Robert and Judith Rousseau 506 Ambiguity Aversion and Model Misspecification: An Economic Perspective ................................................Lars Peter Hansen and Massimo Marinacci 511 Rejoinder: Approximate Models and Robust Decisions. .James Watson and Chris Holmes 516 Filtering and Tracking Survival Propensity (Reconsidering the Foundations of Reliability) .......................................................................Nozer D. Singpurwalla 521 On Software and System Reliability Growth and Testing. ..............FrankP.A.Coolen 541 HowAboutWearingTwoHats,FirstPopper’sandthendeFinetti’s?..............Elja Arjas 545 WhatDoes“Propensity”Add?...................................................Jane Hutton 549 Reconciling the Subjective and Objective Aspects of Probability ...............Glenn Shafer 552 Rejoinder:ConcertUnlikely,“Jugalbandi”Perhaps..................Nozer D. Singpurwalla 555 ChaosCommunication:ACaseofStatisticalEngineering...............Anthony
    [Show full text]
  • Subject Experiments in Augmented Reality
    Quan%ta%ve and Qualita%ve Methods for Human-Subject Experiments in Augmented Reality IEEE VR 2014 Tutorial J. Edward Swan II, Mississippi State University (organizer) Joseph L. Gabbard, Virginia Tech (co-organizer) 1 Schedule 9:00–10:30 1.5 hrs Experimental Design and Analysis Ed 10:30–11:00 0.5 hrs Coffee Break 11:00–12:30 1.5 hrs Experimental Design and Analysis / Ed / Formative Usability Evaluation Joe 12:30–2:00 1.5 hrs Lunch Break 2:00–3:30 1.5 hrs Formative Usability Evaluation Joe 3:30–4:00 0.5 hrs Coffee Break 4:00–5:30 1.5 hrs Formative Usability Evaluation / Joe / Panel Discussion Ed 2 Experimental Design and Analysis J. Edward Swan II, Ph.D. Department of Computer Science and Engineering Department of Psychology (Adjunct) Mississippi State University Motivation and Goals • Course attendee backgrounds? • Studying experimental design and analysis at Mississippi State University: – PSY 3103 Introduction to Psychological Statistics – PSY 3314 Experimental Psychology – PSY 6103 Psychometrics – PSY 8214 Quantitative Methods In Psychology II – PSY 8803 Advanced Quantitative Methods – IE 6613 Engineering Statistics I – IE 6623 Engineering Statistics II – ST 8114 Statistical Methods – ST 8214 Design & Analysis Of Experiments – ST 8853 Advanced Design of Experiments I – ST 8863 Advanced Design of Experiments II • 7 undergrad hours; 30 grad hours; 3 departments! 4 Motivation and Goals • What can we accomplish in one day? • Study subset of basic techniques – Presenters have found these to be the most applicable to VR, AR systems • Focus on intuition behind basic techniques • Become familiar with basic concepts and terms – Facilitate working with collaborators from psychology, industrial engineering, statistics, etc.
    [Show full text]
  • Statistics: Challenges and Opportunities for the Twenty-First Century
    This is page i Printer: Opaque this Statistics: Challenges and Opportunities for the Twenty-First Century Edited by: Jon Kettenring, Bruce Lindsay, & David Siegmund Draft: 6 April 2003 ii This is page iii Printer: Opaque this Contents 1 Introduction 1 1.1 The workshop . 1 1.2 What is statistics? . 2 1.3 The statistical community . 3 1.4 Resources . 5 2 Historical Overview 7 3 Current Status 9 3.1 General overview . 9 3.1.1 The quality of the profession . 9 3.1.2 The size of the profession . 10 3.1.3 The Odom Report: Issues in mathematics and statistics 11 4 The Core of Statistics 15 4.1 Understanding core interactivity . 15 4.2 A detailed example of interplay . 18 4.3 A set of research challenges . 20 4.3.1 Scales of data . 20 4.3.2 Data reduction and compression. 21 4.3.3 Machine learning and neural networks . 21 4.3.4 Multivariate analysis for large p, small n. 21 4.3.5 Bayes and biased estimation . 22 4.3.6 Middle ground between proof and computational ex- periment. 22 4.4 Opportunities and needs for the core . 23 4.4.1 Adapting to data analysis outside the core . 23 4.4.2 Fragmentation of core research . 23 4.4.3 Growth in the professional needs. 24 4.4.4 Research funding . 24 4.4.5 A Possible Program . 24 5 Statistics in Science and Industry 27 5.1 Biological Sciences . 27 5.2 Engineering and Industry . 33 5.3 Geophysical and Environmental Sciences .
    [Show full text]
  • Model Alignment Using Optimization and Design of Experiments
    Proceedings of the 2017 Winter Simulation Conference W. K. V. Chan, A. D'Ambrogio, G. Zacharewicz, N. Mustafee, G. Wainer, and E. Page, eds. MODEL ALIGNMENT USING OPTIMIZATION AND DESIGN OF EXPERIMENTS Alejandro Teran-Somohano Levent Yilmaz Alice E. Smith Department of Industrial and Systems Engineering Department of Computer Science and Software Engineering Auburn University Auburn University 3301 Shelby Center 3101 Shelby Center Auburn, AL 36849 USA Auburn, AL 36849 USA ABSTRACT The use of simulation modeling for scientific tasks demands that these models be replicated and independently verified by other members of the scientific community. However, determining whether two independently developed simulation models are “equal,” is not trivial. Model alignment is the term for this type of comparison. In this paper, we present an extension of the model alignment methodology for comparing the outcome of two simulation models that searches the response surface of both models for significant differences. Our approach incorporates elements of both optimization and design of experiments for achieving this goal. We discuss the general framework of our methodology, its feasibility of implementation, as well as some of the obstacles we foresee in its generalized application. 1 INTRODUCTION The present work is part of a larger research endeavor focused on simulation model transformation across different platforms. The overarching goal is to support the replicability and reproducibility of simulation experiments for scientific discovery. Model transformation is not, of itself, sufficient to achieve this goal. It is also necessary to verify that the transformed model properly mimics the behavior of the original. But what does it mean for a model to mimic another model? In this paper, we consider two models to be equal if the response surfaces of both models are equal.
    [Show full text]