Inferential Statistics Methods and the Computer-Based Approach to Project Management

Total Page:16

File Type:pdf, Size:1020Kb

Inferential Statistics Methods and the Computer-Based Approach to Project Management Inferential Statistics Methods and the Computer-Based Approach to Project Management Gordana DUKIĆ Josip Juraj Strossmayer University of Osijek, Faculty of Philosophy Department of Information Sciences Lorenza Jägera 9, 31000 Osijek, Croatia Mate SESAR Ph.D. Student Josip Juraj Strossmayer University of Osijek, Faculty of Agriculture Department of Agroeconomics Trg Sv. Trojstva 3, 31000 Osijek, Croatia Ivana SESAR Student Josip Juraj Strossmayer University of Osijek, Faculty of Agriculture Department of Agroeconomics Trg Sv. Trojstva 3, 31000 Osijek, Croatia ABSTRACT INTRODUCTION Project management is a discipline of outstanding Successful operation of any organization is directly importance, especially in cases when a certain task related to project realization. Projects as such can serve requires substantial human, technical and financial different purposes, e.g. they can result in shortening the resources to be accomplished. When estimating project time required for a certain activity, cost reduction, quality duration and its costs, as well as identifying its critical improvement, increase in sales, or improved working path activities, project managers can gain crucial support conditions. There are numerous determinants of the from computer simulation. The decision-making model concept 'project', two of which will be pointed out here. presented in this paper is based on the assumption that G.H. Blackiston [9] refers to projects when speaking of activity durations can be defined as random variables that activities that result in new or changed products, services, follow a triangular or beta PERT distribution. In order to environments, processes and organizations. According to estimate project duration as accurately as possible, the H.A. Levine [12], a project is a group of tasks, performed proposed computer-based model envisages a higher in a definable time period, in order to meet a specific set number of simulation sets. By using the analysis of of objectives. This author believes that a project is variance (ANOVA) and post hoc comparisons we are characterized by being a one-time program, by its life able to determine simulation sets whose averages of cycle, work areas that can be categorized in certain tasks, project duration have no statistically significant budget, using multiple resources, and possibly the need to differences. After their frequency distribution is formed, establish a special organization or to cross the boundaries we can calculate the confidence interval for the mean of the traditional one. Although the issues of project project duration, which is one of the key pieces of management is directly connected to the development of information in project management. human civilization, modern project management, based on scientific approach and systematic application of Keywords: project management, computer-based technological achievements, came into being only in mid- approach, decision-making model, computer simulation, 20th century. Over the past few decades, different random variable, ANOVA, post hoc comparisons, organizations, including non-profit, public and confidence interval for the mean project duration government institutions, have recognized the importance of project management. In today's turbulent conditions, efficient realization of any complex task is inconceivable without this discipline. Project management can be defined as the application of stochastic one. The procedures used within these methods relevant logic and tools to planning, directing, and to determine the critical path, the expected project controlling a temporary endeavor means of planning and duration, and the slack are available in numerous control (M.M. Helms (eds.) [5]). In line with the Project publications (R. Bronson, G. Naadimuthu [2], F.S. Management Body of Knowledge (PMBOK), J.P. Lewis Hillier, G.J. Lieberman [7], H.A. Taha [17]). Within the [13] defines project management as the application of proposed decision-making model, activity durations are knowledge, skills, tools and techniques to project defined as random variables that follow a triangular or activities to achieve project requirements. It follows from beta PERT distribution (F.E. Williams [19], C. the above that project management encompasses a wide Hendrickson [6], J. Fente, K. Knutson, C. Schexnayder spectrum of different concepts, tools and techniques [4], P.A. Jensen [8]). In order to estimate project duration designed to facilitate the efficient achievement of planned as accurately as possible, and to identify the activities on objectives. the critical path, the model envisages repeating a number of simulation sets. This procedure is inconceivable According to E. Verzuh [18], project management can be without a computer and appropriate program applications. divided into five main functions: selection, definition, planning, control and close out. During selection, a Project managers can significantly improve their project is assessed regarding its purpose and benefits, its decision-making by using inferential statistics methods objectives are being related to the organization's strategic (D.-E. Lee, D. Arditi [10], D.-E. Lee, J.J. Shi [11], G. goals, and resource constraints are determined. Having Dukić, D. Dukić, M. Sesar [3]). In the model analyzed selected the project, the organization identifies all the here, the simulation sets whose averages of project stakeholders interested in its performance, appoints duration have no statistically significant differences are people in charge, and draws up the required documents. determined by means of the analysis of variance They list the objectives, constraints, management (ANOVA) and post hoc comparisons. These are then the structure, performance benchmarks, and all the other facts basis for calculating the confidence interval for the mean or assumptions that are important for completing the project duration, which is one of the most valuable pieces project. In this way, the project has been defined. This is of information for project managers. followed by the planning stage, including a detailed description of the activities, their schedule, an estimate of project costs and duration, as well as other required PROJECT PLANNING USING CPM/PERT resources, especially personnel. Within the control function, the project progress is monitored, establishing The aim of the CPM and PERT methods is to provide an possible departures from the set plan, which requires analytical framework for scheduling the activities. feedback between particular phases. If there are Therefore, there are four key values to be determined departures from the plan, the project team takes within both of these methods: the earliest start time (ES), corrective action. An important aspect of this function is the earliest finish time (EF), the latest start time (LS) and communication of the control bodies with the project the latest finish time (LF). They are calculated on the team and all the other stakeholders. Close out is the last basis of estimated duration of each activity (Dij). In this phase in project implementation. As a function, it way we can obtain the elements required to create a includes disbanding the team that worked on the project network diagram. An example of such a diagram, where and dismantling the project infrastructure. There should the nodes are marked with A, is given in Figure 1. be a systematic follow-up of the outcomes achieved in the realization phase. If these outcomes do not meet the organization's requirements, project management needs to A(1) be put in place again. As its efficiency is closely related 1 ES1 LF1 D 0 1 to the available management information, an area that D D 4 1 A(0)3 A(4) merits special attention within project management is the D D 34 modeling and construction of an adequate database. ES0 LF0 A(3) ES4 LF4 02 23 D A(2) ES3 LF3 ES LF RESEARCH METHODOLOGY 2 2 The decision-making model presented in this paper is Figure 1. An example of a network diagram based on project management techniques. The two best known, CPM (Critical Path Method) and PERT (Project The basic difference between the CPM and PERT Evaluation and Review Technique), are the starting point methods lies in the way how activity duration is for the model we have developed. The former belongs to determined. Whereas CPM is based on a single time the group of deterministic methods, whereas the latter is a estimate, within PERT, three time estimates are given for every activity - optimistic (a), most likely (m) and ⎛ T − μ{}E( j ) ⎞ pessimistic (b). In such a case, the expected duration of P()D{}E( j ) ≤ T = P⎜ z ≤ ⎟ ⎜ σ 2 {}⎟ each activity (i, j) is calculated by means of the formula: ⎝ E( j ) ⎠ 1 Dij = ()a + 4m + b With the increasing number of activities the parameter 6 approximation is improving, and with it also the estimate If the earliest start time of the first activity is marked with of probable completion of an event within the set ES0, then it must be ES0 = 0. The earliest start time of any timeframe. activity (ESj) is determined in the following way: {} ES j = max ESi + Dij , for all i entering into j. SIMULATION OF ACTIVITY DURATION i BASED ON A TRIANGULAR AND BETA PERT DISTRIBUTION The latest finish time of an activity (LFi) can be calculated as follows: In comparison to the CPM method, the usage of PERT can only slightly decrease the risk of inaccurate estimate LFi = min{}LF j − Dij , for all j leaving from i. j of project duration and its costs, as well as of the incorrect identification of activities on the critical path. The activity is critical if it satisfies the following The reason for this is that, when using PERT in project conditions: planning, only one time estimate is taken into account, namely the one obtained by calculating the expected 1. ESi = LFi duration of an activity. What is disregarded in this way are all the possible subcritical paths, which could be 2. ES j = LFj generated due to wrong estimates of this value. Hence, − = − = 3. ES j ESi LFj LFi Dij the initial model based on the PERT method needs to include the concept of computer simulation.
Recommended publications
  • Package 'Prevalence'
    Zurich Open Repository and Archive University of Zurich Main Library Strickhofstrasse 39 CH-8057 Zurich www.zora.uzh.ch Year: 2013 Package ‘prevalence’ Devleesschauwer, Brecht ; Torgerson, Paul R ; Charlier, Johannes ; Levecke, Bruno ; Praet, Nicolas ; Dorny, Pierre ; Berkvens, Dirk ; Speybroeck, Niko Abstract: Tools for prevalence assessment studies. IMPORTANT: the truePrev functions in the preva- lence package call on JAGS (Just Another Gibbs Sampler), which therefore has to be available on the user’s system. JAGS can be downloaded from http://mcmc-jags.sourceforge.net/ Posted at the Zurich Open Repository and Archive, University of Zurich ZORA URL: https://doi.org/10.5167/uzh-89061 Scientific Publication in Electronic Form Published Version The following work is licensed under a Software: GNU General Public License, version 2.0 (GPL-2.0). Originally published at: Devleesschauwer, Brecht; Torgerson, Paul R; Charlier, Johannes; Levecke, Bruno; Praet, Nicolas; Dorny, Pierre; Berkvens, Dirk; Speybroeck, Niko (2013). Package ‘prevalence’. On Line: The Comprehensive R Archive Network. Package ‘prevalence’ September 22, 2013 Type Package Title The prevalence package Version 0.2.0 Date 2013-09-22 Author Brecht Devleesschauwer [aut, cre], Paul Torgerson [aut],Johannes Charlier [aut], Bruno Lev- ecke [aut], Nicolas Praet [aut],Pierre Dorny [aut], Dirk Berkvens [aut], Niko Speybroeck [aut] Maintainer Brecht Devleesschauwer <[email protected]> BugReports https://github.com/brechtdv/prevalence/issues Description Tools for prevalence
    [Show full text]
  • QUICK OVERVIEW of PROBABILITY DISTRIBUTIONS the Following Is A
    distribution is its memoryless property, which means that the future lifetime of a given object has the same distribution regardless of the time it existed. In other words, time has no effect on future outcomes. Success Rate () > 0. QUICK OVERVIEW OF PROBABILITY DISTRIBUTIONS PROBABILITY DISTRIBUTIONS: ALL OTHERS The following is a quick synopsis of the probability distributions available in Real Options Valuation, Inc.’s various software applications such as Risk Simulator, Real Options SLS, ROV Quantitative Data Miner, ROV Modeler, and others. Arcsine. U-shaped, it is a special case of the Beta distribution when both shape PROBABILITY DISTRIBUTIONS: MOST COMMONLY USED and scale are equal to 0.5. Values close to the minimum and maximum have high There are anywhere from 42 to 50 probability distributions available in the ROV probabilities of occurrence whereas values between these two extremes have very software suite, and the most commonly used probability distributions are listed here in small probabilities or occurrence. Minimum < Maximum. order of popularity of use. See the user manual for more technical details. Bernoulli. Discrete distribution with two outcomes (e.g., head or tails, success or failure), which is why it is also known simply as the Yes/No distribution. The Bernoulli distribution is the Binomial distribution with one trial. This distribution is the fundamental building block of other more complex distributions. Probability of Success 0 < (P) < 1. Beta. This distribution is very flexible and is commonly used to represent variability over a fixed range. It is used to describe empirical data and predict the random behavior of percentages and fractions, as the range of outcomes is typically between 0 and 1.
    [Show full text]
  • Field Guide to Continuous Probability Distributions
    Field Guide to Continuous Probability Distributions Gavin E. Crooks v 1.0.0 2019 G. E. Crooks – Field Guide to Probability Distributions v 1.0.0 Copyright © 2010-2019 Gavin E. Crooks ISBN: 978-1-7339381-0-5 http://threeplusone.com/fieldguide Berkeley Institute for Theoretical Sciences (BITS) typeset on 2019-04-10 with XeTeX version 0.99999 fonts: Trump Mediaeval (text), Euler (math) 271828182845904 2 G. E. Crooks – Field Guide to Probability Distributions Preface: The search for GUD A common problem is that of describing the probability distribution of a single, continuous variable. A few distributions, such as the normal and exponential, were discovered in the 1800’s or earlier. But about a century ago the great statistician, Karl Pearson, realized that the known probabil- ity distributions were not sufficient to handle all of the phenomena then under investigation, and set out to create new distributions with useful properties. During the 20th century this process continued with abandon and a vast menagerie of distinct mathematical forms were discovered and invented, investigated, analyzed, rediscovered and renamed, all for the purpose of de- scribing the probability of some interesting variable. There are hundreds of named distributions and synonyms in current usage. The apparent diver- sity is unending and disorienting. Fortunately, the situation is less confused than it might at first appear. Most common, continuous, univariate, unimodal distributions can be orga- nized into a small number of distinct families, which are all special cases of a single Grand Unified Distribution. This compendium details these hun- dred or so simple distributions, their properties and their interrelations.
    [Show full text]
  • Discoversim™ Version 2 Workbook
    DiscoverSim™ Version 2 Workbook Contact Information: Technical Support: 1-866-475-2124 (Toll Free in North America) or 1-416-236-5877 Sales: 1-888-SigmaXL (888-744-6295) E-mail: [email protected] Web: www.SigmaXL.com Published: December 2015 Copyright © 2011 - 2015, SigmaXL, Inc. Table of Contents DiscoverSim™ Feature List Summary, Installation Notes, System Requirements and Getting Help ..................................................................................................................................1 DiscoverSim™ Version 2 Feature List Summary ...................................................................... 3 Key Features (bold denotes new in Version 2): ................................................................. 3 Common Continuous Distributions: .................................................................................. 4 Advanced Continuous Distributions: ................................................................................. 4 Discrete Distributions: ....................................................................................................... 5 Stochastic Information Packet (SIP): .................................................................................. 5 What’s New in Version 2 .................................................................................................... 6 Installation Notes ............................................................................................................... 8 DiscoverSim™ System Requirements ..............................................................................
    [Show full text]
  • Distributing Correlation Coefficients of Linear Structure-Activity/Property Models
    Leonardo Journal of Sciences Issue 18, July-December 2011 ISSN 1583-0233 p. 27-48 Distributing Correlation Coefficients of Linear Structure-Activity/Property Models Lorentz JÄNTSCHI 1, 2, and Sorana D. BOLBOACĂ 3, * 1 Technical University of Cluj-Napoca, 28 Memorandumului, 400114 Cluj-Napoca, Romania. 2 University of Agricultural Sciences and Veterinary Medicine Cluj-Napoca, 3-5 Mănăştur, 400372 Cluj-Napoca, Romania. 3 “Iuliu Haţieganu” University of Medicine and Pharmacy Cluj-Napoca, Department of Medical Informatics and Biostatistics, 6 Louis Pasteur, 400349 Cluj-Napoca, Cluj, Romania. E-mail(s): [email protected]; [email protected] * Corresponding author: Phone: +4-0264-431-697; Fax: +4-0264-593847 Abstract Quantitative structure-activity/property relationships are mathematical relationships linking chemical structure and activity/property in a quantitative manner. These in silico approaches are frequently used to reduce animal testing and risk-assessment, as well as to increase time- and cost-effectiveness in characterization and identification of active compounds. The aim of our study was to investigate the pattern of correlation coefficients distribution associated to simple linear relationships linking the compounds structure with their activities. A set of the most common ordnance compounds found at naval facilities with a limited data set with a range of toxicities on aquatic ecosystem and a set of seven properties was studied. Statistically significant models were selected and investigated. The probability density function of the correlation coefficients was investigated using a series of possible continuous distribution laws. Almost 48% of the correlation coefficients proved fit Beta distribution, 40% fit Generalized Pareto distribution, and 12% fit Pert distribution.
    [Show full text]
  • RISK for Microsoft Excel
    Guide to Using @RISK Risk Analysis and Simulation Add-In for Microsoft® Excel Version 4.5 February, 2004 Palisade Corporation 31 Decker Road Newfield, NY USA 14867 (607) 277-8000 (607) 277-8001 (fax) http://www.palisade.com (website) [email protected] (e-mail) Copyright Notice Copyright © 2004, Palisade Corporation. Trademark Acknowledgments Microsoft, Excel and Windows are registered trademarks of Microsoft, Inc. IBM is a registered trademark of International Business Machines, Inc. Palisade, TopRank, BestFit and RISKview are registered trademarks of Palisade Corporation. RISK is a trademark of Parker Brothers, Division of Tonka Corporation and is used under license. Welcome @RISK for Microsoft Excel Welcome to @RISK, the revolutionary software system for the analysis of business and technical situations impacted by risk! The techniques of Risk Analysis have long been recognized as powerful tools to help decision-makers successfully manage situations subject to uncertainty. Their use has been limited because they have been expensive, cumbersome to use, and have substantial computational requirements. However, the growing use of computers in business and science has offered the promise that these techniques can be commonly available to all decision-makers. That promise has been finally realized with @RISK (pronounced "at risk") — a system which brings these techniques to the industry standard spreadsheet package, Microsoft Excel. With @RISK and Excel any risky situation can be modeled, from business to science and engineering. You are the best judge of what your analysis needs require, and @RISK, combined with the modeling capabilities of Excel, allows you to design a model which best satisfies those needs. Anytime you face a decision or analysis under uncertainty, you can use @RISK to improve your picture of what the future could hold.
    [Show full text]
  • Ebookdistributions.Pdf
    DOWNLOAD YOUR FREE MODELRISK TRIAL Adapted from Risk Analysis: a quantitative guide by David Vose. Published by John Wiley and Sons (2008). All Rights Reserved. No part of this publication may be reproduced, stored in a retrieval system or transmitted in any form or by any means, electronic, mechanical, photocopying, recording, scanning or otherwise, except under the terms of the Copyright, Designs and Patents Act 1988 or under the terms of a licence issued by the Copyright Licensing Agency Ltd, 90 Tottenham Court Road, London W1T 4LP, UK, without the permission in writing of the Publisher. If you notice any errors or omissions, please contact [email protected] Referencing this document Please use the following reference, following the Harvard system of referencing: Van Hauwermeiren M, Vose D and Vanden Bossche S (2012). A Compendium of Distributions (second edition). [ebook]. Vose Software, Ghent, Belgium. Available from www.vosesoftware.com . Accessed dd/mm/yy. © Vose Software BVBA (2012) www.vosesoftware.com Updated 17 January, 2012. Page 2 Table of Contents Introduction .................................................................................................................... 7 DISCRETE AND CONTINUOUS DISTRIBUTIONS.......................................................................... 7 Discrete Distributions .............................................................................................. 7 Continuous Distributions ........................................................................................
    [Show full text]
  • Probability Analysis of Slope Stability
    Graduate Theses, Dissertations, and Problem Reports 1999 Probability analysis of slope stability Jennifer Lynn Peterson West Virginia University Follow this and additional works at: https://researchrepository.wvu.edu/etd Recommended Citation Peterson, Jennifer Lynn, "Probability analysis of slope stability" (1999). Graduate Theses, Dissertations, and Problem Reports. 990. https://researchrepository.wvu.edu/etd/990 This Thesis is protected by copyright and/or related rights. It has been brought to you by the The Research Repository @ WVU with permission from the rights-holder(s). You are free to use this Thesis in any way that is permitted by the copyright and related rights legislation that applies to your use. For other uses you must obtain permission from the rights-holder(s) directly, unless additional rights are indicated by a Creative Commons license in the record and/ or on the work itself. This Thesis has been accepted for inclusion in WVU Graduate Theses, Dissertations, and Problem Reports collection by an authorized administrator of The Research Repository @ WVU. For more information, please contact [email protected]. Probability Analysis of Slope Stability Jennifer Lynn Peterson Thesis Submitted to the College of Engineering and Mineral Resources At West Virginia University In Partial Fulfillment of the Requirements for the Degree of Master of Science In Civil and Environmental Engineering John P. Zaniewski, Ph.D., P.E., Committee Chair Karl E. Barth, Ph.D. Gerald R. Hobbs, Ph.D. Morgantown, West Virginia 1999 Keywords: Slope Stability, Monte Carlo Simulation, Probability Analysis, and Risk Assessment Abstract Probability Analysis of Slope Stability Jennifer L. Peterson Committee: Dr. John Zaniewski, CEE (Chair), Dr.
    [Show full text]
  • What Is Statistical PERT™? What Is PERT?
    What is Statistical PERT™? What is Statistical PERT™? Statistical PERT™ is a technique for making probabilistic estimates for bell-shaped uncertainties. A bell-shaped uncertainty – also referred to as the normal probability distribution, or, more- commonly, “the normal curve” – is one that has a most likely outcome, plus highly unlikely minimum and maximum outcomes. There are many bell-shaped uncertainties in life, such as student test scores for an exam given by a teacher, where the most likely outcome might be a student scoring 70%, and only a very few students will improbably score either 100% or 40%. In business, revenue and expense forecasting deals with bell-shaped uncertainties, and many projects have bell-shaped uncertainty with respect to how long they take to complete. Statistical PERT™ improves upon PERT estimation (described below) by making use of Microsoft Excel’s built-in statistical functions for the normal probability distribution. Using Microsoft Excel, Statistical PERT™ can create an infinite number of estimates for an uncertainty, with each estimate having its own, respective probability that it will (or will not be) exceeded by the actual outcome for that uncertainty. What is PERT? In the 1950s, the United States Navy developed the Program Evaluation and Review Technique1 (PERT) and used it to improve scheduling large projects. Modern project management has used PERT as the basis for making estimates for project tasks or costs. PERT estimation relies upon the PERT formula, expressed in slightly different ways, but all having the same result of creating an arithmetic average (the mean) for a bell-shaped uncertainty.
    [Show full text]
  • Probability Distributions
    Probability distributions [Nematrian website page: ProbabilityDistributionsIntro, © Nematrian 2020] The Nematrian website contains information and analytics on a wide range of probability distributions, including: Discrete (univariate) distributions - Bernoulli, see also binomial distribution - Binomial - Geometric, see also negative binomial distribution - Hypergeometric - Logarithmic - Negative binomial - Poisson - Uniform (discrete) Continuous (univariate) distributions - Beta - Beta prime - Burr - Cauchy - Chi-squared - Dagum - Degenerate - Error function - Exponential - F - Fatigue, also known as the Birnbaum-Saunders distribution - Fréchet, see also generalised extreme value (GEV) distribution - Gamma - Generalised extreme value (GEV) - Generalised gamma - Generalised inverse Gaussian - Generalised Pareto (GDP) - Gumbel, see also generalised extreme value (GEV) distribution - Hyperbolic secant - Inverse gamma - Inverse Gaussian - Johnson SU - Kumaraswamy - Laplace - Lévy - Logistic - Log-logistic - Lognormal - Nakagami - Non-central chi-squared - Non-central t - Normal - Pareto - Power function - Rayleigh - Reciprocal - Rice - Student’s t - Triangular - Uniform - Weibull, see also generalised extreme value (GEV) distribution Continuous multivariate distributions - Inverse Wishart Copulas (a copula is a special type of continuous multivariate distribution) - Clayton - Comonotonicity - Countermonotonicity (only valid for 푛 = 2, where 푛 is the dimension of the input) - Frank - Generalised Clayton - Gumbel - Gaussian - Independence
    [Show full text]
  • Commonly Used Distributions Normal Distribution Formula: F(X)
    Commonly used distributions Normal distribution 1 x−μ 2 1 − ( ) Formula: f(x) = e 2 σ √2πσ2 Domain: −∞ < x < ∞ Parameters: Mean −∞ < μ < ∞ standard deviation σ > 0 Uniform distribution continuous – (real numbers) 0 x < LB or x > UB Formula: f(x) = { 1 LB ≤ x ≤ UB UB−LB Domain: LB < x < UB Parameters: LB (Lower Bound), UB (Upper Bound) Uniform distribution discrete (integer numbers) x<LB or x>UB 0 or x not an integer Formula: f(x) = { 1 LB ≤ x ≤ UB UB−LB+1 Domain: LB ≤ x ≤ UB LB = integer Parameters: UB = integer > LB LB (Lower Bound), UB (Upper Bound) Triangular distribution 0 x < Min or x > Max x−Min 2 ∙ (Max − Min) ∙ Min ≤ x ≤ Mode Formula: f(x) = { Mode−Min x−Mode 2 ∙ (Max − Min) ∙ (1 − ) Mode < x ≤ Max Max−Mode Domain: Min ≤ x ≤ Max Parameters: Min, Mode (likeliest), Max Fractile distributions (10/50/90 et al) Formula: The distribution generates samples from specified list of Low Value (LV), Median Value (MV) and High Value (HV). HV and LV are drawn with probability of 0.30 or 0.25; MV is returned with probability of 0.40 or 0.50 respectively depending on the selected option Gaussian or Swanson’s mean within the distribution dialog. Domain: LV, MV, HV Parameters: LV, MV, HV Beta distribution discrete (integer numbers) (푛−1)! Formula: f(x) = 푥푟−1(1 − 푥)푛−푟−1 (푟−1)!(푛−푟−1)! Domain: 0 < x < 1 Parameters: 푟 > 0, 푛 > 푟 푟 Details: 푀푒푎푛 = ; 푟 = 표푐푐푢푟푎푛푐푒푠; 푛 = 푝표푝푢푙푎푡푖표푛 푠푖푧푒 푛 Beta distribution continuous (real numbers) Γ(푎+푏) Formula: f(x) = 푥(푎−1)(1 − 푥)(푏−1) Γ(푎)Γ(푏) Domain: 0 < x < 1 Parameters: 푎 > 0, 푏 > 0 푎 Details: 푀푒푎푛 = (푎+푏) The
    [Show full text]
  • A Step-Wise Approach to Elicit Triangular Distributions
    2013 ICEAA Professional Development & Training Workshop June 18-21, 2013 • New Orleans, Louisiana A Step-Wise Approach to Elicit Triangular Distributions Presented by: Marc Greenberg Cost Analysis Division (CAD) National Aeronautics and Space Administration Risk, Uncertainty & Estimating “It is better to be approximately right rather than precisely wrong. Warren Buffett Slide 2 Outline • Purpose of Presentation • Background – The Uncertainty Spectrum – Expert Judgment Elicitation (EE) – Continuous Distributions • More details on Triangular, Beta & Beta-PERT Distributions • Five Expert Elicitation (EE) Phases • Example: Estimate Morning Commute Time – Expert Elicitation (EE) to create a Triangular Distribution • With emphasis on Phase 4‟s Q&A with Expert (2 iterations) – Convert Triangular Distribution into a Beta-PERT • Conclusion & Potential Improvements Slide 3 Purpose of Presentation Adapt / combine known methods to demonstrate an expert judgment elicitation process that … 1. Models expert‟s inputs as a triangular distribution – 12 questions to elicit required parameters for a bounded distribution – Not too complex to be impractical; not too simple to be too subjective 2. Incorporates techniques to account for expert bias – A repeatable Q&A process that is iterative & includes visual aids – Convert Triangular to Beta-PERT (if overconfidence was addressed) 3. Is structured in a way to help justify expert‟s inputs – Expert must provide rationale for each of his/her responses – Using Risk Breakdown Structure, expert specifies each risk factor‟s relative contribution to a given uncertainty (of cost, duration, reqt, etc.) This paper will show one way of “extracting” expert opinion for estimating purposes. Nevertheless, as with most subjective methods, there are many ways to do this.
    [Show full text]