Algorithms for Operations on Probability Distributions in a Computer Algebra System

Total Page:16

File Type:pdf, Size:1020Kb

Algorithms for Operations on Probability Distributions in a Computer Algebra System W&M ScholarWorks Dissertations, Theses, and Masters Projects Theses, Dissertations, & Master Projects 2001 Algorithms for operations on probability distributions in a computer algebra system Diane Lynn Evans College of William & Mary - Arts & Sciences Follow this and additional works at: https://scholarworks.wm.edu/etd Part of the Mathematics Commons, and the Statistics and Probability Commons Recommended Citation Evans, Diane Lynn, "Algorithms for operations on probability distributions in a computer algebra system" (2001). Dissertations, Theses, and Masters Projects. Paper 1539623382. https://dx.doi.org/doi:10.21220/s2-bath-8582 This Dissertation is brought to you for free and open access by the Theses, Dissertations, & Master Projects at W&M ScholarWorks. It has been accepted for inclusion in Dissertations, Theses, and Masters Projects by an authorized administrator of W&M ScholarWorks. For more information, please contact [email protected]. Reproduced with with permission permission of the of copyright the copyright owner. owner.Further reproductionFurther reproduction prohibited without prohibited permission. without permission. ALGORITHMS FOR OPERATIONS ON PROBABILITY DISTRIBUTIONS IN A COMPUTER ALGEBRA SYSTEM A Dissertation Presented to The Faculty of the Department of Applied Science The College of William & Mary in Virginia In Partial Fulfillment Of the Requirements for the Degree of Doctor of Philosophy by Diane Lynn Evans July 2001 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. UMI Number: 3026405 Copyright 2001 by Evans, Diane Lynn All rights reserved. ___ ® UMI UMI Microform 3026405 Copyright 2001 by Bell & Howell Information and Learning Company. All rights reserved. This microform edition is protected against unauthorized copying under Title 17, United States Code. Bell & Howell Information and Learning Company 300 North Zeeb Road P.O. Box 1346 Ann Arbor, Ml 48106-1346 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. APPROVAL SHEET This dissertation is submitted in partial fulfillment of the requirements for the Degree of Doctor of Philosophy Diane L. Evans, Author APPROVED, July 2001 u Lawrence Leemis * * l3Rex Av MKincaid in Am/1 ^ IS<W) Dennis Manos i t ' John Drew Sidney Lawrence Mathematics Department ii Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. Contents Acknowledgements v List of Tables vi List of Figures vii A bstract x 1 Introduction 2 1.1 Notation and Nomenclature .......................................................................... 1 1 1.2 Introductory E xam ples .................................................................................. 1 2 2 Data Structure 17 2.1 Standard Discrete Data Structure Form ats .............................................. 27 2 .2 The Six Functional Representations ........................................................... 33 2.3 Algorithms for Fundamental Procedures ................................................. 51 3 Order Statistics 56 3.1 Implementation for Discrete Populations ................................................. 58 3.2 E x a m p le s ......................................................................................................... 70 3.3 Range S ta tistic s ............................................................................................... 79 3.4 Eliminating Resampling Error in Bootstrapping ................................... 8 8 4 Convolutions and Products 97 4.1 Conceptual Fram ew ork .................................................................................. 104 4.2 A lg o rith m ......................................................................................................... 117 4.3 Im plem entatio n ............................................................................................... 120 4.4 Examples .......................................................................................................... 123 4.5 Products of Random Variables with Finite Supports ............................. 133 5 Transformations 145 5.1 T h e o ry ............................................................................................................... 146 5.2 Im plem entation .............................................................................................. 149 5.3 A pplications ..................................................................................................... 164 iii Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. 6 Minimums and Maximums 170 6.1 PDF of the Minimum ...................................................................................... 172 6.2 PDF of the Maximum ................................................................................... 185 7 Algorithms for Operations on Continuous Distributions 195 7.1 Existence Conditions for PD Fs .................................................................. 195 7.2 Method of Moments Estimation .................................................................. 200 7.3 Maximum Likelihood Estimation with Right Censoring ....................... 207 7.4 Mixture and Truncate P ro c e d u re s ........................................................... 212 8 Survival Distributions Satisfying Benford’s Law 218 8.1 Benford’s L a w ................................................................................................... 218 8 . 2 Parametric Survival Distributions ............................................................... 220 8.3 Conditions for Conformance to Benford’s L aw ................ 222 8.4 Variate Generation ......................................................................................... 230 8.5 C onclusio ns ...................................................................................................... 232 9 Input Modeling 233 9.1 E x a m p le s ......................................................................................................... 234 9.2 Further work ................................................................................................... 245 10 APPLications 247 10.1 Kolmogorov-Smirnov Test Statistic for Estimated Parameters .... 247 10.2 O th e r s ................................................................................................................. 258 11 Future Work 271 A Algorithm for OrderStat 274 B Maple Code for NextCombination and NextPermutation 277 C Determining Candidate Sums for the Heap 282 D Algorithm for BruteForceMethod 284 E A lgorithm for MovingHeapMethod 285 F APPL Code for Benford 287 Bibliography 288 iv Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. Acknowledgement s I would like to thank: My committee members: Dr. Drew, Dr. Kincaid, Dr. Manos, and Dr. Lawrence for their careful reading and suggestions of my dissertation and for being great instructors, both in and out of the classroom; Dr. Andrew Glen for allowing me to become part of “APPL” and showing me the ropes to becoming a Maple programmer; The Operations Research faculty at The College of William & Mary for out­ standing instruction and a strong probability and statistics foundation; Dr. Frank Carroll for being my mentor, friend, and mathematical “sounding board” for many years; The Clare Boothe Luce foundation for their generous fellowship that allowed me to continue my education and have the freedom to delve into my research; Dr. Larry Leemis for being himself: an excellent teacher, researcher, and advisor. I have spent three of the best years of my life working with him and will always admire and respect him in many ways. The time he has spent with me will always be appreciated, and I hope that someday I may also make such a positive impact, mathematically and otherwise, on another person’s life. Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. List of Tables 1 .1 Observed horse kick fatalities ......................................................................... 15 2.1 Discrete random variable support categories .............................................. 24 2.2 The six functional representations of a random variable X .................... 34 2.3 Distribution representation relationships .................................................... 35 3.1 Categorization of discrete order statistics with associated examples. 71 3.2 Rat survival data............................................................................................... 90 3.3 Bootstrap estimates of the standard error of the median ........................ 91 3.4 Bootstrap estimates of the standard error of the mean ........................... 93 4.1 Comparison of BruteForceMethod and MovingHeapMethod ................... 123 4.2 Probability table for a convolution ............................................................... 130 4.3 The exact probabilities and normal PDF approximations for Pr(S = s) for s = 7 , 8 , ..., 21............................................................................................ 131 5.1 Categories for computing the PDF of the random variable Y = g(X ) when X is a discrete random variable with support Qx hi a Dot support form at .................................................................................................................. 157 5.2 Life tests on a three-component system .....................................................
Recommended publications
  • A Tail Quantile Approximation Formula for the Student T and the Symmetric Generalized Hyperbolic Distribution
    A Service of Leibniz-Informationszentrum econstor Wirtschaft Leibniz Information Centre Make Your Publications Visible. zbw for Economics Schlüter, Stephan; Fischer, Matthias J. Working Paper A tail quantile approximation formula for the student t and the symmetric generalized hyperbolic distribution IWQW Discussion Papers, No. 05/2009 Provided in Cooperation with: Friedrich-Alexander University Erlangen-Nuremberg, Institute for Economics Suggested Citation: Schlüter, Stephan; Fischer, Matthias J. (2009) : A tail quantile approximation formula for the student t and the symmetric generalized hyperbolic distribution, IWQW Discussion Papers, No. 05/2009, Friedrich-Alexander-Universität Erlangen-Nürnberg, Institut für Wirtschaftspolitik und Quantitative Wirtschaftsforschung (IWQW), Nürnberg This Version is available at: http://hdl.handle.net/10419/29554 Standard-Nutzungsbedingungen: Terms of use: Die Dokumente auf EconStor dürfen zu eigenen wissenschaftlichen Documents in EconStor may be saved and copied for your Zwecken und zum Privatgebrauch gespeichert und kopiert werden. personal and scholarly purposes. Sie dürfen die Dokumente nicht für öffentliche oder kommerzielle You are not to copy documents for public or commercial Zwecke vervielfältigen, öffentlich ausstellen, öffentlich zugänglich purposes, to exhibit the documents publicly, to make them machen, vertreiben oder anderweitig nutzen. publicly available on the internet, or to distribute or otherwise use the documents in public. Sofern die Verfasser die Dokumente unter Open-Content-Lizenzen (insbesondere CC-Lizenzen) zur Verfügung gestellt haben sollten, If the documents have been made available under an Open gelten abweichend von diesen Nutzungsbedingungen die in der dort Content Licence (especially Creative Commons Licences), you genannten Lizenz gewährten Nutzungsrechte. may exercise further usage rights as specified in the indicated licence. www.econstor.eu IWQW Institut für Wirtschaftspolitik und Quantitative Wirtschaftsforschung Diskussionspapier Discussion Papers No.
    [Show full text]
  • Generating Random Samples from User-Defined Distributions
    The Stata Journal (2011) 11, Number 2, pp. 299–304 Generating random samples from user-defined distributions Katar´ına Luk´acsy Central European University Budapest, Hungary lukacsy [email protected] Abstract. Generating random samples in Stata is very straightforward if the distribution drawn from is uniform or normal. With any other distribution, an inverse method can be used; but even in this case, the user is limited to the built- in functions. For any other distribution functions, their inverse must be derived analytically or numerical methods must be used if analytical derivation of the inverse function is tedious or impossible. In this article, I introduce a command that generates a random sample from any user-specified distribution function using numeric methods that make this command very generic. Keywords: st0229, rsample, random sample, user-defined distribution function, inverse method, Monte Carlo exercise 1 Introduction In statistics, a probability distribution identifies the probability of a random variable. If the random variable is discrete, it identifies the probability of each value of this vari- able. If the random variable is continuous, it defines the probability that this variable’s value falls within a particular interval. The probability distribution describes the range of possible values that a random variable can attain, further referred to as the sup- port interval, and the probability that the value of the random variable is within any (measurable) subset of that range. Random sampling refers to taking a number of independent observations from a probability distribution. Typically, the parameters of the probability distribution (of- ten referred to as true parameters) are unknown, and the aim is to retrieve them using various estimation methods on the random sample generated from this probability dis- tribution.
    [Show full text]
  • Random Variables Generation
    Random Variables Generation Revised version of the slides based on the book Discrete-Event Simulation: a first course L.L. Leemis & S.K. Park Section(s) 6.1, 6.2, 7.1, 7.2 c 2006 Pearson Ed., Inc. 0-13-142917-5 Discrete-Event Simulation Random Variables Generation 1/80 Introduction Monte Carlo Simulators differ from Trace Driven simulators because of the use of Random Number Generators to represent the variability that affects the behaviors of real systems. Uniformly distributed random variables are the most elementary representations that we can use in Monte Carlo simulation, but are not enough to capture the complexity of real systems. We must thus devise methods for generating instances (variates) of arbitrary random variables Properly using uniform random numbers, it is possible to obtain this result. In the sequel we will first recall some basic properties of Discrete and Continuous random variables and then we will discuss several methods to obtain their variates Discrete-Event Simulation Random Variables Generation 2/80 Basic Probability Concepts Empirical Probability, derives from performing an experiment many times n and counting the number of occurrences na of an event A The relative frequency of occurrence of event is na/n The frequency theory of probability asserts thatA the relative frequency converges as n → ∞ n Pr( )= lim a A n→∞ n Axiomatic Probability is a formal, set-theoretic approach Mathematically construct the sample space and calculate the number of events A The two are complementary! Discrete-Event Simulation Random
    [Show full text]
  • On Half-Cauchy Distribution and Process
    International Journal of Statistika and Mathematika, ISSN: 2277- 2790 E-ISSN: 2249-8605, Volume 3, Issue 2, 2012 pp 77-81 On Half-Cauchy Distribution and Process Elsamma Jacob1 , K Jayakumar2 1Malabar Christian College, Calicut, Kerala, INDIA. 2Department of Statistics, University of Calicut, Kerala, INDIA. Corresponding addresses: [email protected], [email protected] Research Article Abstract: A new form of half- Cauchy distribution using sin ξ cos ξ Marshall-Olkin transformation is introduced. The properties of () = − ξ, () = ξ, ≥ 0 the new distribution such as density, cumulative distribution ξ ξ function, quantiles, measure of skewness and distribution of the extremes are obtained. Time series models with half-Cauchy Remark 1.1. For the HC distribution the moments do distribution as stationary marginal distributions are not not exist. developed so far. We develop first order autoregressive process Remark 1.2. The HC distribution is infinitely divisible with the new distribution as stationary marginal distribution and (Bondesson (1987)) and self decomposable (Diedhiou the properties of the process are studied. Application of the (1998)). distribution in various fields is also discussed. Keywords: Autoregressive Process, Geometric Extreme Stable, Relationship with other distributions: Half-Cauchy Distribution, Skewness, Stationarity, Quantiles. 1. Let Y be a folded t variable with pdf given by 1. Introduction: ( ) > 0, (1.5) The half Cauchy (HC) distribution is derived from 2Γ( ) () = 1 + , ν ∈ , the standard Cauchy distribution by folding the ν ()√νπ curve on the origin so that only positive values can be observed. A continuous random variable X is said When ν = 1 , (1.5) reduces to 2 1 > 0 to have the half Cauchy distribution if its survival () = , function is given by 1 + (x)=1 − tan , > 0 (1.1) Thus, HC distribution coincides with the folded t distribution with = 1 degree of freedom.
    [Show full text]
  • Handbook on Probability Distributions
    R powered R-forge project Handbook on probability distributions R-forge distributions Core Team University Year 2009-2010 LATEXpowered Mac OS' TeXShop edited Contents Introduction 4 I Discrete distributions 6 1 Classic discrete distribution 7 2 Not so-common discrete distribution 27 II Continuous distributions 34 3 Finite support distribution 35 4 The Gaussian family 47 5 Exponential distribution and its extensions 56 6 Chi-squared's ditribution and related extensions 75 7 Student and related distributions 84 8 Pareto family 88 9 Logistic distribution and related extensions 108 10 Extrem Value Theory distributions 111 3 4 CONTENTS III Multivariate and generalized distributions 116 11 Generalization of common distributions 117 12 Multivariate distributions 133 13 Misc 135 Conclusion 137 Bibliography 137 A Mathematical tools 141 Introduction This guide is intended to provide a quite exhaustive (at least as I can) view on probability distri- butions. It is constructed in chapters of distribution family with a section for each distribution. Each section focuses on the tryptic: definition - estimation - application. Ultimate bibles for probability distributions are Wimmer & Altmann (1999) which lists 750 univariate discrete distributions and Johnson et al. (1994) which details continuous distributions. In the appendix, we recall the basics of probability distributions as well as \common" mathe- matical functions, cf. section A.2. And for all distribution, we use the following notations • X a random variable following a given distribution, • x a realization of this random variable, • f the density function (if it exists), • F the (cumulative) distribution function, • P (X = k) the mass probability function in k, • M the moment generating function (if it exists), • G the probability generating function (if it exists), • φ the characteristic function (if it exists), Finally all graphics are done the open source statistical software R and its numerous packages available on the Comprehensive R Archive Network (CRAN∗).
    [Show full text]
  • 5 Introduction to the Theory of Order Statistics and Rank Statistics • This
    5 Introduction to the Theory of Order Statistics and Rank Statistics • This section will contain a summary of important definitions and theorems that will be useful for understanding the theory of order and rank statistics. In particular, results will be presented for linear rank statistics. • Many nonparametric tests are based on test statistics that are linear rank statistics. { For one sample: The Wilcoxon-Signed Rank Test is based on a linear rank statistic. { For two samples: The Mann-Whitney-Wilcoxon Test, the Median Test, the Ansari- Bradley Test, and the Siegel-Tukey Test are based on linear rank statistics. • Most of the information in this section can be found in Randles and Wolfe (1979). 5.1 Order Statistics • Let X1;X2;:::;Xn be a random sample of continuous random variables having cdf F (x) and pdf f(x). th • Let X(i) be the i smallest random variable (i = 1; 2; : : : ; n). • X(1);X(2);:::;X(n) are referred to as the order statistics for X1;X2;:::;Xn. By defini- tion, X(1) < X(2) < ··· < X(n). Theorem 5.1: Let X(1) < X(2) < ··· < X(n) be the order statistics for a random sample from a distribution with cdf F (x) and pdf f(x). The joint density for the order statistics is n Y g(x(1); x(2); : : : ; x(n)) = n! f(x(i)) for − 1 < x(1) < x(2) < ··· < x(n) < 1 (16) i=1 = 0 otherwise th Theorem 5.2: The marginal density for the j order statistic X(j) (j = 1; 2; : : : ; n) is n! g (t) = [F (t)]j−1 [1 − F (t)]n−j f(t) − 1 < t < 1: j (j − 1)!(n − j)! • For random variable X with cdf F (x), the inverse distribution F −1(·) is defined as F −1(y) = inffx : F (x) ≥ yg 0 < y < 1: • If F (x) is strictly increasing between 0 and 1, then there is only one x such that F (x) = y.
    [Show full text]
  • Introduction to Monte Carlo Simulations Ebrahim Shayesteh Agenda
    F19: Introduction to Monte Carlo simulations Ebrahim Shayesteh Agenda . Introduction and repetition . Monte Carlo methods: Background, Introduction, Motivation . Example 1: Buffon’s needle . Simple Sampling . Example 2: Travel time from A to B . Accuracy: Variance reduction techniques . VRT 1: Complementary random numbers . Example 3: DC OPF problem 2 Repetition: fault models . Note: "Non-repairable systems” . A summary of functions to describe the stochastic variable T (when a failure occurs): . Cumulative distribution function: F t PT t . survivor Rt P T t 1 F t function: . Probability f (t) F t R t density function: f t F t . Failure rate: zt Rt 1 F t 3 Repetition: repairable systems . Repairable systems: ”alternating renewal process” X(t) 1 0 t T 1 D 1 T 2 D 2 T 3 D 3 T 4 . Mean Time To Failure (MTTF) . Mean Down Time (MDT) . Mean Time To Repair (MTTR), sometimes, but not always the same as MDT . Mean Time Between Failure (MTBF = MTTF + MDT) 4 Repetition: repairable systems . The “availability” for a unit is defined as the probability that the unit is operational at a given time t 1 . Note: – if the unit cannot be repaired A(t)=R(t) – if the unit can be repaired, the availability will depend both on the lifetime distribution and the repair time. 5 Repetition: repairable systems . The share of when the unit has been working thus becomes: n 1 n T T i n i i 1 i 1 n n 11n n TDi i Ti D i i 11i n i 11n i . It results when ∞ in: ET MTTF Aav ET ED MTTF MDT 6 Repetition: repairable systems .
    [Show full text]
  • Lecture 3: Probability Metrics
    Lecture 3: Probability metrics Prof. Dr. Svetlozar Rachev Institute for Statistics and Mathematical Economics University of Karlsruhe Portfolio and Asset Liability Management Summer Semester 2008 Prof. Dr. Svetlozar Rachev (University of Karlsruhe) Lecture 3: Probability metrics 2008 1 / 93 Copyright These lecture-notes cannot be copied and/or distributed without permission. The material is based on the text-book: Svetlozar T. Rachev, Stoyan Stoyanov, and Frank J. Fabozzi Advanced Stochastic Models, Risk Assessment, and Portfolio Optimization: The Ideal Risk, Uncertainty, and Performance Measures John Wiley, Finance, 2007 Prof. Svetlozar (Zari) T. Rachev Chair of Econometrics, Statistics and Mathematical Finance School of Economics and Business Engineering University of Karlsruhe Kollegium am Schloss, Bau II, 20.12, R210 Postfach 6980, D-76128, Karlsruhe, Germany Tel. +49-721-608-7535, +49-721-608-2042(s) Fax: +49-721-608-3811 http://www.statistik.uni-karslruhe.de Prof. Dr. Svetlozar Rachev (University of Karlsruhe) Lecture 3: Probability metrics 2008 2 / 93 Introduction Theory of probability metrics came from the investigations related to limit theorems in probability theory. A well-known example is Central Limit Theorem (CLT) but there are many other limit theorems, such as the Generalized CLT, the max-stable CLT, functional limit theorems, etc. The limit law can be regarded as an approximation to the stochastic model considered and, therefore, can be accepted as an approximate substitute. How large an error we make by adopting the approximate model? This question can be investigated by studying the distance between the limit law and the stochastic model and whether it is, for example, sum or maxima of i.i.d.
    [Show full text]
  • A Note on Generalized Inverses
    A note on generalized inverses Paul Embrechts1, Marius Hofert2 2014-02-17 Abstract Motivated by too restrictive or even incorrect statements about generalized inverses in the literature, properties about these functions are investigated and proven. Examples and counterexamples show the importance of generalized inverses in mathematical theory and its applications. Keywords Increasing function, generalized inverse, distribution function, quantile function. MSC2010 60E05, 62E15, 26A48. 1 Introduction It is well known that a real-valued, continuous, and strictly monotone function of a single variable possesses an inverse on its range. It is also known that one can drop the assumptions of continuity and strict monotonicity (even the assumption of considering points in the range) to obtain the notion of a generalized inverse. Generalized inverses play an important role in probability theory and statistics in terms of quantile functions, and in financial and insurance mathematics, for example, as Value-at-Risk or return period. Generalized inverses of increasing functions which are not necessarily distribution functions also frequently appear, for example, as transformations of random variables. In particular, proving the famous invariance principle of copulas under strictly increasing transformations on the ranges of the underlying random variables involves such transformations. One can often work with generalized inverses as one does with ordinary inverses. To see this, one has to have several properties about generalized inverses at hand. Although these properties are often stated in the literature, one rarely finds detailed proofs of these results. Moreover, some of the statements found and often referred to are incorrect. The main goal of this paper is therefore to state and prove important properties about generalized inverses of increasing functions.
    [Show full text]
  • Introduction to Stochastic Processes Frans Willekens 19 October 2015
    Introduction to stochastic processes Frans Willekens 19 October 2015 Overview Actions of agents and interactions between agents cannot be predicted with certainty, even if we know a lot about an actor, his or her social network and the contextual factors that could trigger a need or desire to act. Decisions to act are made under uncertainty. Agent-based models (ABM) should account for the uncertainties and the impact of chance on decision outcomes. As a consequence, agent-based models should be probability models. Random variables constitute the elementary building blocks of a probability model. Random variables may take on a finite number of values (discrete random variable) or an infinite number of values (continuous random variable). The likelihood of a value or range of values is expressed as probabilities. Each random variable is characterized by a probability distribution. A distinction is made between empirical (observed) distributions and theoretical distributions. The normal distribution, the exponential distribution, the binomial/multinomial distribution and the Poisson distribution are common probability distributions. The waiting time to an action or interaction is a random variable, characterized by a waiting time distribution. The exponential distribution, the Gompertz distribution, the extreme value distribution and the gamma distribution are used regularly in demography. The outcome of an action is a random variable too. If the outcome is a continuous variable (e.g. reward), possible values are described by a probability density function. If the number of possible outcomes is finite, which is often the case in demography and social sciences, the random variable is discrete and the distribution of the likelihood of each value is the probability mass function.
    [Show full text]
  • Gain Control Explains the Effect of Distraction in Human Perceptual, Cognitive, and Economic Decision Making
    Gain control explains the effect of distraction in human perceptual, cognitive, and economic decision making Vickie Lia,1, Elizabeth Michaelb, Jan Balaguera, Santiago Herce Castañóna,c, and Christopher Summerfielda aDepartment of Experimental Psychology, University of Oxford, OX2 6GG Oxford, United Kingdom; bDepartment of Psychology, University of Cambridge, CB2 3EB Cambridge, United Kingdom; and cDepartment of Psychology and Educational Sciences, University of Geneva, 1202 Geneva, Switzerland Edited by Randolph Blake, Vanderbilt University, Nashville, TN, and approved July 3, 2018 (received for review March 26, 2018) When making decisions, humans are often distracted by irrelevant economists have charted the irrational influence that a decoy al- information. Distraction has a different impact on perceptual, ternative of value Z has on choices between two choice-relevant cognitive, and value-guided choices, giving rise to well-described prospects X and Y,whereX > Y (10–13). A common finding is behavioral phenomena such as the tilt illusion, conflict adaptation, that rational choices (i.e., for X > Y) initially decline as Z increases or economic decoy effects. However, a single, unified model that in value but then increase sharply as Z comes to approximately can account for all these phenomena has yet to emerge. Here, we match the other two items in value (Fig. 1C); other stereotypical “ ” X offer one such account, based on adaptive gain control, and decoy effects are observed when alternatives are character- additionally show that it successfully predicts a range of counter- ized by more than one attribute (discussed below). intuitive new behavioral phenomena on variants of a classic In the fields of psychology, economics, and neuroscience, diverse cognitive paradigm, the Eriksen flanker task.
    [Show full text]
  • Stratified Random Sampling for Dependent Inputs
    Stratified Random Sampling for Dependent Inputs Anirban Mondal Case Western Reserve University, Cleveland, OH 44106, USA Abhijit Mandal Wayne State University, Detroit, MI 48202, USA April 2, 2019 Abstract A new approach of obtaining stratified random samples from statistically dependent random variables is described. The proposed method can be used to obtain samples from the input space of a computer forward model in estimating expectations of functions of the corresponding output variables. The advantage of the proposed method over the existing methods is that it preserves the exact form of the joint distribution on the input variables. The asymptotic distribution of the new estimator is derived. Asymptotically, the variance of the estimator using the proposed method is less than that obtained using the simple random sampling, with the degree of variance reduction depending on the degree of additivity in the function being integrated. This technique is applied to a practical example related to the performance of the river flood inundation model. Keywords: Stratified sampling; Latin hypercube sampling; Variance reduction; Sampling with dependent random variables; Monte Carlo simulation. arXiv:1904.00555v1 [stat.ME] 1 Apr 2019 1 Introduction Mathematical models are widely used by engineers and scientists to describe physical, economic, and social processes. Often these models are complex in nature and are described by a system of ordinary or partial differential equations, which cannot be solved analytically. Given the values of the input parameters of the processes, complex computer codes are widely used to solve such 1 systems numerically, providing the corresponding outputs. These computer models, also known as forward models, are used for prediction, uncertainty analysis, sensitivity analysis, and model calibration.
    [Show full text]