Contributions to Statistical Distribution Theory with Applications
Total Page:16
File Type:pdf, Size:1020Kb
Load more
Recommended publications
-
Power Comparisons of the Rician and Gaussian Random Fields Tests for Detecting Signal from Functional Magnetic Resonance Images Hasni Idayu Binti Saidi
University of Northern Colorado Scholarship & Creative Works @ Digital UNC Dissertations Student Research 8-2018 Power Comparisons of the Rician and Gaussian Random Fields Tests for Detecting Signal from Functional Magnetic Resonance Images Hasni Idayu Binti Saidi Follow this and additional works at: https://digscholarship.unco.edu/dissertations Recommended Citation Saidi, Hasni Idayu Binti, "Power Comparisons of the Rician and Gaussian Random Fields Tests for Detecting Signal from Functional Magnetic Resonance Images" (2018). Dissertations. 507. https://digscholarship.unco.edu/dissertations/507 This Text is brought to you for free and open access by the Student Research at Scholarship & Creative Works @ Digital UNC. It has been accepted for inclusion in Dissertations by an authorized administrator of Scholarship & Creative Works @ Digital UNC. For more information, please contact [email protected]. ©2018 HASNI IDAYU BINTI SAIDI ALL RIGHTS RESERVED UNIVERSITY OF NORTHERN COLORADO Greeley, Colorado The Graduate School POWER COMPARISONS OF THE RICIAN AND GAUSSIAN RANDOM FIELDS TESTS FOR DETECTING SIGNAL FROM FUNCTIONAL MAGNETIC RESONANCE IMAGES A Dissertation Submitted in Partial Fulfillment of the Requirement for the Degree of Doctor of Philosophy Hasni Idayu Binti Saidi College of Education and Behavioral Sciences Department of Applied Statistics and Research Methods August 2018 This dissertation by: Hasni Idayu Binti Saidi Entitled: Power Comparisons of the Rician and Gaussian Random Fields Tests for Detecting Signal from Functional Magnetic Resonance Images has been approved as meeting the requirement for the Degree of Doctor of Philosophy in College of Education and Behavioral Sciences in Department of Applied Statistics and Research Methods Accepted by the Doctoral Committee Khalil Shafie Holighi, Ph.D., Research Advisor Trent Lalonde, Ph.D., Committee Member Jay Schaffer, Ph.D., Committee Member Heng-Yu Ku, Ph.D., Faculty Representative Date of Dissertation Defense Accepted by the Graduate School Linda L. -
The Poisson Burr X Inverse Rayleigh Distribution and Its Applications
Journal of Data Science,18(1). P. 56 – 77,2020 DOI:10.6339/JDS.202001_18(1).0003 The Poisson Burr X Inverse Rayleigh Distribution And Its Applications Rania H. M. Abdelkhalek* Department of Statistics, Mathematics and Insurance, Benha University, Egypt. ABSTRACT A new flexible extension of the inverse Rayleigh model is proposed and studied. Some of its fundamental statistical properties are derived. We assessed the performance of the maximum likelihood method via a simulation study. The importance of the new model is shown via three applications to real data sets. The new model is much better than other important competitive models. Keywords: Inverse Rayleigh; Poisson; Simulation; Modeling. * [email protected] Rania H. M. Abdelkhalek 57 1. Introduction and physical motivation The well-known inverse Rayleigh (IR) model is considered as a distribution for a life time random variable (r.v.). The IR distribution has many applications in the area of reliability studies. Voda (1972) proved that the distribution of lifetimes of several types of experimental (Exp) units can be approximated by the IR distribution and studied some properties of the maximum likelihood estimation (MLE) of the its parameter. Mukerjee and Saran (1984) studied the failure rate of an IR distribution. Aslam and Jun (2009) introduced a group acceptance sampling plans for truncated life tests based on the IR. Soliman et al. (2010) studied the Bayesian and non- Bayesian estimation of the parameter of the IR model Dey (2012) mentioned that the IR model has also been used as a failure time distribution although the variance and higher order moments does not exist for it. -
Incorporating Extreme Events Into Risk Measurement
Lecture notes on risk management, public policy, and the financial system Incorporating extreme events into risk measurement Allan M. Malz Columbia University Incorporating extreme events into risk measurement Outline Stress testing and scenario analysis Expected shortfall Extreme value theory © 2021 Allan M. Malz Last updated: July 25, 2021 2/24 Incorporating extreme events into risk measurement Stress testing and scenario analysis Stress testing and scenario analysis Stress testing and scenario analysis Expected shortfall Extreme value theory 3/24 Incorporating extreme events into risk measurement Stress testing and scenario analysis Stress testing and scenario analysis What are stress tests? Stress tests analyze performance under extreme loss scenarios Heuristic portfolio analysis Steps in carrying out a stress test 1. Determine appropriate scenarios 2. Calculate shocks to risk factors in each scenario 3. Value the portfolio in each scenario Objectives of stress testing Address tail risk Reduce model risk by reducing reliance on models “Know the book”: stress tests can reveal vulnerabilities in specfic positions or groups of positions Criteria for appropriate stress scenarios Should be tailored to firm’s specific key vulnerabilities And avoid assumptions that favor the firm, e.g. competitive advantages in a crisis Should be extreme but not implausible 4/24 Incorporating extreme events into risk measurement Stress testing and scenario analysis Stress testing and scenario analysis Approaches to formulating stress scenarios Historical -
New Proposed Length-Biased Weighted Exponential and Rayleigh Distribution with Application
CORE Metadata, citation and similar papers at core.ac.uk Provided by International Institute for Science, Technology and Education (IISTE): E-Journals Mathematical Theory and Modeling www.iiste.org ISSN 2224-5804 (Paper) ISSN 2225-0522 (Online) Vol.4, No.7, 2014 New Proposed Length-Biased Weighted Exponential and Rayleigh Distribution with Application Kareema Abed Al-kadim1 Nabeel Ali Hussein 2 1,2.College of Education of Pure Sciences, University of Babylon/Hilla 1.E-mail of the Corresponding author: [email protected] 2.E-mail: [email protected] Abstract The concept of length-biased distribution can be employed in development of proper models for lifetime data. Length-biased distribution is a special case of the more general form known as weighted distribution. In this paper we introduce a new class of length-biased of weighted exponential and Rayleigh distributions(LBW1E1D), (LBWRD).This paper surveys some of the possible uses of Length - biased distribution We study the some of its statistical properties with application of these new distribution. Keywords: length- biased weighted Rayleigh distribution, length- biased weighted exponential distribution, maximum likelihood estimation. 1. Introduction Fisher (1934) [1] introduced the concept of weighted distributions Rao (1965) [2]developed this concept in general terms by in connection with modeling statistical data where the usual practi ce of using standard distributions were not found to be appropriate, in which various situations that can be modeled by weighted distributions, where non-observe ability of some events or damage caused to the original observation resulting in a reduced value, or in a sampling procedure which gives unequal chances to the units in the original That means the recorded observations cannot be considered as a original random sample. -
Suitability of Different Probability Distributions for Performing Schedule Risk Simulations in Project Management
2016 Proceedings of PICMET '16: Technology Management for Social Innovation Suitability of Different Probability Distributions for Performing Schedule Risk Simulations in Project Management J. Krige Visser Department of Engineering and Technology Management, University of Pretoria, Pretoria, South Africa Abstract--Project managers are often confronted with the The Project Evaluation and Review Technique (PERT), in question on what is the probability of finishing a project within conjunction with the Critical Path Method (CPM), were budget or finishing a project on time. One method or tool that is developed in the 1950’s to address the uncertainty in project useful in answering these questions at various stages of a project duration for complex projects [11], [13]. The expected value is to develop a Monte Carlo simulation for the cost or duration or mean value for each activity of the project network was of the project and to update and repeat the simulations with actual data as the project progresses. The PERT method became calculated by applying the beta distribution and three popular in the 1950’s to express the uncertainty in the duration estimates for the duration of the activity. The total project of activities. Many other distributions are available for use in duration was determined by adding all the duration values of cost or schedule simulations. the activities on the critical path. The Monte Carlo simulation This paper discusses the results of a project to investigate the (MCS) provides a distribution for the total project duration output of schedule simulations when different distributions, e.g. and is therefore more useful as a method or tool for decision triangular, normal, lognormal or betaPert, are used to express making. -
University of Regina Lecture Notes Michael Kozdron
University of Regina Statistics 441 – Stochastic Calculus with Applications to Finance Lecture Notes Winter 2009 Michael Kozdron [email protected] http://stat.math.uregina.ca/∼kozdron List of Lectures and Handouts Lecture #1: Introduction to Financial Derivatives Lecture #2: Financial Option Valuation Preliminaries Lecture #3: Introduction to MATLAB and Computer Simulation Lecture #4: Normal and Lognormal Random Variables Lecture #5: Discrete-Time Martingales Lecture #6: Continuous-Time Martingales Lecture #7: Brownian Motion as a Model of a Fair Game Lecture #8: Riemann Integration Lecture #9: The Riemann Integral of Brownian Motion Lecture #10: Wiener Integration Lecture #11: Calculating Wiener Integrals Lecture #12: Further Properties of the Wiener Integral Lecture #13: ItˆoIntegration (Part I) Lecture #14: ItˆoIntegration (Part II) Lecture #15: Itˆo’s Formula (Part I) Lecture #16: Itˆo’s Formula (Part II) Lecture #17: Deriving the Black–Scholes Partial Differential Equation Lecture #18: Solving the Black–Scholes Partial Differential Equation Lecture #19: The Greeks Lecture #20: Implied Volatility Lecture #21: The Ornstein-Uhlenbeck Process as a Model of Volatility Lecture #22: The Characteristic Function for a Diffusion Lecture #23: The Characteristic Function for Heston’s Model Lecture #24: Review Lecture #25: Review Lecture #26: Review Lecture #27: Risk Neutrality Lecture #28: A Numerical Approach to Option Pricing Using Characteristic Functions Lecture #29: An Introduction to Functional Analysis for Financial Applications Lecture #30: A Linear Space of Random Variables Lecture #31: Value at Risk Lecture #32: Monetary Risk Measures Lecture #33: Risk Measures and their Acceptance Sets Lecture #34: A Representation of Coherent Risk Measures Lecture #35: Further Remarks on Value at Risk Lecture #36: Midterm Review Statistics 441 (Winter 2009) January 5, 2009 Prof. -
Benford's Law As a Logarithmic Transformation
Benford’s Law as a Logarithmic Transformation J. M. Pimbley Maxwell Consulting, LLC Abstract We review the history and description of Benford’s Law - the observation that many naturally occurring numerical data collections exhibit a logarithmic distribution of digits. By creating numerous hypothetical examples, we identify several cases that satisfy Benford’s Law and several that do not. Building on prior work, we create and demonstrate a “Benford Test” to determine from the functional form of a probability density function whether the resulting distribution of digits will find close agreement with Benford’s Law. We also discover that one may generalize the first-digit distribution algorithm to include non-logarithmic transformations. This generalization shows that Benford’s Law rests on the tendency of the transformed and shifted data to exhibit a uniform distribution. Introduction What the world calls “Benford’s Law” is a marvelous mélange of mathematics, data, philosophy, empiricism, theory, mysticism, and fraud. Even with all these qualifiers, one easily describes this law and then asks the simple questions that have challenged investigators for more than a century. Take a large collection of positive numbers which may be integers or real numbers or both. Focus only on the first (non-zero) digit of each number. Count how many numbers of the large collection have each of the nine possibilities (1-9) as the first digit. For typical J. M. Pimbley, “Benford’s Law as a Logarithmic Transformation,” Maxwell Consulting Archives, 2014. number collections – which we’ll generally call “datasets” – the first digit is not equally distributed among the values 1-9. -
1988: Logarithmic Series Distribution and Its Use In
LOGARITHMIC SERIES DISTRIBUTION AND ITS USE IN ANALYZING DISCRETE DATA Jeffrey R, Wilson, Arizona State University Tempe, Arizona 85287 1. Introduction is assumed to be 7. for any unit and any trial. In a previous paper, Wilson and Koehler Furthermore, for ~ach unit the m trials are (1988) used the generalized-Dirichlet identical and independent so upon summing across multinomial model to account for extra trials X. = (XI~. X 2 ...... X~.)', the vector of variation. The model allows for a second order counts ~r the j th ~nit has ~3multinomial of pairwise correlation among units, a type of distribution with probability vector ~ = (~1' assumption found reasonable in some biological 72 ..... 71) ' and sample size m. Howe~er, data. In that paper, the two-way crossed responses given by the J units at a particular generalized Dirichlet Multinomial model was used trial may be correlated, producing a set of J to analyze repeated measure on the categorical correlated multinomial random vectors, X I, X 2, preferences of insurance customers. The number • o.~ X • of respondents was assumed to be fixed and Ta~is (1962) developed a model for this known. situation, which he called the generalized- In this paper a generalization of the model multinomial distribution in which a single is made allowing the number of respondents m, to parameter P, is used to reflect the common be random. Thus both the number of units m, and dependency between any two of the dependent the underlying probability vector are allowed to multinomial random vectors. The distribution of vary. The model presented here uses the m logarithmic series distribution to account for the category total Xi3. -
Final Paper (PDF)
Analytic Method for Probabilistic Cost and Schedule Risk Analysis Final Report 5 April2013 PREPARED FOR: NATIONAL AERONAUTICS AND SPACE ADMINISTRATION (NASA) OFFICE OF PROGRAM ANALYSIS AND EVALUATION (PA&E) COST ANALYSIS DIVISION (CAD) Felecia L. London Contracting Officer NASA GODDARD SPACE FLIGHT CENTER, PROCUREMENT OPERATIONS DIVISION OFFICE FOR HEADQUARTERS PROCUREMENT, 210.H Phone: 301-286-6693 Fax:301-286-1746 e-mail: [email protected] Contract Number: NNHl OPR24Z Order Number: NNH12PV48D PREPARED BY: RAYMOND P. COVERT, COVARUS, LLC UNDER SUBCONTRACT TO GALORATHINCORPORATED ~ SEER. br G A L 0 R A T H [This Page Intentionally Left Blank] ii TABLE OF CONTENTS 1 Executive Summacy.................................................................................................. 11 2 In.troduction .............................................................................................................. 12 2.1 Probabilistic Nature of Estimates .................................................................................... 12 2.2 Uncertainty and Risk ....................................................................................................... 12 2.2.1 Probability Density and Probability Mass ................................................................ 12 2.2.2 Cumulative Probability ............................................................................................. 13 2.2.3 Definition ofRisk ..................................................................................................... 14 2.3 -
Thompson Sampling on Symmetric Α-Stable Bandits
Thompson Sampling on Symmetric α-Stable Bandits Abhimanyu Dubey and Alex Pentland Massachusetts Institute of Technology fdubeya, [email protected] Abstract Thompson Sampling provides an efficient technique to introduce prior knowledge in the multi-armed bandit problem, along with providing remarkable empirical performance. In this paper, we revisit the Thompson Sampling algorithm under rewards drawn from α-stable dis- tributions, which are a class of heavy-tailed probability distributions utilized in finance and economics, in problems such as modeling stock prices and human behavior. We present an efficient framework for α-stable posterior inference, which leads to two algorithms for Thomp- son Sampling in this setting. We prove finite-time regret bounds for both algorithms, and demonstrate through a series of experiments the stronger performance of Thompson Sampling in this setting. With our results, we provide an exposition of α-stable distributions in sequential decision-making, and enable sequential Bayesian inference in applications from diverse fields in finance and complex systems that operate on heavy-tailed features. 1 Introduction The multi-armed bandit (MAB) problem is a fundamental model in understanding the exploration- exploitation dilemma in sequential decision-making. The problem and several of its variants have been studied extensively over the years, and a number of algorithms have been proposed that op- timally solve the bandit problem when the reward distributions are well-behaved, i.e. have a finite support, or are sub-exponential. The most prominently studied class of algorithms are the Upper Confidence Bound (UCB) al- gorithms, that employ an \optimism in the face of uncertainty" heuristic [ACBF02], which have been shown to be optimal (in terms of regret) in several cases [CGM+13, BCBL13]. -
On Products of Gaussian Random Variables
On products of Gaussian random variables Zeljkaˇ Stojanac∗1, Daniel Suessy1, and Martin Klieschz2 1Institute for Theoretical Physics, University of Cologne, Germany 2 Institute of Theoretical Physics and Astrophysics, University of Gda´nsk,Poland May 29, 2018 Sums of independent random variables form the basis of many fundamental theorems in probability theory and statistics, and therefore, are well understood. The related problem of characterizing products of independent random variables seems to be much more challenging. In this work, we investigate such products of normal random vari- ables, products of their absolute values, and products of their squares. We compute power-log series expansions of their cumulative distribution function (CDF) based on the theory of Fox H-functions. Numerically we show that for small arguments the CDFs are well approximated by the lowest orders of this expansion. For the two non-negative random variables, we also compute the moment generating functions in terms of Mei- jer G-functions, and consequently, obtain a Chernoff bound for sums of such random variables. Keywords: Gaussian random variable, product distribution, Meijer G-function, Cher- noff bound, moment generating function AMS subject classifications: 60E99, 33C60, 62E15, 62E17 1. Introduction and motivation Compared to sums of independent random variables, our understanding of products is much less comprehensive. Nevertheless, products of independent random variables arise naturally in many applications including channel modeling [1,2], wireless relaying systems [3], quantum physics (product measurements of product states), as well as signal processing. Here, we are particularly motivated by a tensor sensing problem (see Ref. [4] for the basic idea). In this problem we consider n1×n2×···×n tensors T 2 R d and wish to recover them from measurements of the form yi := hAi;T i 1 2 d j nj with the sensing tensors also being of rank one, Ai = ai ⊗ai ⊗· · ·⊗ai with ai 2 R . -
A New Family of Odd Generalized Nakagami (Nak-G) Distributions
TURKISH JOURNAL OF SCIENCE http:/dergipark.gov.tr/tjos VOLUME 5, ISSUE 2, 85-101 ISSN: 2587–0971 A New Family of Odd Generalized Nakagami (Nak-G) Distributions Ibrahim Abdullahia, Obalowu Jobb aYobe State University, Department of Mathematics and Statistics bUniversity of Ilorin, Department of Statistics Abstract. In this article, we proposed a new family of generalized Nak-G distributions and study some of its statistical properties, such as moments, moment generating function, quantile function, and prob- ability Weighted Moments. The Renyi entropy, expression of distribution order statistic and parameters of the model are estimated by means of maximum likelihood technique. We prove, by providing three applications to real-life data, that Nakagami Exponential (Nak-E) distribution could give a better fit when compared to its competitors. 1. Introduction There has been recent developments focus on generalized classes of continuous distributions by adding at least one shape parameters to the baseline distribution, studying the properties of these distributions and using these distributions to model data in many applied areas which include engineering, biological studies, environmental sciences and economics. Numerous methods for generating new families of distributions have been proposed [8] many researchers. The beta-generalized family of distribution was developed , Kumaraswamy generated family of distributions [5], Beta-Nakagami distribution [19], Weibull generalized family of distributions [4], Additive weibull generated distributions [12], Kummer beta generalized family of distributions [17], the Exponentiated-G family [6], the Gamma-G (type I) [21], the Gamma-G family (type II) [18], the McDonald-G [1], the Log-Gamma-G [3], A new beta generated Kumaraswamy Marshall-Olkin- G family of distributions with applications [11], Beta Marshall-Olkin-G family [2] and Logistic-G family [20].