Lecture 3: Probability Metrics
Total Page:16
File Type:pdf, Size:1020Kb
Lecture 3: Probability metrics Prof. Dr. Svetlozar Rachev Institute for Statistics and Mathematical Economics University of Karlsruhe Portfolio and Asset Liability Management Summer Semester 2008 Prof. Dr. Svetlozar Rachev (University of Karlsruhe) Lecture 3: Probability metrics 2008 1 / 93 Copyright These lecture-notes cannot be copied and/or distributed without permission. The material is based on the text-book: Svetlozar T. Rachev, Stoyan Stoyanov, and Frank J. Fabozzi Advanced Stochastic Models, Risk Assessment, and Portfolio Optimization: The Ideal Risk, Uncertainty, and Performance Measures John Wiley, Finance, 2007 Prof. Svetlozar (Zari) T. Rachev Chair of Econometrics, Statistics and Mathematical Finance School of Economics and Business Engineering University of Karlsruhe Kollegium am Schloss, Bau II, 20.12, R210 Postfach 6980, D-76128, Karlsruhe, Germany Tel. +49-721-608-7535, +49-721-608-2042(s) Fax: +49-721-608-3811 http://www.statistik.uni-karslruhe.de Prof. Dr. Svetlozar Rachev (University of Karlsruhe) Lecture 3: Probability metrics 2008 2 / 93 Introduction Theory of probability metrics came from the investigations related to limit theorems in probability theory. A well-known example is Central Limit Theorem (CLT) but there are many other limit theorems, such as the Generalized CLT, the max-stable CLT, functional limit theorems, etc. The limit law can be regarded as an approximation to the stochastic model considered and, therefore, can be accepted as an approximate substitute. How large an error we make by adopting the approximate model? This question can be investigated by studying the distance between the limit law and the stochastic model and whether it is, for example, sum or maxima of i.i.d. random variables makes no difference as far as the universal principle is concerned. Prof. Dr. Svetlozar Rachev (University of Karlsruhe) Lecture 3: Probability metrics 2008 3 / 93 Introduction The theory of probability metrics studies the problem of measuring distances between random quantities. First, it provides the fundamental principles for building probability metrics — the means of measuring such distances. Second, it studies the relationships between various classes of probability metrics. It also concerns problems which require a particular metric while the basic results can be obtained in terms of other metrics. Prof. Dr. Svetlozar Rachev (University of Karlsruhe) Lecture 3: Probability metrics 2008 4 / 93 Introduction No limitations in the theory of probability metrics on the nature of the random quantities makes its methods fundamental and appealing. It is more appropriate to refer to the random quantities as random elements: random variables, random vectors, random functions or random elements of general spaces. For instance, in the context of financial applications, we can study the distance between two random stocks prices, or between vectors of financial variables building portfolios, or between entire yield curves which are much more complicated objects. ⇒ The methods of the theory remain the same, no matter the nature of the random elements. Prof. Dr. Svetlozar Rachev (University of Karlsruhe) Lecture 3: Probability metrics 2008 5 / 93 Measuring distances: the discrete case So how can we measure the distance between two random quantities? The important topics will be discussed such as: Examples of metrics defined on sets of characteristics of discrete distributions Examples of metrics based on the cumulative distribution function of discrete random variables Examples of metrics defined on the joint probability of discrete random variables Minimal and maximal distances Prof. Dr. Svetlozar Rachev (University of Karlsruhe) Lecture 3: Probability metrics 2008 6 / 93 Sets of characteristics Let us consider a pair of unfair dice and label the elements of the pair “die X” and “die Y”. “Die X” face 1 2 3 4 5 6 Probability, pi 3/12 2/12 1/12 2/12 2/12 2/12 “Die Y” face 1 2 3 4 5 6 Probability, qi 2/12 2/12 2/12 1/12 2/12 3/12 Table: The probabilities of the faces of “die X” and “die Y” In the case of “die X”, the probability of face 1 is higher than 1/6, which is the probability of a face of a fair die, and the probability of face 3 is less than 1/6. The probabilities of “die Y” have similar deviations from those of a fair die. Prof. Dr. Svetlozar Rachev (University of Karlsruhe) Lecture 3: Probability metrics 2008 7 / 93 Measuring distances: the discrete case We can view the pair of dice as an example of two discrete random variables: X for “die X” and Y for “die Y”. The two discrete random variables have different distributions and, also, different characteristics, such as the mean and higher moments. Therefore, we can compare the two random variables in terms of the differences in some of their characteristics. For example, let us choose the mathematical expectation: 6 6 EX = ipi = 40/12 and EY = iqi = 44/12. Xi=1 Xi=1 Prof. Dr. Svetlozar Rachev (University of Karlsruhe) Lecture 3: Probability metrics 2008 8 / 93 Measuring distances: the discrete case The distance between the two random variables, µ(X, Y ), may be computed as the absolute difference between the corresponding mathematical expectations, µ(X, Y )= |EX − EY | = 4/12. The second moment can be calculated: 6 6 2 2 2 2 EX = i pi = 174/12 and EY = i qi = 202/12. Xi=1 Xi=1 If we add it to the mathematical expectation, for the distance we obtain µ(X, Y )= |EX − EY | + |EX 2 − EY 2| = 32/12. Prof. Dr. Svetlozar Rachev (University of Karlsruhe) Lecture 3: Probability metrics 2008 9 / 93 Measuring distances: the discrete case If we considered a pair of fair dice, these characteristics would coincide and we would obtain that the distance between the two random variables is zero. However, it is possible to obtain zero deviation between given characteristics in the case of unfair dice. Let us illustrate this with the variance of X and Y . The variance of a random variable Z , DZ , is defined as, DZ = E(Z − EZ )2. or DZ = E(Z − EZ )2 = EZ 2 − (EZ )2. Prof. Dr. Svetlozar Rachev (University of Karlsruhe) Lecture 3: Probability metrics 2008 10 / 93 Measuring distances: the discrete case The variance of X equals 174 40 2 61 DX = EX 2 − (EX)2 = − = 12 12 18 and the variance of Y equals 202 44 2 61 DY = EY 2 − (EY )2 = − = . 12 12 18 We obtain that DX = DY . ⇒ Thus, any attempts to measure the distance between the two random variables in terms of differences in variance will indicate zero distance even though “die X” is quite different from “die Y”. Prof. Dr. Svetlozar Rachev (University of Karlsruhe) Lecture 3: Probability metrics 2008 11 / 93 Distribution functions By including more additional characteristics when measuring the distance between two random variables, we incorporate in µ(X, Y ) more information from their distribution functions. How many characteristics we have to include, when X and Y have discrete distributions, so that we can be sure that the entire distribution function of X, FX (x)= P(X ≤ x) agrees to the entire distribution of Y , FY (x)= P(Y ≤ x)? Let us consider n µ(X, Y )= |EX k − EY k | (1) kX=1 assuming that X and Y are the two dice considered above but this time we do not know the probabilities pi and qi , i = 1, 6. Prof. Dr. Svetlozar Rachev (University of Karlsruhe) Lecture 3: Probability metrics 2008 12 / 93 Distribution functions How large should n be so that µ(X, Y )= 0 guarantees that the distributions of X and Y agree completely? Since µ(X, Y )= 0 is equivalent to 6 EX = EY i=1 i(pi − qi ) = 0 EX 2 = EY 2 6 i2(p − q ) = 0 ⇐⇒ Pi=1 i i . n n P6 EX = EY in p q 0 i=1 ( i − i ) = then we need exactly 5 equations in orderP to guarantee that P(X = i)= pi = P(Y = i)= qi , i = 1, 6. Because there are 6 differences pi − qi in the equations and we need 6 equations from the ones above plus the additional equation 6 i=1(pi − qi )= 0 as all probabilities should sum up to one. ⇒ If PX and Y are positive integers valued with k outcomes, then we need k − 1 equations in order to solve the linear system. Prof. Dr. Svetlozar Rachev (University of Karlsruhe) Lecture 3: Probability metrics 2008 13 / 93 Distribution functions ⇒ If a given number of characteristics of two discrete random variables with finitely many outcomes agree, then their distribution functions agree completely. Then, instead of trying to figure out how many characteristics to include in a metric of a given type, is it possible to consider ways of measuring the distance between X and Y directly through their distribution function? If the distribution functions of two random variables coincide, then we have equal corresponding probabilities of any event and we can conclude that they have the same probabilistic properties. In the pair of dice example, all events are described by the set of all possible unions of the outcomes. Prof. Dr. Svetlozar Rachev (University of Karlsruhe) Lecture 3: Probability metrics 2008 14 / 93 Distribution functions The distribution functions FX (x) and FY (x) of “die X” and “die Y” are easy to calculate, 0, x < 1 3/12, 1 ≤ x < 2 5/12, 2 ≤ x < 3 0, [x] < 1 FX (x) = = 6/12, 3 ≤ x < 4 (2) [x] ≥ i=1 pi , [x] 1 8/12, 4 ≤ x < 5 P 10/12, 5 ≤ x < 6 1, x ≥ 6 0, x < 1 2/12, 1 ≤ x < 2 4/12, 2 ≤ x < 3 0, [x] < 1 FY (x) = = 6/12, 3 ≤ x < 4 (3) [x] ≥ i=1 qi , [x] 1 7/12, 4 ≤ x < 5 P 9/12, 5 ≤ x < 6 1, x ≥ 6 where [x] denotes the largest integer smaller than x.