There Is a Var Beyond Usual Approximations. Towards a Toolkit to Compute Risk Measures of Aggregated Heavy Tailed Risks

Total Page:16

File Type:pdf, Size:1020Kb

There Is a Var Beyond Usual Approximations. Towards a Toolkit to Compute Risk Measures of Aggregated Heavy Tailed Risks There is a VaR beyond usual approximations. Towards a toolkit to compute risk measures of aggregated heavy tailed risks Marie Kratz September 2013 SWISS FINANCIAL MARKET SUPERVISORY AUTHORITY FINMA Bern - Z¨urich 1 Acknowledgement. This work has been carried out during my internship (July-December 2012) at the Swiss Financial Market Supervisory Authority (FINMA). I would like to thank FINMA for its hospitality and rich environment, and ISFA (Univ. Claude Bernard, Lyon 1) for having given me the opportunity to learn more about actuarial sciences and finance, in view of becoming an IA - actuary. A special thanks to Dr. Hansj¨orgFurrer (Head of the QRM department, FINMA) for having made this stay possible and for bringing to my attention this interesting practical issue. Thanks to all of my colleagues from the Quantitative Risk Management (QRM) - Fi- nancial Risk SST department, for making my stay at FINMA a rich, pleasant and fruitful experience. Besides the study presented in this report, it was very interesting to get the chance to participate in the review process, discussion and report of some company, to learn more about the Swiss Solvency Test (SST) through discussions with colleagues and the revision of the french translation of SST. My warm thanks also to Dr. Michel Dacorogna, for stimulating and interesting discus- sions on this study. 2 Abstract Basel II and Solvency 2 both use the Value-at-Risk (VaR) as the risk measure to compute the Capital Requirements. In practice, to calibrate the VaR, a normal approximation is often chosen for the unknown distribution of the yearly log re- turns of financial assets. This is usually justified by the use of the Central Limit Theorem (CLT), when assuming aggregation of independent and identically dis- tributed (iid) observations in the portfolio model. Such a choice of modeling, in particular using light tail distributions, has proven during the crisis of 2008/2009 to be an inadequate approximation when dealing with the presence of extreme returns; as a consequence, it leads to a gross underestimation of the risks. The main objective of our study is to obtain the most accurate evaluations of the aggregated risks distribution and risk measures when working on financial or insurance data under the presence of heavy tail and to provide practical solutions for accurately estimating high quantiles of aggregated risks. We explore new ap- proaches to handle this problem, numerically as well as theoretically, based on properties of upper order statistics. We compare them with existing methods, for instance with one based on the Generalized Central Limit Theorem. R´esum´e Les r`eglesde B^aleII et de Solvabilit´e2 sont toutes deux bas´eessur le choix de la Value-at-Risk (VaR) comme mesure de risque pour calculer le capital de solvabilit´e. En pratique, il est souvent d'usage de choisir une approximation normale pour estimer la distribution sous-jacente des log-returns annuels d'actifs financiers, en vue de l'´evaluation de la VaR. Cela est justifi´epar l'utilisation du TLC lorsqu'on suppose le mod`eledu portefeuille constitu´ede l'aggr´egation d'observations iid. Un tel choix de mod´elisation,reposant sur l'utilisation d'une distribution `aqueue dite 'l´eg`ere',s'est av´er´etotalement inappropri´elors de la crise financi`erede 2008/09 durant laquelle la pr´esencede returns extr^emesest devenue manifeste. Cette erreur de mod`elea conduit `aune sous-estimation cons´equente des risques. L'objectif principal de cette ´etudeest d'obtenir une ´evaluation de la distribution des risques aggr´eg´eset des mesures de risque associ´ees,la meilleure possible, lorsqu'on traite des donn´eesfinanci`eresou d'assurance `aqueue de distribution lourde, et de proposer des solutions pratiques pour estimer de fa¸conla plus op- timale possible les quantiles extr^emesdes risques aggr´eg´es. Nous explorons de nouvelles approches pour appr´ehenderce probl`eme,d'un point de vue th´eorique comme num´erique,bas´eessur les propri´et´esdes plus grandes statistiques d'ordre. Nous comparons ensuite ces approches avec les m´ethodes existantes, telle celle reposant sur le TLC g´en´eralis´e. Keywords: aggregated risk, (refined) Berry-Ess´eeninequality, (generalized) central limit theorem, conditional (Pareto) distribution, conditional (Pareto) moment, convo- lution, expected shortfall, extreme values, financial data, high frequency data, market risk, order statistics, Pareto distribution, rate of convergence, risk measures, stable distribution, Value-at-Risk 3 Synopsis • Motivation Toute institution financi`ere,banque ou assurance, doit g`ererun portefeuille de risques. Ces risques agr´eg´es,mod´elis´espar des variables al´eatoires,constituent la base de tout mod`eleinterne. On constate qu'il est encore tr`esfr´equent, en pratique, d'utiliser une approximation normale pour estimer la distribution (in- connue) des rendements logarithmiques annuels d'actifs financiers. Cela est jus- tifi´epar l'application du Th´eor`eme de la Limite Centrale (TLC) sous l'hypoth`ese d'observations ind´ependantes et identiquement distribu´ees(iid) `avariance finie. Un tel choix de mod´elisation,utilisant une loi `aqueue de distribution dite l´eg`ere, s'est av´er´einappropri´epour l'´evaluation de mesures de risque, car elle sous estime le risque. Deux probl`emesessentiels se posent lors de l'application du TLC lorsqu'on con- sid`erece type de donn´ees `aqueue de distribution relativement lourde (par exem- ple de type Pareto ayant un param`etrede forme sup´erieur`a2). Le premier con- cerne la qualit´ede l'approximation li´eeaux moments de la variable parente, le sec- ond la pertinence d'une m´ethode graphique pour d´ecelerune distribution `aqueue ´epaisselorsqu'on agr`egeles donn´eeset que l'on obtient ainsi des ´echantillons de taille plus r´eduite. ,! Revenons sur la 1`erequestion soulev´ee. Il ne faut pas perdre de vue que le TLC est un th´eor`emepermettant d'´evaluer le comportement moyen d'un ph´enom`eneet non celui des extr^emes(localis´esen queue de distribution). Il a ´et´ed´emontr´e,depuis les ´etudesdes ann´ees80, en particulier celle de Hall ([27]), qu'enlever les extr^emesde l'´echantillon peut am´eliorerla vitesse de convergence (variance plus faible) de la moyenne lors de l'application du TLC. De plus, m^emesi l'on ne s'int´eressaitqu'au comportement moyen, on sait qu'il s'agit d'un th´eor`emeasymptotique fonction de la taille de l'´echantillon et que l'on peut obtenir une mauvaise approximation pour des ´echantillons de taille r´eduite. Pour am´eliorerla qualit´ede l'approximation, l'existence de moments d'ordre sup´erieur`a2 s'av`eren´ecessaire,comme l'atteste le d´eveloppement de Edgeworth. ,! Concernant le second probl`emeli´e`al'agr´egation, notons que l'existence d'une distribution sous-jacente `aqueue lourde peut se d´ecelerpar des m´ethodes empiriques/graphiques, telles le QQ-plot, - clairement sur des donn´ees`ahaute fr´equence (par exemple des donn´ees journali`eres) - et non plus sur des donn´eesagr´eg´ees,par exemple des donn´eesannuelles (´echantillons de petite taille) 4 alors qu'il est bien connu, depuis le th´eor`emede Fisher, que l'indice de queue de la distribution sous-jacente reste invariant apr`esagr´egation.C'est un ph´enom`enesur lequel ont insist´ebien des auteurs (cf. par exemple [13]). Ce second probl`emes'illustre tr`esbien par les QQ-plots obtenus pour les ren- dements S&P 500 figurant dans la section Introduction. En effet, alors que le QQ-plot effectu´esur des donn´eesjournali`eresde 1987 `a2007 permet de d´etecter une distribution `aqueue lourde, il n'en est pas de m^emelorsqu'on agr`egeces donn´eesmensuellement. Dans ce cas, le QQ-plot appara^ıtplut^otcomme normal, et inciterait alors `aconsid´ererles crises financi`eresde 1998 et 1987 comme des valeurs aberrantes. M^emeen consid´erant les donn´eesagr´eg´eesmensuellement sur une p´eriode un peu plus longue, `asavoir de 1987 `a2013, le QQ-plot reste plus ou moins inchang´e, i.e. normal, sauf que cette fois un autre point 'aberrant' appara^ıt, avec la date de ... Octobre 2008 ! Sans revenir aux donn´eesjournali`eresmais en augmentant sensiblement la taille de l'´echantillon, avec des donn´eesmensuelles de 1791 `a2013, il devient `anouveau manifeste que la distribution sous-jacente est `aqueue lourde. La crise financi`erede 2008 est un ´ev´enement apparaissant dans la queue de distribution et ne peut alors plus ^etre´ecart´ede l'analyse en tant que point aberrant. Ces diff´erentes figures illustrent bien l'importance de la taille d'´echantillon, lorsque nous devons ´evaluer des risques en pr´esencede distri- bution `aqueue relativement lourde. Il s'av`eredonc fondamental de proposer une m´ethode qui ne d´epende pas de la taille de l'´echantillon pour d´ecelerla forme de la queue de distribution. • Objectif La recherche de telle(s) approche(s) afin d'obtenir les ´evaluations les plus fines possibles des mesures de risque lorsque nous analysons des donn´eesfinanci`eres en pr´esencede queues de distribution ´epaissesconstitue notre principal objectif. Nous allons explorer diverses m´ethodes, existantes et nouvelles, pour r´esoudrece probl`eme,th´eoriquement et num´eriquement. Ayant en vue des applications financi`eres ou actuarielles, nous utiliserons des mod`eles de loi puissance pour les marginales des risques, telle la loi de Pareto de param`etre α d´efiniepar F (x) := 1 − F (x) = x−α; α > 0; x ≥ 1 (1) Nous d´efinironsles rendements selon (n) Xi := ln Pi − ln Pi−n; n ≥ 1 o`u Pi est le prix journalier et n le facteur d'agr´egation. Notons que nous pouvons aussi ´ecrire n (n) X (1) Xi = Xi i=1 5 (1) Nous simplifierons la notation de Xi en Xi et consid´ereronsalors un n-´echantillon (Xi; i = 1; : : : ; n) de variable parente X suivant une loi de Pareto, de statistiques d'ordre associ´ees X(1) 6 ··· 6 X(n).
Recommended publications
  • Incorporating Extreme Events Into Risk Measurement
    Lecture notes on risk management, public policy, and the financial system Incorporating extreme events into risk measurement Allan M. Malz Columbia University Incorporating extreme events into risk measurement Outline Stress testing and scenario analysis Expected shortfall Extreme value theory © 2021 Allan M. Malz Last updated: July 25, 2021 2/24 Incorporating extreme events into risk measurement Stress testing and scenario analysis Stress testing and scenario analysis Stress testing and scenario analysis Expected shortfall Extreme value theory 3/24 Incorporating extreme events into risk measurement Stress testing and scenario analysis Stress testing and scenario analysis What are stress tests? Stress tests analyze performance under extreme loss scenarios Heuristic portfolio analysis Steps in carrying out a stress test 1. Determine appropriate scenarios 2. Calculate shocks to risk factors in each scenario 3. Value the portfolio in each scenario Objectives of stress testing Address tail risk Reduce model risk by reducing reliance on models “Know the book”: stress tests can reveal vulnerabilities in specfic positions or groups of positions Criteria for appropriate stress scenarios Should be tailored to firm’s specific key vulnerabilities And avoid assumptions that favor the firm, e.g. competitive advantages in a crisis Should be extreme but not implausible 4/24 Incorporating extreme events into risk measurement Stress testing and scenario analysis Stress testing and scenario analysis Approaches to formulating stress scenarios Historical
    [Show full text]
  • University of Regina Lecture Notes Michael Kozdron
    University of Regina Statistics 441 – Stochastic Calculus with Applications to Finance Lecture Notes Winter 2009 Michael Kozdron [email protected] http://stat.math.uregina.ca/∼kozdron List of Lectures and Handouts Lecture #1: Introduction to Financial Derivatives Lecture #2: Financial Option Valuation Preliminaries Lecture #3: Introduction to MATLAB and Computer Simulation Lecture #4: Normal and Lognormal Random Variables Lecture #5: Discrete-Time Martingales Lecture #6: Continuous-Time Martingales Lecture #7: Brownian Motion as a Model of a Fair Game Lecture #8: Riemann Integration Lecture #9: The Riemann Integral of Brownian Motion Lecture #10: Wiener Integration Lecture #11: Calculating Wiener Integrals Lecture #12: Further Properties of the Wiener Integral Lecture #13: ItˆoIntegration (Part I) Lecture #14: ItˆoIntegration (Part II) Lecture #15: Itˆo’s Formula (Part I) Lecture #16: Itˆo’s Formula (Part II) Lecture #17: Deriving the Black–Scholes Partial Differential Equation Lecture #18: Solving the Black–Scholes Partial Differential Equation Lecture #19: The Greeks Lecture #20: Implied Volatility Lecture #21: The Ornstein-Uhlenbeck Process as a Model of Volatility Lecture #22: The Characteristic Function for a Diffusion Lecture #23: The Characteristic Function for Heston’s Model Lecture #24: Review Lecture #25: Review Lecture #26: Review Lecture #27: Risk Neutrality Lecture #28: A Numerical Approach to Option Pricing Using Characteristic Functions Lecture #29: An Introduction to Functional Analysis for Financial Applications Lecture #30: A Linear Space of Random Variables Lecture #31: Value at Risk Lecture #32: Monetary Risk Measures Lecture #33: Risk Measures and their Acceptance Sets Lecture #34: A Representation of Coherent Risk Measures Lecture #35: Further Remarks on Value at Risk Lecture #36: Midterm Review Statistics 441 (Winter 2009) January 5, 2009 Prof.
    [Show full text]
  • Modeling Vehicle Insurance Loss Data Using a New Member of T-X Family of Distributions
    Journal of Statistical Theory and Applications Vol. 19(2), June 2020, pp. 133–147 DOI: https://doi.org/10.2991/jsta.d.200421.001; ISSN 1538-7887 https://www.atlantis-press.com/journals/jsta Modeling Vehicle Insurance Loss Data Using a New Member of T-X Family of Distributions Zubair Ahmad1, Eisa Mahmoudi1,*, Sanku Dey2, Saima K. Khosa3 1Department of Statistics, Yazd University, Yazd, Iran 2Department of Statistics, St. Anthonys College, Shillong, India 3Department of Statistics, Bahauddin Zakariya University, Multan, Pakistan ARTICLEINFO A BSTRACT Article History In actuarial literature, we come across a diverse range of probability distributions for fitting insurance loss data. Popular Received 16 Oct 2019 distributions are lognormal, log-t, various versions of Pareto, log-logistic, Weibull, gamma and its variants and a generalized Accepted 14 Jan 2020 beta of the second kind, among others. In this paper, we try to supplement the distribution theory literature by incorporat- ing the heavy tailed model, called weighted T-X Weibull distribution. The proposed distribution exhibits desirable properties Keywords relevant to the actuarial science and inference. Shapes of the density function and key distributional properties of the weighted Heavy-tailed distributions T-X Weibull distribution are presented. Some actuarial measures such as value at risk, tail value at risk, tail variance and tail Weibull distribution variance premium are calculated. A simulation study based on the actuarial measures is provided. Finally, the proposed method Insurance losses is illustrated via analyzing vehicle insurance loss data. Actuarial measures Monte Carlo simulation Estimation 2000 Mathematics Subject Classification: 60E05, 62F10. © 2020 The Authors. Published by Atlantis Press SARL.
    [Show full text]
  • A New Heavy Tailed Class of Distributions Which Includes the Pareto
    risks Article A New Heavy Tailed Class of Distributions Which Includes the Pareto Deepesh Bhati 1 , Enrique Calderín-Ojeda 2,* and Mareeswaran Meenakshi 1 1 Department of Statistics, Central University of Rajasthan, Kishangarh 305817, India; [email protected] (D.B.); [email protected] (M.M.) 2 Department of Economics, Centre for Actuarial Studies, University of Melbourne, Melbourne, VIC 3010, Australia * Correspondence: [email protected] Received: 23 June 2019; Accepted: 10 September 2019; Published: 20 September 2019 Abstract: In this paper, a new heavy-tailed distribution, the mixture Pareto-loggamma distribution, derived through an exponential transformation of the generalized Lindley distribution is introduced. The resulting model is expressed as a convex sum of the classical Pareto and a special case of the loggamma distribution. A comprehensive exploration of its statistical properties and theoretical results related to insurance are provided. Estimation is performed by using the method of log-moments and maximum likelihood. Also, as the modal value of this distribution is expressed in closed-form, composite parametric models are easily obtained by a mode matching procedure. The performance of both the mixture Pareto-loggamma distribution and composite models are tested by employing different claims datasets. Keywords: pareto distribution; excess-of-loss reinsurance; loggamma distribution; composite models 1. Introduction In general insurance, pricing is one of the most complex processes and is no easy task but a long-drawn exercise involving the crucial step of modeling past claim data. For modeling insurance claims data, finding a suitable distribution to unearth the information from the data is a crucial work to help the practitioners to calculate fair premiums.
    [Show full text]
  • An Introduction to Risk Measures for Actuarial Applications
    An Introduction to Risk Measures for Actuarial Applications Mary R Hardy CIBC Professor of Financial Risk Management University of Waterloo 1 Introduction In actuarial applications we often work with loss distributions for insurance products. For example, in P&C insurance, we may develop a compound Poisson model for the losses under a single policy or a whole portfolio of policies. Similarly, in life insurance, we may develop a loss distribution for a portfolio of policies, often by stochastic simulation. Profit and loss distributions are also important in banking, and most of the risk measures we discuss in this note are also useful in risk management in banking. The convention in banking is to use profit random variables, that is Y where a loss outcome would be Y<0. The convention in insurance is to use loss random variables, X = −Y . In this paper we work exclusively with loss distributions. Thus, all the definitions that we present for insurance losses need to be suitably adapted for profit random variables. Additionally, it is usually appropriate to assume in insurance contexts that the loss X is non-negative, and we have assumed this in Section 2.5 of this note. It is not essential however, and the risk measures that we describe can be applied (perhaps after some adaptation) to random variables with a sample space spanning any part of the real line. Having established a loss distribution, either parametrically, non-parametrically, analyti- cally or by Monte Carlo simulation, we need to utilize the characteristics of the distribution for pricing, reserving and risk management.
    [Show full text]
  • On Approximations of Value at Risk and Expected Shortfall Involving Kurtosis
    On approximations of Value at Risk and Expected Shortfall involving kurtosis Maty´ as´ Barczy∗;, Ad´ am´ Dudas´ ∗∗,Jozsef´ Gall´ ∗∗∗ * MTA-SZTE Analysis and Stochastics Research Group, Bolyai Institute, University of Szeged, Aradi v´ertan´uktere 1, H{6720 Szeged, Hungary. ** Former master student of Faculty of Science and Technology, University of Debrecen, Hun- gary. *** Faculty of Informatics, University of Debrecen, Pf. 12, H{4010 Debrecen, Hungary. e-mail: [email protected] (M. Barczy), [email protected] (A.´ Dud´as), [email protected] (J. G´all). Corresponding author. Abstract We derive new approximations for the Value at Risk and the Expected Shortfall at high levels of loss distributions with positive skewness and excess kurtosis, and we describe their precisions for notable ones such as for exponential, Pareto type I, lognormal and compound (Poisson) distributions. Our approximations are motivated by that kind of extensions of the so-called Normal Power Approximation, used for approximating the cumulative distribution function of a random variable, which incorporate not only the skewness but the kurtosis of the random variable in question as well. We show the performance of our approximations in numerical examples and we also give comparisons with some known ones in the literature. 1 Introduction Value at Risk (VaR) and Expected Shortfall (ES) are standard risk measures in financial and insurance mathematics. VaR permits to measure the maximum aggregate loss of a portfolio with a given confidence level, while ES can be defined as the conditional expectation of the loss arXiv:1811.06361v3 [q-fin.RM] 23 Dec 2020 for losses beyond the corresponding VaR-level (see Definitions 3.1 and 3.2).
    [Show full text]
  • Risk Measures in Quantitative Finance
    Risk Measures in Quantitative Finance by Sovan Mitra Abstract This paper was presented and written for two seminars: a national UK University Risk Conference and a Risk Management industry workshop. The target audience is therefore a cross section of Academics and industry professionals. The current ongoing global credit crunch 1 has highlighted the importance of risk measurement in Finance to companies and regulators alike. Despite risk measurement’s central importance to risk management, few papers exist reviewing them or following their evolution from its foremost beginnings up to the present day risk measures. This paper reviews the most important portfolio risk measures in Financial Math- ematics, from Bernoulli (1738) to Markowitz’s Portfolio Theory, to the presently pre- ferred risk measures such as CVaR (conditional Value at Risk). We provide a chrono- logical review of the risk measures and survey less commonly known risk measures e.g. Treynor ratio. Key words: Risk measures, coherent, risk management, portfolios, investment. 1. Introduction and Outline arXiv:0904.0870v1 [q-fin.RM] 6 Apr 2009 Investors are constantly faced with a trade-off between adjusting potential returns for higher risk. However the events of the current ongoing global credit crisis and past financial crises (see for instance [Sto99a] and [Mit06]) have demonstrated the necessity for adequate risk measurement. Poor risk measurement can result in bankruptcies and threaten collapses of an entire finance sector [KH05]. Risk measurement is vital to trading in the multi-trillion dollar derivatives industry [Sto99b] and insufficient risk analysis can misprice derivatives [GF99]. Additionally 1“Risk comes from not knowing what you’re doing”, Warren Buffett.
    [Show full text]
  • Multinomial Var Backtests: a Simple Implicit Approach to Backtesting Expected Shortfall
    Multinomial VaR Backtests: A simple implicit approach to backtesting expected shortfall Marie Kratz,∗ Yen H. Lok ,y Alexander J. McNeil z Abstract Under the Fundamental Review of the Trading Book (FRTB) capital charges for the trading book are based on the coherent expected shortfall (ES) risk measure, which show greater sensitivity to tail risk. In this paper it is argued that backtesting of expected shortfall - or the trading book model from which it is calculated - can be based on a simultaneous multinomial test of value-at-risk (VaR) exceptions at different levels, an idea supported by an approximation of ES in terms of multiple quantiles of a distribution proposed in Emmer et al. (2015). By comparing Pearson, Nass and likelihood-ratio tests (LRTs) for different numbers of VaR levels N it is shown in a series of simulation experiments that multinomial tests with N ≥ 4 are much more powerful at detecting misspecifications of trading book loss models than standard bi- nomial exception tests corresponding to the case N = 1. Each test has its merits: Pearson offers simplicity; Nass is robust in its size properties to the choice of N; the LRT is very powerful though slightly over-sized in small samples and more compu- tationally burdensome. A traffic-light system for trading book models based on the multinomial test is proposed and the recommended procedure is applied to a real-data example spanning the 2008 financial crisis. 2010 AMS classification: 60G70; 62C05; 62P05; 91B30; 91G70; 91G99 Keywords: backtesting; banking regulation; coherence; elicitability; expected short- fall; heavy tail; likelihood ratio test, multinomial distribution; Nass test; Pearson test; risk management; risk measure; statistical test; tail of distribution; value-at-risk 1 Introduction Techniques for the measurement of risk are central to the process of managing risk in financial institutions and beyond.
    [Show full text]
  • Computing the Value at Risk of a Portfolio: Academic Literature and Practionners’ Response
    Computing the Value at Risk of a Portfolio: Academic literature and Practionners' response. Contents I Historical approaches 7 1 Historical simulations and recent improvements 7 1.1 Introduction to historical simulations ........................ 7 1.2 The BRW and Hull and White methods ...................... 10 1.3 From historical simulations to filtered historical simulations . 12 2 Extreme Value Theory 14 2.1 Introduction to EVT ................................. 14 2.2 Quantile estimation with EVT ............................ 16 3 Quantile regressions 17 II Parametric methods for Value at Risk 20 4 The original RiskMetrics parametric approach 21 4.1 Distributional assumptions underlying the RiskMetrics approach . 22 4.2 A first approximation of the portfolio: the Delta-Normal approach . 24 4.3 Introduction to the Delta-Gamma-(Theta)-Normal approach . 26 5 The Delta-Gamma-Normal approach 27 5.1 Gaussian approximation ............................... 29 1 5.2 Johnson transformation ................................ 30 5.3 Cornish-Fisher approximation ............................ 31 5.4 Solomon-Stephens approximation .......................... 33 5.5 Saddle point approximation ............................. 35 5.6 Fourier Transform ................................... 37 5.7 Principal component Value at Risk ......................... 38 5.8 Partial Monte Carlo simulations ........................... 41 6 Going beyond normality 42 6.1 The Delta-(Mixture) Elliptical approach ...................... 44 6.2 More on the saddle-point approximation and Fourier transform approaches . 46 6.3 Principal component Value at Risk: a general framework . 51 7 Concluding remarks on the linear and quadratic approximations 52 III Monte-Carlo simulations and stochastic algorithms 54 8 Monte-Carlo simulations 54 9 Variance reduction techniques 56 9.1 Importance Sampling ................................. 57 9.2 Stratification ..................................... 59 10 Stochastic algorithms 60 10.1 The Bardou-Frikha-Pag`esapproach ........................
    [Show full text]
  • On the Subadditivity of Tail-Value at Risk: an Investigation with Copulas
    On the Subadditivity of Tail-Value at Risk: An Investigation with Copulas S. Desmedt¤z, J.F. Walhin{x ¤ Secura, Avenue des Nerviens 9-31, B-1040 Brussels, Belgium z Corresponding author, Tel: +32 2 5048228, e-mail: [email protected] { Fortis, Montagne du Parc 3 (1MA3F), B-1000 Brussels, Belgium x Institut de Sciences Actuarielles, Universit¶eCatholique de Louvain, rue des Wallons 6, B-1348 Louvain-la-Neuve, Belgium Abstract In this paper, we compare the point of view of the regulator and the investors about the required solvency level of an insurance company. We assume that the required solvency level is determined using the Tail-Value at Risk and analyze the diversi¯- cation bene¯t, both on the required capital and on the residual risk, when merging risks. To describe the dependence structure, we use a range of various copulas. This allows to judge whether or not the Tail-Value at Risk is too subadditive under a wide range of conditions. Furthermore, we discuss the e®ect of di®erent copulas on the diversi¯cation possibilities. Keywords Required solvency level, Tail-Value at Risk, Diversi¯cation bene¯t, Stochastic depen- dence, Copulas, Tail dependence. 1 Introduction Assume that the loss incurred by an insurance company is given by the realization of a random variable X, de¯ned on a probability space (­; F; P). To protect the insured, reg- ulators demand that the insurance company holds an amount of money large enough to be able to compensate the policyholders with a high probability. Obviously, a fraction of that amount is provided by the premiums paid by the policyholders.
    [Show full text]
  • A Survey of Capital Allocation Methods with Commentary Topic 3
    A Survey of Capital Allocation Methods with Commentary Topic 3: Risk Control Gary G Venter Guy Carpenter Instrat One Madison Avenue, New York NY USA +1-212-323-1739 Fax: +1-212-390-4739 [email protected] Abstract: A number of methods of allocating capital to business unit, e.g., line of business, profit cen- ter, etc., are discussed. Goals of capital allocation include testing the profitability of business units and determining which units could best be grown to add value to the firm. Methods of approaching these questions without allocating capital are included in the discussion. Keywords: capital allocation, risk measures A Survey of Capital Allocation Methods with Commentary Capital allocation is generally not an end in itself, but rather an intermediate step in a deci- sion-making process. Trying to determine which business units are most profitable relative to the risk they bear is a typical example. Pricing for risk is another. Return-on-capital thinking would look to allocate capital to each business unit, then divide the units’ profits by that capital. Of course if profit were negative, you would not need to di- vide by anything to know it is not sufficient. But this approach would hope to be able to dis- tinguish the profitable-but-not-enough-so units from the real value-adders. The same issue can be approached without allocating capital, using a theory of market risk pricing. The actual pricing achieved by each business unit can be compared to the risk price needed. This would require having a good theory of risk pricing, where the previous approach would depend on having a good theory of capital allocation.
    [Show full text]
  • Modeling Fundamentals: Evaluating Risk Measures
    JANUARY 2014 AIRCURRENTS: MODELING FUNDamentaLS: EVALuatinG RISK MEASURES BY DAVID LALONDE AND ALISSA LEGENZA EDITED BY SARA GAMBRILL Editor’s NotE: Senior Vice President David Lalonde and Risk Consultant Alissa Legenza describe various risk measures that can be used when assessing capital and solvency requirements and pricing risk transfer opportunities. Use of appropriate risk measures is crucial when assessing capital and solvency requirements and when pricing risk transfer THE ArticLE: Promotes awareness of the opportunities, among many other business needs. When risk advantages and limitations associated with common risk measures are not well understood, opportunities to optimize measures. the allocation of risks are too often missed. It is therefore of fundamental importance that business decisions are based on the HIGHLIGHTS: Examines less common techniques for measuring most robust measures of risk available. This article aims to promote financial risks. Presents case studies comparing peer companies awareness of the advantages and limitations associated with to illustrate each risk measure’s advantages and limitations. common risk measures. In addition, some lesser-known techniques for measuring financial risks are also examined. VALUE AT RISK The financial industry began researching new methods for measuring exposure to downside risks in the late 1980s, following the infamous Black Monday stock market crash. It was then that value at risk (VaR) became popular as a risk measure. VaR is a single value from a loss distribution, often with an associated probability of exceedance. The insurance industry later adopted the widespread use of VaR as a measure of catastrophe risk. During the early years of its use, VaR Figure 1.
    [Show full text]