Stochastic Methods in Risk Management

Total Page:16

File Type:pdf, Size:1020Kb

Stochastic Methods in Risk Management Stochastic Methods in Risk Management Dissertation zur Erlangung des akademischen Grades eines Doktors der Wirtschaftswissenschaften (Dr. rer. pol.) durch die Fakult¨atf¨urWirtschaftswissenschaften der Universit¨atDuisburg-Essen, Campus Essen vorgelegt von Jinsong Zheng geboren in Zhejiang, China Essen, 2018 Diese Dissertation wird über DuEPublico, dem Dokumenten- und Publikationsserver der Universität Duisburg-Essen, zur Verfügung gestellt und liegt auch als Print-Version vor. DOI: 10.17185/duepublico/70203 URN: urn:nbn:de:hbz:464-20190624-093546-8 Alle Rechte vorbehalten. Tag der m¨undlichen Pr¨ufung:12. Juni 2019 Erstgutachter: Prof. Dr. R¨udigerKiesel Zweitgutachter: Prof. Dr. Antje Mahayni Zusammenfassung Stochastische Methoden sind in der Finanzbranche weit verbreitet und werden z.B. in der stochastischen Modellierung und Simulation, der risikoneutralen Bewertung, der Derivatebewertung und vielen weiteren Anwendungen eingesetzt. Unter den Rahmenbe- dingungen von Solvency II mussen¨ Versicherungsunternehmen ad¨aquat kapitalisiert sein um jene aus Solvency II erwachsenen Kapitalanforderungen, zum Schutze der Aktion¨are und Versicherungsnehmer, zu erfullen.¨ Daher mussen¨ zwei wesentliche Gr¨oßen betrach- tet werden; das vorhandene Risikokapital (bzw. die Basiseigenmittel) und das ben¨otigte Risikokapital. Im Allgemeinen werden diese Gr¨oßen anhand der Mittelwerte von sto- chastischen Simulationen berechnet und folglich wird ein Economic Scenario Generator (ESG) verwendet, um die potentielle Entwicklung von Risikofaktoren der Okonomie¨ und des Finanzmarktes im Zeitverlauf zu simulieren. Fur¨ die Berechnung des vorhandenen Risikokapitals (definiert als die Differenz zwischen dem Marktwert der Verm¨ogenswerte abzuglich¨ der Verbindlichkeiten) wird ein stochastische Cash-Flow Projektionsmodell verwendet, um eine marktkonsistente Bewertung der Verm¨ogenswerte und der Verbind- lichkeiten, unter Verwendung von risikoneutralen Szenarien, durchzufuhren.¨ Die Berech- nung des ben¨otigten Risikokapitals erfolgt anhand der Wahrscheinlichkeitsverteilung des vorhandenen Risikokapitals uber¨ einen einj¨ahrigen Zeithorizont mithilfe eines Risikoma- ßes. Beispielsweise wird die Solvency Capital Requirement (SCR) anhand des Value-at- Risk zum Konfidenzniveau 99,5% gemessen. Zun¨achst haben wir einen Uberblick¨ uber¨ die bestehende Literatur gegeben. Hierbei haben wir festgestellt, dass die allermeisten Autoren sich bei ihrer Betrachtung hinsicht- lich der Verwendung stochastischer Methoden im Rahmen von Solvency II auf eine der drei Modellkomponenten des interne Partialmodells, also dem Inputmodell, dem Be- wertungsmodell und dem Risikomodell, konzentrieren. In dieser Arbeit wollten wir ein internes Partialmodell mit allen Komponenten aufbauen und Schritt fur¨ Schritt zeigen, wie wir mit stochastischen Methoden eine marktkonsistente Bewertung vornehmen und das ben¨otigte Risikokapital berechnen k¨onnen. Fur¨ das Inputmodel haben anstatt eines akademisch bevorzugten einfachen ESG Mo- dells, mit einem Ein-Faktor Zinsmodells und einem durch eine geometrische Brownsche Bewegung getriebenen Aktienmodell, ein komplexeres Modell entwickelt, welches in der Praxis besser geeignet ist. Fur¨ die Modellierung des Zinssatzes haben wir das erweiterte Drei-Faktoren-Modell von Cox-Ingersoll-Ross, das die drei Hauptkomponenten der Zinss- trukturkurve erfassen kann, verwendet. Wir haben den Preis von Nullkupon-Optionen mithilfe der Fourier-Transformation der charakteristischen Funktion einer Linearkom- bination von Zustandsvariablen sowie einer anschließender Bewertung von Swaptions anhand einer stochastischen Durationsapproximation hergeleitet. Fur¨ die Modellierung von Aktien haben wir ein stochastisches Volatilit¨atsmodell (Heston-Modell) zusammen i mit der oben beschriebenen stochastischen Zinsmodellierung verwendet. In ¨ahnlicher Weise haben wir die geschlossene Form der diskontierten charakteristischen Funktion des logarithmierten Aktienpreises ermittelt, indem wir ein System von gew¨ohnlichen Differentialgleichungen l¨osen, welches von einer affinen partiellen Differentialgleichung stammt. Anschließend haben wir den Preis von europ¨aischen Optionen ebenfalls an- hand Fouriertechniken hergeleitet. Zus¨atzlich haben wir die Methode der Erzeugung ¨okonomischer Szenarien mithilfe einer Monte Carlo Simulation, unter Verwendung eines Euler Diskretisierungsschemas und Varianzreduktionstechniken formuliert. Fur¨ das Bewertungsmodell haben wir ein stochastisches Cashflow-Projektionsmodell entwickelt, um die Entwicklung der Bilanz sowie des aus Kuponanleihen und Aktien bestehenden Verm¨ogensportfolios und des aus uberschussberechtigten¨ Lebensversiche- rungsvertr¨agen bestehenden Passivportfolios zu erfassen. Anschließend haben wir eine marktkonsistente Bewertung der Verm¨ogenswerte und Verbindlichkeiten vorgenommen, die auf den vom stochastischen Modell projizierten Cashflows und den Input risikoneu- traler ¨okonomischen Szenarien basiert. Daruber¨ hinaus haben wir die Managementregeln modelliert. Beispielsweise haben wir eine konstante Asset-Allocation-Strategie entwi- ckelt, um das Asset-Portfolio wieder ins Gleichgewicht zu bringen. Wir haben den nicht realisierten Gewinn und Verlust durch Modellierung des Buchwerts und des Marktwerts von Verm¨ogenswerten berucksichtigt.¨ Daruber¨ hinaus haben wir den MUST-Fall fur¨ die Uberschussbeteiligung¨ zwischen Anteilseigner und Versicherungsnehmer modelliert. Fur¨ das Risikokapitalmodell haben wir zun¨achst die verschachtelte stochastische Simu- lation ( Nested Stochastic Simulation\) implementiert, um das ben¨otigten Risikokapital " zu bestimmen. Da fur¨ die verschachtelte Simulation eine hohe Rechenzeit erforderlich ist, haben wir auch die Proxy-Methoden Least Squared Monte Carlo\ (LSMC), Replica- " " ting Portfolio\ und Curve Fitting\ untersucht. Insbesondere haben wir eine allgemeine " Strategie entwickelt, um ein gutes replizierendes Portfolio zusammenzustellen. Hierbei haben wir zuerst den Aufbau eines Asset-Pools beschrieben. Danach haben wir die Kon- struktion von Sensitivit¨atss¨atzen durch Rekalibrierungs- oder Neugewichtungstechniken veranschaulicht. Als n¨achstes haben wir ein Kalibrierungsverfahren vorgeschlagen, bei dem sowohl die Optimierungsmethode der kleinsten Quadrate als auch die Auswahl von Teilmengen anhand bestimmter Kriterien verwendet werden, um das optimale Replika- tionsportfolio auszuw¨ahlen und das ben¨otigte Risikokapital zu berechnen. Schließlich haben wir anhand einer empirischen Anwendung den gesamten Prozess von der Kalibrierung der ESG-Modelle anhand von realen Marktdaten, der Erzeugung und Validierung der ¨okonomischen Szenarien, der marktkonsistenten Bewertung sowie der Bestimmung des SCR durch die verschachtelte Simulation und des replizierenden Portfolios, illustriert. ii Abstract Stochastic methods, such as stochastic modeling and simulation, risk neutral valuation, derivative pricing, etc., are widely used in the finance industry. Under Solvency II framework, in order to protect the benefit of shareholder and policyholder, the insurance company should be adequately capitalized to fulfill the capital requirement for solvency. Therefore, two main quantities are taken into account, i.e. the available capital (or basic own funds) and the required capital. In general, these two quantities are calculated by means of stochastic simulation and hence an Economic Scenario Generator (ESG) is used to simulate the potential evolution of risk factors of the economies and financial markets over time. For the calculation of available capital (defined as the difference between the market value of assets and liabilities), the stochastic cash flow projection model is used to perform the market consistent valuation of assets and liabilities given the risk neutral scenarios. For the calculation of required capital, the probability distribution of available capital over a one-year time horizon and a risk measure based on such distribution is taken into account. For instance, the Solvency Capital Requirement (SCR) is measured by the Value-at-Risk at confidence level of 99.5%. We began by reviewing the existing literature and found that most authors used stochastic methods in risk management under Solvency II framework on one of the three components of the partial internal model, i.e. the input model, the valuation model or the risk capital model. In this thesis, we aimed to build a partial internal model including all components and show how we can use stochastic methods to do market consistent valuation and calculate the required capital. For the input model, instead of using academic preferred simple ESG models, e.g. one factor short rate interest rate model along with geometric Brownian motion equity model, we developed advanced models that are more suitable in practice. For the modeling of interest rate, we used the extended three-factor Cox-Ingersoll-Ross model, which is able to capture the three main principle components of yield curve. We derived the pricing of zero coupon options by Fourier transformation of the characteristic function of the linear combination of state variables and subsequently the pricing of swaption using stochastic duration approximation. For the modeling of equity, we used the stochastic volatility model (Heston model) along with above-mentioned stochastic interest rate. Similarly, we first showed the closed-form of discounted characteristic function of log equity price by solving a system of Ordinary Differential Equations (ODEs) resulting from an affine Partial Differential Equation
Recommended publications
  • Minimizing Spectral Risk Measures Applied to Markov Decision Processes 3
    MINIMIZING SPECTRAL RISK MEASURES APPLIED TO MARKOV DECISION PROCESSES NICOLE BAUERLE¨ AND ALEXANDER GLAUNER Abstract. We study the minimization of a spectral risk measure of the total discounted cost generated by a Markov Decision Process (MDP) over a finite or infinite planning horizon. The MDP is assumed to have Borel state and action spaces and the cost function may be unbounded above. The optimization problem is split into two minimization problems using an infimum representation for spectral risk measures. We show that the inner minimization problem can be solved as an ordinary MDP on an extended state space and give sufficient conditions under which an optimal policy exists. Regarding the infinite dimensional outer minimization problem, we prove the existence of a solution and derive an algorithm for its numerical approximation. Our results include the findings in B¨auerle and Ott (2011) in the special case that the risk measure is Expected Shortfall. As an application, we present a dynamic extension of the classical static optimal reinsurance problem, where an insurance company minimizes its cost of capital. Key words: Risk-Sensitive Markov Decision Process; Spectral Risk Measure; Dy- namic Reinsurance AMS subject classifications: 90C40, 91G70, 91G05 1. Introduction In the last decade, there have been various proposals to replace the expectation in the opti- mization of Markov Decision Processes (MDPs) by risk measures. The idea behind it is to take the risk-sensitivity of the decision maker into account. Using simply the expectation models a risk-neutral decision maker whose optimal policy sometimes can be very risky, for an example see e.g.
    [Show full text]
  • Robust Spectral Risk Optimization When Information on Risk Spectrum Is Incomplete
    Robust Spectral Risk Optimization When Information on Risk Spectrum Is Incomplete Wei Wangy Huifu Xuz June 25, 2020 Abstract Spectral risk measure (SRM) is a weighted average of value at risk (VaR) where the weighting function (also known as risk spectrum or distortion function) characterizes decision maker's risk attitude. In this paper, we consider the case where the decision maker's risk spectrum is ambiguous and introduce a robust SRM model based on the worst risk spectrum from a ball of risk spectra centred at a nominal risk spectrum. When the ball consists of step-like risk spectra, we show that the robust SRM can be computed by solving a linear programming problem. For general case, we propose a step-like approximation scheme and derive an error bound for the approximation. As an application, we apply the proposed robust SRM to one-stage stochastic optimization with objective of minimizing the robust SRM and propose an alternating iterative algorithm for solving the resulting minimax optimization problem. Moreover, to examine stability of the robust spectral risk optimization model with respect to perturbation of observed data from the underlying exogenous uncertainty in data- driven environments, we investigate statistical robustness of the model and derive sufficient conditions for the required stability. Keywords: robust spectral risk measure, robust spectral risk optimization, step-like ap- proximation, alternating iterative algorithm, statistical robustness 1 Introduction Spectral risk measure (SRM) is a weighted average of value at risk (VaR) where the weighting function captures an investor's risk preference. The weighting function is formally called risk spectrum as it determines the nature of SRM.
    [Show full text]
  • A Discussion on Recent Risk Measures with Application to Credit Risk: Calculating Risk Contributions and Identifying Risk Concentrations
    risks Article A Discussion on Recent Risk Measures with Application to Credit Risk: Calculating Risk Contributions and Identifying Risk Concentrations Matthias Fischer 1,*, Thorsten Moser 2 and Marius Pfeuffer 1 1 Lehrstuhl für Statistik und Ökonometrie, Universität Erlangen-Nürnberg, Lange Gasse 20, 90403 Nürnberg, Germany; [email protected] 2 Risikocontrolling Kapitalanlagen, R+V Lebensverischerung AG, Raiffeisenplatz 1, 65189 Wiesbaden, Germany; [email protected] * Correspondence: matthias.fi[email protected] Received: 11 October 2018; Accepted: 30 November 2018; Published: 7 December 2018 Abstract: In both financial theory and practice, Value-at-risk (VaR) has become the predominant risk measure in the last two decades. Nevertheless, there is a lively and controverse on-going discussion about possible alternatives. Against this background, our first objective is to provide a current overview of related competitors with the focus on credit risk management which includes definition, references, striking properties and classification. The second part is dedicated to the measurement of risk concentrations of credit portfolios. Typically, credit portfolio models are used to calculate the overall risk (measure) of a portfolio. Subsequently, Euler’s allocation scheme is applied to break the portfolio risk down to single counterparties (or different subportfolios) in order to identify risk concentrations. We first carry together the Euler formulae for the risk measures under consideration. In two cases (Median Shortfall and Range-VaR), explicit formulae are presented for the first time. Afterwards, we present a comprehensive study for a benchmark portfolio according to Duellmann and Masschelein (2007) and nine different risk measures in conjunction with the Euler allocation. It is empirically shown that—in principle—all risk measures are capable of identifying both sectoral and single-name concentration.
    [Show full text]
  • Determination of Risk Pricing Measures from Market Prices of Risk
    Determination of risk pricing measures from market prices of risk Henryk Gzyl and Silvia Mayoral ∗ IESA, Caracas, Venezuela and UNAV, Pamplona, Espa˜na. Abstract A new insurance provider or a regulatory agency may be interested in determining a risk measure consistent with observed market prices of a collection of risks. Using a relationship between distorted coherent risk measures and spectral risk measures, we provide a method for reconstruct- ing distortion functions from the observed prices of risk. The technique is based on an appropriate application of the method of maximum entropy in the mean, which builds upon the classical method of maximum entropy. Key words. Distortion function, Spectral measures, Risk Aversion Func- tion, Maximum entropy in the mean, Inverse problems. ∗The research of this author was partially funded by Welzia Management, SGIIC SA, RD Sistemas SA , Comunidad Autnoma de Madrid Grant s-0505/tic/000230, and MEyC Grant BEC2000-1388-C04-03. 1 1 Introduction The problem of pricing actuarial risk has received a great deal of attention in recent years, generating a large amount of theoretical and practical work. A good account of the traditional and modern pricing methods appears in the book by Kaas, Goovaerts, Dhaene and Denuit (Kaas et al. (2005)). Among pricing measures the class of distorted risk measures is popular and appealing. For fur- ther details and the general philosophy of distorted risk measures the interested reader may consult Wang (1996), Wang, Young and Panjer (1997), and Wang and Young (1998), where, building on Quiggin (1982), Yaari (1987), and Schmeidler (1989), an axiomatic characterization and many applications to insurance can be found.
    [Show full text]
  • Estimation of Distortion Risk Measures
    Int. Statistical Inst.: Proc. 58th World Statistical Congress, 2011, Dublin (Session CPS046) p.5035 Estimation of Distortion Risk Measures Hideatsu Tsukahara Department of Economics, Seijo University, Tokyo e-mail address: [email protected] The concept of coherent risk measure was introduced in Artzner et al. (1999). They listed some properties, called axioms of `coherence', that any good risk measure should possess, and studied the (non-)coherence of widely-used risk measure such as Value-at- Risk (VaR) and expected shortfall (also known as tail conditional expectation or tail VaR). Kusuoka (2001) introduced two additional axioms called law invariance and comonotonic additivity, and proved that the class of coherent risk measures satisfying these two axioms coincides with the class of distortion risk measures with convex distortions. To be more speci¯c, let X be a random variable representing a loss of some ¯nancial position, and let F (x) := P(X · x) be the distribution function (df) of X. We denote ¡1 its quantile function by F (u) := inffx 2 R: FX (x) ¸ ug; 0 < u < 1. A distortion risk measure is then of the following form Z Z ¡1 ½D(X) := F (u) dD(u) = x dD ± F (x); (1) [0;1] R where D is a distortion function, which is simply a df D on [0; 1]; i.e., a right-continuous, increasing function on [0; 1] satisfying D(0) = 0 and D(1) = 1. For ½D(X) to be coherent, D must be convex, which we assume throughout this paper. The celebrated VaR can be written of the form (1), but with non-convex D; this implies that the VaR is not coherent.
    [Show full text]
  • Simulation-Based Portfolio Optimization with Coherent Distortion Risk Measures
    DEGREE PROJECT IN MATHEMATICS, SECOND CYCLE, 30 CREDITS STOCKHOLM, SWEDEN 2020 Simulation-Based Portfolio Optimization with Coherent Distortion Risk Measures ANDREAS PRASTORFER KTH ROYAL INSTITUTE OF TECHNOLOGY SCHOOL OF ENGINEERING SCIENCES Simulation Based Portfolio Optimization with Coherent Distortion Risk Measures ANDREAS PRASTORFER Degree Projects in Financial Mathematics (30 ECTS credits) Degree Programme in Applied and Computational Mathematics KTH Royal Institute of Technology year 2020 Supervisors at SAS Institute: Jimmy Skoglund Supervisor at KTH: Camilla Johansson Landén Examiner at KTH: Camilla Johansson Landén TRITA-SCI-GRU 2020:005 MAT-E 2020:05 Royal Institute of Technology School of Engineering Sciences KTH SCI SE-100 44 Stockholm, Sweden URL: www.kth.se/sci Abstract This master's thesis studies portfolio optimization using linear programming algorithms. The contribu- tion of this thesis is an extension of the convex framework for portfolio optimization with Conditional Value-at-Risk, introduced by Rockafeller and Uryasev [28]. The extended framework considers risk mea- sures in this thesis belonging to the intersecting classes of coherent risk measures and distortion risk measures, which are known as coherent distortion risk measures. The considered risk measures belong- ing to this class are the Conditional Value-at-Risk, the Wang Transform, the Block Maxima and the Dual Block Maxima measures. The extended portfolio optimization framework is applied to a reference portfolio consisting of stocks, options and a bond index. All assets are from the Swedish market. The re- turns of the assets in the reference portfolio are modelled with elliptical distribution and normal copulas with asymmetric marginal return distributions. The portfolio optimization framework is a simulation-based framework that measures the risk using the simulated scenarios from the assumed portfolio distribution model.
    [Show full text]
  • Nonparametric Estimation of Risk Measures of Collective Risks
    Nonparametric estimation of risk measures of collective risks Alexandra Lauer∗ Henryk Z¨ahle† Abstract We consider two nonparametric estimators for the risk measure of the sum of n i.i.d. individual insurance risks where the number of historical single claims that are used for the statistical estimation is of order n. This framework matches the situation that nonlife insurance companies are faced with within in the scope of premium calculation. Indeed, the risk measure of the aggregate risk divided by n can be seen as a suitable premium for each of the individual risks. For both estimators divided by n we derive a sort of Marcinkiewicz–Zygmund strong law as well as a weak limit theorem. The behavior of the estimators for small to moderate n is studied by means of Monte-Carlo simulations. Keywords: Aggregate risk, Total claim distribution, Convolution, Law-invariant risk measure, Nonuniform Berry–Ess´een inequality, Marcinkiewicz–Zygmund strong law, Weak limit theorem, Panjer recursion arXiv:1504.02693v2 [math.ST] 16 Sep 2015 ∗Department of Mathematics, Saarland University, [email protected] †Department of Mathematics, Saarland University, [email protected] 1 1. Introduction Let (Xi) be a sequence of nonnegative i.i.d. random variables on a common probability space with distribution µ. In the context of actuarial theory, the random variable Sn := n i=1 Xi can be seen as the total claim of a homogeneous insurance collective consisting P n of n risks. The distribution of Sn is given by the n-fold convolution µ∗ of µ. A central n task in insurance practice is the specification of the premium (µ∗ ) for the aggregate Rρ risk S , where is the statistical functional associated with any suitable law-invariant n Rρ risk measure ρ (henceforth referred to as risk functional associated with ρ).
    [Show full text]
  • Orlicz Risk Measures
    Convex Risk Measures based on Divergence Paul Dommel∗ Alois Pichler∗† March 26, 2020 Abstract Risk measures connect probability theory or statistics to optimization, particularly to convex optimization. They are nowadays standard in applicationsof finance and in insurance involving risk aversion. This paper investigates a wide class of risk measures on Orliczspaces. Thecharacterizing function describes the decision maker’s risk assessment towards increasing losses. We link the risk measures to a crucial formula developed by Rockafellar for the Average Value-at- Risk based on convexduality, which is fundamentalin correspondingoptimization problems. We characterize the dual and provide complementary representations. Keywords: risk measures, Orlicz spaces, Duality MSC classification: 91G70, 94A17, 46E30, 49N1 1 Introduction Risk measures are of fundamental importance in assessing risk, they have numerous applications in finance and in actuarial mathematics. A cornerstone is the Average Value-at-Risk, which has been considered in insurance first. Rockafellar and Uryasev (2002, 2000) develop its dual representation, which is an important tool when employing risk measures for concrete optimiza- tion. Even more, the Average Value-at-Risk is the major building block in what is known as the Kusuoka representation. The duality relations are also elaborated in Ogryczak and Ruszczyński (2002, 1999). Risk measures are most typically considered on Lebesgue spaces as L1 or L∞, although these are not the most general Banach space to consider them. An important reason for choosing this domain is that risk measures are Lipschitz continuous on L∞. A wide class of risk measures can be properly defined on function spaces as Orlicz spaces. arXiv:2003.07648v2 [q-fin.RM] 25 Mar 2020 These risk functionals get some attention in Bellini et al.
    [Show full text]
  • Distortion Risk Measures, ROC Curves, and Distortion Divergence
    UvA-DARE (Digital Academic Repository) Distortion risk measures, ROC curves, and distortion divergence Schumacher, J.M. DOI 10.2139/ssrn.2956334 Publication date 2017 Document Version Submitted manuscript Link to publication Citation for published version (APA): Schumacher, J. M. (2017). Distortion risk measures, ROC curves, and distortion divergence. University of Amsterdam, section Quantitative Economics. https://doi.org/10.2139/ssrn.2956334 General rights It is not permitted to download or to forward/distribute the text or part of it without the consent of the author(s) and/or copyright holder(s), other than for strictly personal, individual use, unless the work is under an open content license (like Creative Commons). Disclaimer/Complaints regulations If you believe that digital publication of certain material infringes any of your rights or (privacy) interests, please let the Library know, stating your reasons. In case of a legitimate complaint, the Library will make the material inaccessible and/or remove it from the website. Please Ask the Library: https://uba.uva.nl/en/contact, or a letter to: Library of the University of Amsterdam, Secretariat, Singel 425, 1012 WP Amsterdam, The Netherlands. You will be contacted as soon as possible. UvA-DARE is a service provided by the library of the University of Amsterdam (https://dare.uva.nl) Download date:23 Sep 2021 Distortion Risk Measures, ROC Curves, and Distortion Divergence Johannes M. Schumacher∗ October 18, 2017 Abstract Distortion functions are employed to define measures of risk. Receiver operating characteristic (ROC) curves are used to describe the performance of parametrized test families in testing a simple null hypothesis against a simple alternative.
    [Show full text]
  • An Axiomatic Foundation for the Expected Shortfall
    An Axiomatic Foundation for the Expected Shortfall Ruodu Wang∗ RiˇcardasZitikisy 12th March 2020z Abstract In the recent Basel Accords, the Expected Shortfall (ES) replaces the Value-at-Risk (VaR) as the standard risk measure for market risk in the banking sector, making it the most popular risk measure in financial regulation. Although ES is { in addition to many other nice properties { a coherent risk measure, it does not yet have an axiomatic foundation. In this paper we put forward four intuitive economic axioms for portfolio risk assessment { monotonicity, law invariance, prudence and no reward for concentration { that uniquely characterize the family of ES. The herein developed results, therefore, provide the first economic foundation for using ES as a globally dominating regulatory risk measure, currently employed in Basel III/IV. Key to the main results, several novel notions such as tail events and risk concentration naturally arise, and we explore them in detail. As a most important feature, ES rewards portfolio diversification and penalizes risk concentration in a special and intuitive way, not shared by any other risk measure. Keywords: Risk measure, Expected Shortfall, risk concentration, diversification, risk aggreg- ation. ∗Ruodu Wang: Department of Statistics and Actuarial Science, University of Waterloo, Waterloo, Ontario N2L 5A7, Canada. E-mail: [email protected] yRicardasˇ Zitikis: School of Mathematical and Statistical Sciences, University of Western Ontario, London, Ontario N6A 5B7, Canada. E-mail: [email protected] zThe
    [Show full text]
  • Messages from the Academic Literature on Risk Measurement for the Trading Book
    Basel Committee on Banking Supervision Working Paper No. 19 Messages from the academic literature on risk measurement for the trading book 31 January 2011 The Working Papers of the Basel Committee on Banking Supervision contain analysis carried out by experts of the Basel Committee or its working groups. They may also reflect work carried out by one or more member institutions or by its Secretariat. The subjects of the Working Papers are of topical interest to supervisors and are technical in character. The views expressed in the Working Papers are those of their authors and do not represent the official views of the Basel Committee, its member institutions or the BIS. Copies of publications are available from: Bank for International Settlements Communications CH-4002 Basel, Switzerland E-mail: [email protected] Fax: +41 61 280 9100 and +41 61 280 8100 © Bank for International Settlements 2011. All rights reserved. Limited extracts may be reproduced or translated provided the source is stated. ISSN: 1561-8854 Contents Executive summary ..................................................................................................................1 1. Selected lessons on VaR implementation..............................................................1 2. Incorporating liquidity .............................................................................................1 3. Risk measures .......................................................................................................2 4. Stress testing practices for market risk
    [Show full text]
  • Distortion Risk Measures Or the Transformation of Unimodal Distributions Into Multimodal Functions Dominique Guegan, Bertrand Hassani
    Distortion Risk Measures or the Transformation of Unimodal Distributions into Multimodal Functions Dominique Guegan, Bertrand Hassani To cite this version: Dominique Guegan, Bertrand Hassani. Distortion Risk Measures or the Transformation of Unimodal Distributions into Multimodal Functions. 2014. halshs-00969242 HAL Id: halshs-00969242 https://halshs.archives-ouvertes.fr/halshs-00969242 Submitted on 2 Apr 2014 HAL is a multi-disciplinary open access L’archive ouverte pluridisciplinaire HAL, est archive for the deposit and dissemination of sci- destinée au dépôt et à la diffusion de documents entific research documents, whether they are pub- scientifiques de niveau recherche, publiés ou non, lished or not. The documents may come from émanant des établissements d’enseignement et de teaching and research institutions in France or recherche français ou étrangers, des laboratoires abroad, or from public or private research centers. publics ou privés. Documents de Travail du Centre d’Economie de la Sorbonne Distortion Risk Measures or the Transformation of Unimodal Distributions into Multimodal Functions Dominique GUEGAN, Bertrand K. HASSANI 2014.08 Maison des Sciences Économiques, 106-112 boulevard de L'Hôpital, 75647 Paris Cedex 13 http://centredeconomiesorbonne.univ-paris1.fr/ ISSN : 1955-611X Distortion Risk Measures or the Transformation of Unimodal Distributions into Multimodal Functions D. Gu´egan,B. Hassani February 10, 2014 • Pr. Dr. Dominique Gu´egan:University Paris 1 Panth´eon-Sorbonne and NYU - Poly, New York, CES UMR 8174, 106 boulevard de l'Hopital 75647 Paris Cedex 13, France, phone: +33144078298, e-mail: [email protected]. • Dr. Bertrand K. Hassani: University Paris 1 Panth´eon-Sorbonne CES UMR 8174 and Grupo Santander, 106 boulevard de l'Hopital 75647 Paris Cedex 13, France, phone: +44 (0)2070860973, e-mail: [email protected].
    [Show full text]