Higher Moment Coherent Risk Measures∗

Total Page:16

File Type:pdf, Size:1020Kb

Higher Moment Coherent Risk Measures∗ Higher Moment Coherent Risk Measures∗ Pavlo A. Krokhmal Department of Mechanical and Industrial Engineering The University of Iowa, 2403 Seamans Center, Iowa City, IA 52242 E-mail: [email protected] April 2007 Abstract The paper considers modeling of risk-averse preferences in stochastic programming problems using risk mea- sures. We utilize the axiomatic foundation of coherent risk measures and deviation measures in order to develop simple representations that express risk measures via specially constructed stochastic programming problems. Us- ing the developed representations, we introduce a new family of higher-moment coherent risk measures (HMCR), which includes, as a special case, the Conditional Value-at-Risk measure. It is demonstrated that the HMCR mea- sures are compatible with the second order stochastic dominance and utility theory, can be efficiently implemented in stochastic optimization models, and perform well in portfolio optimization case studies. Keywords: Risk measures, stochastic programming, stochastic dominance, portfolio optimization 1 Introduction Research and practice of portfolio management and optimization is driven to a large extent by tailoring the mea- sures of reward (satisfaction) and risk (unsatisfaction/regret) of the investment venture to the specific preferences of an investor. While there exists a common consensus that an investment’s reward may be adequately associated with its expected return, the methods for proper modeling and measurement of an investment’s risk are subject to much more pondering and debate. In fact, the risk-reward or mean-risk models constitute an important part of the investment science subject and, more generally, the field of decision making under uncertainty. The cornerstone of modern portfolio analysis was set up by Markowitz (1952, 1959), who advocated identification of the portfolio’s risk with the volatility (variance) of its returns. On the other hand, Markowitz’s work led to formalization of the fundamental view that any decision under uncertainties may be evaluated in terms of its risk and reward. The seminal Markowitz’s ideas are still widely used today in many areas of decision making, and the entire paradigm of bi-criteria “risk-reward” optimization has received extensive development in both directions of increasing the computational efficiency and enhancing the models for risk measurement and estimation. At the same time, it has been recognized that the symmetric attitude of the classical Mean-Variance (MV) approach, where both the “positive” and “negative” deviations from the expected level are penalized equally, does not always yield an adequate estimation of risks induced by the uncertainties. Hence, significant effort has been devoted to the development of downside risk measures and models. Replacing the variance by the lower standard semi-deviation as a measure of investment risk so as to take into account only “negative” deviations from the expected level has been proposed as early as by Markowitz (1959); see also more recent works by Ogryczak and Ruszczynski´ (1999, 2001, 2002). ∗Supported in part by NSF grant DMI 0457473. 1 Among the popular downside risk models we mention the Lower Partial Moment and its special case, the Expected Regret, which is also known as Integrated Chance Constraint in stochastic programming (Bawa, 1975; Fishburn, 1977; Dembo and Rosen, 1999; Testuri and Uryasev, 2003; van der Vlerk, 2003). Widely known in finance and banking industry is the Value-at-Risk measure (JP Morgan, 1994; Jorion, 1997; Duffie and Pan, 1997). Being sim- ply a quantile of loss distribution, the Value-at-Risk (VaR) concept has its counterparts in stochastic optimization (probabilistic, or chance programming, see Prekopa,´ 1995), reliability theory, etc. Yet, minimization or control of risk using the VaR measure proved to be technically and methodologically difficult, mainly due to VaR’s notorious non-convexity as a function of the decision variables. A downside risk measure that circumvents the shortcomings of VaR while offering a similar quantile approach to estimation of risk is the Conditional Value-at-Risk measure (Rockafellar and Uryasev, 2000, 2002; Krokhmal et al., 2002a). Risk measures that are similar to CVaR and/or may coincide with it, are Expected Shortfall and Tail VaR (Acerbi and Tasche, 2002), see also Conditional Drawdown- at-Risk (Chekhlov et al., 2005; Krokhmal et al., 2002b). A simple yet effective risk measure closely related to CVaR is the so-called Maximum Loss, or Worst-Case Risk (Young, 1998; Krokhmal et al., 2002b), whose use in problems with uncertainties is also known as the robust optimization approach (see, e.g., Kouvelis and Yu, 1997). In the last few years, the formal theory of risk measures received a major impetus from the works of Artzner et al. (1999) and Delbaen (2002), who introduced an axiomatic approach to definition and construction of risk measures by developing the concept of coherent risk measures. Among the risk measures that satisfy the coherency properties, there are Conditional Value-at-Risk, Maximum Loss (Pflug, 2000; Acerbi and Tasche, 2002), coherent risk measures based on one-sided moments (Fischer, 2003), etc. Recently, Rockafellar et al. (2006) have extended the theory of risk measures to the case of deviation measures, and demonstrated a close relationship between the coherent risk measures and deviation measures; spectral measures of risk have been proposed by Acerbi (2002). An approach to decision making under uncertainty, different from the risk-reward paradigm, is embodied by the von Neumann and Morgenstern (vNM) utility theory, which exercises mathematically sound axiomatic description of preferences and construction of the corresponding decision strategies. Along with its numerous modifications and extensions, the vNM utility theory is widely adopted as a basic model of rational choice, especially in eco- nomics and social sciences (see, among others, Fishburn, 1970, 1988; Karni and Schmeidler, 1991, etc). Thus, substantial attention has been paid in the literature to the development of risk-reward optimization models and risk measures that are consistent with expected utility maximization. In particular, it has been shown that under certain conditions the Markovitz MV framework is consistent with the vNM theory (Kroll et al., 1984). Ogryczak and Ruszczynski´ (1999, 2001, 2002) developed mean-semideviation models that are consistent with stochastic dom- inance concepts (Fishburn, 1964; Rothschild and Stiglitz, 1970; Levy, 1998); a class of risk-reward models with SSD-consistent coherent risk measures was discussed in De Giorgi (2005). Optimization with stochastic dom- inance constraints was recently considered by Dentcheva and Ruszczynski´ (2003); stochastic dominance-based portfolio construction was discussed in Roman et al. (2006). In this paper we aim to offer an additional insight into the properties of axiomatically defined measures of risk by developing a number of representations that express risk measures via solutions of stochastic programming prob- lems (Section 2.1); using the developed representations, we construct a new family of higher-moment coherent risk (HMCR) measures. In Section 2.2 it is demonstrated that the suggested representations are amenable to seam- less incorporation into stochastic programming problems. In particular, implementation of the HMCR measures reduces to p-order conic programming, and can be approximated via linear programming. Section 2.3 shows that the developed results are applicable to deviation measures, while section 2.4 illustrates that the HMCR measures are compatible with the second-order stochastic dominance and utility theory. The conducted case study (Section 3) indicates that the family of HMCR measures has a strong potential for practical application in portfolio selection problems. Finally, the Appendix contains the proofs of the theorems introduced in the paper. 2 Modeling of risk measures as stochastic programs The discussion in the Introduction has illustrated the variety of approaches to definition and estimation of risk. Arguably, the recent advances in risk theory are associated with the axiomatic approach to construction of risk measures pioneered by Artzner et al. (1999). The present endeavor essentially exploits this axiomatic approach in 2 order to devise simple computational recipes for dealing with several types of risk measures by representing them in the form of stochastic programming problems. These representations can be used to create new risk measures to be tailored to specific risk preferences, as well as to incorporate these preferences into stochastic programming problems. In particular, we present a new family of Higher Moment Coherent Risk measures (HMCR). It will be shown that the HMCR measures are well-behaved in terms of theoretical properties, and demonstrate very promising performance in test applications. Within the axiomatic framework of risk analysis, risk measure R(X) of a random outcome X from some prob- ability space (, F , µ) may be defined as a mapping R : X R, where X is a linear space of F -measurable 7→ functions X : R. In a more general setting one may assume X to be a separated locally convex space; for 7→ our purposes it suffices to consider X Lp(, F , P), 1 p , where the particular value of p shall be clear = ≤ ≤ ∞ from the context. Traditionally to convex analysis, we call function f : X R proper if f (X) > for all 7→ −∞ X X and dom f ∅, i.e., there exists X X such that f (X) < (see, e.g.,
Recommended publications
  • Var and Other Risk Measures
    What is Risk? Risk Measures Methods of estimating risk measures Bibliography VaR and other Risk Measures Francisco Ramírez Calixto International Actuarial Association November 27th, 2018 Francisco Ramírez Calixto VaR and other Risk Measures What is Risk? Risk Measures Methods of estimating risk measures Bibliography Outline 1 What is Risk? 2 Risk Measures 3 Methods of estimating risk measures Francisco Ramírez Calixto VaR and other Risk Measures What is Risk? Risk Measures Methods of estimating risk measures Bibliography What is Risk? Risk 6= size of loss or size of a cost Risk lies in the unexpected losses. Francisco Ramírez Calixto VaR and other Risk Measures What is Risk? Risk Measures Methods of estimating risk measures Bibliography Types of Financial Risk In Basel III, there are three major broad risk categories: Credit Risk: Francisco Ramírez Calixto VaR and other Risk Measures What is Risk? Risk Measures Methods of estimating risk measures Bibliography Types of Financial Risk Operational Risk: Francisco Ramírez Calixto VaR and other Risk Measures What is Risk? Risk Measures Methods of estimating risk measures Bibliography Types of Financial Risk Market risk: Each one of these risks must be measured in order to allocate economic capital as a buer so that if a catastrophic event happens, the bank won't go bankrupt. Francisco Ramírez Calixto VaR and other Risk Measures What is Risk? Risk Measures Methods of estimating risk measures Bibliography Risk Measures Def. A risk measure is used to determine the amount of an asset or assets (traditionally currency) to be kept in reserve in order to cover for unexpected losses.
    [Show full text]
  • Estimation and Decomposition of Downside Risk for Portfolios with Non-Normal Returns
    The Journal of Risk (79–103) Volume 11/Number 2, Winter 2008/09 Estimation and decomposition of downside risk for portfolios with non-normal returns Kris Boudt Faculty of Business and Economics, Katholieke Universiteit Leuven and Lessius University College, 69 Naamsestraat, B-3000 Leuven, Belgium; email: [email protected] Brian Peterson Diamond Management & Technology Consultants, Chicago, IL; email: [email protected] Christophe Croux Faculty of Business and Economics, Katholieke Universiteit Leuven and Lessius University College, 69 Naamsestraat, B-3000 Leuven, Belgium; email: [email protected] We propose a new estimator for expected shortfall that uses asymptotic expansions to account for the asymmetry and heavy tails in financial returns. We provide all the necessary formulas for decomposing estimators of value-at-risk and expected shortfall based on asymptotic expansions and show that this new methodology is very useful for analyzing and predicting the risk properties of portfolios of alternative investments. 1 INTRODUCTION Value-at-risk (VaR) and expected shortfall (ES) have emerged as industry standards for measuring downside risk. Despite the variety of complex estimation methods based on Monte Carlo simulation, extreme value theory and quantile regression proposed in the literature (see Kuester et al (2006) for a review), many practitioners either use the empirical or the Gaussian distribution function to predict portfolio downside risk. The potential advantage of using the empirical distribution function over the hypothetical Gaussian distribution function is that only the information in the return series is used to estimate downside risk, without any distributional assumptions. The disadvantage is that the resulting estimates of VaR and ES, called historical VaR and ES, typically have a larger variation from out-of-sample obser- vations than those based on a correctly specified parametric class of distribution functions.
    [Show full text]
  • A Framework for Dynamic Hedging Under Convex Risk Measures
    A Framework for Dynamic Hedging under Convex Risk Measures Antoine Toussaint∗ Ronnie Sircary November 2008; revised August 2009 Abstract We consider the problem of minimizing the risk of a financial position (hedging) in an incomplete market. It is well-known that the industry standard for risk measure, the Value- at-Risk, does not take into account the natural idea that risk should be minimized through diversification. This observation led to the recent theory of coherent and convex risk measures. But, as a theory on bounded financial positions, it is not ideally suited for the problem of hedging because simple strategies such as buy-hold strategies may not be bounded. Therefore, we propose as an alternative to use convex risk measures defined as functionals on L2 (or by simple extension Lp, p > 1). This framework is more suitable for optimal hedging with L2 valued financial markets. A dual representation is given for this minimum risk or market adjusted risk when the risk measure is real-valued. In the general case, we introduce constrained hedging and prove that the market adjusted risk is still a L2 convex risk measure and the existence of the optimal hedge. We illustrate the practical advantage in the shortfall risk measure by showing how minimizing risk in this framework can lead to a HJB equation and we give an example of computation in a stochastic volatility model with the shortfall risk measure 1 Introduction We are interested in the problem of hedging in an incomplete market: an investor decides to buy a contract with a non-replicable payoff X at time T but has the opportunity to invest in a financial market to cover his risk.
    [Show full text]
  • Capturing Downside Risk in Financial Markets: the Case of the Asian Crisis
    View metadata, citation and similar papers at core.ac.uk brought to you by CORE provided by Research Papers in Economics Journal of International Money and Finance 18 (1999) 853–870 www.elsevier.nl/locate/econbase Capturing downside risk in financial markets: the case of the Asian Crisis Rachel A.J. Pownall *, Kees G. Koedijk Faculty of Business Administration, Financial Management, Erasmus University Rotterdam, 3000 DR Rotterdam, The Netherlands and CEPR Abstract Using data on Asian equity markets, we observe that during periods of financial turmoil, deviations from the mean-variance framework become more severe, resulting in periods with additional downside risk to investors. Current risk management techniques failing to take this additional downside risk into account will underestimate the true Value-at-Risk with greater severity during periods of financial turnoil. We provide a conditional approach to the Value- at-Risk methodology, known as conditional VaR-x, which to capture the time variation of non-normalities allows for additional tail fatness in the distribution of expected returns. These conditional VaR-x estimates are then compared to those based on the RiskMetrics method- ology from J.P. Morgan, where we find that the model provides improved forecasts of the Value-at-Risk. We are therefore able to show that our conditional VaR-x estimates are better able to capture the nature of downside risk, particularly crucial in times of financial crises. 1999 Elsevier Science Ltd. All rights reserved. Keywords: Financial regulation; Value-at-risk; Riskmetrics; Extreme value theory 1. Introduction A number of Asian economies have recently been characterized by highly volatile financial markets, which when coupled with high returns, should have been seen as an attractive avenue down which one could diversify portfolios.
    [Show full text]
  • Divergence-Based Risk Measures: a Discussion on Sensitivities and Extensions
    entropy Article Divergence-Based Risk Measures: A Discussion on Sensitivities and Extensions Meng Xu 1 and José M. Angulo 2,* 1 School of Economics, Sichuan University, Chengdu 610065, China 2 Department of Statistics and Operations Research, University of Granada, 18071 Granada, Spain * Correspondence: [email protected]; Tel.: +34-958-240492 Received: 13 June 2019; Accepted: 24 June 2019; Published: 27 June 2019 Abstract: This paper introduces a new family of the convex divergence-based risk measure by specifying (h, f)-divergence, corresponding with the dual representation. First, the sensitivity characteristics of the modified divergence risk measure with respect to profit and loss (P&L) and the reference probability in the penalty term are discussed, in view of the certainty equivalent and robust statistics. Secondly, a similar sensitivity property of (h, f)-divergence risk measure with respect to P&L is shown, and boundedness by the analytic risk measure is proved. Numerical studies designed for Rényi- and Tsallis-divergence risk measure are provided. This new family integrates a wide spectrum of divergence risk measures and relates to divergence preferences. Keywords: convex risk measure; preference; sensitivity analysis; ambiguity; f-divergence 1. Introduction In the last two decades, there has been a substantial development of a well-founded risk measure theory, particularly propelled since the axiomatic approach introduced by [1] in relation to the concept of coherency. While, to a large extent, the theory has been fundamentally inspired and motivated with financial risk assessment objectives in perspective, many other areas of application are currently or potentially benefited by the formal mathematical construction of the discipline.
    [Show full text]
  • G-Expectations with Application to Risk Measures
    g-Expectations with application to risk measures Sonja C. Offwood Programme in Advanced Mathematics of Finance, University of the Witwatersrand, Johannesburg. 5 August 2012 Abstract Peng introduced a typical filtration consistent nonlinear expectation, called a g-expectation in [40]. It satisfies all properties of the classical mathematical ex- pectation besides the linearity. Peng's conditional g-expectation is a solution to a backward stochastic differential equation (BSDE) within the classical framework of It^o'scalculus, with terminal condition given at some fixed time T . In addition, this g-expectation is uniquely specified by a real function g satisfying certain properties. Many properties of the g-expectation, which will be presented, follow from the spec- ification of this function. Martingales, super- and submartingales have been defined in the nonlinear setting of g-expectations. Consequently, a nonlinear Doob-Meyer decomposition theorem was proved. Applications of g-expectations in the mathematical financial world have also been of great interest. g-Expectations have been applied to the pricing of contin- gent claims in the financial market, as well as to risk measures. Risk measures were introduced to quantify the riskiness of any financial position. They also give an indi- cation as to which positions carry an acceptable amount of risk and which positions do not. Coherent risk measures and convex risk measures will be examined. These risk measures were extended into a nonlinear setting using the g-expectation. In many cases due to intermediate cashflows, we want to work with a multi-period, dy- namic risk measure. Conditional g-expectations were then used to extend dynamic risk measures into the nonlinear setting.
    [Show full text]
  • Applied Quantitative Finance
    Applied Quantitative Finance Wolfgang H¨ardle Torsten Kleinow Gerhard Stahl In cooperation with G¨okhanAydınlı, Oliver Jim Blaskowitz, Song Xi Chen, Matthias Fengler, J¨urgenFranke, Christoph Frisch, Helmut Herwartz, Harriet Holzberger, Steffi H¨ose, Stefan Huschens, Kim Huynh, Stefan R. Jaschke, Yuze Jiang Pierre Kervella, R¨udigerKiesel, Germar Kn¨ochlein, Sven Knoth, Jens L¨ussem,Danilo Mercurio, Marlene M¨uller,J¨ornRank, Peter Schmidt, Rainer Schulz, J¨urgenSchumacher, Thomas Siegl, Robert Wania, Axel Werwatz, Jun Zheng June 20, 2002 Contents Preface xv Contributors xix Frequently Used Notation xxi I Value at Risk 1 1 Approximating Value at Risk in Conditional Gaussian Models 3 Stefan R. Jaschke and Yuze Jiang 1.1 Introduction . 3 1.1.1 The Practical Need . 3 1.1.2 Statistical Modeling for VaR . 4 1.1.3 VaR Approximations . 6 1.1.4 Pros and Cons of Delta-Gamma Approximations . 7 1.2 General Properties of Delta-Gamma-Normal Models . 8 1.3 Cornish-Fisher Approximations . 12 1.3.1 Derivation . 12 1.3.2 Properties . 15 1.4 Fourier Inversion . 16 iv Contents 1.4.1 Error Analysis . 16 1.4.2 Tail Behavior . 20 1.4.3 Inversion of the cdf minus the Gaussian Approximation 21 1.5 Variance Reduction Techniques in Monte-Carlo Simulation . 24 1.5.1 Monte-Carlo Sampling Method . 24 1.5.2 Partial Monte-Carlo with Importance Sampling . 28 1.5.3 XploRe Examples . 30 2 Applications of Copulas for the Calculation of Value-at-Risk 35 J¨ornRank and Thomas Siegl 2.1 Copulas . 36 2.1.1 Definition .
    [Show full text]
  • Return to Riskmetrics: the Evolution of a Standard
    Return to RiskMetrics: The Evolution of a Standard Jorge Mina and Jerry Yi Xiao Introduction by Christopher C. Finger RiskMetrics www.riskmetrics.com 44 Wall St. New York, NY 10005 Return to RiskMetrics: The Evolution of a Standard, April 2001. Copyright © 2001 RiskMetrics Group, Inc. All rights reserved. Certain risk measurement technology of RiskMetrics Group, Inc. is patent pending. RiskMetrics, RiskManager, CreditMetrics, CreditManager, LongRun, CorporateMetrics, and DataMetrics, among others, are trademarks or service marks owned by or licensed to RiskMetrics Group, Inc. in the United States and other countries. RiskMetrics Group, Inc. disclaims any and all warranties as to any results to be obtained from any use of the content or methodology provided by RiskMetrics Group, Inc. or any representation or warranty that such content or methodology are in any way guidance for any investor or investors in general to determine the suitability or desirability of the investment in a particular security, or securities in general. RiskMetrics Group, Inc. does not guarantee the sequence, timeliness, accuracy, completeness, or continued availability of such content or methodology. Such content and methodology are based on historical observations and should not be relied upon to predict future market movements. The information contained in this document is believed to be reliable, but RiskMetrics Group does not guarantee its completeness or accuracy. Opinions and estimates constitute our judgment and are subject to change without notice. Foreword This document is an update and restatement of the mathematical models in the 1996 RiskMetrics Technical Document, now known as RiskMetrics Classic. RiskMetrics Classic was the fourth edition, with the original document having been published in 1994.
    [Show full text]
  • Loss-Based Risk Measures Rama Cont, Romain Deguest, Xuedong He
    Loss-Based Risk Measures Rama Cont, Romain Deguest, Xuedong He To cite this version: Rama Cont, Romain Deguest, Xuedong He. Loss-Based Risk Measures. 2011. hal-00629929 HAL Id: hal-00629929 https://hal.archives-ouvertes.fr/hal-00629929 Submitted on 7 Oct 2011 HAL is a multi-disciplinary open access L’archive ouverte pluridisciplinaire HAL, est archive for the deposit and dissemination of sci- destinée au dépôt et à la diffusion de documents entific research documents, whether they are pub- scientifiques de niveau recherche, publiés ou non, lished or not. The documents may come from émanant des établissements d’enseignement et de teaching and research institutions in France or recherche français ou étrangers, des laboratoires abroad, or from public or private research centers. publics ou privés. Loss-based risk measures Rama CONT1,3, Romain DEGUEST 2 and Xue Dong HE3 1) Laboratoire de Probabilit´es et Mod`eles Al´eatoires CNRS- Universit´ePierre et Marie Curie, France. 2) EDHEC Risk Institute, Nice (France). 3) IEOR Dept, Columbia University, New York. 2011 Abstract Starting from the requirement that risk measures of financial portfolios should be based on their losses, not their gains, we define the notion of loss-based risk measure and study the properties of this class of risk measures. We charac- terize loss-based risk measures by a representation theorem and give examples of such risk measures. We then discuss the statistical robustness of estimators of loss-based risk measures: we provide a general criterion for qualitative ro- bustness of risk estimators and compare this criterion with sensitivity analysis of estimators based on influence functions.
    [Show full text]
  • A Discussion on Recent Risk Measures with Application to Credit Risk: Calculating Risk Contributions and Identifying Risk Concentrations
    risks Article A Discussion on Recent Risk Measures with Application to Credit Risk: Calculating Risk Contributions and Identifying Risk Concentrations Matthias Fischer 1,*, Thorsten Moser 2 and Marius Pfeuffer 1 1 Lehrstuhl für Statistik und Ökonometrie, Universität Erlangen-Nürnberg, Lange Gasse 20, 90403 Nürnberg, Germany; [email protected] 2 Risikocontrolling Kapitalanlagen, R+V Lebensverischerung AG, Raiffeisenplatz 1, 65189 Wiesbaden, Germany; [email protected] * Correspondence: matthias.fi[email protected] Received: 11 October 2018; Accepted: 30 November 2018; Published: 7 December 2018 Abstract: In both financial theory and practice, Value-at-risk (VaR) has become the predominant risk measure in the last two decades. Nevertheless, there is a lively and controverse on-going discussion about possible alternatives. Against this background, our first objective is to provide a current overview of related competitors with the focus on credit risk management which includes definition, references, striking properties and classification. The second part is dedicated to the measurement of risk concentrations of credit portfolios. Typically, credit portfolio models are used to calculate the overall risk (measure) of a portfolio. Subsequently, Euler’s allocation scheme is applied to break the portfolio risk down to single counterparties (or different subportfolios) in order to identify risk concentrations. We first carry together the Euler formulae for the risk measures under consideration. In two cases (Median Shortfall and Range-VaR), explicit formulae are presented for the first time. Afterwards, we present a comprehensive study for a benchmark portfolio according to Duellmann and Masschelein (2007) and nine different risk measures in conjunction with the Euler allocation. It is empirically shown that—in principle—all risk measures are capable of identifying both sectoral and single-name concentration.
    [Show full text]
  • A Framework for Time-Consistent, Risk-Averse Model Predictive Control: Theory and Algorithms
    A Framework for Time-Consistent, Risk-Averse Model Predictive Control: Theory and Algorithms Yin-Lam Chow, Marco Pavone Abstract— In this paper we present a framework for risk- planner conservatism and the risk of infeasibility (clearly this averse model predictive control (MPC) of linear systems af- can not be achieved with a worst-case approach). Second, fected by multiplicative uncertainty. Our key innovation is risk-aversion allows the control designer to increase policy to consider time-consistent, dynamic risk metrics as objective functions to be minimized. This framework is axiomatically robustness by limiting confidence in the model. Third, risk- justified in terms of time-consistency of risk preferences, is aversion serves to prevent rare undesirable events. Finally, in amenable to dynamic optimization, and is unifying in the sense a reinforcement learning framework when the world model that it captures a full range of risk assessments from risk- is not accurately known, a risk-averse agent can cautiously neutral to worst case. Within this framework, we propose and balance exploration versus exploitation for fast convergence analyze an online risk-averse MPC algorithm that is provably stabilizing. Furthermore, by exploiting the dual representation and to avoid “bad” states that could potentially lead to a of time-consistent, dynamic risk metrics, we cast the computa- catastrophic failure [9], [10]. tion of the MPC control law as a convex optimization problem Inclusion of risk-aversion in MPC is difficult for two main amenable to implementation on embedded systems. Simulation reasons. First, it appears to be difficult to model risk in results are presented and discussed.
    [Show full text]
  • Evaluating the Riskmetrics Methodology in Measuring Volatility
    Evaluating the RiskMetrics Methodology in Measuring Volatility and Value-at-Risk in Financial Markets Szil´ard Pafka a,1, Imre Kondor a,b,2 aDepartment of Physics of Complex Systems, E¨otv¨os University, P´azm´any P. s´et´any 1/a, H-1117 Budapest, Hungary bMarket Risk Research Department, Raiffeisen Bank, Akad´emia u. 6, H-1054 Budapest, Hungary Abstract We analyze the performance of RiskMetrics, a widely used methodology for mea- suring market risk. Based on the assumption of normally distributed returns, the RiskMetrics model completely ignores the presence of fat tails in the distribution function, which is an important feature of financial data. Nevertheless, it was com- monly found that RiskMetrics performs satisfactorily well, and therefore the tech- nique has become widely used in the financial industry. We find, however, that the success of RiskMetrics is the artifact of the choice of the risk measure. First, the outstanding performance of volatility estimates is basically due to the choice of a very short (one-period ahead) forecasting horizon. Second, the satisfactory perfor- mance in obtaining Value-at-Risk by simply multiplying volatility with a constant factor is mainly due to the choice of the particular significance level. Key words: RiskMetrics, market risk, risk measurement, volatility, Value-at-Risk 1 Introduction arXiv:cond-mat/0103107v1 [cond-mat.stat-mech] 5 Mar 2001 Risk management is one of the top priorities in the financial industry today. A huge effort is being invested into developing reliable risk measurement meth- ods and sound risk management techniques by academics and practitioners alike.
    [Show full text]