Credit Risk Evaluation Modeling – Analysis – Management
Total Page:16
File Type:pdf, Size:1020Kb
Load more
Recommended publications
-
Basel Violations, Volatility Model Variants and Value at Risk: Optimization of Performance Deviations in Banks
Economics and Business Letters 10(3), 240-248, 2021 Basel violations, volatility model variants and value at risk: Optimization of performance deviations in banks Shahid Anjum1,* • Naveeda Qaseem2 1Universiti Teknologi Brunei, Brunei Darussalam 2University of Westminster, United Kingdom Received: 26 October 2020 Revised: 12 February 2021 Accepted: 14 February 2021 Abstract Basel penalties originate from VaR violations, based on the type of variance models and the estimation of VaR thresholds employed by the bank, which leads to excess regulatory capital charge held and can affect banks’ risk taking ability and thus its profitability. Different approaches are used to evaluate the performance of variance models and forecasts of the VaR thresholds i.e. hypothesis tests, back-testing procedures and Basel Accord regulatory calculations for penalty zones. A multi-criteria performance measure named as analytical hierarchy process has been introduced in this study. The approach helps in evaluating and selection of the optimal internal model based on performance evaluation techniques objectively which, in turn, may help in the selection of best volatility internal model for reduction in the VaR violations and thus leaving more capital in the hands of the banks for profitable ventures. Keywords: volatility models; value at risk estimation and performance measures; multivariate techniques; multi-criteria decision making JEL Classification Codes: G21, K23, M16, N20, O16 1. Introduction Volatility is the focus of market risk measurement, management and reporting exercises. Measuring the variance or volatility of the market portfolio is at the heart of measuring VaR threshold for the capital charge of market risk (Bhandari; 2010; Dardac et el.; 2011 & Mehta et el.; 2012) and thus banks’ compliance with Basel regulations (Anjum; 2018). -
A Guide to Presenting Clinical Prediction Models for Use in the Clinical Setting Laura J Bonnett, Kym IE Snell, Gary S Collins, Richard D Riley
A guide to presenting clinical prediction models for use in the clinical setting Laura J Bonnett, Kym IE Snell, Gary S Collins, Richard D Riley Addresses & positions Laura J Bonnett – University of Liverpool (Tenure Track Fellow); [email protected]; Department of Biostatistics, Waterhouse Building, Block F, 1-5 Brownlow Street, University of Liverpool, Liverpool, L69 3GL; https://orcid.org/0000-0002-6981-9212 Kym IE Snell – Centre for Prognosis Research, Research Institute for Primary Care and Health Sciences, Keele University (Research Fellow in Biostatistics); [email protected]; https://orcid.org/0000-0001- 9373-6591 Gary S Collins – Centre for Statistics in Medicine, Nuffield Department of Orthopaedics, Rheumatology and Musculoskeletal Sciences, University of Oxford (Professor of Medical Statistics); [email protected]; https://orcid.org/0000-0002-2772-2316 Richard D Riley – Centre for Prognosis Research, Research Institute for Primary Care and Health Sciences, Keele University (Professor of Biostatistics); [email protected]; https://orcid.org/0000-0001-8699-0735 Corresponding Author Correspondence to: [email protected] Standfirst Clinical prediction models estimate the risk of existing disease or future outcome for an individual, conditional on their values of multiple predictors such as age, sex and biomarkers. In this article, Bonnett and colleagues provide a guide to presenting clinical prediction models so that they can be implemented in practice, if appropriate. They describe how to create four presentation formats, and discuss the advantages and disadvantages of each. A key message is the need for stakeholder engagement to determine the best presentation option in relation to the clinical context of use and the intended user. -
Point-In-Time Versus Through-The-Cycle Ratings 1
Point-in-Time versus Through-the-Cycle Ratings 1 Authors: Scott D. Aguais, Lawrence R. Forest, Jr., Elaine Y. L. Wong, Diana Diaz-Ledezma 2 1 The authors would like to acknowledge the many Basel and credit risk related discussions they have had with various members of the Barclays Risk Management Team over the last year. The authors also want to thank, Tim Thompson, Julian Shaw and Brian Ranson for helpful comments. Moodys|KMV also deserves thanks for providing an historical data set of five-year KMV EDF term structures for all of their counterparties. Finally, we thank Zoran Stanisavljevic, for his un-ending credit data management support for our credit research and modelling efforts. All errors remain the responsibility of the authors. The views and opinions expressed in this chapter are those of the authors and do not necessarily reflect the views and opinions of Barclays Bank PLC. 2 Scott Aguais is Director and Head of Credit Risk Methodology. Lawrence Forest and Elaine Wong are Associate Directors, and Diana Diaz-Ledezma is a Manager, all on the Credit Risk Methodology Team. All four work at Barclays Capital, the Investment Banking unit of Barclays. Draft – For Discussion Purposes Only I. Introduction: To qualify for the advanced internal ratings-based (AIRB) status under the Basel II Capital Accord, “internationally active banks” (“banks”) must document the validity not only of their probability of default (PD) estimates but also of the Loss Given Default (LGD) and Exposure at Default (EAD) measures that they plan to use as alternatives to the foundation internal ratings-based (FIRB) assumptions. -
Systematic Risk of Cdos and CDO Arbitrage
Systematic risk of CDOs and CDO arbitrage Alfred Hamerle (University of Regensburg) Thilo Liebig (Deutsche Bundesbank) Hans-Jochen Schropp (University of Regensburg) Discussion Paper Series 2: Banking and Financial Studies No 13/2009 Discussion Papers represent the authors’ personal opinions and do not necessarily reflect the views of the Deutsche Bundesbank or its staff. Editorial Board: Heinz Herrmann Thilo Liebig Karl-Heinz Tödter Deutsche Bundesbank, Wilhelm-Epstein-Strasse 14, 60431 Frankfurt am Main, Postfach 10 06 02, 60006 Frankfurt am Main Tel +49 69 9566-0 Telex within Germany 41227, telex from abroad 414431 Please address all orders in writing to: Deutsche Bundesbank, Press and Public Relations Division, at the above address or via fax +49 69 9566-3077 Internet http://www.bundesbank.de Reproduction permitted only if source is stated. ISBN 978-3–86558–570–7 (Printversion) ISBN 978-3–86558–571–4 (Internetversion) Abstract “Arbitrage CDOs” have recorded an explosive growth during the years before the outbreak of the financial crisis. In the present paper we discuss potential sources of such arbitrage opportunities, in particular arbitrage gains due to mispricing. For this purpose we examine the risk profiles of Collateralized Debt Obligations (CDOs) in some detail. The analyses reveal significant differences in the risk profile between CDO tranches and corporate bonds, in particular concerning the considerably increased sensitivity to systematic risks. This has far- reaching consequences for risk management, pricing and regulatory capital requirements. A simple analytical valuation model based on the CAPM and the single-factor Merton model is used in order to keep the model framework simple. -
Hong Leong Finance Chooses Kamakura's Suite of Solutions For
Press Contacts: 2222 Kalakaua Avenue, 14th Floor Martin Zorn Honolulu, Hawaii 96815, USA President and COO Telephone 808 791 9888 1-808-791-9888, extension 8700 Fax 808 791 9898 [email protected] www.kamakuraco.com www.kamakuraco.com www.kris-online.com For Immediate Release Hong Leong Finance Singapore Chooses Kamakura’s Suite of Solutions for Asset and Liabilities Management, Funds Transfer Pricing, and Liquidity Management SINGAPORE, Nov 14, 2017 - Hong Leong Finance has signed an agreement to use Kamakura Corporation’s set of solutions for its balance sheet management, funds transfer pricing, and liquidity management processes. The Singapore-listed finance company, part of the Hong Leong Group Singapore, made its decision after doing a comprehensive upgrade exercise involving its risk management processes. “Hong Leong Finance was clear in its risk vision, and wanted to incorporate an integrated framework for balance sheet management, funds transfer pricing and liquidity management initiatives,” said Dr. Clement Ooi, Kamakura’s EVP and Managing Director of Asia-Pacific Operations. “They wanted to load the data once, and reconcile the data once. Kamakura Corporation’s solution fit the bill perfectly, providing the company with an accurate assessment of transaction cashflows that can then be used for gap, duration, and mismatch management, as well providing a crystal-clear understanding of the margins associated with transactions. The company now has an integrated, holistic computation engine—the platform upon which it will build its integrated risk framework. The company will be one of the first to generate both historical and forward-looking views of its balance sheet and gain a transaction-level understanding of profit margins. -
Transforming Summary Statistics from Logistic Regression to the Liability Scale: Application To
bioRxiv preprint doi: https://doi.org/10.1101/385740; this version posted August 6, 2018. The copyright holder for this preprint (which was not certified by peer review) is the author/funder, who has granted bioRxiv a license to display the preprint in perpetuity. It is made available under aCC-BY-NC-ND 4.0 International license. Transforming summary statistics from logistic regression to the liability scale: application to genetic and environmental risk scores Alexandra C. Gilletta, Evangelos Vassosa, Cathryn M. Lewisa, b* a. Social, Genetic & Developmental Psychiatry Centre, Institute of Psychiatry, Psychology & Neuroscience, King’s College London, London, UK b. Department of Medical and Molecular Genetics, Faculty of Life Sciences and Medicine, King’s College London, London, UK Short Title: Transforming the odds ratio to the liability scale *Corresponding Author Cathryn M. Lewis Institute of Psychiatry, Psychology & Neuroscience Social, Genetic & Developmental Psychiatry Centre King’s College London 16 De Crespigny Park Denmark Hill, London, SE5 8AF, UK Tel: +44(0)20 7848 0873 Fax: +44(0)20 7848 0866 E-mail: [email protected] Keywords Statistical genetics, risk, logistic regression, liability threshold model, schizophrenia Page 1 bioRxiv preprint doi: https://doi.org/10.1101/385740; this version posted August 6, 2018. The copyright holder for this preprint (which was not certified by peer review) is the author/funder, who has granted bioRxiv a license to display the preprint in perpetuity. It is made available under aCC-BY-NC-ND 4.0 International license. 1. Abstract 1.1. Objective Stratified medicine requires models of disease risk incorporating genetic and environmental factors. -
1. INTRODUCTION the Use of Clinical and Laboratory Data to Detect
1. INTRODUCTION The use of clinical and laboratory data to detect conditions and predict patient outcomes is a mainstay of medical practice. Classification and prediction are equally important in other fields of course (e.g., meteorology, economics, computer science) and have been subjects of statistical research for a long time. The field is currently receiving more attention in medicine, in part because of biotechnologic advancements that promise accurate non-invasive modes of testing. Technologies include gene expression arrays, protein mass spectrometry and new imaging modalities. These can be used for purposes such as detecting subclinical disease, evaluating the prognosis of patients with disease and predicting their responses to different choices of therapy. Statistical methods have been developed for assessing the accuracy of classifiers in medicine (Zhou, Obuchowski, and McClish 2002; Pepe 2003), although this is an area of statistics that is evolving rapidly. In practice there may be multiple sources of information available to assist in prediction. For example, clinical signs and symptoms of disease may be supplemented by results of laboratory tests. As another example, it is expected that multiple biomarkers will be needed for detecting subclinical cancer with adequate sensitivity and specificity (Pepe et al. 2001). The starting point for the research we describe in this paper is the need to combine multiple predictors together, somehow, in order to predict outcome. We note in Section 2 that for a binary outcome, D, the best classifiers using the predictors Y =(Y1,Y2,...,YP ) are based on the risk score function RS(Y )=P (D =1|Y ) (1) or equivalently any monotone increasing transformation of it. -
Support Vector Hazards Machine: a Counting Process Framework for Learning Risk Scores for Censored Outcomes
Journal of Machine Learning Research 17 (2016) 1-37 Submitted 1/16; Revised 7/16; Published 8/16 Support Vector Hazards Machine: A Counting Process Framework for Learning Risk Scores for Censored Outcomes Yuanjia Wang [email protected] Department of Biostatistics Mailman School of Public Health Columbia University New York, NY 10032, USA Tianle Chen [email protected] Biogen 300 Binney Street Cambridge, MA 02142, USA Donglin Zeng [email protected] Department of Biostatistics University of North Carolina at Chapel Hill Chapel Hill, NC 27599, USA Editor: Karsten Borgwardt Abstract Learning risk scores to predict dichotomous or continuous outcomes using machine learning approaches has been studied extensively. However, how to learn risk scores for time-to-event outcomes subject to right censoring has received little attention until recently. Existing approaches rely on inverse probability weighting or rank-based regression, which may be inefficient. In this paper, we develop a new support vector hazards machine (SVHM) approach to predict censored outcomes. Our method is based on predicting the counting process associated with the time-to-event outcomes among subjects at risk via a series of support vector machines. Introducing counting processes to represent time-to-event data leads to a connection between support vector machines in supervised learning and hazards regression in standard survival analysis. To account for different at risk populations at observed event times, a time-varying offset is used in estimating risk scores. The resulting optimization is a convex quadratic programming problem that can easily incorporate non- linearity using kernel trick. We demonstrate an interesting link from the profiled empirical risk function of SVHM to the Cox partial likelihood. -
Developing a Credit Risk Model Using SAS® Amos Taiwo Odeleye, TD Bank
Paper 3554-2019 Developing a Credit Risk Model Using SAS® Amos Taiwo Odeleye, TD Bank ABSTRACT A credit risk score is an analytical method of modeling the credit riskiness of individual borrowers (prospects and customers). While there are several generic, one-size-might-fit-all risk scores developed by vendors, there are numerous factors increasingly driving the development of in-house risk scores. This presentation introduces the audience to how to develop an in-house risk score using SAS®, reject inference methodology, and machine learning and data science methods. INTRODUCTION According to Frank H. Knight (1921), University of Chicago professor, risk can be thought of as any variability that can be quantified. Generally speaking, there are 4 steps in risk management, namely: Assess: Identify risk Quantify: Measure and estimate risk Manage: Avoid, transfer, mitigate, or eliminate Monitor: Evaluate the process and make necessary adjustment In the financial environment, risk can be classified in several ways. In 2017, I coined what I termed the C-L-O-M-O risk acronym: Credit Risk, e.g., default, recovery, collections, fraud Liquidity Risk, e.g., funding, refinancing, trading Operational Risk, e.g., bad actor, system or process interruption, natural disaster Market Risk, e.g., interest rate, foreign exchange, equity, commodity price Other Risk not already classified above e.g., Model risk, Reputational risk, Legal risk In the pursuit of international financial stability and cooperation, there are financial regulators at the global -
A Better Coefficient of Determination for Genetic Profile Analysis
A better coefficient of determination for genetic profile analysis Sang Hong Lee,1,* Michael E Goddard,2,3 Naomi R Wray,1,4 Peter M Visscher1,4,5 This is the peer reviewed version of the following article: Lee, S. H., M. E. Goddard, N. R. Wray and P. M. Visscher (2012). "A better coefficient of determination for genetic profile analysis." Genetic Epidemiology 36(3): 214-224, which has been published in final form at 10.1002/gepi.21614. This article may be used for non-commercial purposes in accordance with Wiley Terms and Conditions for self-archiving. 1Queensland Institute of Medical Research, 300 Herston Rd, Herston, QLD 4006, Australia; 2Biosciences Research Division, Department of Primary Industries, Victoria, Australia; 3Department of Agriculture and Food Systems, University of Melbourne, Melbourne, Australia; 4Queensland Brain Institute, University of Queensland, Brisbane, QLD 4072, Australia; 5Diamantina Institute, University of Queensland, Brisbane, QLD 4072, Australia Running head: A better coefficient of determination *Correspondence: Sang Hong Lee, 1Queensland Institute of Medical Research, 300 Herston Rd, Herston, QLD 4006, Australia. Tel: 61 7 3362 0168, [email protected] ABSTRACT Genome-wide association studies have facilitated the construction of risk predictors for disease from multiple SNP markers. The ability of such ‘genetic profiles’ to predict outcome is usually quantified in an independent data set. Coefficients of determination (R2) have been a useful measure to quantify the goodness-of-fit of the genetic profile. Various pseudo R2 measures for binary responses have been proposed. However, there is no standard or consensus measure because the concept of residual variance is not easily defined on the observed probability scale. -
Risk Management & Trading Conference
RiskMathics, aware that the most important factors to develop and consolidate the Financial WHO SHOULD ATTEND? Markets are training and promoting a high level financial culture, will host for the third time in The Risk Management & Trading Conference is aim at Practitioners directly or indirectly involved Mexico: “The Risk Management & Trading Conference”, which will have the participation of in areas of trading, risk management, regulation, technology, and research & development of leading authorities who have key roles in the global financial industry. Stock Exchanges, Brokers, Brokerage Houses, Banks, Institutional Investors (Pension Funds, OBJECTIVES Mutual Funds, Insurance Companies, etc.), Hedge Funds, and Independent Investors. One of the primary objectives of this Conference is to provide through Workshops, Presentations It will be of particular relevance to: and Round Table Discussions the latest advances in Risk Management, Trading, Technology • Chief Executive Officers of financial institutions and intermediaries and Market Regulation, and to transmit all this knowledge by local and international authorities • Traders in the field. • Risk Managers Some other objectives of this Conference are to explain and show in detail the current • Consultants situation and where the Global Financial Industry is heading, advances in Pricing, and how • Regulators intermediaries and direct or indirect participants of markets need to be prepared to remain • Technology managers and staff competitive in spite of the new challenges and paradigms -
Conference Overview
Conference Overview Attend the First International Model Risk Management Conference and hear from thought leaders and today’s leading practitioners working in the areas of model development, validation, and audit or using models and their outputs. Sessions include discussions of risk topics and challenges across a broad spectrum of industries. The Conference will also support the development of professionalism and best practices among Model Risk Management (MRM) practitioners working in the regulated financial industry across sectors and geographies. Through presentations and discussions, the Conference aims to strengthen industry and regulatory MRM practice, grow and develop the current and next generation of MRM practitioners and facilitate the incorporation of best practices across industries. Tailored to address critical issues arising from global financial services regulation as well as emerging issues, the Conference will provide a thorough understanding of regulatory requirements and their impact on the leading MRM practices. The Conference will achieve these outcomes by providing relevant, meaningful and useful content that can be taken away and put into practice. Audience participation and collaboration is encouraged. This two day Conference runs from 8:00 a.m. till late afternoon each day including lunch as well as morning and afternoon breaks. Opening remarks by Jamey Hubbs, Assistant Superintendent Deposit-Taking Supervision Sector, OSFI. Sponsors NETWORKING RECEPTION SPONSOR Scotiabank is Canada’s international bank and a leading financial services provider in North America, Latin America, the Caribbean and Central America, and parts of Asia. Scotiabank is helping our 21 million customers become better off through a broad range of advice, products and services, including personal and commercial banking, wealth management and private banking, corporate and investment banking.