Quantifying Operational Risk: Possibilities and Limitations
Total Page:16
File Type:pdf, Size:1020Kb
Quantifying Operational Risk: Possibilities and Limitations Hansj¨org Furrer ETH Z¨urich [email protected]/∼hjfurrer (Based on joint work with Paul Embrechts and Roger Kaufmann) Deutsche Bundesbank Training Centre, Eltville 19-20 March 2004 Managing OpRisk Contents A. The New Accord (Basel II) B. Risk measurement methods for OpRisks C. Advanced Measurement Approaches (AMA) D. Conclusions E. References Managing OpRisk 1 A. The New Accord (Basel II) • 1988: Basel Accord (Basel I): minimum capital requirements against credit risk. One standardised approach • 1996: Amendment to Basel I: market risk. • 1999: First Consultative Paper on the New Accord (Basel II). • 2003: CP3: Third Consultative Paper on the New Basel Capital Accord. (www.bis.org/bcbs/bcbscp3.htmcp3) • mid 2004: Revision of CP3 • end of 2006: full implementation of Basel II ([9]) Managing OpRisk 2 What’s new? • Rationale for the New Accord: More flexibility and risk sensitivity • Structure of the New Accord: Three-pillar framework: Pillar 1: minimal capital requirements (risk measurement) Pillar 2: supervisory review of capital adequacy Pillar 3: public disclosure Managing OpRisk 3 What’s new? (cont’d) • Two options for the measurement of credit risk: Standard approach Internal rating based approach (IRB) • Pillar 1 sets out the minimum capital requirements: total amount of capital ≥ 8% risk-weighted assets • MRC (minimum regulatory capital)def= 8% of risk-weighted assets • New regulatory capital approach for operational risk (the risk of losses resulting from inadequate or failed internal processes, people and systems, or external events) Managing OpRisk 4 What’s new? (cont’d) • Notation: COP: capital charge for operational risk • Target: COP ≈ 12% of MRC • Estimated total losses in the US (2001): $50b • Some examples 1977: Credit Suisse Chiasso-affair 1995: Nick Leeson/Barings Bank, £1.3b 2001: Enron (largest US bankruptcy so far) 2003: Banque Cantonale de Vaudoise, KBV Winterthur Managing OpRisk 5 B. Risk measurement methods for OpRisks Pillar 1 regulatory minimal capital requirements for OpRisk: Three distinct approaches: Basic Indicator Approach Standardised Approach Advanced Measurement Approaches (AMA) Managing OpRisk 6 Basic Indicator Approach • Capital charge: BIA COP = α × GI BIA • COP : capital charge under the Basic Indicator Approach • GI: average annual gross income over the previous three years • α = 15% (set by the Committee) Managing OpRisk 7 Standardised Approach • Similar to the BIA, but on the level of each business line: 8 SA X COP = βi × GIi i=1 βi ∈ [12%, 18%], i = 1, 2,..., 8. • 8 business lines: Corporate finance Payment & Settlement Trading & sales Agency Services Retail banking Asset management Commercial banking Retail brokerage Managing OpRisk 8 C. Advanced Measurement Approaches (AMA) • Allows banks to use their own methods for assessing their exposure to OpRisk • Preconditions: Bank must meet - qualitative and - quantitative standards before using the AMA • Risk mitigation via insurance allowed Managing OpRisk 9 Quantitative standards • The Committee does not stipulate which (analytical) approach banks must use • However, banks must demonstrate that their approach captures severe tail loss events, e.g 99.9% VaR • In essence, MRC = EL + UL If ELs are already captured in a bank’s internal business practices, then MRC = UL Managing OpRisk 10 Quantitative standards (cont’d) • Internal data: Banks must record internal loss data • Internally generated OpRisk measures must be based on a five-year observation period • External data: A bank’s OpRisk measurement system must also include external data Managing OpRisk 11 Some internal data type 1 type 2 20 40 15 30 10 20 5 10 0 0 1992 1994 1996 1998 2000 2002 1992 1994 1996 1998 2000 2002 type 3 pooled operational losses 40 8 30 6 20 4 10 2 0 0 1992 1994 1996 1998 2000 2002 1992 1994 1996 1998 2000 2002 Managing OpRisk 12 Modeling issues • Stylized facts about OP risk losses - Loss occurrence times are irregularly spaced in time (selection bias, economic cycles, regulation, management interactions,. ) - Loss amounts show extremes • Large losses are of main concern! • Repetitive vs non-repetitive losses • Red flag: Are observations in line with modeling assumptions? • Example: “iid” assumption implies NO structural changes in the data as time evolves Irrelevance of which loss is denoted X1, which one X2,. Managing OpRisk 13 A mathematical (actuarial) model • OpRisk loss database n (t,k) X` , t ∈ {T − n + 1,...,T − 1,T } (years), k ∈ {1, 2,..., 7} (loss event type), o ` ∈ {1, 2,...,N (t,k)} (number of losses) • Loss event types: Internal fraud External fraud Employment practices and workplace safety Clients, products & business practices Damage to physical assets Business disruption and system failures Execution, delivery & process management Managing OpRisk 14 A mathematical (actuarial) model (cont’d) • Truncation (t,k) (t,k)1 X` = X` (t,k) (t,k) X` >d and (random) censoring • A further index j, j ∈ {1,..., 8} indicating business line can be introduced (suppressed for this talk) (T +1,k) • Estimate a risk measure for FL(T +1,k)(x) = P L ≤ x like Op-VaR(T +1,k) = F ← (α) α L(T +1,k) T +1 h (T +1,k) (T +1,k) (T +1,k)i Op-ESα = E L |L > Op-VaRα where N(T +1,k) (T +1,k) X (T +1,k) • L = X` : OpRisk loss for loss event type k `=1 over the period [T,T + 1]. Managing OpRisk 15 A mathematical (actuarial) model (cont’d) • Discussion: Recall the stylized facts - X’s are heavy-tailed - N shows non-stationarity • Conclusions: - FX(x) and FL(t,k)(x) difficult to estimate - In-sample estimation of VaRα (α large) almost impossible! - actuarial tools may be useful: ∗ Approximation (translated gamma/lognormal) ∗ Inversion methods (FFT) ∗ Recursive methods (Panjer) ∗ Simulation ∗ Extreme Value Theory (EVT) Managing OpRisk 16 How accurate are VaR-estimates? • Make inference about the tail decay of the aggregate loss L(t,k) via the tail decay of the individual losses X(t,k): N X −α L = X` , 1 − FX(x) ∼ x h(x) , x → ∞ `=1 −α § 1 − FL(x) ∼ E[N] x h(x) , x → ∞. • Assumptions: (Xm) iid ∼ F and for some ξ, β and u large Fu(x) := P[X − u ≤ x|X > u] = Gξ,β(u)(x) where −1/ξ ξx 1 − 1 + β(u) , ξ 6= 0, Gξ,β(x) = 1 − e−x/β , ξ = 0. Managing OpRisk 17 How accurate are VaR-estimates? (cont’d) • Tail- and quantile estimate: −1/ξˆ Nu ˆx − u 1 − FˆX(x) = 1 + ξ , x > u. n βˆ (1) ˆ ξˆ β Nu VaRd α =q ˆα = u − 1 − ξˆ n(1 − α) • Idea: Comparison of estimated quantiles with the corresponding theoretical ones by means of a simulation study ([8]). Managing OpRisk 18 How accurate are VaR-estimates? (cont’d) • Simulation procedure: Choose F and fix α0 < α < 1, Nu ∈ {25, 50, 100, 200} (Nu: # of data points above u) Calculate u = qα0 and the true value of the quantile qα Sample Nu independent points of F above u by the rejection method. Record the total number n of sampled points this requires Estimate ξ, β by fitting the GPD to the Nu exceedances over u by means of MLE. Determine qˆα according to (1) Repeat N times the above to arrive at estimates of Bias(ˆqα) and SE(ˆqα) Managing OpRisk 19 How accurate are VaR-estimates? (cont’d) • Accuracy of the quantile estimate expressed in terms of bias and standard error: 21/2 Bias(ˆqα) = E[ˆqα − qα], SE(ˆqα) = E (ˆqα − qα) N N 1 X 1 X 1/2 Bias\(ˆq ) = qˆj − q , SE\(ˆq ) = (ˆqj − q )2 α N α α α N α α j=1 j=1 • For comparison purposes (different distributions) introduce Bias\(ˆq ) SE\(ˆq ) Percentage Bias := α , Percentage SE := α qα qα Managing OpRisk 20 How accurate are VaR-estimates? (cont’d) • Criterion for a “good estimate”: Percentage Bias and Percentage SE should be small, e.g. α = 0.99 α = 0.999 Percentage Bias ≤ 0.05 ≤ 0.10 Percentage SE ≤ 0.30 ≤ 0.60 Managing OpRisk 21 −−θθ Example: Pareto distribution 11 −− FFXX((xx)) = = xx , θθ == 2 2 ← u = F (xq) α Goodness of VaRd α 0.99 A minimum number of 100 exceedances (corresponding to 333 observations) is required to ensure accuracy wrt bias and standard error. q = 0.7 0.999 A minimum number of 200 exceedances (corresponding to 667 observations) is required to ensure accuracy wrt bias and standard error. 0.99 Full accuracy can be achieved with the minimum number 25 of exceedances (corresponding to 250 observations). q = 0.9 0.999 A minimum number of 100 exceedances (corresponding to 1000 observations) is required to ensure accuracy wrt bias and standard error. Managing OpRisk 22 −−θθ Example: Pareto distribution 11 −− FFXX((xx)) = = xx , θθ == 1 1 ← u = F (xq) α Goodness of VaRd α 0.99 For all number of exceedances up to 200 (corresponding to a minimum of 667 observations) the VaR estimates fail to meet the accuracy criteria. q = 0.7 0.999 For all number of exceedances up to 200 (corresponding to a minimum of 667 observations) the VaR estimates fail to meet the accuracy criteria. 0.99 A minimum number of 100 exceedances (corresponding to 1000 observations) is required to ensure accuracy wrt bias and standard error. q = 0.9 0.999 A minimum number of 200 exceedances (corresponding to 2000 observations) is required to ensure accuracy wrt bias and standard error. Managing OpRisk 23 How accurate are VaR-estimates? (cont’d) • Large number of observations necessary to achieve targeted accuracy. • Minimum number of observations increases as the tails become thicker ([8]). • Remember: The simulation study was done under idealistic assumptions. OpRisk losses, however, typically do NOT fulfil these assumptions.