Quantifying Operational : Possibilities and Limitations

Hansj¨org Furrer ETH Z¨urich [email protected]/∼hjfurrer

(Based on joint work with Paul Embrechts and Roger Kaufmann)

Deutsche Bundesbank Training Centre, Eltville 19-20 March 2004

Managing OpRisk Contents

A. The New Accord ( II)

B. Risk measurement methods for OpRisks

C. Advanced Measurement Approaches (AMA)

D. Conclusions

E. References

Managing OpRisk 1 A. The New Accord (Basel II)

• 1988: Basel Accord (Basel I): minimum capital requirements against . One standardised approach

• 1996: Amendment to Basel I: .

• 1999: First Consultative Paper on the New Accord (Basel II).

• 2003: CP3: Third Consultative Paper on the New Basel Capital Accord. (www.bis.org/bcbs/bcbscp3.htmcp3)

• mid 2004: Revision of CP3

• end of 2006: full implementation of Basel II ([9])

Managing OpRisk 2 What’s new?

• Rationale for the New Accord: More flexibility and risk sensitivity

• Structure of the New Accord: Three-pillar framework:

– Pillar 1: minimal capital requirements (risk measurement)

— Pillar 2: supervisory review of capital adequacy

˜ Pillar 3: public disclosure

Managing OpRisk 3 What’s new? (cont’d)

• Two options for the measurement of credit risk:

Standard approach Internal rating based approach (IRB)

• Pillar 1 sets out the minimum capital requirements: total amount of capital ≥ 8% risk-weighted assets

• MRC (minimum regulatory capital)def= 8% of risk-weighted assets

• New regulatory capital approach for (the risk of losses resulting from inadequate or failed internal processes, people and systems, or external events)

Managing OpRisk 4 What’s new? (cont’d)

• Notation: COP: capital charge for operational risk

• Target: COP ≈ 12% of MRC

• Estimated total losses in the US (2001): $50b

• Some examples

1977: Credit Suisse Chiasso-affair 1995: Nick Leeson/Barings , £1.3b 2001: Enron (largest US bankruptcy so far) 2003: Banque Cantonale de Vaudoise, KBV Winterthur

Managing OpRisk 5 B. Risk measurement methods for OpRisks

Pillar 1 regulatory minimal capital requirements for OpRisk: Three distinct approaches:

–

— Standardised Approach

˜ Advanced Measurement Approaches (AMA)

Managing OpRisk 6 Basic Indicator Approach

• Capital charge: BIA COP = α × GI

BIA • COP : capital charge under the Basic Indicator Approach

• GI: average annual gross income over the previous three years

• α = 15% (set by the Committee)

Managing OpRisk 7 Standardised Approach

• Similar to the BIA, but on the level of each business line:

8 SA X COP = βi × GIi i=1

βi ∈ [12%, 18%], i = 1, 2,..., 8.

• 8 business lines: Corporate finance Payment & Settlement Trading & sales Agency Services Retail banking Asset management Commercial banking Retail brokerage

Managing OpRisk 8 C. Advanced Measurement Approaches (AMA)

• Allows to use their own methods for assessing their exposure to OpRisk

• Preconditions: Bank must meet - qualitative and - quantitative standards before using the AMA

• Risk mitigation via insurance allowed

Managing OpRisk 9 Quantitative standards

• The Committee does not stipulate which (analytical) approach banks must use

• However, banks must demonstrate that their approach captures severe tail loss events, e.g 99.9% VaR

• In essence,

MRC = EL + UL

If ELs are already captured in a bank’s internal business practices, then

MRC = UL

Managing OpRisk 10 Quantitative standards (cont’d)

• Internal data: Banks must record internal loss data

• Internally generated OpRisk measures must be based on a five-year observation period

• External data: A bank’s OpRisk measurement system must also include external data

Managing OpRisk 11 Some internal data

type 1 type 2 20 40 15 30

10 20 5 10 0 0

1992 1994 1996 1998 2000 2002 1992 1994 1996 1998 2000 2002

type 3 pooled operational losses 40 8 30 6

20 4 10 2 0 0

1992 1994 1996 1998 2000 2002 1992 1994 1996 1998 2000 2002

Managing OpRisk 12 Modeling issues

• Stylized facts about OP risk losses - Loss occurrence times are irregularly spaced in time (selection bias, economic cycles, regulation, management interactions,. . . ) - Loss amounts show extremes

• Large losses are of main concern!

• Repetitive vs non-repetitive losses

• Red flag: Are observations in line with modeling assumptions?

• Example: “iid” assumption implies

NO structural changes in the data as time evolves Irrelevance of which loss is denoted X1, which one X2,. . .

Managing OpRisk 13 A mathematical (actuarial) model

• OpRisk loss database

n (t,k) X` , t ∈ {T − n + 1,...,T − 1,T } (years), k ∈ {1, 2,..., 7} (loss event type), o ` ∈ {1, 2,...,N (t,k)} (number of losses)

• Loss event types: Internal fraud External fraud Employment practices and workplace safety Clients, products & business practices Damage to physical assets Business disruption and system failures Execution, delivery & process management Managing OpRisk 14 A mathematical (actuarial) model (cont’d)

• Truncation (t,k) (t,k)1 X` = X`  (t,k) (t,k) X` >d and (random) censoring

• A further index j, j ∈ {1,..., 8} indicating business line can be introduced (suppressed for this talk)

 (T +1,k)  • Estimate a risk measure for FL(T +1,k)(x) = P L ≤ x like Op-VaR(T +1,k) = F ← (α) α L(T +1,k)

T +1 h (T +1,k) (T +1,k) (T +1,k)i Op-ESα = E L |L > Op-VaRα where

N(T +1,k) (T +1,k) X (T +1,k) • L = X` : OpRisk loss for loss event type k `=1 over the period [T,T + 1].

Managing OpRisk 15 A mathematical (actuarial) model (cont’d)

• Discussion: Recall the stylized facts - X’s are heavy-tailed - N shows non-stationarity

• Conclusions:

- FX(x) and FL(t,k)(x) difficult to estimate - In-sample estimation of VaRα (α large) almost impossible! - actuarial tools may be useful: ∗ Approximation (translated gamma/lognormal) ∗ Inversion methods (FFT) ∗ Recursive methods (Panjer) ∗ Simulation ∗ Extreme Value Theory (EVT) Managing OpRisk 16 How accurate are VaR-estimates?

• Make inference about the tail decay of the aggregate loss L(t,k) via the tail decay of the individual losses X(t,k):

N X −α L = X` , 1 − FX(x) ∼ x h(x) , x → ∞ `=1 −α § 1 − FL(x) ∼ E[N] x h(x) , x → ∞.

• Assumptions: (Xm) iid ∼ F and for some ξ, β and u large

Fu(x) := P[X − u ≤ x|X > u] = Gξ,β(u)(x)

where  −1/ξ   ξx  1 − 1 + β(u) , ξ 6= 0, Gξ,β(x) = 1 − e−x/β , ξ = 0.

Managing OpRisk 17 How accurate are VaR-estimates? (cont’d)

• Tail- and quantile estimate:

−1/ξˆ Nu ˆx − u 1 − FˆX(x) = 1 + ξ , x > u. n βˆ (1) ˆ ξˆ β  Nu   VaRd α =q ˆα = u − 1 − ξˆ n(1 − α)

• Idea: Comparison of estimated quantiles with the corresponding theoretical ones by means of a simulation study ([8]).

Managing OpRisk 18 How accurate are VaR-estimates? (cont’d)

• Simulation procedure:

– Choose F and fix α0 < α < 1, Nu ∈ {25, 50, 100, 200} (Nu: # of data points above u)

— Calculate u = qα0 and the true value of the quantile qα ˜ Sample Nu independent points of F above u by the rejection method. Record the total number n of sampled points this requires ™ Estimate ξ, β by fitting the GPD to the Nu exceedances over u by means of MLE. š Determine qˆα according to (1) › Repeat N times the above to arrive at estimates of Bias(ˆqα) and SE(ˆqα)

Managing OpRisk 19 How accurate are VaR-estimates? (cont’d)

• Accuracy of the quantile estimate expressed in terms of bias and standard error:

 21/2 Bias(ˆqα) = E[ˆqα − qα], SE(ˆqα) = E (ˆqα − qα) N N 1 X  1 X 1/2 Bias\(ˆq ) = qˆj − q , SE\(ˆq ) = (ˆqj − q )2 α N α α α N α α j=1 j=1

• For comparison purposes (different distributions) introduce

Bias\(ˆq ) SE\(ˆq ) Percentage Bias := α , Percentage SE := α qα qα

Managing OpRisk 20 How accurate are VaR-estimates? (cont’d)

• Criterion for a “good estimate”: Percentage Bias and Percentage SE should be small, e.g.

α = 0.99 α = 0.999

Percentage Bias ≤ 0.05 ≤ 0.10

Percentage SE ≤ 0.30 ≤ 0.60

Managing OpRisk 21 −−θθ Example: Pareto distribution 11 −− FFXX((xx)) = = xx , θθ == 2 2

← u = F (xq) α Goodness of VaRd α 0.99 A minimum number of 100 exceedances (corresponding to 333 observations) is required to ensure accuracy wrt bias and standard error. q = 0.7 0.999 A minimum number of 200 exceedances (corresponding to 667 observations) is required to ensure accuracy wrt bias and standard error. 0.99 Full accuracy can be achieved with the minimum number 25 of exceedances (corresponding to 250 observations). q = 0.9 0.999 A minimum number of 100 exceedances (corresponding to 1000 observations) is required to ensure accuracy wrt bias and standard error.

Managing OpRisk 22 −−θθ Example: Pareto distribution 11 −− FFXX((xx)) = = xx , θθ == 1 1

← u = F (xq) α Goodness of VaRd α 0.99 For all number of exceedances up to 200 (corresponding to a minimum of 667 observations) the VaR estimates fail to meet the accuracy criteria. q = 0.7 0.999 For all number of exceedances up to 200 (corresponding to a minimum of 667 observations) the VaR estimates fail to meet the accuracy criteria. 0.99 A minimum number of 100 exceedances (corresponding to 1000 observations) is required to ensure accuracy wrt bias and standard error. q = 0.9 0.999 A minimum number of 200 exceedances (corresponding to 2000 observations) is required to ensure accuracy wrt bias and standard error.

Managing OpRisk 23 How accurate are VaR-estimates? (cont’d)

• Large number of observations necessary to achieve targeted accuracy.

• Minimum number of observations increases as the tails become thicker ([8]).

• Remember: The simulation study was done under idealistic assumptions. OpRisk losses, however, typically do NOT fulfil these assumptions.

Managing OpRisk 24 Pricing risk under incomplete information

• Recall: L(T +1,k): OpRisk loss for loss event type k over the period [T,T + 1] N(T +1,k) (T +1,k) X (T +1,k) L = X` `=1

(T +1,k) • Question: Suppose we have calculated risk measures ρα , k ∈ {1,..., 7} for each loss event type. When can we consider

7 X (T +1,k) ρα k=1

T +1 P7 (T +1,k) a “good” risk measure for the total loss L = k=1 L ?

Managing OpRisk 25 Pricing risk under incomplete information (cont’d)

• Answer: Ingredients - (non-) coherence of risk measures (Artzner, Delbaen, Eber, Heath framework)

(T +1,k) - optimization problem: given ρα , k ∈ {1,..., 7}, what is the worst case for the overall risk for LT +1? Solution: using copulas in [4] and references therein.

- aggregation of banking [1]

Managing OpRisk 26 A ruin-theoretic problem motivated by OpRisk

• OpRisk process:

(k) (k) (k) (k) Vt = u + p (t) − Lt , t ≥ 0

for some initial capital u(k) and a premium function p(k)(t) (k) (k) satisfying P[Lt − p (t) → −∞] = 1.

• Given ε > 0, calculate u(k)(ε) such that

h  (k) (k) (k) i P inf u (ε) + p (t) − Lt < 0 ≤ ε (2) T ≤t≤T +1

u(k)(ε) is a risk capital charge (internal)

Managing OpRisk 27 Ruin-theoretic problem (cont’d)

• Solving for (2) is difficult (k) - complicated loss process Lt - heavy-tailed case - finite time horizon [T,T + 1]

• Recall for the classical Cramer-Lundberg model

N(t) X X Y (t) = u + ct − Xk = u + ct − SN (t) k=1 h  X  i Ψ(u) = P sup SN (t) − ct > u t≥0

Managing OpRisk 28 Ruin-theoretic problem (cont’d)

−(β+1) • In the heavy-tailed case: P[X1 > x] ∼ x L(x), L slowly varying, implies that

Ψ(u) ∼ κu−βL(u) , u → ∞.

• Important: Net-profit condition

h  X  i lim SN (t) − ct = −∞ = 1 P t→∞

• Now assume for some general loss process (S(t)) h i lim S(t) − ct = −∞ = 1 P t→∞ h  i −β Ψ1(u) = P sup S(t) − ct > u ∼ u L(u) , u → ∞. (3) t≥0

Managing OpRisk 29 Ruin-theoretic problem (cont’d)

• Question: How much can we change S keeping (3)?

• Solution: use time change S∆(t) = S(∆(t))

h  i Ψ∆(u) = P sup S∆(t) − ct > u t≥0

Under some technical conditions on ∆ and S, general models are given so that Ψ (u) lim ∆ = 1 u→∞ Ψ1(u) i.e. ultimate ruin behaves similarly under the time change

Managing OpRisk 30 Ruin-theoretic problem (cont’d)

• Example: - start from the homogeneous Poisson case (classical Cramer- Lundberg, heavy-tailed case)

- use ∆ to transform to changes in intensities motivated by operational risk, see [6].

Managing OpRisk 31 D. Conclusions

• OP risk 6=market risk, credit risk

• “Multiplicative structure” of OP risk losses ([7]) S × T × M (Selection-Training-Monitoring)

• Standard actuarial methods (including EVT) aiming to derive capital charges are of limited use due to

lack of data inconsistency of the data with the modeling assumptions

• OpRisk loss databases must grow

• interesting source of mathematical problems

Managing OpRisk 32 Conclusions (cont’d)

• challenges: choice of risk measures, aggregation of risk measures

• OpRisk charges can not be based on statistical modeling alone

I Pillar 2 (overall OP such as analysis of causes, prevention, . . . ) more important than Pillar 1

Managing OpRisk 33 E. References

[1] Alexander, C., and Pezier, P. (2003). Assessment and aggregation of banking risks. 9th IFCI Annual Risk Management Round Table, International Financial Risk Institute (IFCI).

[2] Basel Committee on Banking Supervision. The New Basel Capital Accord. April 2003. BIS, Basel, , www.bis.org/bcbs

[3] Embrechts, P., Furrer, H.J., and Kaufmann, R. (2003). Quantifying Regulatory Capital for Operational Risk. Derivatives Use, Trading and Regulation, Vol. 9, No. 3, 217-233. Also available on www.bis.org/bcbs/cp3comments.htm

Managing OpRisk 34 [4] Embrechts, P., Hoeing, A., and Juri, A. (2003). Using copulae to bound the Value-at-Risk for functions of dependent risks. Finance and Stochastics, Vol. 7, No 2, 145-167.

[5] Embrechts, P., Kaufmann, R., and Samorodnitsky, G. (2002). Ruin theory revisited: stochastic models for operational risk. Submitted.

[6] Embrechts, P., and Samorodnitsky, G. (2003). Ruin problem and how fast stochastic processes mix. Annals of Applied Probability, Vol. 13, 1-36.

[7] Geiger, H. (2000). Regulating and Supervising Operational Risk for Banks. Working paper, University of Zurich.

Managing OpRisk 35 [8] McNeil, A. J., and Saladin, T. (1997) The peaks over thresholds method for estimating high quantiles of loss distributions. Proceedings of XXVIIth International ASTIN Colloquium, Cairns, Australia, 23-43.

[9] The Economist (2003). Blockage in Basel. Vol. 369, No 8344, October 2003.

Managing OpRisk 36