Robust covariance estimation for financial applications

Tim Verdonck, Mia Hubert, Peter Rousseeuw

Department of Mathematics K.U.Leuven

August 30 2011

Tim Verdonck, Mia Hubert, Peter Rousseeuw Robust covariance estimation August 30 2011 1 / 44 Contents

1 Introduction Robust

2 Multivariate Location and Scatter Estimates

3 Minimum Covariance Determinant Estimator (MCD) FAST-MCD algorithm DetMCD algorithm

4 Principal Component Analysis

5 Multivariate Time Series

6 Conclusions

7 Selected references

Tim Verdonck, Mia Hubert, Peter Rousseeuw Robust covariance estimation August 30 2011 2 / 44 Introduction Introduction Robust Statistics

Real data often contain . Most classical methods are highly influenced by these outliers. What is robust statistics? Robust statistical methods try to fit the model imposed by the majority of the data. They aim to find a ‘robust’ fit, which is similar to the fit we would have found without outliers (observations deviating from robust fit). This also allows for detection.

Robust estimate applied on all observations is comparable with the classical estimate applied on the outlier-free data set. Robust estimator A good robust estimator combines high robustness with high efficiency.

◮ Robustness: being less influenced by outliers. ◮ Efficiency: being precise at uncontaminated data.

Tim Verdonck, Mia Hubert, Peter Rousseeuw Robust covariance estimation August 30 2011 3 / 44 Introduction Robust Statistics Univariate Scale Estimation: Wages data set

6000 households with male head earning less than USD 15000 annually in 1966. Classified into 39 demographic groups (we concentrate on variable AGE).

◮ 1 n 2 Standard Deviation (SD): n−1 i=1(xi − x) =4.91 ◮ Interquartile Range (IQR): 0.74(x(⌊0.75n⌋) − x(⌊0.25n⌋))=0.91 ◮ Median Absolute Deviation (MAD): 1.48 medi |xi − medj xj | =0.96

Tim Verdonck, Mia Hubert, Peter Rousseeuw Robust covariance estimation August 30 2011 4 / 44 Introduction Robust Statistics Measures of robustness

Breakdown Point The breakdown point of a scale estimator S is the smallest fraction of observations to be contaminated such that S ↑ ∞ or S ↓ 0.

Scale estimator Breakdown point

1 SD n ≈ 0

IQR 25%

MAD 50%

Note that when the breakdown value of an estimator is ε, this does not imply that a proportion of less than ε does not affect the estimator at all.

Tim Verdonck, Mia Hubert, Peter Rousseeuw Robust covariance estimation August 30 2011 5 / 44 Introduction Robust Statistics Measures of robustness

A specific type of contamination is point contamination

Fε,y = (1 − ε)F + ε∆y

with ∆y Dirac measure at y. Influence Function (Hampel, 1986) The influence function measures how T (F ) changes when contamination is added in y T (F ) − T (F ) IF (y; T , F )= lim ε,y ε→0 ε where T (.) is functional version of the estimator.

◮ IF is a local measure of robustness, whereas breakdown point is a global measure. ◮ We prefer estimators that have a bounded IF.

Tim Verdonck, Mia Hubert, Peter Rousseeuw Robust covariance estimation August 30 2011 6 / 44 Introduction Robust Statistics Influence Function (Hampel, 1986)

Tim Verdonck, Mia Hubert, Peter Rousseeuw Robust covariance estimation August 30 2011 7 / 44 Multivariate Location and Scatter Estimates Multivariate Location and Scatter

Scatterplot of bivariate data (ρ =0.990)

◮ ρˆ =0.779 ◮ ρˆMCD =0.987.

Tim Verdonck, Mia Hubert, Peter Rousseeuw Robust covariance estimation August 30 2011 8 / 44 Multivariate Location and Scatter Estimates Boxplot of the marginals

In the multivariate setting, outliers can not just be detected by applying outlier detection rules on each variable separately.

Only by correctly estimating the covariance structure, we can detect the outliers.

Tim Verdonck, Mia Hubert, Peter Rousseeuw Robust covariance estimation August 30 2011 9 / 44 Multivariate Location and Scatter Estimates Classical Estimator

p Data: Xn = x1,..., xn with xi ∈ R . Model: Xi ∼ Np(µ, Σ). More general we can assume that the data are generated from an elliptical distribution, i.e. a distribution whose density contours are ellipses.

The classical estimators for µ and Σ are the empirical mean and covariance matrix

n 1 x = x n i i=1 n 1 S = (x − x)(x − x)′. n n − 1 i i i =1 Both are highly sensitive to outliers ◮ zero breakdown value ◮ unbounded IF.

Tim Verdonck, Mia Hubert, Peter Rousseeuw Robust covariance estimation August 30 2011 10 / 44 Multivariate Location and Scatter Estimates Tolerance Ellipsoid

Boundary contains x-values with constant Mahalanobis distance to mean.

′ −1 MDi = (xi − x) Sn (xi − x) Classical Tolerance Ellipsoid

2 {x|MD(x) ≤ χp,0.975}

2 2 with χp,0.975 the 97.5% quantile of the χ distribution with p d.f. We expect (at large samples) that 97.5% of the observations belong to this ellipsoid. We can flag observation xi as an outlier if it does not belong to the tolerance ellipsoid.

Tim Verdonck, Mia Hubert, Peter Rousseeuw Robust covariance estimation August 30 2011 11 / 44 Multivariate Location and Scatter Estimates Tolerance Ellipsoid

Tolerance Ellipsoid for example

Tim Verdonck, Mia Hubert, Peter Rousseeuw Robust covariance estimation August 30 2011 11 / 44 Minimum Covariance Determinant Estimator (MCD) Robust Estimator

Minimum Covariance Determinant Estimator (MCD)

◮ Estimator of multivariate location and scatter [Rousseeuw, 1984]. ◮ Raw MCD estimator: ◮ Choose h between ⌊(n + p + 1)/2⌋ and n. ◮ Find h < n observations whose classical covariance matrix has lowest determinant. H0 = argmin det (cov(xi |i ∈ H)) H

◮ µˆ0 is mean of those h observations. 1 µˆ = xi . 0 n i H X∈ 0

◮ Σˆ 0 is covariance matrix of those h observations (multiplied by consistency factor). Σˆ 0 = c0 cov(xi |i ∈ H0)

Tim Verdonck, Mia Hubert, Peter Rousseeuw Robust covariance estimation August 30 2011 12 / 44 Minimum Covariance Determinant Estimator (MCD) Robust Estimator

Minimum Covariance Determinant Estimator (MCD)

◮ Estimator of multivariate location and scatter [Rousseeuw, 1984]. ◮ Raw MCD estimator. ◮ Reweighted MCD estimator: ◮ Compute initial robust distances

ˆ ′ ˆ −1 di = D(xi , µˆ0, Σ0)= (xi − µˆ0) Σ0 (xi − µˆ0). q ◮ 2 Assign weights wi = 0 if di > χp,0.975, else wi = 1. ◮ Compute reweighted mean andq covariance matrix: n wi xi µˆ = i=1 MCD n w P i=1 i n n −1 P ′ Σˆ MCD = c1 wi (xi − µˆ MCD)(xi − µˆ MCD) ) wi . i=1 ! i=1 ! X X ◮ Compute final robust distances and assign new weights wi .

Tim Verdonck, Mia Hubert, Peter Rousseeuw Robust covariance estimation August 30 2011 12 / 44 Minimum Covariance Determinant Estimator (MCD) Outlier detection

For outlier detection, recompute the robust distances (based on MCD).

−1 ′ ˆ RDi = (xi − µˆ MCD ) ΣMCD (xi − µˆ MCD ) 2 Flag observation xi as outlier if RDi > χp,0.975. This is equivalent with flagging the observations that do not belong to the robust tolerance ellipsoid. Robust tolerance ellipsoid

2 {x|RD(x) ≤ χp,0.975}

Tim Verdonck, Mia Hubert, Peter Rousseeuw Robust covariance estimation August 30 2011 13 / 44 Minimum Covariance Determinant Estimator (MCD) Outlier detection

Robust Tolerance Ellipsoid (based on MCD) for example

Tim Verdonck, Mia Hubert, Peter Rousseeuw Robust covariance estimation August 30 2011 13 / 44 Minimum Covariance Determinant Estimator (MCD) Properties of the MCD

◮ Robust ◮ breakdown point from 0 to 50% ◮ bounded influence function [Croux and Haesbroeck, 1999] . ◮ Positive definite ◮ Affine equivariant ◮ given X, the MCD estimates satisfy

′ µˆ(XA + 1nv ) =µ ˆ(X)A + v ′ ′ Σˆ (XA + 1nv ) = A Σˆ (X)A.

for all nonsingular matrices A and all constant vectors v. ⇒ data may be rotated, translated or rescaled without affecting the outlier detection diagnostics. ◮ Not very efficient: improved by reweighting step. ◮ Computation: FAST-MCD algorithm [Rousseeuw and Van Driessen, 1999].

Tim Verdonck, Mia Hubert, Peter Rousseeuw Robust covariance estimation August 30 2011 14 / 44 Minimum Covariance Determinant Estimator (MCD) FAST-MCD algorithm FAST-MCD algorithm

Computation of the raw estimates for n 6 600: ◮ For m =1 to 500: ◮ Draw random subsets of size p + 1. ◮ Apply two C-steps: Compute robust distances

′ −1 di = D(xi , µˆ, Σˆ )= q(xi − µˆ) Σˆ (xi − µˆ). Take h observations with smallest robust distance. Compute mean and covariance matrix of this h-subset. ◮ Retain 10 h-subsets with lowest covariance determinant. ◮ Apply C-steps on these 10 subsets until convergence. ◮ Retain the h-subset with lowest covariance determinant.

Tim Verdonck, Mia Hubert, Peter Rousseeuw Robust covariance estimation August 30 2011 15 / 44 Minimum Covariance Determinant Estimator (MCD) FAST-MCD algorithm FASTMCD algorithm

◮ A C-step will always decrease the determinant of the covariance matrix. ◮ As there are only a finite number of h-subsets, convergence to a (local) minimum is guaranteed. ◮ The algorithm is not guaranteed to yield the global minimum. The fixed number of initial p + 1-subsets (500) is a compromise between robustness and computation time. ◮ Implementations of FASTMCD algorithm widely available. ◮ R: in the packages robustbase and rrcov ◮ Matlab: in LIBRA toolbox and PLS toolbox of Eigenvector Research. ◮ SAS: in PROC ROBUSTREG ◮ S-plus: built-in function cov.mcd.

Tim Verdonck, Mia Hubert, Peter Rousseeuw Robust covariance estimation August 30 2011 16 / 44 Minimum Covariance Determinant Estimator (MCD) FAST-MCD algorithm Example: Animal set

Logarithm of body and brain weight for 28 animals.

Outlier detection based on MCD correctly indicates the outliers.

Tim Verdonck, Mia Hubert, Peter Rousseeuw Robust covariance estimation August 30 2011 17 / 44 Minimum Covariance Determinant Estimator (MCD) FAST-MCD algorithm Example: Animal set

In dimension p > 2, a scatterplot and tolerance ellipsoid can not be drawn. To expose the differences between a classical and a robust analysis, a distance-distance plot can be made

Outlier detection based on MCD correctly indicates the outliers.

Tim Verdonck, Mia Hubert, Peter Rousseeuw Robust covariance estimation August 30 2011 18 / 44 Minimum Covariance Determinant Estimator (MCD) DetMCD algorithm DetMCD algorithm

Deterministic algorithm for MCD [Hubert, Rousseeuw and Verdonck, 2010]. ◮ Idea: ◮ Compute several ’robust’ h-subsets, based on robust transformations of variables robust estimators of multivariate location and scatter. ◮ Apply C-steps until convergence.

Tim Verdonck, Mia Hubert, Peter Rousseeuw Robust covariance estimation August 30 2011 19 / 44 Minimum Covariance Determinant Estimator (MCD) DetMCD algorithm Computation of DetMCD

◮ Standardize X by subtracting median and dividing by Qn. ◮ Location and scale equivariant. ◮ ′ Standardized data: Z with rows zi and columns Zj . ◮ Obtain estimate S for covariance/correlation matrix of Z. ◮ To overcome lack of positive definiteness: 1 Compute eigenvectors P of S and define B = ZP. ′ 2 2 2 Σˆ (Z)= PLP with L = diag Qn(B1) ,..., Qn(Bp ) . − 1 1 ◮ Estimation of the center:µ ˆ(Z)= med(ZΣˆ 2 ) Σˆ 2 . ◮ Compute statistical distances

di = D(zi , µˆ(Z), Σˆ (Z)).

◮ Initial h-subset: h observations with smallest distance. ◮ Apply C-steps until convergence.

Tim Verdonck, Mia Hubert, Peter Rousseeuw Robust covariance estimation August 30 2011 20 / 44 Minimum Covariance Determinant Estimator (MCD) DetMCD algorithm Construct preliminary estimates S

1 Take hyperbolic tangent of the standardized data.

Yj = tanh(Zj ) ∀j =1,..., p. Take Pearson correlation matrix of Y

S1 = corr(Y).

2 Spearman correlation matrix.

S2 = corr(R)

where Rj is the rank of Zj . 3 Compute Tukey normal scores Tj from the ranks Rj :

1 −1 Rj − 3 Tj =Φ 1 n + 3 where Φ(.) is normal cdf S3 = corr(T).

Tim Verdonck, Mia Hubert, Peter Rousseeuw Robust covariance estimation August 30 2011 21 / 44 Minimum Covariance Determinant Estimator (MCD) DetMCD algorithm Construct preliminary estimates S

4 Related to spatial sign covariance matrix [Visuri et al., 2000] . zi Define ki = and let zi

n 1 S = k k′ 4 n i i i =1 5 We take first step of BACON algorithm [Billor et al., 2000 ] . Consider ⌈n/2⌉ standardized observations zi with smallest norm, and compute their mean and covariance matrix. 6 Obtained from the raw OGK estimator for scatter. [Maronna and Zamar, 2002 ]

Tim Verdonck, Mia Hubert, Peter Rousseeuw Robust covariance estimation August 30 2011 22 / 44 Minimum Covariance Determinant Estimator (MCD) DetMCD algorithm Simulation study

Compare DetMCD with FASTMCD on artificial data. ◮ Different small and moderate data sets A n = 100 and p = 2 A n = 100 and p = 5 A n = 200 and p = 10 A n = 400 and p = 40 A n = 600 and p = 60. ◮ Also consider correlated data [Maronna and Zamar, 2002 ] . ◮ Different contamination models ◮ ε = 0, 10, 20, 30 and 40%. ◮ Different types of contamination ◮ point, cluster and radial contamination.

Tim Verdonck, Mia Hubert, Peter Rousseeuw Robust covariance estimation August 30 2011 23 / 44 Minimum Covariance Determinant Estimator (MCD) DetMCD algorithm Simulation study

Compare DetMCD with FASTMCD on artificial data. ◮ Measures of performance ◮ The objective function of the raw scatter estimator, OBJ = det Σˆ raw(Y). 2 ◮ An error measure of the location estimator, given by eµ = ||µˆ(Y)|| . ◮ An error measure of the scatter estimate, defined as the logarithm of its ˆ condition number: eΣ = log10(cond(Σ(Y))). ◮ The computation time t (in seconds). Each of these performance measures should be as close to 0 as possible.

Tim Verdonck, Mia Hubert, Peter Rousseeuw Robust covariance estimation August 30 2011 24 / 44 Minimum Covariance Determinant Estimator (MCD) DetMCD algorithm Simulation study

0.69 7 DetMCD DetMCD FASTMCD FASTMCD 6 0.68

5 0.67

4 Σ Σ

e 0.66 e 3

0.65 2

0.64 1

0.63 0 0 50 100 150 200 250 0 50 100 150 200 250 r r

(a) (b)

Figuur: The error of the scatter estimate for different values of r when n = 400, p = 40 for (a) 10% and (b) 40% cluster contamination.

Value of r determines distance between outliers and main center.

Tim Verdonck, Mia Hubert, Peter Rousseeuw Robust covariance estimation August 30 2011 25 / 44 Minimum Covariance Determinant Estimator (MCD) DetMCD algorithm Simulation results for clean data

A B C D E

DetMCD OGK DetMCD OGK DetMCD OGK DetMCD OGK DetMCD OGK

OBJ 0.088 0.086 0.031 0.030 0.009 0.009 1e-5 1e-5 4.35e-7 8.68e-7

eµ 0.028 0.031 0.065 0.073 0.060 0.063 0.124 0.132 0.1250 0.1285

eΣ 0.175 0.202 0.390 0.460 0.393 0.418 0.636 0.668 0.6424 0.6576

t 0.019 0.498 0.029 0.581 0.096 0.868 1.775 4.349 5.7487 8.7541

Tim Verdonck, Mia Hubert, Peter Rousseeuw Robust covariance estimation August 30 2011 26 / 44 Minimum Covariance Determinant Estimator (MCD) DetMCD algorithm

Point (10%) Cluster (10%) Radial (10%)

OBJ 0.120 / 0.120 0.117 / 0.117 0.119 / 0.120 0.117 / 0.117 0.119 0.117

eµ 0.027 / 0.028 0.028 / 0.029 0.027 / 0.027 0.028 / 0.028 0.027 0.029 A eΣ 0.156 / 0.158 0.171 / 0.172 0.157 / 0.157 0.171 / 0.171 0.161 0.177

t 0.018 / 0.019 0.483 / 0.482 0.018 / 0.018 0.482 / 0.482 0.018 0.496

OBJ 0.047 / 0.047 0.045 / 0.045 0.047 / 0.047 0.045 / 0.045 0.047 0.045

eµ 0.068 / 0.068 0.074 / 0.074 0.068 / 0.068 0.074 / 0.074 0.067 0.074 B eΣ 0.383 / 0.383 0.425 / 0.425 0.382 / 0.383 0.426 / 0.426 0.379 0.425

t 0.028 / 0.028 0.556 / 0.555 0.028 / 0.028 0.557 / 0.557 0.028 0.579

OBJ 0.014 / 0.015 0.014 / 0.013 0.015 / 0.015 0.014 / 0.014 0.015 0.014

eµ 0.064 / 0.063 0.065 / 0.855 0.063 / 0.064 0.065 / 0.065 0.063 0.066 C eΣ 0.399 / 0.398 0.415 / 1.037 0.398 / 0.398 0.415 / 0.415 0.397 0.414

t 0.092 / 0.092 0.823 / 0.825 0.093 / 0.093 0.828 / 0.828 0.092 0.861

OBJ 3e-05 / 3e-05 5e-05 / 3e-05 4e-05 / 4e-05 5e-05 / 5e-05 4e-05 5e-05

eµ 0.131 / 0.130 0.135 / 175 0.131 / 0.130 0.135 / 0.135 0.129 0.136 D eΣ 0.651 / 0.650 0.672 / 4.639 0.651 / 0.651 0.672/ 0.673 0.645 0.670

t 1.694 / 1.710 4.395 / 4.305 1.715 / 1.717 4.362 / 4.344 1.739 4.336

OBJ 1e-06 / 2e-06 5e-10 / 6e-07 1e-06 / 1e-06 2e-06 / 2e-06 1e-06 2e-06

eµ 0.288 / 0.134 51.5 / 65317 0.134 / 0.134 0.134 / 0.134 0.135 0.136 E eΣ 0.666 / 0.661 3.098 / 6.201 0.660 / 0.660 0.663 / 0.663 0.660 0.669

t 5.527 / 5.527 8.530 / 8.769 5.649 / 5.644 8.773 / 8.758 5.703 8.617 Tim Verdonck, Mia Hubert, Peter Rousseeuw Robust covariance estimation August 30 2011 27 / 44 Minimum Covariance Determinant Estimator (MCD) DetMCD algorithm

Point (40%) Cluster (40%) Radial (40%)

OBJ 0.018 / 0.436 0.010 / 0.165 0.436 / 0.436 0.433 / 0.433 0.435 0.433

eµ 13.79 / 0.033 15.24 / 272.0 0.033 / 0.033 0.033 / 0.033 0.095 0.091 A eΣ 2.615 / 0.144 2.870 / 4.102 0.144 / 0.144 0.144 / 0.144 0.352 0.361

t 0.019 / 0.017 0.483 / 0.483 0.017 / 0.017 0.482 / 0.482 0.016 0.495

OBJ 1e-04 / 0.313 3e-05 / 0.053 0.371 / 0.312 0.309 / 0.309 0.313 0.309

eµ 79.0 / 0.084 96.8 / 2e+05 1.206 / 0.084 0.134 / 0.085 0.086 0.086 B eΣ 3.46 / 0.391 4.58 / 7.84 0.465 / 0.391 0.395 / 0.392 0.398 0.400

t 0.027 / 0.027 0.550 / 0.553 0.030 / 0.027 0.553 / 0.554 0.027 0.577

OBJ 3e-04 / 0.168 4e-09 / 6e-06 0.168/ 0.168 110 / 1404 0.168 0.166

eµ 160 / 0.084 187 / 3+05 0.084 / 0.084 7111 / 90886 0.084 0.084 C eΣ 3.58 / 0.441 4.20 / 7.43 0.441 / 0.441 4.089 / 5.127 0.440 0.442

t 0.088 / 0.088 0.804 / 0.809 0.093 / 0.093 0.824 / 0.830 0.089 0.850

OBJ 5e-33 / 0.004 2e-32/ 1e-29 0.004 / 0.004 0.003/ 12.2 0.004 0.004

eµ 766 / 0.171 760 / 1e+06 15.7 / 0.171 99.76 / 4e+05 0.172 0.174 D eΣ 4.57 / 0.734 5.06 / 8.13 1.03 / 0.733 2.62 / 6.21 0.735 0.737

t 1.64 / 1.64 4.00 / 4.18 1.76 / 1.78 4.34 / 4.33 1.72 4.23

OBJ 5-49 / 5e-04 6e-49 / 8e-46 1e-04 / 4e-04 1e-04 / 0.819 4e-04 4e-04

eµ 1152 / 0.172 1142 / 2e+06 75.4 / 0.172 84.7 / 6e+05 0.171 0.171 E eΣ 4.72 / 0.744 4.88 / 8.14 2.43 / 0.742 2.53 / 6.37 0.739 0.740

t 5.33 / 5.32 7.13 / 7.39 5.91 / 5.77 8.70 / 8.76 5.59 8.43 Tim Verdonck, Mia Hubert, Peter Rousseeuw Robust covariance estimation August 30 2011 28 / 44 Minimum Covariance Determinant Estimator (MCD) DetMCD algorithm Properties of DetMCD

Advantages ◮ Very fast ◮ DetMCD: typically 3/4 C-steps needed to converge, hence 21 C-steps in total. ◮ FASTMCD uses 1000 C-steps. ◮ Fully deterministic ◮ Permutation invariant ◮ Easy to compute DetMCD for different values of h ◮ The initial subsets are independent of h. Disadvantages ◮ Not fully affine equivariant

Tim Verdonck, Mia Hubert, Peter Rousseeuw Robust covariance estimation August 30 2011 29 / 44 Minimum Covariance Determinant Estimator (MCD) DetMCD algorithm Example: Philips data

Engineers measured 9 characteristics for 667 diaphragm parts for television sets.

16 16

14 14

12 12

10 10

8 8

Robust distance 6 Robust distance 6

4 4

2 2

0 0 0 100 200 300 400 500 600 700 0 100 200 300 400 500 600 700 Index Index

(a) (b)

Figuur: Robust distances of the Philips data with (a) DetMCD and (b) FASTMCD.

Tim Verdonck, Mia Hubert, Peter Rousseeuw Robust covariance estimation August 30 2011 30 / 44 Minimum Covariance Determinant Estimator (MCD) DetMCD algorithm Example: Philips data

Engineers measured 9 characteristics for 667 diaphragm parts for television sets. ◮ Estimates for location and scatter almost identical. ◮ dµ = ||µˆ MCD − µˆ DetMCD|| = 0.0000 − 1 − 1 ◮ ˆ 2 ˆ ˆ 2 ′ dΣ = cond ΣMCDΣDetMCD(ΣMCD) = 1.0000.   ◮ Objective functions almost the same ◮ OBJMCD = 0.9992. OBJDetMCD ◮ Optimal h-subsets only differed in 1 observation. ◮ Computation time ◮ DetMCD: 0.2676s ◮ FASTMCD: 1.0211s

Tim Verdonck, Mia Hubert, Peter Rousseeuw Robust covariance estimation August 30 2011 30 / 44 Minimum Covariance Determinant Estimator (MCD) DetMCD algorithm Applications of MCD

MCD has been applied in numerous research fields, such as ◮ Finance ◮ Medicine ◮ Quality control ◮ Image analysis ◮ Chemistry MCD has also been used as a basis to develop robust and computationally efficient multivariate techniques, such as ◮ Principal Component Analysis (PCA) ◮ Classification ◮ Factor Analysis ◮ Multivariate Regression

Tim Verdonck, Mia Hubert, Peter Rousseeuw Robust covariance estimation August 30 2011 31 / 44 Principal Component Analysis Principal Component Analysis (PCA)

PCA summarizes information in data into few principal components (PCs) ◮ Let X ∈ Rn×p be the data (n cases and p variables). ◮ PCs ti are defined as linear combinations of the data

ti = Xpi ◮ where pi = argmax {var(Xa)} a under the constraint that

pi =1 and cov Xpi , Xpj =0 for j < i ◮ The PCs are uncorrelated and ordered so that the first few retain most of the variation present in all of the original variables. ◮ From Lagrange multiplier method: PCs can be computed as eigenvectors of the variance-covariance matrix Σ. ◮ Variance and variance-covariance matrix are sensitive to outliers ⇒ PCA is a non-robust method.

Tim Verdonck, Mia Hubert, Peter Rousseeuw Robust covariance estimation August 30 2011 32 / 44 Principal Component Analysis PCA

◮ Example: Chinchilla data ◮ 50 Chinchillas for auditory research ◮ 3 measurements (cm): length tail, length whisker and length ear ◮ data is standardized.

◮ Visualization.

Tim Verdonck, Mia Hubert, Peter Rousseeuw Robust covariance estimation August 30 2011 33 / 44 Principal Component Analysis PCA

◮ Example: Chinchilla data ◮ Measurements for 10 more Chinchillas from USA → added to the data. ◮ Visualization.

Tim Verdonck, Mia Hubert, Peter Rousseeuw Robust covariance estimation August 30 2011 33 / 44 Principal Component Analysis PCA

◮ Example: Chinchilla data ◮ Measurements for 10 more Chinchillas from USA → added to the data. ◮ Solution: Robust PCA when data contains outliers. ◮ Visualization.

Tim Verdonck, Mia Hubert, Peter Rousseeuw Robust covariance estimation August 30 2011 33 / 44 Principal Component Analysis PCA

◮ Example: Chinchilla data ◮ Measurements for 10 more Chinchillas from USA → added to the data. ◮ Solution: Robust PCA when data contains outliers ◮ Reason for outliers → wrong Chinchillas from USA.

Tim Verdonck, Mia Hubert, Peter Rousseeuw Robust covariance estimation August 30 2011 33 / 44 Principal Component Analysis Robust covariance matrix

◮ PCA corresponds to a spectral decomposition of the variance-covariance matrix as Σ = PΛP′ ◮ P contains as columns the eigenvectors pi of Σ. ◮ Λ is a diagonal matrix where the diagonal elements λii are the eigenvalues of

Σ corresponding to pi . ◮ Simple idea: Compute principal components as eigenvectors of a robust covariance matrix (a robust estimate of Σ). ◮ Robust scatter estimators can not be computed or have bad statistical properties in high dimensions.

Tim Verdonck, Mia Hubert, Peter Rousseeuw Robust covariance estimation August 30 2011 34 / 44 Principal Component Analysis ROBPCA: a hybrid method (Hubert et al.)

◮ Use PP to find directions which are most outlying. ◮ Stahel-Donoho Outlyingness (SDO) is defined as v′x − M(v′X) r x X i ( i , )= sup ′ Rp S(v X) v∈ ◮ xi : rows of X ◮ M: estimator of location (univariate MCD). ◮ S: estimator of scale (univariate MCD). ◮ v: a p variate direction.

Tim Verdonck, Mia Hubert, Peter Rousseeuw Robust covariance estimation August 30 2011 35 / 44 Principal Component Analysis ROBPCA: a hybrid method (Hubert et al.)

◮ Use PP to find directions which are most outlying. ◮ Stahel-Donoho Outlyingness (SDO) is defined as v′x − M(v′X) r x X i ( i , )= sup ′ Rp S(v X) v∈ ◮ Apply classical PCA on h data points with smallest SDO and retain k components. ◮ Obtain improved robust subspace estimate as subspace spanned by k dominant eigenvectors of covariance matrix of all points with ODi = xi − xˆi < ch (ch cutoff value based on robust measure of location 2/3 and scale of the ODi ). Project data points on this subspace. ◮ Apply MCD covariance estimator in subspace: mean and covariance of the h points with smallest RDi . ◮ Final PCs are eigenvectors of this robust covariance matrix. ◮ Robustness properties are inherited from MCD. Tim Verdonck, Mia Hubert, Peter Rousseeuw Robust covariance estimation August 30 2011 35 / 44 Principal Component Analysis Outliers

Outlier is observation that does not obey the pattern of the majority of the data. ◮ Different kind of outliers.

◮ 1,2: good leverage points 3,4: orthogonal outliers 5,6: bad leverage points

Tim Verdonck, Mia Hubert, Peter Rousseeuw Robust covariance estimation August 30 2011 36 / 44 Principal Component Analysis Outlier Map

◮ Displays orthogonal distance ODi,k vs score distance SDi,k

ODi,k = xi − µˆx − Pp,k ti = xi − xˆi

′ −1 SDi,k = ti (Lk,k ) ti ◮ Hereµ ˆx , P, ti and L represent resp. the robust center of the data, the robust loading matrix, the robust scores and a diagonal matrix with as entries the k largest robust eigenvalues as a result of a robust PCA method.

◮ Cut-off value to determine outliers for each distance.

Tim Verdonck, Mia Hubert, Peter Rousseeuw Robust covariance estimation August 30 2011 37 / 44 Principal Component Analysis Outlier Map

◮ Example: 3-dimensional data (generated from multivariate normal distribution) projected on a robust 2-dimensional PCA-subspace.

ROBPCA

14 4

12 6

10 3

8 5 6

Orthogonal distance 4

2 2 0 1

0 1 2 3 4 5 6 7 8 9 Score distance (2 LV)

1 1 & 2: Good leverage points (outlying SD, regular OD) 2 3 & 4: Orhogonal outliers (outlying OD, regular SD) 3 5 & 6: Bad leverage points (outlying OD, outlying SD)

Tim Verdonck, Mia Hubert, Peter Rousseeuw Robust covariance estimation August 30 2011 38 / 44 Principal Component Analysis Example 1: Swiss bank notes (n = 100 and p = 6)

◮ Highly correlated variables and outliers ⇒ Robust PCA method. ◮ Missing values in the data ⇒ Methodology of Serneels and Verdonck (2008). ◮ 3 PCs explained 92% of the variance.

71 71

67 80 67 80 2 60 2 60 48 9262 61 48 9262 61 82 82 94 94 1.5 68 87 1.5 68 87 16 16 11 11

1 38 1 38 Orthogonal distance Orthogonal distance 25 25 0.5 0.5

3 3

0 0

0 1 2 3 4 5 0 1 2 3 4 5 Score distance (3 LV) Score distance (3 LV)

◮ Time: FASTMCD took 197s, whereas DetMCD needed 10s.

Tim Verdonck, Mia Hubert, Peter Rousseeuw Robust covariance estimation August 30 2011 39 / 44 Principal Component Analysis Example 2: Credit Default Swap (CDS) from iTraxx Europe

Price of CDS of 125 companies over 58 weeks. ◮ take log ratios log(xi,j /xi,j−1) for every xi (i is company and j is week). ◮ Delete variables containing more than 63 zeroes. ⇒ 125 companies and 37 log ratios. Applying ROBPCA

ROBPCA − SD

Hilton 30

BAA 25 VNU TDCAS CarltonComm

20

Linde Rentokil 15 Valeo Altadis TateLyle Electrolux GUS Finmeccanica

Orthogonal distance 10

UPMKymmene 5

0

0 5 10 15 20 Score distance (10 LV)

Tim Verdonck, Mia Hubert, Peter Rousseeuw Robust covariance estimation August 30 2011 40 / 44 Figuur: Outlier map of CDS data based on ROBPCA Principal Component Analysis Example 2: Credit Default Swap (CDS) from iTraxx Europe

Price of CDS of 125 companies over 58 weeks. ◮ take log ratios log(xi,j /xi,j−1) for every xi (i is company and j is week). ◮ Delete variables containing more than 63 zeroes. ⇒ 125 companies and 37 log ratios. Cope with skewness of variables [Hubert et al., 2009] .

ROBPCA − AO

BAA CarltonComm 25 VNU Hilton

20 TDCAS

15 TateLyle Valeo 10 Linde Rentokil Altadis

Orthogonal distance BootsGroup Finmeccanica

5

0

0 5 10 15 20 Tim Verdonck, Mia Hubert, Peter Rousseeuw Robust covariance estimation August 30 2011 40 / 44 Score distance (10 LV) Multivariate Time Series Econometric application: Multivariate Time Series

Multivariate exponential smoothing is popular technique to forecast time series. ◮ Can not cope with outliers. Robust version based on MCD [Croux et al., 2010] . ◮ Assume

◮ y1,..., yT : multivariate time series ◮ ˆy1,..., ˆyt−1: robust smoothed values of y1,..., yt−1 are already computed. ◮ ∗ yˆt = Λyt + (I − Λ)yˆt−1 where ◮ Λ is smoothing matrix ◮ ∗ yt is cleaned version of p-dimensional vector yt . ◮ Forecast for yT +1 that can be made at time T

T −1 k yˆT +1|T = yˆT = Λ (I − Λ) yT −k . k =0

Tim Verdonck, Mia Hubert, Peter Rousseeuw Robust covariance estimation August 30 2011 41 / 44 Multivariate Time Series Econometric application: Multivariate Time Series

Multivariate exponential smoothing is popular technique to forecast time series. ◮ Can not cope with outliers. Robust version based on MCD [Croux et al., 2010] . ◮ This multivariate cleaned series is calculated as

′ ˆ −1 ψ r Σt rt ∗ yt = rt + ˆyt|t−1 ′ ˆ −1 r Σt rt where ◮ rt = yt − ˆyt|t−1 denotes one-step-ahead forecast error

◮ 2 ψ is Huber ψ-function with clipping constant χp,0.95

◮ Σˆ t is estimated covariance matrix of one-step-aheadq forecast error at time t

Tim Verdonck, Mia Hubert, Peter Rousseeuw Robust covariance estimation August 30 2011 41 / 44 Multivariate Time Series Econometric application: Multivariate Time Series

Multivariate exponential smoothing is popular technique to forecast time series. ◮ Can not cope with outliers. Robust version based on MCD [Croux et al., 2010] .

⇒ Robust version uses MCD in 2 different stages of algorithm ◮ starting values are obtained by MCD-based robust multivariate regression [Rousseeuw et al., 2010] . ◮ MCD is used as loss function to choose smoothing matrix Λ (iterative manner).

Tim Verdonck, Mia Hubert, Peter Rousseeuw Robust covariance estimation August 30 2011 41 / 44 Multivariate Time Series Illustration: Housing Data (1968-1996)

Real bivariate time series of monthly data (housing starts and completions). ◮ Startup period of length 10 and complete series as training sample yields following smoothing matrix

0.68 0.04 Λ =   . 0.04 0.62     ◮ Redoing this example with DetMCD gives exact same smoothing matrix. ◮ Time: FASTMCD took 1 hour and 52 minutes, whereas DetMCD only needed 22 minutes. ◮ Speed-up will become more important when considering higher-dimensional time series.

Tim Verdonck, Mia Hubert, Peter Rousseeuw Robust covariance estimation August 30 2011 42 / 44 Conclusions Conclusions

◮ DetMCD is new algorithm which ◮ is typically more robust than FASTMCD and needs even less time. ◮ is deterministic in that it does not use any random subsets. ◮ is permutation invariant and close to affine equivariant ◮ allows to run the analysis for many values of h without much additional computation. ◮ We illustrated DetMCD in contexts of PCA and time series analysis. ◮ Also many other methods that (in)directly rely on MCD may benefit from DetMCD approach, such as ◮ robust canonical correlation ◮ with continuous and categorical regressors ◮ robust errors-in-variables regression ◮ robust calibration ◮ on-line applications or procedures that require MCD to be computed many times.

Tim Verdonck, Mia Hubert, Peter Rousseeuw Robust covariance estimation August 30 2011 43 / 44 Selected references Selected references

M. Hubert, P.J. Rousseeuw and T. Verdonck (2011). A deterministic algorithm for the MCD. Submitted. M. Hubert, P.J. Rousseeuw and T. Verdonck (2009). Robust PCA for skewed data and its outlier map. Computational Statistics and Data Analysis, 53: 2264–2274. S. Serneels and T. Verdonck (2008). Principal component analysis for data containing outliers and missing elements. Computational Statistics and Data Analysis, 52: 1712–1727. P. J. Rousseeuw and K. Van Driessen (1999). A Fast Algorithm for the Minimum Covariance Determinant Estimator. Technometrics, 4:212–223. Journal of the American Statistical Association, 94(446): 434–445. M. Hubert, P.J. Rousseeuw and K. Vanden Branden (2005). ROBPCA: a new approach to robust principal component analysis. Technometrics, 47: 64–79. C. Croux, S. Gelper, K. Mahieu (2010). Robust exponential smoothing of multivariate time series. Computational Statistics and Data Analysis, 54: 2999-3006.

Tim Verdonck, Mia Hubert, Peter Rousseeuw Robust covariance estimation August 30 2011 44 / 44