Robust Covariance Estimation for Financial Applications
Total Page:16
File Type:pdf, Size:1020Kb
Robust covariance estimation for financial applications Tim Verdonck, Mia Hubert, Peter Rousseeuw Department of Mathematics K.U.Leuven August 30 2011 Tim Verdonck, Mia Hubert, Peter Rousseeuw Robust covariance estimation August 30 2011 1 / 44 Contents 1 Introduction Robust Statistics 2 Multivariate Location and Scatter Estimates 3 Minimum Covariance Determinant Estimator (MCD) FAST-MCD algorithm DetMCD algorithm 4 Principal Component Analysis 5 Multivariate Time Series 6 Conclusions 7 Selected references Tim Verdonck, Mia Hubert, Peter Rousseeuw Robust covariance estimation August 30 2011 2 / 44 Introduction Robust Statistics Introduction Robust Statistics Real data often contain outliers. Most classical methods are highly influenced by these outliers. What is robust statistics? Robust statistical methods try to fit the model imposed by the majority of the data. They aim to find a ‘robust’ fit, which is similar to the fit we would have found without outliers (observations deviating from robust fit). This also allows for outlier detection. Robust estimate applied on all observations is comparable with the classical estimate applied on the outlier-free data set. Robust estimator A good robust estimator combines high robustness with high efficiency. ◮ Robustness: being less influenced by outliers. ◮ Efficiency: being precise at uncontaminated data. Tim Verdonck, Mia Hubert, Peter Rousseeuw Robust covariance estimation August 30 2011 3 / 44 Introduction Robust Statistics Univariate Scale Estimation: Wages data set 6000 households with male head earning less than USD 15000 annually in 1966. Classified into 39 demographic groups (we concentrate on variable AGE). ◮ 1 n 2 Standard Deviation (SD): n−1 i=1(xi − x) =4.91 ◮ Interquartile Range (IQR): 0.74(x(⌊0.75n⌋) − x(⌊0.25n⌋))=0.91 ◮ Median Absolute Deviation (MAD): 1.48 medi |xi − medj xj | =0.96 Tim Verdonck, Mia Hubert, Peter Rousseeuw Robust covariance estimation August 30 2011 4 / 44 Introduction Robust Statistics Measures of robustness Breakdown Point The breakdown point of a scale estimator S is the smallest fraction of observations to be contaminated such that S ↑ ∞ or S ↓ 0. Scale estimator Breakdown point 1 SD n ≈ 0 IQR 25% MAD 50% Note that when the breakdown value of an estimator is ε, this does not imply that a proportion of less than ε does not affect the estimator at all. Tim Verdonck, Mia Hubert, Peter Rousseeuw Robust covariance estimation August 30 2011 5 / 44 Introduction Robust Statistics Measures of robustness A specific type of contamination is point contamination Fε,y = (1 − ε)F + ε∆y with ∆y Dirac measure at y. Influence Function (Hampel, 1986) The influence function measures how T (F ) changes when contamination is added in y T (F ) − T (F ) IF (y; T , F )= lim ε,y ε→0 ε where T (.) is functional version of the estimator. ◮ IF is a local measure of robustness, whereas breakdown point is a global measure. ◮ We prefer estimators that have a bounded IF. Tim Verdonck, Mia Hubert, Peter Rousseeuw Robust covariance estimation August 30 2011 6 / 44 Introduction Robust Statistics Influence Function (Hampel, 1986) Tim Verdonck, Mia Hubert, Peter Rousseeuw Robust covariance estimation August 30 2011 7 / 44 Multivariate Location and Scatter Estimates Multivariate Location and Scatter Scatterplot of bivariate data (ρ =0.990) ◮ ρˆ =0.779 ◮ ρˆMCD =0.987. Tim Verdonck, Mia Hubert, Peter Rousseeuw Robust covariance estimation August 30 2011 8 / 44 Multivariate Location and Scatter Estimates Boxplot of the marginals In the multivariate setting, outliers can not just be detected by applying outlier detection rules on each variable separately. Only by correctly estimating the covariance structure, we can detect the outliers. Tim Verdonck, Mia Hubert, Peter Rousseeuw Robust covariance estimation August 30 2011 9 / 44 Multivariate Location and Scatter Estimates Classical Estimator p Data: Xn = x1,..., xn with xi ∈ R . Model: Xi ∼ Np(µ, Σ). More general we can assume that the data are generated from an elliptical distribution, i.e. a distribution whose density contours are ellipses. The classical estimators for µ and Σ are the empirical mean and covariance matrix n 1 x = x n i i=1 n 1 S = (x − x)(x − x)′. n n − 1 i i i =1 Both are highly sensitive to outliers ◮ zero breakdown value ◮ unbounded IF. Tim Verdonck, Mia Hubert, Peter Rousseeuw Robust covariance estimation August 30 2011 10 / 44 Multivariate Location and Scatter Estimates Tolerance Ellipsoid Boundary contains x-values with constant Mahalanobis distance to mean. ′ −1 MDi = (xi − x) Sn (xi − x) Classical Tolerance Ellipsoid 2 {x|MD(x) ≤ χp,0.975} 2 2 with χp,0.975 the 97.5% quantile of the χ distribution with p d.f. We expect (at large samples) that 97.5% of the observations belong to this ellipsoid. We can flag observation xi as an outlier if it does not belong to the tolerance ellipsoid. Tim Verdonck, Mia Hubert, Peter Rousseeuw Robust covariance estimation August 30 2011 11 / 44 Multivariate Location and Scatter Estimates Tolerance Ellipsoid Tolerance Ellipsoid for example Tim Verdonck, Mia Hubert, Peter Rousseeuw Robust covariance estimation August 30 2011 11 / 44 Minimum Covariance Determinant Estimator (MCD) Robust Estimator Minimum Covariance Determinant Estimator (MCD) ◮ Estimator of multivariate location and scatter [Rousseeuw, 1984]. ◮ Raw MCD estimator: ◮ Choose h between ⌊(n + p + 1)/2⌋ and n. ◮ Find h < n observations whose classical covariance matrix has lowest determinant. H0 = argmin det (cov(xi |i ∈ H)) H ◮ µˆ0 is mean of those h observations. 1 µˆ = xi . 0 n i H X∈ 0 ◮ Σˆ 0 is covariance matrix of those h observations (multiplied by consistency factor). Σˆ 0 = c0 cov(xi |i ∈ H0) Tim Verdonck, Mia Hubert, Peter Rousseeuw Robust covariance estimation August 30 2011 12 / 44 Minimum Covariance Determinant Estimator (MCD) Robust Estimator Minimum Covariance Determinant Estimator (MCD) ◮ Estimator of multivariate location and scatter [Rousseeuw, 1984]. ◮ Raw MCD estimator. ◮ Reweighted MCD estimator: ◮ Compute initial robust distances ˆ ′ ˆ −1 di = D(xi , µˆ0, Σ0)= (xi − µˆ0) Σ0 (xi − µˆ0). q ◮ 2 Assign weights wi = 0 if di > χp,0.975, else wi = 1. ◮ Compute reweighted mean andq covariance matrix: n wi xi µˆ = i=1 MCD n w P i=1 i n n −1 P ′ Σˆ MCD = c1 wi (xi − µˆ MCD)(xi − µˆ MCD) ) wi . i=1 ! i=1 ! X X ◮ Compute final robust distances and assign new weights wi . Tim Verdonck, Mia Hubert, Peter Rousseeuw Robust covariance estimation August 30 2011 12 / 44 Minimum Covariance Determinant Estimator (MCD) Outlier detection For outlier detection, recompute the robust distances (based on MCD). −1 ′ ˆ RDi = (xi − µˆ MCD ) ΣMCD (xi − µˆ MCD ) 2 Flag observation xi as outlier if RDi > χp,0.975. This is equivalent with flagging the observations that do not belong to the robust tolerance ellipsoid. Robust tolerance ellipsoid 2 {x|RD(x) ≤ χp,0.975} Tim Verdonck, Mia Hubert, Peter Rousseeuw Robust covariance estimation August 30 2011 13 / 44 Minimum Covariance Determinant Estimator (MCD) Outlier detection Robust Tolerance Ellipsoid (based on MCD) for example Tim Verdonck, Mia Hubert, Peter Rousseeuw Robust covariance estimation August 30 2011 13 / 44 Minimum Covariance Determinant Estimator (MCD) Properties of the MCD ◮ Robust ◮ breakdown point from 0 to 50% ◮ bounded influence function [Croux and Haesbroeck, 1999] . ◮ Positive definite ◮ Affine equivariant ◮ given X, the MCD estimates satisfy ′ µˆ(XA + 1nv ) =µ ˆ(X)A + v ′ ′ Σˆ (XA + 1nv ) = A Σˆ (X)A. for all nonsingular matrices A and all constant vectors v. ⇒ data may be rotated, translated or rescaled without affecting the outlier detection diagnostics. ◮ Not very efficient: improved by reweighting step. ◮ Computation: FAST-MCD algorithm [Rousseeuw and Van Driessen, 1999]. Tim Verdonck, Mia Hubert, Peter Rousseeuw Robust covariance estimation August 30 2011 14 / 44 Minimum Covariance Determinant Estimator (MCD) FAST-MCD algorithm FAST-MCD algorithm Computation of the raw estimates for n 6 600: ◮ For m =1 to 500: ◮ Draw random subsets of size p + 1. ◮ Apply two C-steps: Compute robust distances ′ −1 di = D(xi , µˆ, Σˆ )= q(xi − µˆ) Σˆ (xi − µˆ). Take h observations with smallest robust distance. Compute mean and covariance matrix of this h-subset. ◮ Retain 10 h-subsets with lowest covariance determinant. ◮ Apply C-steps on these 10 subsets until convergence. ◮ Retain the h-subset with lowest covariance determinant. Tim Verdonck, Mia Hubert, Peter Rousseeuw Robust covariance estimation August 30 2011 15 / 44 Minimum Covariance Determinant Estimator (MCD) FAST-MCD algorithm FASTMCD algorithm ◮ A C-step will always decrease the determinant of the covariance matrix. ◮ As there are only a finite number of h-subsets, convergence to a (local) minimum is guaranteed. ◮ The algorithm is not guaranteed to yield the global minimum. The fixed number of initial p + 1-subsets (500) is a compromise between robustness and computation time. ◮ Implementations of FASTMCD algorithm widely available. ◮ R: in the packages robustbase and rrcov ◮ Matlab: in LIBRA toolbox and PLS toolbox of Eigenvector Research. ◮ SAS: in PROC ROBUSTREG ◮ S-plus: built-in function cov.mcd. Tim Verdonck, Mia Hubert, Peter Rousseeuw Robust covariance estimation August 30 2011 16 / 44 Minimum Covariance Determinant Estimator (MCD) FAST-MCD algorithm Example: Animal set Logarithm of body and brain weight for 28 animals. Outlier detection based on MCD correctly indicates the outliers. Tim Verdonck, Mia Hubert, Peter Rousseeuw Robust covariance estimation August 30 2011 17 / 44 Minimum Covariance Determinant Estimator (MCD) FAST-MCD algorithm Example: Animal set In dimension p > 2, a scatterplot and tolerance ellipsoid can not be drawn. To expose the differences between a classical and a robust analysis, a distance-distance plot can be made Outlier detection based on MCD correctly indicates the outliers. Tim Verdonck, Mia Hubert, Peter Rousseeuw Robust covariance estimation August 30 2011 18 / 44 Minimum Covariance Determinant Estimator (MCD) DetMCD algorithm DetMCD algorithm Deterministic algorithm for MCD [Hubert, Rousseeuw and Verdonck, 2010]. ◮ Idea: ◮ Compute several ’robust’ h-subsets, based on robust transformations of variables robust estimators of multivariate location and scatter.