Moving Block Bootstrap for Kernel Smoothing
Total Page:16
File Type:pdf, Size:1020Kb
Moving Block Bootstrap for Kernel Smoothing in Trend Analysis* Il Bootstrap a Blocchi Mobili nella Stima Kernel del Trend
Michele La Rocca, Cira Perna** Università degli Studi di Salerno, [email protected], [email protected]
Riassunto: In questo lavoro viene proposto l'uso del bootstrap a blocchi mobili per la costruzione di intervalli di confidenza per il trend deterministico di una serie storica stimato mediante un approccio kernel. La performance della procedura proposta viene valutata attraverso un esperimento di simulazione.
Keywords: Bootstrap, Kernel smoothing, Time series, Trend analysis.
1. Introduction
Let Y1,,Yn be a non stationary time series with a trend characterised by a deterministic function. The series can be decomposed as a signal plus noise model:
Yt s(t) t , t 1,2,,n where s(t) represents the trend and t is a stationary noise process with zero mean. Under the assumption that the trend is smooth, non parametric techniques, such as kernel smoothing, can be used for the estimation of the function s() . In order to construct confidence intervals, the asymptotic normal theory is well developed. A greater accuracy can be achieved by resampling techniques. Bootstrap approximations for the limiting distribution of kernel smoother has been successfully applied when t is a white noise process (Härdle and Marron, 1991). When t admits an AR() representation Bühlmann (1998) proposed, more recently, a sieve bootstrap approach. If the noise process is strongly non linear (i.e. it does not admit an AR() representation) this method is not asymptotically consistent and the moving block bootstrap (MBB) is superior (Bühlmann, 1999). The aim of the paper is to propose the use of MBB for constructing confidence intervals for s() . The approach has the advantage to be very general since it only assumes some mixing conditions on the stationary noise process and no linear representation is required. This bootstrap procedure is particularly useful when it is difficult, as in the case of kernel smoothing, to identify the structure of the noise process. The paper is organised as follows. In the next section we focus on the use of kernel smoothers for trend estimate and report some asymptotic results. In section 3 we
* Paper supported by MURST 98 "Modelli statistici per l'analisi delle serie temporali". ** The work is joint responsability of the two authors; C. Perna wrote sections 1 and 2, M. La Rocca wrote sections 3 and 4. describe and discuss the MBB technique in trend analysis for the construction of pointwise confidence intervals. Finally, in section 4 we report some results of a simulation study where we compare the MBB with the asymptotic normal theory.
2. Kernel smoothers for trend analysis
Let s(t) m0 (t / n) where m0 is a real smooth function on [0, 1]. We consider the Nadaraya-Watson kernel smoother defined as:
n 1 x t / n mˆ (x) mx,h,Y (nh) K Yt t 1 h
The kernel function K is a symmetric probability density and the bandwidth h is such that h On1/ 5 . Under some regularity conditions (Robinson, 1983) it is:
1/ 2 d 2 (nh) (mˆ (x) m0 (x)) N((x), )
1/ 2 2 2 K 2 x dx 2 where: (x) limn (nh) bx and with being the variance b(x) h 2m''(x) x 2 K x dx of the noise process and . An approximate confidence interval, of nominal level 1 , based on this limiting normal distribution is mˆ x ˆ n xz / 2 where z is the quantile of order of the standard normal 2 distribution and ˆn x is an estimate of Varmˆ x. A bias corrected confidence interval ˆ ˆ is given by mˆ x bx ˆ n xz / 2 where bx is an estimate of the biasbx. An alternative approach can be based on bootstrap schemes. They offer the advantage of higher order accuracy with respect to the asymptotic normal approximations and they are also able to correct for bias for kernel smoothers (Härdle and Marron, 1991). Here we propose to use the MBB scheme for its wider range of applications. It gives consistent procedures under some very general and minimal conditions. Moreover, this is a genuine non parametric bootstrap method which seems the best choice when dealing with nonparametric estimates. In our context, no specific and explicit structure for the noise should be assumed. This can be particularly useful in kernel setting where the specification of a smoothing parameter can heavily affect the structure of the residuals.
3. The moving block bootstrap procedure
Let Y Y1,Yn be the observed time series. The bootstrap procedure runs as follows. ~ ~ Step 1. Fix a pilot bandwidth h and compute the estimates ~s t mt / n, h,Y for t n1,,1 n with 0 0.5. Here the kernel smoother is used only in a ~ region ,1 to avoid edge effects typical of kernel estimators. A pilot bandwidth h is necessary for a successful approximation of the limiting distribution of mˆ x which requires estimates of the asymptotic bias as well. Explicit estimates of this quantity can be avoided by over-smoothing, that is by choosing a pilot bandwidth of larger order ~ than n1/5 , i.e. hn1/5 . ~ ~ Step 2. Compute the residuals t Yt s t with t n n 1,,1 n. Step 3. Fix and form blocks of length l of consecutive observations l cardn ~ ~ * * * Bi ni ,, nil1, i 1,2,,1 n. Let B1 , B2 ,, Bp be iid from * * B1, B2 ,, B1 n and construct the bootstrap replicate n1,,1 n. Generate * ~ * the bootstrap observations by setting. Yt s t t with t n . If cardn is a multiple of l then p cardn/l , otherwise p cardn/l1 and only a portion of the p-th block is used. ~ Step 4. Compute m* x mx,h,Y * .
Step 5. Approximate the distribution of mˆ x m0 x with the bootstrap distribution of ~ m* x m~x, where m~x mx,h,Y . As usual the bootstrap distribution can be approximated through a Monte Carlo approach by repeating B times the steps 3-4. The empirical distribution function of these B replicates can be used to approximate the bootstrap distribution of mˆ x.
Based on the non pivotal quantity mˆ x m0 x an approximate bootstrap confidence interval of nominal level 1 is given by mˆ x cˆ1 /2 , mˆ x cˆ /2 where cˆ is the quantile of order of the bootstrap distribution.
4. Monte Carlo results and some concluding remarks
To study the characteristics of the proposed moving block procedure in terms of coverage probabilities and to compare it with the classical asymptotic normal approximations a small Monte Carlo experiment was performed. We considered the 2 same trend model m0 x 2 5x 5exp(100(x 0.5) ) , with x 0,1 as in Bühlmann (1998). As noise structures we used an ARMA(1,1) and an EXPAR(2) with iid innovations distributed as a Student-t with 6 degrees of freedom, scaled so that
Var t 1. All the simulations are based on 1000 Monte Carlo runs and 999 bootstrap replicates. We fixed n 140 and 1 0.90 . The probability coverage is measured at x 1/ 2 , the peak of the trend function. We considered a Normal kernel function with ~ smoothing parameters, h and h , varying according to Table 1. In all the cases, bootstrap outperforms normal based procedures and in most cases it outperforms normal based procedure with bias correction as well (see Table 2). The specification of the smoothing parameter is crucial both for the bootstrap based confidence intervals and the normal based ones, although the bootstrap method seems to be overall less sensitive to it. Moreover a proper choice of the smoothing parameter seems to be more important than that of the block length in the moving block procedure. ~ Our numerical results confirm that the pilot bandwidth h used in the MBB should be larger than h.
1/ 5 ~ ~ 5 / 9 Table 1. Bandwidth configurations. h k10.044n and h h or h k2h .
C1 C2 C3 C4 C5 C6 C7 C8 C9 C10 C11 C12 h .016 .016 .016 .016 .033 .033 .033 .033 .066 .066 .066 .066 ~ h .016 .051 .102 .204 .033 .075 .150 .300 .066 .110 .220 .440
Table 2. Pointwise empirical coverage at x=0.5. Noise model: ARMA(1,1) and EXPAR(2) with residual distributed as T6. N: normal based confidence interval; NBC: bias corrected normal based confidence interval; BT: MBB confidence interval.
ARMA l C1 C2 C3 C4 C5 C6 C7 C8 C9 C10 C11 C12 N .313 - - - .453 - - - .082 - - - NBC .140 .316 .310 .307 .385 .500 .512 .470 .568 .582 .284 .115 BT 2 .458 .670 .850 .951 .461 .607 .751 .830 .160 .086 .089 .068 4 .411 .719 .913 .975 .455 .703 .829 .937 .198 .144 .194 .217 6 .406 .723 .924 .977 .450 .728 .866 .963 .222 .192 .297 .377 8 .296 .804 .948 .988 .449 .750 .886 .968 .239 .213 .358 .497 10 .404 .728 .923 .973 .444 .739 .891 .965 .251 .242 .417 .596 12 .398 .731 .925 .977 .443 .745 .886 .970 .250 .259 .449 .672
EXPA l C1 C2 C3 C4 C5 C6 C7 C8 C9 C10 C11 C12 R N .174 - - - .508 - - - .164 - - - NBC .155 .153 .177 .172 .278 .424 .495 .516 .451 .614 .393 .208 BT 2 .337 .633 .852 .954 .468 .666 .756 .805 .313 .195 .180 .145 4 .277 .697 .931 .989 .461 .730 .824 .898 .357 .274 .320 .324 6 .267 .704 .931 .991 .448 .748 .857 .948 .372 .334 .424 .464 8 .181 .779 .959 .997 .426 .757 .882 .966 .379 .366 .480 .550 10 .266 .706 .930 .990 .425 .757 .879 .965 .382 .386 .512 .619 12 .270 .704 .937 .987 .430 .755 .883 .966 .379 .395 .535 .649
References
Bühlmann P. (1998) Sieve Bootstrap for Smoothing in Nonstationary Time Series, The Annals of Statistics, 26, 48-83. Bühlmann P. (1999) Bootstrap for Time Series, Research report n. 87, ETH, Zürich. Robinson P.M. (1983) Nonparametric Estimation for Time Series Analysis, Journal of Time Series Analysis, 4, 185-207. Härdle, W., Marron, J. S. (1991) Bootstrap Simultaneous Error Bars for Nonparametric Regression, The Annals of Statistics 19, 778-796.