Sieve Bootstrap-Based Prediction Intervals for Garch Processes

Sieve Bootstrap-Based Prediction Intervals for Garch Processes

SIEVE BOOTSTRAP-BASED PREDICTION INTERVALS FOR GARCH PROCESSES by Garrett Tresch A capstone project submitted in partial fulfillment of graduating from the Academic Honors Program at Ashland University April 2015 Faculty Mentor: Dr. Maduka Rupasinghe, Assistant Professor of Mathematics Additional Reader: Dr. Christopher Swanson, Professor of Mathematics ABSTRACT Time Series deals with observing a variable—interest rates, exchange rates, rainfall, etc.—at regular intervals of time. The main objectives of Time Series analysis are to understand the underlying processes and effects of external variables in order to predict future values. Time Series methodologies have wide applications in the fields of business where mathematics is necessary. The Generalized Autoregressive Conditional Heteroscedasic (GARCH) models are extensively used in finance and econometrics to model empirical time series in which the current variation, known as volatility, of an observation is depending upon the past observations and past variations. Various drawbacks of the existing methods for obtaining prediction intervals include: the assumption that the orders associated with the GARCH process are known; and the heavy computational time involved in fitting numerous GARCH processes. This paper proposes a novel and computationally efficient method for the creation of future prediction intervals using the Sieve Bootstrap, a promising resampling procedure for Autoregressive Moving Average (ARMA) processes. This bootstrapping technique remains efficient when computing future prediction intervals for the returns as well as the volatilities of GARCH processes and avoids extensive computation and parameter estimation. Both the included Monte Carlo simulation study and the exchange rate application demonstrate that the proposed method works very well under normal distributed errors. i Table of Contents Abstract ........................................................................................................................................... i Section 1: Introduction .................................................................................................................1 Section 2: The Sieve Bootstrap Procedure...................................................................................8 Section 3: The Simulation Study and Application ....................................................................11 Exchange Rate Case Study .....................................................................................................15 Section 4: Conclusion ..................................................................................................................21 References .....................................................................................................................................23 Author’s Biography .....................................................................................................................26 ii Introduction Figure 1: The Inflation-Adjusted Price of the S&P 500; A Strong Example of a Time Series --------------------------------------------------------------------------------------------------------------------- The sieve bootstrap technique (referred to as SB throughout) was first proposed by Buhlmann (1997). This procedure utilizes the general concepts of bootstrapping, an estimation technique of resampling in which data is removed at random, stored within another series, and then placed back within the initial bootstrapped sample. The main idea of this technique is that repetition of this process provides insight into underlying behavior as well as the variation possibilities that the process could display with calculated likelihoods. The sieve bootstrap is a variation on this process where sieves or collections of interwoven linear weights, of autoregressive processes are used to approximate an underlying process and, similarly, gather vital statistical information regarding the original data. Autoregressive processes are those that are linearly weighted by previous data within the same data. These will be described in further detail a bit later in this paper. To simulate desired models that are dependent on previous values, the collection of data will be presented in the form of a time series where time is the determining 1 factor of order. A time series is a set of data where an explanatory variable is fixed time intervals that are paired —and often analyzed against—one or more response variables. Thus, the analysis of time series can be considered as a subset of data analysis and statistics that serves to explain the effects on one or many dependent variables with the change of the often elusive procession of time. As can be imagined, there are countless applications for this process including, but not limited to, the extensive study of financial markets (Figure 1). To learn more about the behavior of these applications, regression analysis can be performed on the data sets. Using this gained information, models can then be fitted to explain the nuances of the particular series and its accompanying statistics. To allow for further understanding of the data at hand, an error component is often included within the general model. These error terms are calculated by differencing the actual time series with that of the fitted model estimations. These errors can be, and within this paper are, referred to as the residuals of the series. As previously mentioned, the SB technique involves the sampling of the residuals of a fitted Autoregressive or AR( ) model in which is the order, where it is assumed that with a sample size (Buhlmann, 1997). This order can be considered the maximum number of references back to previous data or the largest time span into the past in which there is linear impact on the present value. An autoregressive model is a stochastic process wherein future values can be constructed by the use of the model formulation as a recursion with estimated coefficients that weigh previous data points. In a general sense, AR( ) refers to previous values of different time lags, or units away from the current time, as having different linear effects on current and future values. The order represents the backshift of a total of time lags; therefore, the model will contain weighted coefficients. In practice, when an AR model is fitted to a time series, the first values will be removed since they have no previous terms to weigh and, 2 therefore, a model representation of these values cannot be created. Thus, residuals can be estimated for the to values and subsequently resampled via the bootstrap method. In previous, yet fairly recent articles ranging from 2002-2004, Alonso et al discusses the obtainment of prediction intervals using the SB method and an underlying ARMA1 process. From this point, the method has consistently been improved in a variety of studies, from the inclusion of an inflation factor for the prediction intervals (Mukhopadhyay and Samaranayake, 2010) to extensions onto other model structures including Fractionally Integrated Autoregressive Moving average (FARIMA) models (Rupasinghe and Samaranayake, 2012-2014). In recent years the SB has been extended onto the study of Autoregressive Conditional Heteroscedastic (ARCH) and Generalized Autoregressive Conditional Heteroscedastic (GARCH) processes, the primary models of interest in this study that will be further examined in the following sections. All in all, the SB resampling technique has been applied to many models due to its lack of dependence on underlying structure. Regardless of whether the model fitted follows an AR, MA, ARMA, or FARIMA structure, a new AR model is always fitted, making the actual order, structure, and error distribution of each of these processes unimportant for the creation of the distribution of statistics. It is also worthy to mention that the sieve bootstrap technique remains a rather cost-effective method with low computation time. Perhaps in today’s quick and technology-heavy world; this is the most appealing of all of its qualities. In a variety of different fields there are time series that display attributes of changing variance. For example, financial market data often contain time periods of high and low volatility 1 An ARMA(p,q) process utilizes the linear dependence of previous values as described for the AR process but also incorporates a moving average component in which data can be dependent on previous errors that are independently and identically distributed. The formulation of such is as follows: ∑ ∑ where is a time series at time and are calculated coefficients and is the previous errors. 3 depending on the confidence of consumers, the state of world affairs, and various other influential factors. To account for this, ARCH models were theorized by Engle (1982), in which changing variance is taken into account by looking at volatility as a linear function of squared returns. This method would later be expanded to GARCH structures by Bollerslev (1986) by adding a linear moving average component where previous volatilities provide a basis of present and future values of a series. In general notation, a time series is said to be a GARCH( process if it serves as a solution to the following equations: (1) and ∑ ∑ (2) where is a sequence of independent , identically distributed (i.i.d), random variables with a mean of zero, unit variance and ( ; the volatility process, is a stochastic process that is assumed to be independent of ; and where and are unknown parameters satisfying

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    29 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us