Trend Analysis

Total Page:16

File Type:pdf, Size:1020Kb

Trend Analysis TREND ANALYSIS November 16, 2005 San Francisco, CA Jointly sponsored by California Department of Health Services, Maternal and Child Health Branch, Perinatal Advisory Council: Leadership, Advocacy, and Consultation (PAC/LAC), and Family Health Outcomes Project, UCSF TREND ANALYSIS Family Health Outcomes Project Staff Geraldine Oliva, MD, MPH Linda Remy, PhD Director Senior Statistician Brianna Gass, MPH Nadia Thind, MPH MCAH Coordinator Research Associate Suzanne Ezrre Mary Tran Web Administrator Training and Meeting Coordinator TODAY’S AGENDA Trend Analysis By the end of the training, participants should have introductory skills needed for trend analysis. At the end of the training, the participant will be able to: 1. Describe, analyze and interpret a trend 2. Interpret data from the new FHOP MCAH indicator spreadsheets 3. Identify trend results within and across jurisdictions 9:30 am Coffee and Registration 10:00 am Welcome and Introductions Geraldine Oliva, MD, MPH 10:15 am Session 1. Trend Analysis for Local Public Health Surveillance: Basic Concepts and Added Value Geraldine Oliva, MD, MPH 11:30 pm Session 2. Title V Indicators: The FHOP Databooks Linda Remy, PhD 12:00 pm Lunch 1:00 pm Title V Indicators: The FHOP Databooks (Continued) Linda Remy, PhD 2:00 pm Session 3. Working in Small Groups 3:30 pm Wrap-Up and Conclusions Trend Analysis for Local Public Health Surveillance: Basic Concepts and Added Value Geraldine Oliva M.D., MPH UCSF Family Health Outcomes Project November 16, 2005 Today’s Agenda Overview of current policies and thinking that generate an impetus for trend analysis Review basic analytical and statistical concepts used in trend analysis Review new indicator databook methods and utility Practice interpreting databook spreadsheets from counties and preparing an analysis Today’s Training Objectives By the end of this training, participants should be able to: – Describe why trend analysis is important – Articulate the definitions of terms used in trend analysis – Articulate the basic analytic methods that are used for a variety of situations – Evaluate longitudinal trends in county-level MCAH indicators – Interpret health disparity trends for different age and race/ethnic groups 1 Fiscal and Policy Changes Affecting MCAH National budget deficits are resulting in cuts for basic social and health services for poor families The state Title V Block Grant funds will be reduced as of July 2005 Some of the use of local funds as matching funds for public health nursing functions are being challenged New Expectations from Federal Funders The CDC has begun a major initiative on using evidence based prevention strategies This requires that state and local governments either use an evidence based model or have an evaluation that provides new evidence on the program’s cost and effectiveness Federal agencies are interested in program impact on fewer people rather than typical descriptive epidemiology for monitoring of health indicators in the entire population New Emphases at California DHHS Increasing focus on planning at the local level Increased focus on local solutions Explosion of programs using GIS to examine local health issues 2 Challenges for Local Health Jurisdictions Determine where to allocate reduced resources Monitor results of program cuts Advocate for local resources to sustain threatened programs Advocate at state level for general fund monies to sustain programs Analytic Implications A longitudinal perspective is required at the local level: – To assess the impact of changes in policy and funding among different populations – To assess the impact of new programs as they are implemented among different populations – To assess the impact as more or fewer clients are served over time – To assess the impact of an identified risk or outcome Statistical Implications To observe changes over time at the local level techniques such as time series analysis or forecasting may be used Time series is also used when examining the impact of a risk factor on an outcome e.g. Insurance status and access indicators Trend analysis can also be used to simultaneously control for the overall trends in two measures where the focus is the association of the two factors rather than the trend itself 3 Added Value of Trend Analysis The pattern of change in an indicator over time e.g. gradual change, spikes or curves, can provide information on the impact of exposure, new technology or policy changes Can provide comparison of trends among small geographic areas and regions Can provide comparisons of patterns among different population subgroups Can assist in making future projections Definition of Terms for Regression Analysis for Trends Slope – the average rate of change over time where: zero = flat (+) = increasing trend (-) = downward trend Intercept – the rate at the start of the study period i.e. where the line intercepts the x axis Slope: Fertility Rate per 1,000 Women 15 to 44 – 1991-2000 100 80 STATE: y = 82.8 - 1.5x COUNTY: y = 70.3 - 0.75x 60 40 20 0 1991 1992 1993 1994 1995 1996 1997 1998 1999 2000 4 Change in the Intercept: %LBW newborns (< 2,500 gms. 1991-2000 8 COUNTY: y = 7.5 - 0.13x 6 STATE: y = 5.9 + 0.04x 4 2 0 1991 1992 1993 1994 1995 1996 1997 1998 1999 2000 Definition of Terms for Regression Analysis for Trends P-value- the likelihood that model results are statistically significant: used both for significance of trend of an indicator and differences with comparison groups Standard Error - the Confidence Interval around the slope and intercept Factors Impacting Analysis Sample size and the number of time periods being examined Presence of outliers Availability of numerators and denominators over time Confounding-changes over time in factors related to the indicator of interest 5 Small Numbers and Data Points For large sample sizes at the state, federal and large county level simple linear plots describing changes are treated as error free For the usual county, city or district the problem of unstable rates due to small numbers requires tests of statistical significance and other statistical techniques In looking at the individuals impacted by a particular program or policy the same small numbers problems occur Sample size issues Minimum number for stable rate may require statistical techniques such as aggregating or averaging years Many years of data may be required in order to obtain at least 3 usable data points needed for a trend analysis One size does not fit all: approach differs between one jurisdiction and another and between on indicator and another Sample Size Issues: Raw Data 6 Sample Size Issues: Plot of Data Small #s Methods: Data Aggregation Small #s Methods: Data Averaging 7 Small #s Methods: Regression Techniques Presence of Outliers This can be the result of real events or a statistical problem – When looking at communicable disease there may be years with spikes due to a new flu strain – When small numbers are used to generate rates a spike may not reflect a real change – In an area with a relatively small population, missing or inaccurate data may cause spikes that are the result of problems at the data collection level Trend study 3 8 Availability of Numerators and Denominators For numerators this occurs where data come from surveys only done periodically e.g. CHIS Denominator issues: – Between census years we have to arrive at estimates by vendors using multiple data sets to make projections – Geographic boundaries may change e.g ZIP codes appear and disappear – Intercensile samples for race/ethnic subgroups are often too small to generate reliable rates Changes Over Time in Confounding Factors Policies affecting access to care can impact rates of service utilization e.g. first trimester prenatal care, teen pregnancy Socio-demographic changes in the population of a small area may be a main determinant of changes in an indicator Technological advances may drastically change outcomes e.g. better survival of pre- term infants Monitoring the Impact of Policy Changes Year 1983 1984 1985 1986 1987 1988 1989 1990 1991 1992 1993 1994 1995 1996 1997 1998 1999 2000 19 Healthy Family: 18 100% to 200% FPL 17 = MediCal SCHIP: Lt 19, 200% FPL 16 = CHDP or 50% gt MCaid eligibility 15 = AIM 14 = Healthy Family 13 12 MediCal: A 11 1984 1 year olds at FPL G 10 1985 up to age 5 FPL E 9 1986 up to age 3 with 100% FPL 8 1987 infant up to 185 FPL 7 1989 up to age 6, 133% FPL 6 AIM AI M 5 200% 200% to 300% FPL 4 to 3 CHDP 29 Months 250% 2 AIM FPL 1 100 to 200% FPL 0 CHDP 100% FPL Payor PPS and DRG Implemented MediCal MC legislation but not implemented MediCal MC Expansions Reforms Private Sector MC Reforms Begin Employers drop low wage coverage Employers mandate MC Employers drop insurance Legislative EMTALA EMS for undocumented Recession Residency Prop 187 Assets tests waived then dropped Reforms Medicaid for immigrants Requirements IRCA Welfare Reform 9 Likelihood of Hospital Admission by Race/Ethnicity 1.6 1.4 1.2 1.0 0.8 0.6 0.4 0.2 0.0 1983 1984 1985 1986 1987 1988 1989 1990 1991 1992 1993 1994 1995 1996 1997 1998 White Hispanic Black Asian ACS Rate/10,000 for Payor 50 40 30 20 10 0 1983 1984 1985 1986 1987 1988 1989 1990 1991 1992 1993 1994 1995 1996 1997 1998 Public Private Selecting an Appropriate Statistical Strategy Define the purpose and context of the analysis i.e. comparison groups or dates of specific events or policy changes Are there enough cases to do a trend analysis i.e. should I manipulate data to do a trend or should I use case studies
Recommended publications
  • Trend Analysis and Forecasting the Spread of COVID-19 Pandemic in Ethiopia Using Box– Jenkins Modeling Procedure
    International Journal of General Medicine Dovepress open access to scientific and medical research Open Access Full Text Article ORIGINAL RESEARCH Trend Analysis and Forecasting the Spread of COVID-19 Pandemic in Ethiopia Using Box– Jenkins Modeling Procedure Yemane Asmelash Introduction: COVID-19, which causes severe acute respiratory syndrome, is spreading Gebretensae 1 rapidly across the world, and the severity of this pandemic is rising in Ethiopia. The main Daniel Asmelash 2 objective of the study was to analyze the trend and forecast the spread of COVID-19 and to develop an appropriate statistical forecast model. 1Department of Statistics, College of Natural and Computational Science, Methodology: Data on the daily spread between 13 March, 2020 and 31 August 2020 were Aksum University, Aksum, Ethiopia; collected for the development of the autoregressive integrated moving average (ARIMA) 2 Department of Clinical Chemistry, model. Stationarity testing, parameter testing and model diagnosis were performed. In School of Biomedical and Laboratory Sciences, College of Medicine and Health addition, candidate models were obtained using autocorrelation function (ACF) and partial Sciences, University of Gondar, Gondar, autocorrelation functions (PACF). Finally, the fitting, selection and prediction accuracy of the Ethiopia ARIMA models was evaluated using the RMSE and MAPE model selection criteria. Results: A total of 51,910 confirmed COVID-19 cases were reported from 13 March to 31 August 2020. The total recovered and death rates as of 31 August 2020 were 37.2% and 1.57%, respectively, with a high level of increase after the mid of August, 2020. In this study, ARIMA (0, 1, 5) and ARIMA (2, 1, 3) were finally confirmed as the optimal model for confirmed and recovered COVID-19 cases, respectively, based on lowest RMSE, MAPE and BIC values.
    [Show full text]
  • Public Health & Intelligence
    Public Health & Intelligence PHI Trend Analysis Guidance Document Control Version Version 1.5 Date Issued March 2017 Author Róisín Farrell, David Walker / HI Team Comments to [email protected] Version Date Comment Author Version 1.0 July 2016 1st version of paper David Walker, Róisín Farrell Version 1.1 August Amending format; adding Róisín Farrell 2016 sections Version 1.2 November Adding content to sections Róisín Farrell, Lee 2016 1 and 2 Barnsdale Version 1.3 November Amending as per Róisín Farrell 2016 suggestions from SAG Version 1.4 December Structural changes Róisín Farrell, Lee 2016 Barnsdale Version 1.5 March 2017 Amending as per Róisín Farrell, Lee suggestions from SAG Barnsdale Contents 1. Introduction ........................................................................................................................................... 1 2. Understanding trend data for descriptive analysis ................................................................................. 1 3. Descriptive analysis of time trends ......................................................................................................... 2 3.1 Smoothing ................................................................................................................................................ 2 3.1.1 Simple smoothing ............................................................................................................................ 2 3.1.2 Median smoothing ..........................................................................................................................
    [Show full text]
  • Initialization Methods for Multiple Seasonal Holt–Winters Forecasting Models
    mathematics Article Initialization Methods for Multiple Seasonal Holt–Winters Forecasting Models Oscar Trull 1,* , Juan Carlos García-Díaz 1 and Alicia Troncoso 2 1 Department of Applied Statistics and Operational Research and Quality, Universitat Politècnica de València, 46022 Valencia, Spain; [email protected] 2 Department of Computer Science, Pablo de Olavide University, 41013 Sevilla, Spain; [email protected] * Correspondence: [email protected] Received: 8 January 2020; Accepted: 13 February 2020; Published: 18 February 2020 Abstract: The Holt–Winters models are one of the most popular forecasting algorithms. As well-known, these models are recursive and thus, an initialization value is needed to feed the model, being that a proper initialization of the Holt–Winters models is crucial for obtaining a good accuracy of the predictions. Moreover, the introduction of multiple seasonal Holt–Winters models requires a new development of methods for seed initialization and obtaining initial values. This work proposes new initialization methods based on the adaptation of the traditional methods developed for a single seasonality in order to include multiple seasonalities. Thus, new methods to initialize the level, trend, and seasonality in multiple seasonal Holt–Winters models are presented. These new methods are tested with an application for electricity demand in Spain and analyzed for their impact on the accuracy of forecasts. As a consequence of the analysis carried out, which initialization method to use for the level, trend, and seasonality in multiple seasonal Holt–Winters models with an additive and multiplicative trend is provided. Keywords: forecasting; multiple seasonal periods; Holt–Winters, initialization 1. Introduction Exponential smoothing methods are widely used worldwide in time series forecasting, especially in financial and energy forecasting [1,2].
    [Show full text]
  • Forecasting by Extrapolation: Conclusions from Twenty-Five Years of Research
    University of Pennsylvania ScholarlyCommons Marketing Papers Wharton Faculty Research November 1984 Forecasting by Extrapolation: Conclusions from Twenty-five Years of Research J. Scott Armstrong University of Pennsylvania, [email protected] Follow this and additional works at: https://repository.upenn.edu/marketing_papers Recommended Citation Armstrong, J. S. (1984). Forecasting by Extrapolation: Conclusions from Twenty-five Years of Research. Retrieved from https://repository.upenn.edu/marketing_papers/77 Postprint version. Published in Interfaces, Volume 14, Issue 6, November 1984, pages 52-66. Publisher URL: http://www.aaai.org/AITopics/html/interfaces.html The author asserts his right to include this material in ScholarlyCommons@Penn. This paper is posted at ScholarlyCommons. https://repository.upenn.edu/marketing_papers/77 For more information, please contact [email protected]. Forecasting by Extrapolation: Conclusions from Twenty-five Years of Research Abstract Sophisticated extrapolation techniques have had a negligible payoff for accuracy in forecasting. As a result, major changes are proposed for the allocation of the funds for future research on extrapolation. Meanwhile, simple methods and the combination of forecasts are recommended. Comments Postprint version. Published in Interfaces, Volume 14, Issue 6, November 1984, pages 52-66. Publisher URL: http://www.aaai.org/AITopics/html/interfaces.html The author asserts his right to include this material in ScholarlyCommons@Penn. This journal article is available at ScholarlyCommons: https://repository.upenn.edu/marketing_papers/77 Published in Interfaces, 14 (Nov.-Dec. 1984), 52-66, with commentaries and reply. Forecasting by Extrapolation: Conclusions from 25 Years of Research J. Scott Armstrong Wharton School, University of Pennsylvania Sophisticated extrapolation techniques have had a negligible payoff for accuracy in forecasting.
    [Show full text]
  • Stat Forecasting Methods V6.Pdf
    Driving a new age of connected planning Statistical Forecasting Methods Overview of all Methods from Anaplan Statistical Forecast Model Predictive Analytics 30 Forecast Methods Including: • Simple Linear Regression • Simple Exponential Smoothing • Multiplicative Decomposition • Holt-Winters • Croston’s Intermittent Demand Forecasting Methods Summary Curve Fit Smoothing Capture historical trends and project future trends, Useful in extrapolating values of given non-seasonal cyclical or seasonality factors are not factored in. and trending data. ü Trend analysis ü Stable forecast for slow moving, trend & ü Long term planning non-seasonal demand ü Short term and long term planning Seasonal Smoothing Basic & Intermittent Break down forecast components of baseline, trend Simple techniques useful for specific circumstances or and seasonality. comparing effectiveness of other methods. ü Good forecasts for items with both trend ü Non-stationary, end-of-life, highly and seasonality recurring demand, essential demand ü Short to medium range forecasting products types ü Short to medium range forecasting Forecasting Methods List Curve Fit Smoothing Linear Regression Moving Average Logarithmic Regression Double Moving Average Exponential Regression Single Exponential Smoothing Power Regression Double Exponential Smoothing Triple Exponential Smoothing Holt’s Linear Trend Seasonal Smoothing Basic & Intermittent Additive Decomposition Croston’s Method Multiplicative Decomposition Zero Method Multiplicative Decomposition Logarithmic Naïve Method Multiplicative Decomposition Exponential Prior Year Multiplicative Decomposition Power Manual Input Winter’s Additive Calculated % Over Prior Year Winter’s Multiplicative Linear Approximation Cumulative Marketing End-of-Life New Item Forecast Driver Based Gompertz Method Custom Curve Fit Methods Curve Fit techniques capture historical trends and project future trends. Cyclical or seasonality factors are not factored into these methods.
    [Show full text]
  • Trend Analysis of Climate Time Series a Review of Methods
    Earth-Science Reviews 190 (2019) 310–322 Contents lists available at ScienceDirect Earth-Science Reviews journal homepage: www.elsevier.com/locate/earscirev Trend analysis of climate time series: A review of methods T Manfred Mudelsee Alfred Wegener Institute Helmholtz Centre for Polar and Marine Research, Bussestrasse 24, 27570 Bremerhaven, Germany Climate Risk Analysis, Kreuzstrasse 27, Heckenbeck, 37581 Bad Gandersheim, Germany ARTICLE INFO ABSTRACT Keywords: The increasing trend curve of global surface temperature against time since the 19th century is the icon for the Bootstrap resampling considerable influence humans have on the climate since the industrialization. The discourse about thecurvehas Global surface temperature spread from climate science to the public and political arenas in the 1990s and may be characterized by terms Instrumental period such as “hockey stick” or “global warming hiatus”. Despite its discussion in the public and the searches for the Linear regression impact of the warming in climate science, it is statistical science that puts numbers to the warming. Statistics has Nonparametric regression developed methods to quantify the warming trend and detect change points. Statistics serves to place error bars Statistical change-point model and other measures of uncertainty to the estimated trend parameters. Uncertainties are ubiquitous in all natural and life sciences, and error bars are an indispensable guide for the interpretation of any estimated curve—to assess, for example, whether global temperature really made a pause after the year 1998. Statistical trend estimation methods are well developed and include not only linear curves, but also change- points, accelerated increases, other nonlinear behavior, and nonparametric descriptions. State-of-the-art, com- puting-intensive simulation algorithms take into account the peculiar aspects of climate data, namely non- Gaussian distributional shape and autocorrelation.
    [Show full text]
  • Chapter 3 Sales Forecasting
    Chapter 3 Sales forecasting Nature and purpose of sales forecasting It would not be hard to be a successful business person if you had a crystal ball and could look into the future. If you knew which products were going to sell well and which were going to sell badly it would be easy to make a lot of money. Unfortunately, things are never that simple. One can never be one hundred percent sure about the future. However, wise business people will do their best to find out what is likely to happen so that they can position their business in the best way possible to take advantage of any opportunities that may arise. For most businesses the best way to anticipate the future is to use sales forecasting techniques. Sales forecasting is the art or science of predicting future demand by anticipating what consumers are likely to do in a given set of circumstances. Businesses might ask ‘what will demand be if real incomes increase by 10%?’ or ‘how much is demand likely to fall if a competitor launches a copycat product?’ Above all, sales forecasting techniques allow businesses to predict sales, and once a business has what it believes is an accurate estimate of future sales it can then predict HRM needs, finance needs, estimate the quantity and cost of purchases of raw materials as well as determining production levels. Using past experience or past business data to forecast future sales is called extrapolation. Methods of sales forecasting Sales forecasting methods can be categorised into quantitative and qualitative techniques.
    [Show full text]
  • Design Research Methods for Future Mapping
    International Conferences on Educational Technologies 2014 and Sustainability, Technology and Education 2014 DESIGN RESEARCH METHODS FOR FUTURE MAPPING Sugandh Malhotra1, Prof. Lalit K. Das2 and Dr. V. M. Chariar3 1Ph. D. Research Scholar 2Ex- Head, IDDC 3Professor, CRDT Indian Institute of technology, Delhi, India ABSTRACT Although a strategy for business innovation is to turn a concept into something that’s desirable, viable, commercially successful and that which adds value to people’s lives but in the fast changing world, we are seeing weakening of relationship between product, user and the environment, thereby causing sustainability issues. A concrete futuristic vision will give overall direction to the efforts; it will also help us channel our energy and resources collectively for a more planned and sustainable future. This paper looks at various future research methods (forecasting techniques) used in the industry and evaluates their relevance and application to designing. The authors identified the importance of factors such as discovery of newer material properties, latest advances in technology realizability, socio-economic trends, cultural paradigms, etc. that have impacted the course of design visualization for the future. Thus, the available forecasting methods fall underthese three paradigms:-Technological, People-social, and Environmental paradigms:- l. Lastly, this paper identifies a need for coherent forecasting mechanism which makes the product designs of future more predictable, viable and thus, promote sustainability at large. KEYWORDS Future mapping, design research methods, design forecasting, business innovation 1. INTRODUCTION Future visions help generate long-term policies, strategies, and plans, which help bring desired and likely future circumstances in closer alignment with the present potentialities.
    [Show full text]
  • Methods and Procedures for Trend Analysis of Air Quality Data Thompson Nunifu and Long Fu
    Methods and Procedures for Trend Analysis of Air Quality Data Thompson Nunifu and Long Fu This publication can be found at: https://open.alberta.ca/publications/9781460136379. Comments, questions, or suggestions regarding the content of this document may be directed to: Ministry of Environment and Parks, Environmental Monitoring and Science Division 10th Floor, 9888 Jasper Avenue NW, Edmonton, Alberta, T5J 5C6 Email: [email protected] Website: environmentalmonitoring.alberta.ca For media inquiries, please visit: alberta.ca/news-spokesperson-contacts.aspx. Recommended citation: Nunifu, T. and L. Fu. 2019. Methods and Procedures for Trend Analysis of Air Quality Data. Government of Alberta, Ministry of Environment and Parks. ISBN 978-1-4601-3637-9. Available at: https://open.alberta.ca/publications/9781460136379. © Her Majesty the Queen in Right of Alberta, as represented by the Minister of Alberta Environment and Parks, 2019. This publication is issued under the Open Government Licence – Alberta open.alberta.ca/licence. Published August, 2019 ISBN: 978-1-4601-3637-9 2 Methods and Procedures for Trend Analysis of Air Quality Data Alberta’s Environmental Science Program The Chief Scientist has a legislated responsibility for developing and implementing Alberta’s environmental science program for monitoring, evaluation and reporting on the condition of the environment in Alberta. The program seeks to meet the environmental information needs of multiple users in order to inform policy and decision-making processes. Two independent advisory panels, the Science Advisory Panel and the Indigenous Wisdom Advisory Panel, periodically review the integrity of the program and provide strategic advice on the respectful braiding of Indigenous Knowledge with conventional scientific knowledge.
    [Show full text]
  • Trend Analysis for Quarterly Insurance Time Series
    Trend Analysis for Quarterly Insurance Time Series Daniel Bortner, Cody Pulliam, Waiman Yam Faculty Advisor: Michael Ludkovski Department of Statistics and Applied Probability, University of California Santa Barbara Abstract Insurance companies commonly use linear regression to create predictive models by drawing a line of best fit through the data points. Here we are implementing the techniques of time series analysis to create a more accurate way to model quarterly data. Within the data, characteristics such as trend and seasonality can be utilized to improve upon basic linear regression. After comparing different models, the ARIMA model proves to be better at predicting the data than linear regression. Objectives Linear Regression ® Explore whether time-series analysis is applicable to calculating Yˆ = b + b x + (1) future Pure Premium costs. i 0 1 i i ® Compare insurance firms' current regression forecasts to the ® Minimizes the amount of error between a best fit line and the actual time-series based methods. data. Assumes residual component (i) demonstrates random noise. ® Identify which variables of the data provide more accurate predic- ® Displays the overall trend of the data set. tions. ®Regression tends not to work well with volatile data. ® Compare the precision of different predictive models. ® Establish an effective methodology for forecasting. Time Series Diagnostics Data ® 18 States of Home Owners Policies (5 types of forms) and Auto Insurance Coverages (15 coverages) by quarter; Up to eight years (2005Q1 to 2013Q1) worth of data and a maximum of 33 quarters per coverage/form. ® The data provided had already been applied a 4-quarter moving average. ® A `series', denoted as a state and coverage (California {Bodily Injury), consists of their Frequency, Severity, Pure Premium, and Earned Exposure variables by quarter.
    [Show full text]
  • Time Series and Forecasting a Time Seres Is a Set of Observations {Y J}
    Time Series and Forecasting A time seres is a set of observations fYjg of some variable taken at regular time intervals (monthly, quarterly, yearly, etc.). Forecasting refers to using past experience to predict values for the future, which requires summarizing the past experience in some useful way, based on the behavior that is expected to continue and factoring out random variation. The ultimate test of a forecasting method can only be carried out after the time predicted { not very helpful when the prediction is needed { so our comparisons of various methods rely on \How well would this method (if it had been used) have predicted what we (now) know really happened?". Thus we use \prediction error" (exactly the same idea as residu- als in regression) for past time periods and the mean of the squares of the errors to compare the success of different methods. Our methods break down (broadly) into two groups 1.) Smoothing methods: the simplest description and prediction with no assumption about patterns. For forecasting, these methods are appropriate only for forecasting one period (one month for monthly data, one year for yearly data, etc.) into the future 2.) Classical time series: Certain patterns (seasonal variation, underlying trend) are identified and separated out (using the methods developed for group 1 as well as regression techniques) to give a more complete representation. This requires assumptions about the existence of patterns but then allows forecasting behavior for several time periods. Because there is no unifying assumption of pattern (as there is with linear regression), there is no \best fit” prediction and no tests of significance are used { these are very much descriptive techniques being used to make forecasts.
    [Show full text]
  • Time Series Analysis Traditional and 4 Contemporary Approaches
    04-Hayes-45377.qxd 10/25/2007 3:19 PM Page 89 Time Series Analysis Traditional and 4 Contemporary Approaches Itzhak Yanovitzky Arthur VanLear ne commonly hears that communication is a process, but most com- Omunication research fails to exploit or live up to that axiom. Whatever the full implications of viewing communication as a process might be, it is clear that it implies that communication is dynamically sit- uated in a temporal context, such that time is a central dimension of com- munication (Berlo, 1977; VanLear, 1996). We believe that there are several reasons for the gap between our axiomatic ideal of communication as process and the realization of that ideal in actual communication research. Incorporating time into communication research is difficult because of the time, effort, resources, and knowledge necessary. Temporal data not only offers great opportunity and advantage, but it also comes with its own set of practical problems and issues (Menard, 2002; Taris, 2000). The paucity of process research exists not only because of the greater effort and difficulty in time series data collection but also because the exploitation and analysis of time series data call for knowledge and expertise beyond what is typically taught in our communication methods sequences in graduate school, and mastering these techniques by self-teaching is diffi- cult for many. This chapter is designed as the first step in developing the knowledge and skills necessary to do communication process research. Authors’ Note: We wish to thank Stacy Renfro Powers and Christian Rauh for use of the interac- tion data used in the example of frequency domain time series, and we wish to thank Jeff Kotz and Mark Cistulli for unitizing and coding that data.
    [Show full text]