Forecasting Using Time Series Techniques
Forecasting Time Series simply means forecasting future by understanding the past. A simple model is fitted to a given time series, which is then used to predict future behaviour.
For example, you own a retail stationery business in LA. You are really concerned over consistent losses since last 10 years. You have taken remedial actions to boost up sales including increase in advertising budget, off season discounts etc. Even after implementing this strategy for boosting up revenue, you are really worried if the loss in sales will continue in near future also.
What should you do now?
You can use time series analysis over past 10 years’ data for your stationery business and forecast future behaviour/pattern of sales based upon trends in the past.
Time Series forecasting involves the use of a model to predict future values (e.g. sales projections) based upon previously observed data/pattern. It can be used in statistics, weather forecasting, earthquake prediction etc.
Time series are complex in nature as each observation is influenced by previous observation or dependent upon more than one previous observations. Thus random error is also influential from one observation to another. These influences are called as Autocorrelation. To extract these autocorrelation elements of the data is the real challenge of Time Series analyses.
The time series data violates the assumption of independence of errors. As we had come across earlier, Type 1 error rates will increase substantially when autocorrelation is present. It is because of the presence of this autocorrelation element in time series data, we cannot use multiple linear regression techniques. Hence, we use some other methods to forecast data as briefed below.
In this session, we will try to understand various forecasting techniques and analyse a case study involving forecasting so as to make the process more comprehensible.
We can forecast data using the following techniques:
• Simple Moving Average Method
• Weighted Moving Average Method
• Simple Exponential Moving Method
• Double Exponential Smoothing Method
• Triple Exponential Smoothing Method
The Exponential smoothing methods mentioned above are in fact special case of ARIMA models. Single exponential smoothing corresponds to an ARIMA(0,1,1) model; double exponential smoothing corresponds to an ARIMA(0,2,2) model; and triple exponential smoothing corresponds to an ARIMA(0,3,3) model.
In Exponential smoothing methods, a smoothing weight parameter is used.Generally, smaller smoothing weights are appropriate for series with a slowly changing trend, while larger weights are appropriate for volatile series with rapidly changing trend.The weight of an observation is a geometric (exponential) function of the number of periods that the observation extends into the past relative to the current period.
The weight function is:
where ‘x’ is the observation no of the past observation, ‘t’ is the current observation no, ‘w’ is the weighting constant.
The traditional smoothing methods are simple and computationally inexpensive. These were used in 1960’s when ARIMA models were not developed.
However, there is a caveat to the usage of the above techniques as briefed in the following section.
Nowadays, the focus is on the usage of ARIMA models as against the traditional smoothing methods because of the following reasons:
• Automatic computation of Smoothing Parameter
-In Exponential Smoothing forecasts, the weighting parameter is manually specified by the user himself rather than estimated from the data. However, in ARIMA modelling, the optimal smoothing weight is automatically computed as the estimate of the moving average parameter of the ARIMA model.
• ARIMA is a relatively more accurate model
- In double exponential smoothing, the optimal pair of two smoothing weights are computed. In the case of triple exponential smoothing, the optimal three smoothing weights are computed by the ARIMA method.
Most implementations of the traditional exponential smoothing method use the same smoothing weight for each stage of smoothing. This implies more accuracy in ARIMA modelling.
• Problem of setting starting smoothed value
- The ARIMA method takes care of the problem of setting the starting smoothed value, thus setting it in a statistically optimal way.