The appropriate methodology for determining how to get the variance of the
errors to be homogenous or homoscedastic requires a sequence of tests.
Example 1 .. A few outliers in your time series residuals can incorrectly
cause you to think that the variance of the residuals is not-constant. If
you incorporate Intervention variables ( dummy variables ) these can often
render the resultant error series to have a constant variance. Sometimes
these Intervention variables are Seasonal Pulses and/or Local Shifts in the
mean of the residuals. The basis for this is Tiao's work on Intervention
Detection.
Example 2. .. Often times there are a few discrete changes in variance which
can be remedied by Generalized Least Squares where a weighting
transformation is applied yielding a set of errors that has homogenous
variance . The pioneering work of R.Tsay suggest these discrete points in
time.
Example 3 ... If the parameters of the model are time varying or different
from one region of time to another then this can often lead to a
mis-diagnosis of non-constant variance. Validate the assumption of constancy
of parameters by using The Chow Test. This oftentimes will yield parameters
which yield constant variance.
Example 4 ... If the mean of the residuals is zero everywhere or at least
not significantly different from zer0 and the parameters of the model are
roughly constant and there are no evidenced structural break-points in the
variance of the residuals then you might consider a Box-Cox test which can
often suggest the appropriate power transformation such as logs /
reciprocals / square roots etc which often upon incorporation into the model
yield residuals with apparent constant variance.
Example 5 ... If after testing all of the above one takes the residuals and
squares them .. One can then build an ARIMA model for these squared
residuals. This then suggests a case where the variance of the residuals
itself is a random variable. This is often referred to as a Garch Model
The literature is full of unnecessary power transformations which could have
been more appropriately dealt with the methods of 1,2 and 3 ...
For more on this subject , please join me in NYC in June at the 27th Annual
ISF Conference where I will be delivering a 3 hour pre-conference seminar on
EXACTLY this topic.
Regards
Dave Reilly
Automatic Forecasting Systems
http://www.autobox.com
215-675-0652 in the colonies
-----Original Message-----
From: A UK-based worldwide e-mail broadcast system mailing list
[mailto:[log in to unmask]] On Behalf Of Leo Guelman
Sent: Monday, May 07, 2007 3:26 PM
To: [log in to unmask]
Subject: Time Series - Query
Hi,
Can anyone refer me to the right methodology of selecting the appropriate
transformation to a time series (such as the logarithmic transmormation)?
When should I apply logs to the data in a time series context?
In some cases I saw that a simple delta Xt is used to achieve stationarity
in the series but sometimes also logXt-logXt-1?
Thanks for any comments.
Regards,
Leo.
|