I would appreciate if someone could help me with this
minor(?!) statistical problem.
I have two time series A and B. When the two series
are plotted over time using two Y axis (one for each
of the variables A and B) the fit is close to perfect.
However, if plotted using only one axis the fit is not
as good. How can I adjust the time series
mathematically in order to capture the good fit in
regression analysis?
The axis for the time series A is in the range (150 to
(-150)) and the time series B is in the range ((-45)
to 35),
where 0 on A's Y axis is equal to -5 on B's Y axis
150 on A's axis is equal to (-45) on B's Y axis
-150 on A's axis is equal to 35 on B's Y axis
Best Regards,
Elsa Larsson
__________________________________________________
Do You Yahoo!?
Yahoo! Calendar - Get organized for the holidays!
http://calendar.yahoo.com/
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
|