Steve:
> I can get a good fit with any regression model by
>simply exhausting the degrees of freedom.
Yah no doubt. The problem is that the significance of those variables would
undoubtably become so insignificant as to be entirely redundant. In any of
the multiple regression work that I have done, at the very best I have only
been able to obtain perhaps 5 independent variables that had a significance
great enough to make the regression useful for anything. Adding more
variables simply does not work unless they are truely independent. In
forestry there are relatively few variables that are truely independent of
other variables: crown closure is a function of density, density is a
function of age, density is a function of light and moisture, tree height is
a function of age, etc. This is why redundance and increasing the number of
variables in a regression with multiple variables does not increase the
significance.
How do you exhaust the degrees of freedom? That is a new expression for me.
Increasing the df does not reduce the variation of the sample.
> That is, if I keep putting in
>enough variables (especially lags of the dependent variable) I can get a
>great fit to the data. However, there is nothing to ensure the
>predicitive power of the model.
>
>Steve
>
|