A colleague using multiple logistic regression would like to have:
(1) an overall measure of the explanatory power of the model, such as
proportion of variance explained in linear regression, and
(2) a way to compare the contributions of two independent variables when
both (and possibly other variables as well) are in the model, such as
incremental R square in linear regression.
SPSS produces a statistic called R, but the formula for it seems
unreasonable to me because R can be zero when the logistic regression
coefficient is nonzero. Let U=(W-2K)/(-2L), where K is the degrees of
freedom for the (categorical or continuous) independent variable, W is its
Wald statistic (a chi-square statistic for significance testing), and L is
the log likelihood of a model with only an intercept. Then R is plus or
minus the square root of U if U is greater than or equal to 0, and
(here is what I don't like) zero if U<0.
Any suggestions or illuminating comments would be appreciated.
T. Robert Harris [log in to unmask]
Department of Mathematics
University of North Dakota
Grand Forks ND 58202-8376 701-777-2427
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
|