Hi Erica
On 15 December 2012 09:34, Erika Baker <[log in to unmask]> wrote:
> Hi all,
>
> I am going round in circles with this and would really appreciate any insight!
>
> I am running a logistic regression with 7 IVs, N= 214. The model is significant and the R2 is .45. However, the individual variables show some odd findings and i've not been able to work out a solution...
>
Don't use R2 in logistic regression, and if you do, don't call it R2.
If you must have some equivalent, calculate the c-statistic. (There
are at least 4 versions of R2 for logistic regression, they're all
different and none of them are a proportion of variance).
> 1) 3 variables have a b = .000, S.E = .000 and Exp(b) = 1, yet one of these is significant. To my understanding this shows there is no relationship, so how am i finding significance?
>
I suspect you have predictor variables with very, very large
variances. Divide them by some large number. (The logistic regression
uses the change in the predictor of one unit. If you look at the
probability a person is retired given that they are one second older,
the effect will be tiny, but probably significant. If you used years,
the effect would be larger, and have the same p-value, if you used
decades, the effect will be larger again, but still have the same
p-value.)
[It's possible if you have very large or very small variances for the
program to run into computational problems because of rounding. One
rule of thumb is to try to keep your variances with about 10x of each
other - so if the smallest is 1, make sure that the largest isn't
greater than 10).
> 2) 2 variables are significant, however the relationship is in the opposite direction from when i did a bivariate test, also they do not fall in the confidence parameters. Is there anything i can do about this?
>
When adding variables to a regression flips the sign relative to a
bivariate test, this is called a suppressor effect. Usually it's a
pain, sometimes it's interesting. It's happening because your
predictors are highly correlated. Consider dropping them or combining
them.
What do you mean by "do not fall in the confidence parameters"? That's
weird, can you post the output?
> Apologies if i'm missing something basic here, but I am going round in circles with it! The text I have states the problems i'm having but without a solution, so i would be so grateful if anyone could give me any insight?
>
Nothing basic being missed. Regression is complex, logistic regression
even more so.
One tip to try to simplify logistic regression is to do a regular
linear regression in the same way. The results won't be right, but
they will be similar, and linear regression is easier to understand,
so it can help you identify what went wrong. In addition, there are
better diagnostics for linear regression - e.g. you can't get
collinearity diagnostics out of logistic regression (in most programs)
but you can get them from linear regression - and it doesn't matter,
because they're the same in both flavours of regression.
Hope that helps,
Jeremy
|