I'm not a stats brain by any stretch of the imagination but I've used MR a
few times. As I understand it, the r square value tells you about how good
a predictor your model is, how much variance is accounted for, as you said.
But you'd want to know how many predictor variables you put into that
regression to get your .167. If you put only 2 in, and they are both coming
out as significant predictors, and together they account for 16.7% of the
variance, then that's reasonable - people report models like that in
articles. But if you put 8 variables in, and none of them were independently
sig, then it's not meaning so much, because the more you chuck in the more
likely you are to get them accounting for some variance somewhere! I had a
sig model recently with 4 predictors (not strictly a regression but still it
serves to illustrate my point) - 2 of the predictors were sig, but when I
did it as a regression, the r square was only 0.006! Thus although sig, the
predictors weren't really telling me anything about my DV.
Hope this helps (and I hope I'm not telling you wrong - no doubt someone
will correct me!)
----- Original Message -----
From: "Davies, Nicola" <[log in to unmask]>
To: <[log in to unmask]>
Sent: Thursday, March 29, 2007 1:53 PM
Subject: Multiple regression and Adjusted R Sqaure
Hi All,
In a multiple regression, say, I get an adjusted r square of .167. Does
that mean that the predictors only account for 17% of the variance, this
being low? Ideally, am I wanting a large adjusted r square, such as .838?
Also, is it possible to have a low adjusted r square whilst also having a
significant F value in the anova. What would such a situation tell the
researcher?
Kind Regards,
Nicola Davies,
BSc; MSc Comm.; PhD Candidate
Liasion Officer for the DHP Postgraduate Subcommittee
|