I'm writing a bit of software to perform Generalized Linear Models and I finally have an implementation of the basic GLM algorithm that calculates the correct parameters using the matrix formula:
B = (Xt W X)^-1 Xt W Z
The problem is I'm unclear how to calculate the standard errors. I've tried using the measure of deviance (i.e. -2*LogLikelihood):
B_errors = diagonal from the matrix given by:
sqrt[ (Xt W X)^-1 * (-2*Loglikelihood / (n-p)) ]
where:
p - the number of parameters (6 - 5 IVs and the intercept)
n - the number of observations (189)
Xt - the transform of the matrix X
The results I'm getting are about 10% greater than the correct standard errors, although the shape is correct which is encouraging. There's probably something really obvious I'm missing but not coming from a stats background I can't see it.
Any advice would be appreciated.
Regards,
Darren.
--
_______________________________________________
Sign-up for your own FREE Personalized E-mail at Mail.com
http://www.mail.com/?sr=signup
Meet Singles
http://corp.mail.com/lavalife
|