Print

Print


Message
Hi All,
I have been trying different model selections methods and
have a question about Newton/Raftery method for computing the marginal distribution of the data:
 
P(Data) = [ int P(Data| Parameters)^{-1} dP(Parameters|Data) ]^{-1}.
 
The situation I am looking at is conjugate, normal-inverted gamma regression where P(Data) is known to be a t-distribution.
 
The NR approximation usually indicates the correct model, but it consistently over-estimates the known P(Data),
regardless of the sample size, number of MCMC iterations, & prior parameters. 
 
My null hypothesis is that I have a bug, but other approximations, such as
int P(Data|Parameter)*dP(Parameter), Gelfand & Dey, Chib & Carlin, and reversible jump, get very close to P(Data). 
The literature on NR states the method is consistent in the number of MCMC iterations.
In this example, the NR approximation stabilizes at the wrong value as a function of iterations (I only went out 100K).
 
Any insights?
 
Thanks
Peter
 
 
------------------------------------------------------------------- This list is for discussion of modelling issues and the BUGS software. For help with crashes and error messages, first mail [log in to unmask]

To mail the BUGS list, mail to [log in to unmask] Before mailing, please check the archive at www.jiscmail.ac.uk/lists/bugs.html Please do not mail attachments to the list.

To leave the BUGS list, send LEAVE BUGS to [log in to unmask] If this fails, mail [log in to unmask], NOT the whole list