Print

Print


Hi,

I am a new user of winbugs and I am trying to perform a MCMC analysis of the model error in GLS Regional Regression. Below I fully describe the problem and some alternatives I am thinking for implementation in two cases (model error large and small). My question is how to implement this in winbugs. I tried but trap messages appear due numerical computation of inverse matrix during simulation. Does anyone has a similar example that I could use as basis for my code or just a few hints ...
Thanks

Eduardo


We write the general model

y = X b + D + E

where
D ~ N(0,
g I) are the independent model errors
E ~ N(0,
S ) are the sampling errors because y consist of estimators.

Sometimes the model is written

y = X b + N

where


N ~ N(0, L(g)) with L(g) = S + g I

To make this easy to implement in Winbugs, or any MCMC code,
  we precompute the B matrix so that

B BT =
S (G)
and
Inv(B) = C


PROBLEM
The data is given as y the skews and X the matrix of covariates, and
S the specified covariance of the y estimators. We wish to compute the posterior distribution of b and g. Thus we may compute posterior distribution of:

regression coefficients b
model error variance g
Unobserved innovations of Vi and Ei

In the case wherein
g I is large compared to S, the problem is like OLS regression and E is a small correction. In the opposite case when g is small, E is a very small correction and D will be large. I think this effects how we should set up the problem.

MCMC #1: Formulation for large
g.
Write the problem as

D = B V
E = y - X b - D

Likelihood function based upon

(V)i = Vi ~ N(0,1)
(E)i = Ei ~ N(0,
g)

MCMC generates posterior distribution of V, E, b and
g.
Need a prior for
g and b.

g ~ Gamma(0.1, 0.1)
b ~ N(0, big)

Because BBT is small compared to
gI, the values of Vi can bounce around without causing great harm to the likelihood for the Ei values because D is small. And in a similar way, the data may not really tell us much about V because the overwhelming major source of the error is due to E. Thus the E values will essentially be equal to y - XB, as they would be in OLS regression.

MCMC #2: Formulation for large
g.
A compact representation of the model above without explicit Ei values.
Likelihood function based upon

(V)i = Vi ~ N(0,1)
y - X b - B V ~ N(0,
g)
MCMC generates posterior distribution of V, b and
g. Need prior on b and g.

MCMC #3: Formulation for SMALL g.
Take the fundamental equation y = XB + BV + E and solve for V to obtain:

(E)i ~ N(0,
g)
C[y - Xb - E]i ~ N( 0, 1)

Compute posterior for
g, b, and Ei. Here Ei is viewed as a small correction to the GLS model with iid errors V, which are defined implicitly. Need prior on b and g.

MCMC #2: Formulation for small
g. Equivalent alternative with explicit Vi.

V = C[y - Xb - E ]

Likelihood function based upon

(V)i = Vi ~ N(0,1)
(E)i = Ei ~ N(0,
g)
MCMC generates posterior distribution of V, E, b and
g. Need prior on b and g.