Hi everybody,
I am a geodetist. and I want to use Bayesian Statistics in a geodetic
parameter estimation problem.
There is nonlinear relationship between observations and parameters:
Least Square process minimizes the error with respect to the observations.
Fi(X1, Y1, X2, Y2,..., Xu, Yu)=l(i)+V(i) {i=1,2,..,n}
n= number of observation
u= number of unknown points
li= observation
Vi=observation error
X(j), Y(j) = unknown parameters which are cartesian coordinates for each
point {j=1, 2,.., u}
n>2*u more equations than variables
In general case, the least-square method is often used to solve this types
of non-linear equations that have been linearized using a first-order Taylor
series expansion. This technique is referred to as the Gauss- Newton method.
The matrix notations for the mathematical description of this problem:
* nx1 observation vector L including li
* 2ux1 unknown parameter vector DX including X and Y coordinates, u=number
of unknown point
* 2ux1 initial guess vector for coodinates DX0 including X0 and Y0
the initial guesses can have any finite real value, but the system will
converge faster if the guesses are close to the solution.
* nx1 vector of residuals DL including difference between the observation
and the equation evaluated for the initial guess.
l'(i)=l(i)-Fi(X1, Y1, ..., Xu, Yu) i=1,2, ..., n
* 2*u x n Jacobian Matrix or Design matrix A including the partial
differentials of each equation with respect to each unknown
* nxn weighting matrix P square symmetric matrix with one row per equation.
The main diagonal contais the weights of individulas equations, while off-
diagonal entires are the dependenccies of equations upon one another. If all
of the observations are independent, this is a diagonal matrix.
* nxn cofactor matrix Q=invers(P)
* nx1 observation error vector V including V(i) {i=1, 2, .., n}
LSQ estimation for this problem results with:
DX=invers(AT. P. A) (AT. P.DL)
I wan to adapt Markov Chain Monte Carlo Integration formulas to this
problem. Is there anybody who can give me an advice how can I can do this.
X1, Y1, ..., Xu, Yu : unknown parameters
l1, l2, ..., ln : data
P(X1, Y1, ...., Xu, Yu / l1, l2, ..., ln)= The posterior of unknown
parameters ?
P(l1, l2, ..., ln / X1, Y1, ..., Xu, Yu) = Likelihood of data
?
And prior of parameters
?
I am looking forward to getting an advices from you.
With my best regards...
Emine TANIR
===========================================
Emine Tanir
Guest Researcher
Institut fur Statistik und Wahrscheinlichkeitstheorie
Technische Universitat Wien
Wiedner Hauptstrasse 8-10
A-1040 Wien
==========================================
Department of Statistics and Probabability Theory
Vienna University of Technology
Wiedner Hauptstrasse 8-10
A-1040 Wien
_________________________________________________________________
MSN 8 with e-mail virus protection service: 2 months FREE*
http://join.msn.com/?page=features/virus
-------------------------------------------------------------------
This list is for discussion of modelling issues and the BUGS software.
For help with crashes and error messages, first mail [log in to unmask]
To mail the BUGS list, mail to [log in to unmask]
Before mailing, please check the archive at www.jiscmail.ac.uk/lists/bugs.html
Please do not mail attachments to the list.
To leave the BUGS list, send LEAVE BUGS to [log in to unmask]
If this fails, mail [log in to unmask], NOT the whole list
|