From David Paul
------------- "Paul, David A" <[log in to unmask]> -------------
We are currently involved in modelling air pollutants using two
disparate datasets. One dataset is derived from monitoring
sites located in several concentrated groupings throughout the
northeastern United States. We have 250 total sites and are
working with averages from the month of January, so we have 250
total observations from these monitors.
The second dataset was obtained as the result of an atmospheric
model developed at (I believe) MIT. This dataset consists of
predictions on a regular lattice. There are 2184 lattice sites.
Because the predictive data resides on a regular lattice, it is
necessary to do something to make the monitoring data correspond
with the lattice data. We have chosen to take averages
of the monitoring data within those cells where monitors exist.
A frequentist method would use these averages without regard to
the inherent variability of the monitoring data. The Bayesian
paradigm allows us to capture this variability by specifying that
Y.Monitor[j] ~ dnorm(mu.Monitor[i],tau.Monitor[ . ]) for j={1,...,n[i])
observations in the ith regular grid. tau.Monitor[ . ] is indexed
according to the number of monitors within the ith regular grid.
In the course of trying to fit a simple SAR model to the predictive
model data, I have found, using two different wintel boxes with
different processors, that 512Mb of ram is insufficient to fit these
models when I use the WinPro2000 operating system. As soon as
these machines exceed the physical ram limit and dip into virtual
memory, WinBUGS freezes and I have to use the Task Manager to
get back to the desktop. On a two-year old machine running
NT 4.0 with 512Mb of ram, I have been able to run WinBUGS with no
problems, getting beautiful posteriors for the parameters of interest.
In one case, total memory (=virtual mem + hard mem) went as high
as 800Mb and I was able to open other applications (like Excel)
after WinBUGS was done.
After discussing this with one of my system administrators, he told
me that the newer Windows platforms aren't as good at handling programs
that use the "garbage management" approach to memory. I have no way
of verifying what he told me, but I have no reason to doubt him. With
processor speeds being as high as they are, the original motivation for
garbage management is disappearing, and it seems like Microsoft is
putting less effort into efficiently handling memory. I am continuing
to use NT 4.0 and will probably keep using it. I don't think that
asking WinBUGS to handle approximately 2400 observations is too much.
This certainly doesn't constitute a "huge" dataset, especially with
the amount of information being stored on RAID arrays these days.
Based on my recent experiences, it appears that NT 4.0 is more stable
and handles the memory requirements of WinBUGS "better" than WinPro2k.
Regards,
David Alan Paul, Ph.D.
Battelle Memorial Institute - SDAS
[log in to unmask]
614-424-3176
614-424-4611 (fax)
http://www.battelle.org/statistics
-------------------------------------------------------------------
This list is for discussion of modelling issues and the BUGS software.
For help with crashes and error messages, first mail [log in to unmask]
To mail the BUGS list, mail to [log in to unmask]
Before mailing, please check the archive at www.jiscmail.ac.uk/lists/bugs.html
Please do not mail attachments to the list.
To leave the BUGS list, send LEAVE BUGS to [log in to unmask]
If this fails, mail [log in to unmask], NOT the whole list
|