Friday 11 February, Room 642, Department of Mathematics, 6th floor
Huxley Building
180 Queen's Gate
South Kensington, London
Program:
1:30 -- 2:30 Paul Cohen, University of Massachusetts at Amherst
Analyzing the Structure of Robot Perceptual Data
2:30 -- 3:00 Coffee Break (served in the same room - 642)
3:00 -- 4:00 Jon Foster, University of Southampton
Bayesian Estimation of Small Population Frequencies
Abstracts of the talks:
Analyzing the Structure of Robot Perceptual Data
Paul Cohen
University of Massachusetts at Amherst.
The Pioneer 1 mobile robot updates its perceptions of the world
at roughly 5 Hz. At each cycle it produces a set of propositions that
describe the current state of the robot and its environment. With
practice and knowledge of what these propositions mean, humans can
interpret time series of state descriptions and reconstruct what the
robot
was doing. How might the robot do the same thing, how might it
interpret
time series of propositions to build models of what it is doing? I will
describe work on clustering time series of propositions, then show why
the
underlying Markov model of state transitions is inadequate, and
introduce
work on elucidating state descriptions based on the statistical
dependencies between propositions.
Bayesian Estimation of Small Population Frequencies
Jon Foster
University of Southampton.
The assessment of disclosure risk in the release of
categorical census microdata depends on the estimation of
population frequencies for cells of a contingency
table where the cell count is small (typically one).
I will describe an approach to smoothed
estimation of cell frequencies, based on Bayesian
model averaging.
The set of models over which to average is determined
by considering the structure and associated symmetries
of the table. Hence, for example, for a two-way
cross-classification, the model class depends on whether
the two factors have identical labels (square table) or not.
Prior distributions for model parameters may also
largely be determined by symmetry considerations, leaving
a small number of smoothing parameters to be specified.
Efficient computation is available, by using a combination
of MCMC, Laplace approximations and numerical quadrature.
I will present an example which illustrates the benefits of this
approach to disclosure risk assessment, over methods
which do not take the structure of the data into account.
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
|