--------------------------------------------------------------------
"Mini Workshop on ABC methods for Stochastic Epidemic Models"
Wednesday December 9, 2009 @ University of Nottingham
--------------------------------------------------------------------
There will be a half-day workshop on Approximate Bayesian Computation
(ABC) methodology with application to Stochastic Epidemic Models
organised by the Epidemics group in the Division of Statistics, School
of Mathematical Sciences, University of Nottingham.
There has been an increasing interest in ABC methods over the last few
years. The purpose of this meeting is bring together researchers who
have applied (or intend to apply) such methods in epidemic modelling
and also identify any further research directions in this area.
There will be three talks (provisional programme below). The meeting is
free and open to everyone but it would helpful if those interested in
attending could email Theo Kypraios ([log in to unmask])
in advance.
The meeting will take place in room C18 of the Pope Building,
University Park Campus, University of Nottingham. Maps and directions
are available at
http://www.nottingham.ac.uk/about/visitorinformation/mapsanddirections/mapsanddirections.aspx
http://www.nottingham.ac.uk/about/documents/universityparkcampusmap.pdf
best wishes,
Theo.
Programme
-------------
14:00 -- 15:00: Dennis Prangle (Lancaster): Choice of Summary Statistics
for Approximate Bayesian Computation.
15:00 -- 16:00: TJ McKinley (Cambridge): ABC for temporal epidemic models
16:00 -- 16:30: Break
16:30 -- 17:30: Marc Baguelin (Health Protection Agency): Inference of
transmission in an outbreak of equine influenza through simulated
likelihood.
Abstracts
----------
Prangle
--------
To be efficient, ABC methods require low dimensional but informative
summary statistics for the specific model of interest. However little
guidance exists on how to choose these. This talk describes a novel
generic methodology for constructing summary statistics. Theoretical
work justifying this approach is sketched and an example application is
described and related to epidemiological applications. Other
consequences of the theory are also discussed, on the best region in
which to accept simulated summary statistics and improving the behaviour
of ABC algorithms by adding noise.
McKinley
---------
"For large-scale, non-linear dynamic systems, such as those used to
describe epidemic spread, calculating the likelihood function can often
be computationally prohibitive, or in some cases intractable. However,
simulating from these models - for a given set of parameters - is often
more straightforward. Approximate Bayesian Computation provides a method
for inference about the parameters of a system by replacing the
likelihood with a approximations generated from repeated model
simulations. Here we explore recent advances in ABC, which embed these
model simulation steps into both Markov chain Monte Carlo (ABC-MCMC) and
Sequential Monte Carlo (ABC-SMC) algorithms. We compare the accuracy and
efficiency of the approximation routines to a gold-standard of
data-augmented MCMC and illustrate our results on a real-life outbreak
of Ebola Haemorrhagic Fever in the Democratic Republic of Congo.
Baguelin
--------
We developed two simulated likelihood-based methods to find the within-
and between-patch (household, yard, etc.) transmissions in two-level of
mixing models of the spread of influenza using two (fitted-exponential
and empirical) distributions for latent and infectious periods. In the
first method, we use an approximate of the final output with small ϵ to
estimate jointly both distributions, we then simplify this estimation,
when within-household/yard transmission predominates by showing that
local transmission can be estimated by using the exact final sizes of
local sub-epidemics and then use this number to infer the global
transmission.
We initially applied this methodological framework to a metapopulation
model of the spread of equine influenza among thoroughbred horses
parameterized with data from a 2003 outbreak in Newmarket, UK, with the
number of horses initially susceptible was derived from a threshold
theorem and a published statistical model. These methods though applied
to any two levels of mixing models such as the human household models
providing that good quality data is available.
This message has been checked for viruses but the contents of an attachment
may still contain software viruses, which could damage your computer system:
you are advised to perform your own checks. Email communications with the
University of Nottingham may be monitored as permitted by UK legislation.
This message has been checked for viruses but the contents of an attachment
may still contain software viruses which could damage your computer system:
you are advised to perform your own checks. Email communications with the
University of Nottingham may be monitored as permitted by UK legislation.
|