A reminder about the meeting at Reading on Friday. Full details below.
On 20 Nov 2013, at 23:48, Richard Everitt <[log in to unmask]> wrote:
> The full schedule and abstracts for the afternoon meeting on Bayesian Computation at Reading on the 6th December have been finalised. Registration is free and all are welcome.
>
> The schedule is:
>
> 1330-1400 Posters
>
> 1400-1445 Mark Girolami (UCL)
> 1445-1515 Richard Everitt (Reading)
> 1515-1540 Dalia Chakrabarty (Warwick/Leicester)
>
> 1540-1610 Coffee and posters
>
> 1610-1655 Mark Beaumont (Bristol)
> 1655-1725 Dennis Prangle (Reading)
> 1725-1750 Dan Lawson (Bristol)
> 1750-1815 Yordan Raykov (Aston)
>
> 1815- Pub
>
> The meeting is in room 107 in the Palmer Building (building 26 on the map at http://goo.gl/AtV6rU). The university is easily accessed by bus from the railway station (see http://goo.gl/Ybe9AB for further details). The abstracts for all of the talks follow. Hope to see you there!
>
> Speaker: Prof. Mark Girolami
> Title: Probabilistic Integration of Differential Equations for Exact Bayesian Uncertainty Quantification
> Abstract: Taking a Bayesian approach to inverse problems is challenging. This talk presents general methodology to explicitly characterize the mismatch between the finite-dimensional approximation of the forward model and the infinite-dimensional solution by a well-defined probability measure in Hilbert space. Furthermore this measure provides a means of obtaining probabilistic solutions of the forward model that resolves issues related to characterising uncertainty in the solutions of differential equations including chaotic systems and the multiplicity of solutions in e.g. boundary value problems. The probabilistic solutions are employed in the quantification of input uncertainty thus resolving problems with optimistic estimates of system uncertainty when solving the inverse problem.
>
> Speaker: Dr. Richard Everitt
> Title: Evidence estimation for Markov random fields
> Abstract: Markov random field models are used widely in computer science, statistical physics and spatial statistics and network analysis. However, Bayesian analysis of these models using standard Monte Carlo methods is not possible due to an intractable likelihood function. Several methods have been developed that permit exact, or close to exact, simulation from the posterior distribution, but estimating the marginal likelihood, or evidence, of these models remains challenging in general. We introduce a new Monte Carlo method for estimating the evidence in such cases.
>
> Speaker: Dr. Dalia Chakrabarty
> Title: Bayesian Inference on Dark Matter Distribution in Real Galaxies Using a New Distribution-free Test of Hypothesis
> Abstract: https://dl.dropboxusercontent.com/u/11248994/reading_abs.pdf
>
> Speaker: Prof. Mark Beaumont
> Title: Detecting selection with approximate Bayesian computation
> Abstract: In population genetics the parameters describing genetic variation at each site in the genome can often be regarded as conditionally independent, with, for example, mutation rate or selection coefficients drawn from some common distribution that we wish to characterise. This hierarchical structure is potentially problematic for inference based on Approximate Bayesian computation (ABC), because one may be interested in both the hyper-parameters and also the parameters describing, for example, each of many loci, requiring a large number of summary statistics. A general method is described for addressing these problems efficiently, and is applied to detect natural selection in the genome.
>
> Speaker: Dr. Dennis Prangle
> Title: Speeding ABC inference using early-stopping simulations
> Abstract: In fields such as biology and social sciences, dealing with large modern datasets often requires complicated models whose likelihood functions cannot easily be numerical evaluated. This makes statistical inference of the model parameters difficult by standard methods. There has been much recent study of "likelihood-free" approaches which favour parameter values for which simulated data from the model is similar to the observed data. This talk concentrates on Approximate Bayesian computation (ABC), which put this approach into a Bayesian framework.
> A bottleneck in likelihood-free methods is the time taken to run simulations from the model. This talk considers the strategy of stopping simulations early if it looks unlikely that they will produce a good match to the observations. An ABC algorithm is proposed which does this without altering the target distribution. Results on how to tune the algorithm and practical examples are also presented.
>
> Speaker: Dr. Dan Lawson
> Title: Scalable Bayesian Computation using emulation
> Abstract: As the size of datasets grows, the majority of interesting models become inaccessible because they scale quadratically or worse. For some problems fast algorithms exist that converge to the desired model of interest, but this is rarely true - we often really want to use a complex model. Beyond simply discarding data, how do we make the model run? We describe a framework in which statistical emulators can be substituted in for part of the likelihood. By careful construction of a) a decision framework to decide which data to compute the full likelihood for, b) the choice of sub-quadratic cost emulator, and c) integration with the full model, we show that there are conditions in which the emulated Bayesian model can be consistent with the full model, and that the full model is recovered as the amount of emulation decreases. We specify the details of the framework for models of general similarity matrices, and give an example of Bayesian clustering model for genetics data. This allows us in principle to cluster "all the genomes in the world", costing sub-quadratic computation, and describe a tempered MCMC-like algorithm to find the maximum a posteriori state that can be implemented on parallel architecture.
>
> Speaker: Yordan Raykov
> Title: Simplified deterministic algorithms for regularised nonparametric hierarchical hard clustering
> Abstract: https://dl.dropboxusercontent.com/u/11248994/Simplified%20deterministic%20algorithms%20for%20regularized%20nonparametric.pdf
You may leave the list at any time by sending the command
SIGNOFF allstat
to [log in to unmask], leaving the subject line blank.
|