Greetings, and apologies for cross-posting...
We are announcing the following courses, which are scheduled to take place at the Statistical Services Centre in November 2014. Summary information is given below. For registration forms please see http://www.reading.ac.uk/ssc/ providing your address and/or fax number, or email [log in to unmask]
****************************************************************************
Day 1: Bayesian Modeling, Inference and Prediction; 25 November 2014
****************************************************************************
Day 2: Bayesian Hierarchical Modeling; 26 November 2014
****************************************************************************
Day 3: Bayesian Model Specification; 27 November 2014
****************************************************************************
Day 4: Practical Bayesian Nonparametric Analysis; 28 November 2014
****************************************************************************
Prices: 370 GBP for any one day; 695 GBP for any two days; 995 GBP for any three days; 1285 GBP for all four days.
A 30% academic discount is available for these courses. [Terms and conditions apply.] Note a special discount of 50% will be given to students for these specific courses. Please indicate you wish to apply for a discount when you register, together with information supporting your eligibility. [Terms and conditions apply.]
Please view http://www.reading.ac.uk/ssc/n/Short%20Courses/bayesianmodelling.htm for more details. Further information is also given below.
**********************************************************************************************************************
Course outline
**************
This is a short course that offers you flexibility: you can either take:-
* the first day, which is an introductory course on Bayesian modeling, inference and prediction,
* or the second day, which is an intermediate-level course on Bayesian hierarchical modeling,
* or the third day, which is an intermediate-level course on Bayesian model specification,
* or the fourth day, which is an introduction to practical Bayesian non-parametric (BNP) analyses and why it's helpful to know how to reason in a BNP manner
* or any two days, any three days or all four days.
Day 1 of this course will:
*********************
(1) Compare and contrast the frequentist and Bayesian conceptions of probability, highlighting the strengths and weaknesses of both;
(2) Review maximum-likelihood fitting of statistical models;
(3) Show you how to obtain Bayesian solutions to inferential and predictive problems analytically and in closed form (when such solutions are available); and
(4) Introduce you to simulation-based Bayesian model-fitting using Markov-chain Monte Carlo (MCMC) methods, in the freeware packages WinBUGS and R, when closed-form solutions are not possible.
Day 2 of this course will:
*********************
(1) Introduce Bayesian hierarchical modeling via meta-analysis, the study of how information can be combined across experiments to provide a better summary than those obtained by examining one experiment at a time;
(2) Discuss the critical role played by the choice of prior distributions in Bayesian hierarchical models;
(3) Illustrate the use of latent variables (random effects) as an approach to describing unexplained heterogeneity; and
(4) Explore two in-depth case studies involving random-effects Poisson regression and mixed-effects logistic regression.
Day 3 of this course will:
*********************
(1) Provide an overview of the process of Bayesian model specification;
(2) Introduce five basic principles -- the Calibration Principle, the Modeling-As-Decision Principle, the Prediction Principle, the Inference-Versus-Decision Principle and Cromwell's Rule (Parts 1 and 2) -- and show you how they inform good Bayesian model building and model criticism; and
(3) Introduce you to Calibration Cross-Validation, Bayes factors, BIC, DIC and log scoring (in WinBUGS and R) as methods for finding good Bayesian models.
Day 4 of this course will:
*********************
(1) Introduce Bayesian non-parametric modeling as a principled approach to solving the basic problem of well-calibrated quantification of model uncertainty;
(2) Provide a thorough and practical grounding in the use of Dirichlet-process priors and Polya-tree priors for placing distributions, in scientifically-meaningful ways, on cumulative distribution functions (CDFs) on the real line; and
(3) Offer a complete review of available free-ware in R and WinBUGS for performing practical Bayesian non-parametric analyses.
All four days of the course will be based on a series of practical real-world case studies.
***********************************************************************************************************************
Who should attend?
******************
Statisticians, biostatisticians, epidemiologists, data analysts, data-miners, and machine-learning specialists who wish to broaden and deepen:
(a) their understanding of Bayesian methods and
(b) their toolkits for using Bayesian models to find meaningful patterns, arrive at statistically sound inferences and make better decisions.
Some graduate coursework in statistics (or an allied field such as biostatistics, epidemiology or machine learning) will provide sufficient mathematical background for participants. To get the most out of the course, participants should be comfortable with hearing the course presenter discuss:
(a) differentiation and integration of functions of several variables and
(b) discrete and continuous probability distributions (joint, marginal, and conditional) for several variables at a time, but all necessary concepts will be approached in a sufficiently intuitive manner that rustiness on these topics will not prevent understanding of the key ideas.
The first day of the course will assume no previous exposure to Bayesian ideas or methods. Participants interested in entering the course on the second, third or fourth day should ideally have had exposure to the ideas on the days preceding their entry day.
***********************************************************************************************************************
How you will benefit
*******************
You will:
(1) Gain a deeper understanding of maximum-likelihood-based methods and when they can be expected to behave in a sub-optimal manner;
(2) Broaden and deepen your facility in the fitting and interpretation of Bayesian models to solve important problems in science, public policy and business; and
(3) Learn how to write your own programs in WinBUGS and R to fit Bayesian models in your own work.
***********************************************************************************************************************
Course content
***************
Day 1: Bayesian Modeling, Inference and Prediction
******************************************
* Background and basics: strengths and weaknesses of the classical, frequentist and Bayesian probability paradigms
* Sequential learning via Bayes' Theorem
* Coherence as a form of internal calibration
* Bayesian decision theory via maximization of expected utility
* Review of frequentist modeling and maximum-likelihood inference
* Exchangeability as a Bayesian concept parallel to frequentist independence
* Prior, posterior, and predictive distributions
* Bayesian conjugate analysis of binary outcomes, and comparison with frequentist modeling
* Conjugate analysis of integer-valued outcomes (Poisson modeling)
* Conjugate analysis of continuous outcomes (Gaussian modeling)
* Multivariate unknowns and marginal posterior distributions
* Introduction to simulation-based computation, including rejection sampling and Markov chain Monte Carlo (MCMC) methods
* MCMC implementation strategies.
Day 2: Bayesian Hierarchical Modeling
********************************
* Bayesian hierarchical modeling
* Hierarchical modeling with latent variables as an approach to mixture modeling
* Bayesian fixed- and random-effects meta-analysis
* Prior distributions in Bayesian hierarchical modeling
* Bayesian fitting of random-effects and mixed models
* Comparison of likelihood-based and Bayesian methods for fitting hierarchical models: circumstances in which likelihood-based fitting can be poorly calibrated.
Day 3: Bayesian Model Specification
******************************
* The big picture in Bayesian model specification
* The Calibration Principle, the Modeling-As-Decision Principle, the Prediction Principle, the Inference-Versus-Decision Principle, and Cromwell's Rule (Parts 1 and 2)
* Model expansion as a tool for improving Bayesian modeling: embedding a deficient model in a larger class of models of which it's a special case
* Methods for finding good Bayesian models: Calibration Cross Validation, Bayes factors, BIC, DIC and log scores
* A generic Bayesian model-search algorithm
* False positive/false negative trade-offs in comparing {Bayes factors, BIC} and {DIC, log scores} on their ability to correctly discriminate between models.
Day 4: Practical Bayesian Nonparametric Analysis
****************************************
* How de Finetti's representation theorems for real-valued exchangeable outcomes lead directly to the need to place prior distributions on CDFs
* How such priors can be constructed with Dirichlet-process priors, Polya trees and Dirichlet-process mixture modeling, and how this can be done in a scientifically meaningful way
* A thorough review, with many practical examples, of currently available free-ware in R and WinBUGS for performing successful Bayesian non-parametric (BNP) analyses
* The relationship between the frequentist bootstrap and Dirichlet-process posterior distributions; this connection permits BNP analyses to be performed at clock speed-ups of more than 30 to 1 when compared with standard BNP computing methods
* How to parallelize the bootstrap in R, to achieve even greater speed-ups in performing approximate BNP analyses.
**********************************************************************************************************************
Guest Presenter
****************
David Draper is a Professor of Statistics in the Department of Applied Mathematics and Statistics at the University of California, Santa Cruz (USA); he is also the Senior Director of the Center of Excellence in Statistical Research at eBay Research Labs in San Jose, CA.
He is a Fellow of the American Association for the Advancement of Science, the American Statistical Association (ASA), the Institute of Mathematical Statistics, and the Royal Statistical Society; from 2001 to 2003 he served as the President-Elect, President, and Past President of the International Society for Bayesian Analysis (ISBA).
He is the author or co-author of about 145 contributions to the methodological and applied statistical literature, including articles in the Journal of the Royal Statistical Society (Series A, B and C), the Journal of the American Statistical Association, the Annals of Applied Statistics, Bayesian Analysis, Statistical Science, the New England Journal of Medicine, and the Journal of the American Medical Association; his 1995 JRSS-B article on assessment and propagation of model uncertainty has been cited almost 1,300 times, and taken together his publications have been cited more than 9,900 times.
His research is in the areas of Bayesian inference and prediction, model uncertainty and empirical model-building, hierarchical modeling, Markov Chain Monte Carlo methods, and Bayesian nonparametric methods, with applications mainly in medicine, health policy, education, environmental risk assessment and eCommerce.
His short courses have received Excellence in Continuing Education Awards from the American Statistical Association on two occasions, corresponding to days 1 and 2 of this course.
He has won or been nominated for major teaching awards everywhere he has taught (the University of Chicago; the RAND Graduate School of Public Policy Studies; the University of California, Los Angeles; the University of Bath (UK); and the University of California, Santa Cruz).
He has a particular interest in the exposition of complex statistical methods and ideas in the context of real-world applications.
Course Materials
****************
Please note that there will not be any printed notes for this course; a web link to the materials (PDF files with course notes and .txt files containing data sets and R and WinBUGS code) will be provided to registered participants several weeks before the course occurs. A web link to the 2013 materials (day 1 to 3 only) can be found below:
Day one - http://users.soe.ucsc.edu/~draper/Reading-2013-Day-1.html
Day two - http://users.soe.ucsc.edu/~draper/Reading-2013-Day-2.html
Day three - http://users.soe.ucsc.edu/~draper/Reading-2013-Day-3.html
Location
*********
The Statistical Services Centre at Whiteknights campus, University of Reading is in a prime location in the South-East of England and has excellent transport links. The University is close to the M4 motorway allowing easy access by car. Reading's railway station has high speed links to and from London Paddington, as well as regular services to and from other cities around the UK. There are direct services to and from both London Heathrow and London Gatwick Airports. For further details view: http://www.reading.ac.uk/ssc/n/visiting.htm.
Statistical Services Centre
University of Reading
Harry Pitt Building
Whiteknights Road
Reading RG6 6FN
UK
e-mail: [log in to unmask]
Tel: +44(0)118 378 8689
Fax: +44(0)118 378 8458
www.reading.ac.uk/ssc
-------------------------------------------------------------------
This list is for discussion of modelling issues and the BUGS software.
For help with crashes and error messages, first mail [log in to unmask]
To mail the BUGS list, mail to [log in to unmask]
Before mailing, please check the archive at www.jiscmail.ac.uk/lists/bugs.html
Please do not mail attachments to the list.
To leave the BUGS list, send LEAVE BUGS to [log in to unmask]
If this fails, mail [log in to unmask], NOT the whole list
|