We use cookies to ensure that we give you the best experience on our website. You can change your cookie settings at any time. Otherwise, we'll assume you're OK to continue.

Department of Mathematical Sciences

# MATH3341/4031 Bayesian Statistics III/IV

This course provides an overview of theoretical, algorithmic, and practical aspects of the Bayesian approach to statistical inference, introduced for simple models in Statistical Concepts II. Prior knowledge and statistical models for data are combined to provide a flexible and powerful probabilistic approach to knowledge representation. The Bayesian approach has become an important practical tool for data analysis and modelling of complex situations, thanks largely to the use of Markov chain Monte Carlo methodology made possible by the steady increase in power of readily available computers.

The first term will discuss the rational basis for Bayesian inference, statistics, the role of exchangeability, exponential families and conjugate families of prior distributions, and will then start to look at the need for more complex models, and the theory and practice of how to build and represent such models for realistic situations.

The second term will continue the study of models with large numbers of variables and complex dependencies, and will then examine the powerful computational techniques that have rendered such models useful in practice. In particular, it will study Markov chain Monte Carlo methods for generating random values from multivariate distributions.

## Outline of Course

Aim: To provide an overview of the theoretical basis for Bayesian statistics and of practical Bayesian statistical methodology together with important applications.

### Term 1

• Review: Bayesian paradigm, conditional independence and conjugacy, manipulation of multivariate probability distributions.
• Foundations: rational basis for Bayesian inference and decision theory, exchangeability, parametric modelling.
• Exponential families: regular exponential families, canonical representation, maximum entropy, sufficiency, conjugacy, expectation.
• Hierarchical modelling: motivation, latent variables, random effects, conjugacy and semi-conjugacy.

### Term 2

• Bayesian graphical modelling: directed acyclic and undirected graphs, Bayesian and Markov networks, moral graph, separation theorem.
• Computation: Monte Carlo, Markov chain Monte Carlo (Markov chains, equilibrium distribution, Gibbs sampling), Metropolis-Hastings (Metropolis random walk, independence sampler, Gibbs sampling), other algorithms.
• Practicalities: specification of prior beliefs, analysis and interpretation of Markov chain Monte Carlo output.
• Model comparison: Bayes factors, criteria for model choice.

### Prerequisites

For details of prerequisites, corequisites, excluded combinations, teaching methods, and assessment details, please see the Faculty Handbook: MATH3341, MATH4031.