# MATH 653: Bayesian Inference

Credits: 10

Tutor: Christian Rohrbeck

Outline: The module presents the key tools for Bayesian inference. It will cover:

• The key differences between the classical and Bayesian approaches and their similarities.
• Derivation of conjugate priors, posterior distributions, predictive distributions and marginal likelihoods for: a) parameter likelihoods from the exponential family; b) parameter likelihood from Gaussian or regression models.
• Elicitation or priors;
• Laplacian and Jeffreys' priors for objective Bayesian inference;
• Importance and rejection sampling;
• Bayesian decision theory.

Objectives: This module aims to introduce the Bayesian view of statistics, stressing its philosophical contrasts with classical statistics, its facility for including information other than the data into the analysis and its coherent approach toward inference and model selection.

Learning Outcomes:
On successful completion of this module students will be able to:

• For any one parameter likelihood the exponential family or a 2 parameter Gaussian or regression likelihood:
• identify and interpret the conjugate prior;
• calculate and interpret the posterior distribution;
• calculate and interpret the posterior predictive distribution;
• derive, calculate and interpret the marginal likelihood.
• For a given likelihood and prior to simulate from the posterior and the predictive distributions and interpret;
• Differentiate situations where a subjective and an objective Bayesian approach are appropriate;
• Formulate Laplacian and Jeffreys' priors to derive and illustrate their properties;
• Calculate and interpret Bayes rule and risk for a variety of loss functions;
• Use Bayes risk to derive the cost of an observational or experimental procedure;
• Recognise when importance sampling estimate is necessary, to derive its properties and to implement it efficiently in R;
• Show that rejection sampling samples from the correct distribution, and to implement it so that the acceptance rate is maximised.
• State key differences and similarities between the Bayesian and the classical approaches for statistical inference and decision making.
• Construct a mature and well-structured statistical report around a scientific question and a set of measurements.

Core texts:

• P. Hoff (2008), A first course in Bayesian statistics, Springer.
• P.M. Lee (1997) Bayesian Statistics: an Introduction. Edward Arnold, 2nd edition.

Assessment: Assessment will be through a combination of summer exam (80%) and exercises (20%).

Contact hours: There will be a mixture of lectures, tutorials and computer workshops totalling approximately 25 hours contact. In addition, private study will make up the majority of the learning hours.