732A91 Bayesian Learning
The course aims to give a solid introduction to the Bayesian approach to statistical inference, with a view towards applications in data mining and machine learning. After an introduction to the subjective probability concept that underlies Bayesian inference, the course moves on to the mathematics of the prior-to-posterior updating in basic statistical models, such as the Bernoulli, normal and multinomial models. Linear regression and spline regression are also analyzed using a Bayesian approach. The course subsequently shows how complex models can be analyzed with simulation methods like Markov Chain Monte Carlo (MCMC). Bayesian prediction and marginalization of nuisance parameters is explained, and introductions to Bayesian model selection and Bayesian decision theory are also given.
- Introduction to subjective probability and the basic ideas behind Bayesian inference
- Prior-to-posterior updating in basic statistical models, such as the Bernoulli, normal and multinomial models.
- Bayesian analysis of linear and nonlinear regression models
- Shrinkage, variable selection and other regularization priors
- Bayesian analysis of more complex models with simulation methods, e.g. Markov Chain Monte Carlo (MCMC).
- Bayesian prediction and marginalization of nuisance parameters
- Introduction to Bayesian model selection
- Introduction to Bayesian decision theory.
Intended audience and admission requirements
This course is given primarily for students on the Master's programme Statistics and Data Mining. It is also offered to Master students in other subjects and to interested Ph.D. students (with a more advanced examination).
Students admitted to the Master's programme in Statistics and Data Mining fulfill the admission requirements for the course.
Students not admitted to the Master's programme in Statistics and Data Mining should have passed:
- an intermediate course in probability and statistical inference
- a basic course in mathematical analysis
- a basic course in linear algebra
- a basic course in programming
The TimeEdit schedule for the course is available here.
The labs should be done in pairs of students whereas the project is individual work.
All labs and the project should be submitted as PDFs through LISAM.
Module 1 - The BayesicsLecture 1: Basics concepts. Likelihood. The Bernoulli model. The Gaussian model.
Read: BDA Ch. 1, 2.1-2.5 | Slides
Code: Beta density | Bernoulli model | One-parameter Gaussian model
Lecture 2: Conjugate priors. The Poisson model. Prior elicitation. Noninformative priors.
Read: BDA Ch. 2.6-2.9 | Slides
Lecture 3: Multi-parameter models. Marginalization. Multinomial model. Multivariate normal model.
Read: BDA Ch. 3. | Slides
Code: Two-parameter Gaussian model | Prediction with two-parameter Gaussian model | Multinomial model
Math exercises 1: One-parameter models.
Problem set 1 | Solution Problem 2 and 4 | Solution Problem 1 and 3 (Problem 3 is marked as Problem 2 in this solution)
Lab 1: Exploring posterior distributions in one-parameter models by simulation and direct numerical evaluation.
Lab 1 | LISAM Submission
Module 2 - Bayesian Regression and ClassificationLecture 4: Prediction. Making Decisions.
Read: BDA Ch. 9.1-9.2. | Slides
Lecture 5: Linear Regression. Nonlinear regression. Regularization priors.
Read: BDA Ch. 14 and Ch. 20.1-20.2 | Slides
Lecture 6: Classification. Posterior approximation. Logistic regression. Naive Bayes.
Read: BDA Ch. 16.1-16.3 | Slides
Code: Logistic and Probit Regression
Math exercises 2: Predictive distributions and decisions.
Problem set 2 | Solution Problem 1-3
Lab 2: Polynomial regression and classification with logistic regression
Lab 2 | Linköping temperature data | Women work data | LISAM Submission
Module 3 - More Advanced Models, MCMC and Variational BayesLecture 7: Bayesian computations. Monte Carlo simulation. Gibbs sampling. Data augmentation.
Read: BDA Ch. 10-11 | Slides
Code: Gibbs sampling for a bivariate normal | Gibbs sampling for a mixture of normals
Lecture 8: MCMC and Metropolis-Hastings
Read: BDA Ch. 11 | Slides
Code: Simulating Markov Chains
Lecture 9: Variational Bayes. Stan.
Read: BDA Ch. 13.7 and RStan vignette| Slides
Code: RStan - Bernoulli model | RStan - Logistic regression | RStan - Logistic regression with random effects | RStan - Poisson model
Math exercises 3: Comparing Bayes and Frequentist. Posterior approximation. Naive Bayes.
Problem set 3 | Sketchy solutions (better versions will come soon) | Some, but not all, solutions typed up for this exercise
Lab 3: Gibbs sampling for the normal model, mixture of normals and probit regression.
Lab 3 | Rainfall data | LISAM Submission
Module 4 - Model Inference and Variable SelectionLecture 10: Bayesian model comparison
Read: BDA Ch. 7 | Slides
Code: Comparing models for count data
Lecture 11: Computing the marginal likelihood, Bayesian variable selection, model averaging.
Read: Article on variable selection for additional reading | Slides
Lecture 12: Model evaluation and course summary.
Read: BDA 6.1-6.4 | Slides
Math exercises 4: Model comparison.
Problem set 4 | Solutions to problem set 4 |
Lab 4: Metropolis-Hastings for Poisson regression.
Lab 4 | eBay data | LISAM Submission
- Bayesian Data Analysis by Gelman, Carlin, Stern, och Rubin, Chapman & Hall, Third edition. The book's web site can be found here.
- My slides.
The examination for the course Bayesian Learning, 6 credits, consists of
- Written reports on the four computer labs (3 credits)
- Computer exam (3 credits)
The following material will be made available in the computer exam system in the folder given_files:
- Slides from all the 12 lectures in PDF format
- The four computer labs exercises in PDF format
- You will be able to install any R package from CRAN.
- Four pages with distributions from the Appendix in the course book.
- Some page with useful probability and math results
- Base R cheat sheet and R markdown cheat sheet.
- Your solutions to the four computer labs that you submitted through LISAM (see below).
Here is the LISAM submission for the exam 2017-08-16. Remember to submit before the deadline 2017-08-13 at 6 PM.
Here is information about how to log in and out from the exam system: Step by step tutorial for the computer exam system
The exam system includes a so called communication client. This page contains information about the communication client. The communication client serves three basic purposes:
- To ask questions to the teacher, and for the teacher to inform students about things during the exam. I will also visit the exam room regularly, but do use the communication client to ask questions. To submit a question, just click on the button marked by a dotted blue rectangle in this picture
- To get the Client ID (Klient ID in Swedish). This is the area marked by a dashed red rectangle in this picture. This is your personal identication number that you should write on every page of the papers that you submit for the math questions. Do not write your name on the papers, as that will reveal your identity to the grading teacher.
- To submit your (electronic) solutions to the exam. You will only have a single submission for the whole exam. The exam is submitted by clicking the button marked by a solid green rectangle in this picture and following the instructions. Note that the system will let you know that the exam has been submitted, but will not tell you that it was received. This is ok and your solution has actually been received.
- There are three basic folders for the students:
- /home/student_tilde (home folder where the student can save files)
- /home/student_tilde/given_files (the folder where the distributed files to everyone are placed. For example, the exam itself or slides)
- /home/student_tilde/my_given_files (the folder where the distributed files to individual students are placed. For example, the student's lab solutions)
- The student may have to copy files (e.g. data files) from given_files to their home folder. Or read the files using the full path /home/student_tilde/given_files.
- Most students will save the plots directly from RStudio using point-and-click from the menus. But you can also save a plot to a png file using these commands:
plot(rnorm(4)) # Change this to the plot commands you want
The course TDDE07 will be graded on the (U,3,4,5) scale.
Here is picture that shows the percentage of the maximum score for each grade (732A91 to the left, TDDE07 to the right).
Old exams with solutions
- The main page with links to downloads for the programming language R
- RStudio - a very nice developing environment for R.
- Short introduction to R | A little longer introduction | John Cook's intro to R for programmers.
- Informative clickable chart with relations between distributions: http://www.johndcook.com/distribution_chart.html.
- Learning about the prior-to-posterior mapping in:
- The Feynman technique to learning: 5 min Youtube video.
Page responsible: Mattias Villani
Last updated: 2017-08-14