Date |
Lesson |
Reading |
Video |
Slides |
Slides (pdf) |
Lab |
HW |
|
Module 0.1: Course overview |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
WEEK 1 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
MODULE 1: INTRODUCTION TO BAYESIAN INFERENCE |
|
|
|
|
|
|
Mon, June 29 |
Module 1.1: Building blocks of Bayesian inference |
|
|
|
|
|
|
|
Module 1.2: Probability review |
|
|
|
|
|
|
|
Lab 1: R review |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
MODULE 2: ONE PARAMETER MODELS |
|
|
|
|
|
|
Tues, June 30 |
Module 2.1: Conjugacy; Beta-Bernoulli and beta-binomial models |
|
|
|
|
|
|
|
Module 2.2: Operationalizing data analysis; selecting priors |
|
|
|
|
|
|
|
Homework 1 |
|
|
|
|
|
|
Wed, July 1 |
Drop/Add for Term II ends |
|
|
|
|
|
|
|
Module 2.3: Marginal likelihood and posterior prediction |
|
|
|
|
|
|
|
Module 2.4: Truncated priors and the inverse cdf method |
|
|
|
|
|
|
|
Lab 2: The Beta-Binomial model |
|
|
|
|
|
|
Thur, July 2 |
Module 2.5: Frequentist vs Bayesian intervals |
|
|
|
|
|
|
|
Module 2.6: Loss functions and Bayes risk |
|
|
|
|
|
|
Fri, July 3 |
Independence Day holiday observed |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
WEEK 2 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Mon, July 6 |
Module 2.7: Gamma-Poisson model I |
|
|
|
|
|
|
|
Module 2.8: Gamma-Poisson model II; finding conjugate distributions |
|
|
|
|
|
|
|
Lab 3: The Poisson model and posterior predictive checks |
|
|
|
|
|
|
Tues, July 7 |
Quiz I |
|
|
|
|
|
|
|
Homework 2 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
MODULE 3: MONTE CARLO AND MULTIPARAMETER MODELS |
|
|
|
|
|
|
Wed, July 8 |
Module 3.1: Monte Carlo approximation and sampling |
|
|
|
|
|
|
|
Module 3.2: Rejection sampling; Importance sampling |
|
|
|
|
|
|
|
Lab 4: Prior selection and model reparameterization |
|
|
|
|
|
|
Thur, July 9 |
Module 3.3: The normal model: introduction and motivating examples |
|
|
|
|
|
|
|
Module 3.4: The normal model: conditional inference for the mean |
|
|
|
|
|
|
Fri, July 10 |
Module 3.5: The normal model: joint inference for mean and variance |
|
|
|
|
|
|
|
Module 3.5b: The normal model: joint inference for mean and variance (illustration) |
|
|
|
|
|
|
|
Module 3.6: Noninformative and improper priors |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
WEEK 3 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Mon, July 13 |
Module 3.7: MCMC and Gibbs sampling I |
|
|
|
|
|
|
|
Module 3.8: MCMC and Gibbs sampling II |
|
|
|
|
|
|
|
Lab 5: Truncated data |
|
|
|
|
|
|
Tues, July 14 |
Module 3.9: MCMC and Gibbs sampling III |
|
|
|
|
|
|
|
Module 3.10: MCMC and Gibbs sampling IV |
|
|
|
|
|
|
|
Module 3.11: Discussion session exercisee |
|
|
|
|
|
|
|
Homework 3 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
MODULE 4: MULTIVARIATE DATA |
|
|
|
|
|
|
Wed, July 15 |
Module 4.1: Multivariate normal model I |
|
|
|
|
|
|
|
Module 4.2: Multivariate normal model II |
|
|
|
|
|
|
|
Lab 6: Gibbs sampling with block updates |
|
|
|
|
|
|
Thurs, July 16 |
Review for midterm exam |
|
|
|
|
|
|
Fri, July 17 |
Midterm exam |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
WEEK 4 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Mon, July 20 |
Module 4.3: Multivariate normal model III |
|
|
|
|
|
|
|
Module 4.4: Multivariate normal model IV |
|
|
|
|
|
|
|
Lab 7: Introduction to Hamiltonian Monte Carlo |
|
|
|
|
|
|
Tues, July 21 |
Module 4.5: Missing data and imputation I |
|
|
|
|
|
|
|
Module 4.6: Missing data and imputation II |
|
|
|
|
|
|
|
Homework 4 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
MODULE 5: HIERARCHICAL MODELS |
|
|
|
|
|
|
Wed, July 22 |
Module 5.1: Hierarchical normal models with constant variance: two groups |
|
|
|
|
|
|
|
Module 5.2: Hierarchical normal models with constant variance: two groups (illustration) |
|
|
|
|
|
|
|
Module 5.3: Hierarchical normal models with constant variance: multiple groups |
|
|
|
|
|
|
|
No lab |
|
|
|
|
|
|
Thurs, July 23 |
Module 5.4: Hierarchical normal modeling of means and variances |
|
|
|
|
|
|
|
Module 5.5: Hierarchical normal modeling of means and variances (illustration) |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
MODULE 6: BAYESIAN LINEAR REGRESSION |
|
|
|
|
|
|
Fri, July 24 |
Module 6.1: Bayesian linear regression |
|
|
|
|
|
|
|
Module 6.2: Bayesian linear regression (illustration) |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
WEEK 5 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Mon, July 27 |
Last day to withdraw with W |
|
|
|
|
|
|
|
Module 6.3: Bayesian linear regression: weakly informative priors |
|
|
|
|
|
|
|
Module 6.4: Bayesian hypothesis testing |
|
|
|
|
|
|
|
Lab 8: Hierarchical modeling |
|
|
|
|
|
|
Tues, July 28 |
Module 6.5: Bayesian model selection |
|
|
|
|
|
|
|
Module 6.6: Bayesian model selection (illustration) |
|
|
|
|
|
|
|
Homework 5 |
|
|
|
|
|
|
Wed, July 29 |
Quiz II |
|
|
|
|
|
|
|
Lab 9: Bayesian (Generalized) Linear Regression Models |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
MODULE 7: METROPOLIS AND METROPOLIS-HASTINGS |
|
|
|
|
|
|
Thurs, July 30 |
Module 7.1: The Metropolis algorithm |
|
|
|
|
|
|
|
Module 7.2: Metropolis in action |
|
|
|
|
|
|
Fri, July 31 |
Module 7.3: The Metropolis-Hastings algorithm |
|
|
|
|
|
|
|
Module 7.4: Metropolis within Gibbs |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
MODULE 8: CATEGORICAL DATA AND MIXTURE MODELS |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
WEEK 6 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Mon, Aug 3 |
Module 8.1: The multinomial model |
|
|
|
|
|
|
|
Module 8.2: Finite mixture models: univariate categorical data |
|
|
|
|
|
|
|
Lab 10: Metropolis-Hastings |
|
|
|
|
|
|
Tues, Aug 4 |
Module 8.3: Finite mixture models: univariate continuous data |
|
|
|
|
|
|
|
Module 8.4: Finite mixture models: univariate continuous data (illustration) |
|
|
|
|
|
|
Wed, Aug 5 |
Module 8.5: Finite mixture models: multivariate categorical data |
|
|
|
|
|
|
|
Module 8.6: Finite mixture models: multivariate continuous data |
|
|
|
|
|
|
|
No lab |
|
|
|
|
|
|
Thurs, Aug 6 |
Course wrap-up and review for final exam |
|
|
|
|
|
|
Fri, Aug 7 - Sun, Aug 9 |
Final exam period |
|
|
|
|
|
|