(Old Version - 2019 Fall)
STAT 4010
Bayesian Learning
Class Information
Class time: Th 1530-1815
Location: LSB LT5
Outline (Updated: 18 Nov 2019)
Instructor
Name: Kin Wai CHAN
Email: kinwaichan@cuhk.edu.hk
Office: LSB 115
Tel: 3943 7923
Office hour: I have an open door policy. Feel free to drop by anytime and ask me questions. (Of course, you may also make an appointment with me if you want a long meeting.)
Teaching Assistants
Email: 1009633883@link.cuhk.edu.hk
Office: LSB 141
Tel: 3943 9306
Description
This course introduces the basic Bayesian inference procedures and philosophy, emphasizing both conceptual foundations and implementation. It covers conjugate families of distributions, Bayesian credible region, Jeffery’s prior, Markov Chain Monte Carlo, Bayes factor, Bayesian information criterion, imputation, Bayesian linear-regression models, model averaging, hierarchical models and empirical Bayes models. Hands-on implementation of estimation and inference procedures in R will be demonstrated in interactive sections.
Textbooks
A self-contained lecture note is the main source of reference. Complementary textbooks include
(Major) Hoff, P.D. (2009). A First Course in Bayesian Statistical Methods. Springer. (Free online access via CUHK library)
(Minor) Wasserman, L. (2004). All of Statistics: A Concise Course in Statistical Inference. Springer.
(Minor) Albert, J. (2007). Bayesian Computation with R. Springer.
Learning outcomes
Upon finishing the course, students are expected to
distinguish the difference between frequentist and Bayesian methods, and identify their pros and cons;
derive posterior distribution from commonly-used prior and sampling distributions;
perform Markov chain Monte Carlo for Bayesian inference in R;
build simple Bayesian models for solving real problems; and
select appropriate Bayesian tools for different statistical tasks, e.g., estimation, testing, model selection, prediction, etc.
Assessment and Grading
There are three main assessment components, plus a bonus component.
a (out of 100) is the average score of approximately eight assignments with the lowest two scores dropped;
m (out of 100) is the score of mid-term exam; and
f (out of 100) is the score of final exam.
b (out of 10) is the bonus points, which will be given to students who actively participate in class.
The total score (out of 100) is given by
T = min{100, 0.3a + 0.2max(m,f) + 0.5f + b}
Your letter grade will be in the A range if T ≥ 85, at least in the B range if T ≥ 65, at least in the C range if T ≥ 55.
Syllabus
Introduction: [Microlearning]
history, philosophy, examples.
Prior distribution: [Microlearning]
subjective prior, non-informative prior, conjugate prior, maximum entropy prior.
Point estimation: [Microlearning]
decision theory, Bayes estimator, admissible estimator, minimax estimator.
Testing and region estimation: [Microlearning]
hypothesis test, Bayes factor & BIC, credible region.
Theoretic justification: [Microlearning]
de Finetti’s theorem, Bayesian consistency, Bernstein-von Mises theorem
Posterior computation: [Microlearning]
Metropolis–Hastings algorithm, Gibbs algorithm.
Bayesian nonparametric: [Microlearning]
Dirichlet process, Bayesian bootstrap.
Bayesian regression: [Microlearning]
linear model, model averaging, discussion of LASSO and ridge.
Missing data problems: [Microlearning]
Multiple imputation, causal inference.
Hierarchical and empirical Bayes: [Microlearning]
hierarchical Bayes models, empirical Bayes models, Stein effects.
Lecture Notes (Draft)
(All rights reserved by the authors. Re-distribution in any mean is strictly prohibited.)
Front matters
Contents
Instructions
Part 1: Basics of Bayesian Inference
Chapter 1: Introduction
Chapter 2: Prior distribution
Chapter 3: Point estimation
Chapter 4: Hypothesis testing
Chapter 5: Region estimation
Part 2: Theory and Computation
Chapter 6: Theoretic justification
Chapter 7: Posterior Computation (R-code) --- Major updates: In Example 7.8 and 7.9, the acceptance probabilities are corrected.
Part 3: Applications
Chapter 8: Bayesian Regression (R-code)
Chapter 9: Missing data (R-code)
Chapter 10: Hierarchical and empirical Bayes
Appendixes (Optional)
Appendix A: Basic Mathematics (Optional)
Appendix B: Basic probability and statistics (Optional) --- for students who want to review; read Chapters 1 and 2 in STAT 4003
Appendix C: Basic Programming in R (Optional) --- for students who want to review; read Lectures 2 and 3 in RMSC 1101
Appendix D: Why Isn't Everyone a Bayesian? --- Efron (1986) (Optional)
Are you a Bayesian or a frequentist?
Result (of our class) on Lecture 1 (5 Sep 2019)
Bayesian: 40.8% (20/49)
Frequentist: 59.2% (29/59)
Result (of our class) on Lecture 7 (17 Oct 2019)
Bayesian: 55.6% (10/18)
Frequentist: 44.4% (8/18)
Result (of our class) on 10 Jan 2020
Bayesian: 90.9% (10/11)
Frequentist: 9.1% (1/11)