STAT 4010 - Bayesian Learning
2023 Spring
Class Information
Class time: Th 0930-1215
Location: MMW LT2
E-learning platform: http://www1.sta.cuhk.edu.hk/
Instructor
Name: Kin Wai (Keith) CHAN
Email: kinwaichan@cuhk.edu.hk
Office: LSB 115
Tel: 3943 7923
Office hour: Open-door policy
Teaching Assistants
Office: LSB G32
Tel: 3943 8535
Email: sudi@link.cuhk.edu.hk
Office: LSB 143
Tel: 3943 1747
Description
This course introduces the basic Bayesian inference procedures and philosophy, emphasizing both conceptual foundations and implementation. It covers conjugate families of distributions, Bayesian credible region, Jeffery’s prior, Markov Chain Monte Carlo, Bayes factor, Bayesian information criterion, imputation, Bayesian linear-regression models, model averaging, hierarchical models and empirical Bayes models. Hands-on implementation of estimation and inference procedures in R will be demonstrated in interactive sections.
Textbooks
A self-contained lecture note is the main source of reference. Complementary textbooks include
(Major) Hoff, P.D. (2009). A First Course in Bayesian Statistical Methods. Springer. (Free online access via CUHK library)
(Minor) Wasserman, L. (2004). All of Statistics: A Concise Course in Statistical Inference. Springer.
(Minor) Albert, J. (2007). Bayesian Computation with R. Springer.
Learning outcomes
Upon finishing the course, students are expected to
distinguish the difference between frequentist and Bayesian methods, and identify their pros and cons;
derive posterior distribution from commonly-used prior and sampling distributions;
perform Markov chain Monte Carlo for Bayesian inference in R;
build simple Bayesian models for solving real problems; and
select appropriate Bayesian tools for different statistical tasks, e.g., estimation, testing, model selection, prediction, etc.
Assessment and Grading
There are three main assessment components, plus a bonus component.
a (out of 100) is the average score of approximately eight assignments with the lowest two scores dropped;
m (out of 100) is the score of mid-term assessment;
f (out of 100) is the score of final assessment; and
b (out of 2) is the bonus points, which will be given to students who actively participate in class.
The total score t (out of 100) is given by
t = min{100, 0.3a + 0.2max(m,f ) + 0.5f + b}
If min(t, f ) < 30, the final letter grade will be handled on a case-by-case basis. Otherwise, your letter grade will be in the A range if t ≥ 85, at least in the B range if t ≥ 65, at least in the C range if t ≥ 55.
Important note: For the most updated information, please always refer to the course outline announced by the course instructor in Blackboard, which shall prevail the above information if there is any discrepancy.
Lecture Notes
(All rights reserved by the authors. Re-distribution by any means is strictly prohibited.)
Front matters
Instructions
Part 1: Basics of Bayesian Inference
Chapter 1: Introduction
Chapter 2: Prior distribution
Chapter 3: Point estimation
Chapter 4: Hypothesis testing
Chapter 5: Region estimation
Part 2: Theory and Computation
Chapter 6: Theoretic justification
Chapter 7: Posterior computation [R-code]
Part 3: Applications
Chapter 8: Bayesian regression [R-code]
Chapter 9: Missing data [R-code]
Chapter 10: Hierarchical and empirical Bayes [R-code]
Appendixes (Optional)
Appendix A: Basic Mathematics (Optional) --- for students who want to review; read Chapter A in STAT 4003
Appendix B: Basic probability and statistics (Optional) --- for students who want to review; read Chapters 1 and 2 in STAT 4003
Appendix C: Basic Programming in R (Optional) --- for students who want to review; read Lectures 2 and 3 in RMSC 1101
Appendix D: Why Isn't Everyone a Bayesian? --- Efron (1986) (Optional)
Special topics that may be covered: Bayesian nonparametric, Fiducial inference, convergence diagnosis on MCMC, ...