Class time: Th 0930-1215
Location: ERB 407
Outline: 2025Spring_S4010_outline.pdf
Password: <See Blackboard>
Name: Kin Wai (Keith) CHAN
Email: kinwaichan@cuhk.edu.hk
Office: LSB 115
Tel: 3943 7923
Office hour: Open-door policy
Office: LSB G32
Tel: 3943 8535
Email: 1155125610@link.cuhk.edu.hk
Office: LSB 130
Tel: 3943 7939
This course introduces the basic Bayesian inference procedures and philosophy, emphasizing both conceptual foundations and implementation. It covers conjugate families of distributions, Bayesian credible region, Jeffery’s prior, Markov Chain Monte Carlo, Bayes factor, Bayesian information criterion, imputation, Bayesian linear-regression models, model averaging, hierarchical models and empirical Bayes models. Hands-on implementation of estimation and inference procedures in R will be demonstrated in interactive sections.
No prerequisite course, but probability, statistics, and programming knowledge at the level of Stat 2001, 2005, and 2006 is highly recommended.
A self-contained lecture note is the main source of reference. Complementary textbooks include
(Major) Hoff, P.D. (2009). A First Course in Bayesian Statistical Methods. Springer. (Free online access via CUHK library)
(Minor) Wasserman, L. (2004). All of Statistics: A Concise Course in Statistical Inference. Springer.
(Minor) Albert, J. (2007). Bayesian Computation with R. Springer.
Upon finishing the course, students are expected to
distinguish the difference between frequentist and Bayesian methods, and identify their pros and cons;
derive posterior distribution from commonly-used prior and sampling distributions;
perform Markov chain Monte Carlo for Bayesian inference in R;
build simple Bayesian models for solving real problems; and
select appropriate Bayesian tools for different statistical tasks, e.g., estimation, testing, model selection, prediction, etc.
There are three main assessment components, plus a bonus component.
a (out of 100) is the average score of approximately eight assignments with the lowest two scores dropped;
m (out of 100) is the score of mid-term exam;
f (out of 100) is the score of final exam; and
b (out of 2) is the bonus points, which will be given to students who actively participate in class.
The total score t (out of 100) is given by
t = min{100, 0.3a + 0.2max(m,f ) + 0.5f + b}
If min(t, f ) < 30, the final letter grade will be handled on a case-by-case basis. Otherwise, your letter grade will be in the A range if t ≥ 85, at least in the B range if t ≥ 65, at least in the C range if t ≥ 55.
Important note: For the most updated information, please always refer to the course outline announced by the course instructor in Blackboard, which shall prevail the above information if there is any discrepancy.
Front matters
Click (S4010/lecture) to download all lecture notes and codes (or click the individual links below). The lecture notes may be updated from time to time.
Part 1: Basics of Bayesian Inference
Part 2: Theory and Computation
Part 3: Applications
Appendixes (Optional)
Appendix A: Basic Mathematics (Optional) --- for students who want to review; read Appendix A in STAT3005.
Appendix B: Basic probability (Optional) --- for students who want to review; read Appendix B in STAT3005.
Appendix C: Basic Statistics (Optional) --- for students who want to review; read Appendix C in STAT3005.
Appendix D: Basic programming in R (Optional) --- for students who want to review; read Lectures 2 and 3 in RMSC 1101.
Appendix E: Why Isn't Everyone a Bayesian? --- Efron (1986) (Optional)
Special topics that may be covered: Bayesian nonparametric, Fiducial inference, convergence diagnosis on MCMC, ...
Click (S4010/2025Spring/L) to download all in-class notes and recordings (if any).
Lecture 1 (9 Jan): Introduction of Bayesian philosophy, posterior calculation
Lecture 2 (16 Jan): Example of posterior calculation, why flat prior fails, noninformative prior
Lecture 3 (23 Jan): Proofs of invariant priors, conjugate priors, informative priors, weakly informative priors
Lecture 4 (6 Feb): Exponential family mixture conjugate prior, Bayesian estimators, decision theory (luck simulation)
Lecture 5 (13 Feb): Bayes estimator, weighted posterior mean, posterior quantile, admissibility, uniqueness of BE
Lecture 6 (20 Feb): Admissible estimators, minimax estimators, proof of Chapter 3
Lecture 7 (27 Feb): Proof of Theorem 3.7, Bayesian testing, Bayse factor
Lecture 8 (13 Mar): Three problems of Bayesian testing, region estimation, highest posterior density
Lecture 9 (20 Mar): Asymptotic theory for Bayesian analysis, classical and MC methods for posterior computation
Lecture 10 (27 Mar): MH algorithm and examples
Lecture 11 (3 Apr): Convergence diagnosis for MCMC, Gibbs, MH-within-Gibbs
Lecture 12 (10 Apr): Multiple linear regression, BF, BIC and AIC, spike-and-slab prior, LASSO
Lecture 13 (17 Apr): Missing data mechiansim, full Bayesian approach, EM, MI, emprical Bayes
Remark: In-class notes and recordings (if any) will be uploaded within one week after the lecture.
Click (S4010/2025Spring/A) to download all assignments.
Assignment 1: Concept of Bayesian model, posterior calculation, simple coding exercises --- Due: 4 Feb (Tue) @ 1800
Assignment 2: Different types of priors, other types of invariant priors --- Due: 15 Feb (Sat) @ 1800
Assignment 3: Bayes estimators, multinomial distribution with Dirichlet prior --- Due: 10 Mar (Mon) @ 1800
Assignment 4: Region estimators and testing, application in horse-racing --- Due: 31 Mar (Mon) @ 1800
Assignment 5: Theoretic comparison between Bayesian and frequentist's methods --- Due: 7 Apr (Mon) @ 1800
Assignment 6: MCMC algorithm and convergence diagnosis --- Due: 18 Apr (Fri) @ 1800
Assignment 7: Gibbs sampler and missing data problem --- Due: 25 Apr (Fri) @ 1800
Click (S4010/2025Spring/Q) to download all quizzes.
Quiz 1: calculation of posterior
Quiz 2: calculation of prior
Quiz 3: Bayes estimator
Cumulative over the years (conducted in the first class): --------->
* Frequentist: 33/81 = 41%
* Bayesian: 48/81 = 59%
Based on our class this year:
* Frequentist: 18/38 = 47%
* Bayesian: 20/38 = 53%
Date: 14 March (Friday)
Start Time: 6:30 pm
Duration: 2 hours
Location: LSB G25--G27 (Computer Lab)
Scope: Chapters 1--3
Instruction: Please read the first page of the real question paper. Some highlights are listed below:
Read the instructions carefully before doing the exam.
Complete the exam on your own.
Internet access and any form of generative AI are not allowed.
Formulas and hints (if any) are provided after the last question.
Only one piece of A4-sized, double-sided, self-prepared cheat sheets are allowed.
Mock:
Mock 1: 2024 Spring M
Mock 2: 2023 Spring M
Mock 3: 2022 Spring M
Mock 4: 2021 Spring M
Date: 29 April (Tue)
Start Time: 9:30 am
Duration: 3 hours
Location: LSB G25--G27 (Computer Lab)
Scope: Chapters 1--10
Instruction: Please read the first page of the real question paper. Some highlights are listed below:
Read the instructions carefully before doing the exam.
Complete the exam on your own.
Internet access and any form of generative AI are not allowed during the exam.
Formulas and hints (if any) are provided after the last question.
Only two pieces of A4-sized, double-sided, self-prepared cheat sheets are allowed.
Mock:
Quick review exercise for each chapter. (Recording from 2021 Spring)
Mock 1: 2024 Spring F
Mock 1: 2023 Spring F
Mock 2: 2022 Spring F
Mock 3: 2021 Spring F
Mock 4: 2019 Fall F
Real exam structure overview: Please refer to the Blackboard announcement.