Prior knowledge being constantly updated by empirical observations – the essence of Bayesian thinking provides a natural, intuitive, and more importantly, mathematically sounded, probabilistically principled way to characterize the process of learning. With some of its key ideas formulated based on Bayes’ Theorem dating back to 18th century, Bayesian inference is one of oldest schools of statistics (more than a century earlier than the Frequentist!). Yet it was not until the recent developments in sampling algorithms and computational powers that Bayesian inference gained its revival. Bayesian, and Bayesian-based methods, with their flexibilities in modeling (generative) process of data, interpretability with posterior probability statements, and coherent principles to incorporate empirical evidence a priori, have played key roles in modern data analysis, especially for those "big data" with enhanced complexity and connectivity.
A First Course in Bayesian Statical Methods. by Peter D. Hoff (2009).
Bayesian Data Analysis (3rd edition). by Andrew Gelman, John B. Carlin, Hal S. Stern, David B. Dunson, Aki Vehtari, Donald B. Rubin (2015).
Statistical Rethinking: A Bayesian Course with Examples in R and Stan. by Richard McElreath (2016).
Click here for syllabus.
Lecture 2-4: Beta-Binomial model: [slides1] [slides2] [slides3]
Lecture 5-: Gamma-Poisson model and Monte Carlo: [slides1]
more upon request
more upon request