Week 1 (Jan 6): Introduction to Bayes' theorem, importance of prior, coin flipping, and "what is the probability of this datum?". Drawn mainly from Ch. 1 and Ch. 2 (section 2.1) in Sivia, and a little bit from Ch. 1 in the Gregory book.
Week 2 (Jan 13): More on the Gaussian approximation and its limitations (Sivia p. 23-26), the lighthouse problem and variable transformations (Sivia p. 29-34, 68-69).
Week 3 (Jan 20): Common probability density functions (PDFs): binomial (Sivia p. 103-108), Poisson (Sivia p. 121, 35-9), and Gaussian (normal) distributions. The Central Limit Theorem (Sivia p. 121-4, 61-63).
Week 4 (Jan 27): Maximum Likelihood ("Optimal") Estimation, covariance matrices, and ellipsoidal confidence contours (Sivia p. 43-49, 65-67). We will also discuss the issue of how to correctly estimate measurement errors using the data themselves and how to account for "bad" data (large outliers) in a Bayesian framework. See Sivia sections 3.3 and 8.2 for more on these two topics.
Week 5 (Feb 3): Markov chain Monte Carlo method. For supplementary materials see attachment below (nr_mcmc.pdf). This is an excerpt from the book: Numerical Recipes: The Art of Scientific Computing, 3rd Edition (Press et al. 2007). It provides a concise and mathematically motivated explanation how the Markov Chain Monte Carlo method works and includes a worked example as well.
Week 6 (Feb 10): More on MCMC (see week 5 for associated reading).
Week 7 (Feb 17): Nested Sampling and model selection. For reading resources see Sivia Ch. 4, section 9.1 through 9.3. (Optional: 9.4 through 9.6 for advanced things you could do with nested sampling).
Week 8 (Feb 24): Discussion of advanced MCMC techniques, including ensemble samplers, parallel tempering, and rejection sampling.
Week 9 (Mar 2): Gaussian processes.
To download a pdf of this file, use this link: