Instructor: Yizhe Zhu, yizhezhu@usc.edu
Office Hours: Wednesday 12-2 pm at KAP 464B
TA: Rundong Ding, rundongd@usc.edu,
TA Office Hours: Monday 9am-10am and Wednesday 5-7pm at Math Center (KAP 263)
Class schedule: MWF 11:00-11:50 am at KAP 156
Prerequisite: Math 505A or Math 407 or Math 408
Course Description: This is an introductory course to mathematical statistics for PhD level students.
Topics: Parametric families of distributions, sufficiency. Estimation: methods of moments, maximum likelihood, unbiased estimation. Comparison of estimators, optimality, information inequality, asymptotic efficiency, EM algorithm.
Textbook:
George Casella and Roger L. Berger. Statistical Inference. Second Edition.
(Not required): Robert Keener. Theoretical Statistics.
Exam dates:
Midterm 1: Wednesday, Feb 26, 11:00-11:50 am, KAP 156
Midterm 2: Wednesday, April 9, 11:00-11:50 am, KAP 156
Final: Wednesday, May 7, 11 am-1 pm, KAP 156
Homework: Homework will be posted on Brightspace
Course Schedule: Below is a tentative schedule, to be updated as the semester progresses.
Week 1
Jan 13: Parametric models, definition of exponential families
Jan 15: Examples, canonical form, curved family
Jan 17: Derivatives of the log-partition function, cumulants, convexity
Week 2
Jan 20: No class
Jan 22: Sufficient statistic
Jan 24: Factorization theorem
Week 3
Jan 27: Factorization theorem for exponential family, minimal sufficient statistic
Jan 29: minimal sufficiency examples, ancillary statistic
Jan 31: complete statistic, Basu's theorem
Week 4
February 3: Bahadur's theorem, method of moments
February 5: maximum likelihood estimators, MLE for linear regression and logistic regression
February 7: Examples of MLEs, invariance property of MLEs
Week 5
February 10: Bayes estimators
February 12: Conjugacy family for the exponential family
February 14: Dirichlet distribution, Maximum a posteriori (MAP) estimation
Week 6
February 17: No class
February 19: Mean squared error of an estimator, bias-variance tradeoff
February 21: Best unbiased estimators, UMVUE, Cramér–Rao bound
Week 7
February 24: Cramér–Rao bound for i.i.d. samples.
February 26: Midterm 1
February 28: Cramér–Rao bound continued, Rao-Blackwell Theorem
Week 8
March 3: Rao-Blackwell Theorem
March 5: Lehmann–Scheffé theorem
March 7: Complete sufficient statistic and UMVUE
Week 9
March 10: Loss function, Bayes risk
March 12: Bayes estimators
March 14: Expectation–maximization algorithm
Week 10
March 17: No class
March 19: No class
March 21: No class
Week 11
March 24: Gaussian mixture, EM algorithm
March 26: EM algorithm analysis
March 28: Jackknife, Bootstrap resampling
Week 12
March 31: Modes of convergence, consistency of estimators, law of large numbers
April 2: convergence in distribution, central limit theorem
April 4: Berry-Esseen theorem, Hoeffding's inequality, the Delta method
Week 13
April 7: first and second order Delta methods, multivariate Delta method
April 9: Midterm 2
April 11: Consistency
Week 14
April 14: Consistency of MLE, bias versus consistency, asymptotic efficiency
April 16: asymptotic normality of MLE, asymptotic efficiency,
April 18: asymptotic normality of MLE, superefficiency,
Week 15
April 21: superefficiency, asymptotic relative efficiency
April 23: Asymptotic normality of median and quantiles
April 25: ARE of median and mean
Week 16
April 28: Robustness
April 30: M-estimators
May 2: Review