Basics on probability theory. Measurable spaces. Probability spaces. Conditional probability. Random variables. Properties of the distribution function. Discrete, absolute continuous and singular components of distribution functions. Change of distributions and densities under change of variables. Random vectors. Independence of σ-algebras and random variables. Independence of random variables. Independence of events and σ-algebras. Some connections among independence, σ-algebras and random variables. Integrals of random variables on probability spaces. Integrals of multivariate random variables. Independence, uncorrelation and Gaussianity. Characteristic functions. Conditional distribution and expectation under a given event. Conditional distributions and densities. Zero-one laws and laws of large numbers. Central Limit Theorem.
Random samples extracted from a given distribution function. Statistics as a correlation process of data with parameters. Bayesian vs Deterministic approach.
The Deterministic approach. Estimators as statistics: bias and variance. A tradeoff between bias and minimum variance. Fisher Information Matrix and Cramer-Rao inequality. Kullback-Leibler pseudo-distance between density functions and connections with the Fisher matrix, Parameter indistinguishability and identifiability: a characterization based on the Fisher Matrix. Maximum Likelihood. Maximum Likelihood estimators for Gaussian random samples. The invariance principle for Maximum Likelihood estimators. Weighted least squares. Linear statistical models. Maximum Likelihood estimators for the linear statistical model with Gaussian noise. Maximum Likelihood estimators revisited in the geometric framework. Linear estimators with minimal error covariance for the linear statistical model with non-Gaussian noise . Statistical models with vector valued data. Recursive WLS estimators.
The Bayesian approach. Cost functions for the Bayesian estimation problem. Conditional expectation of Gaussian random vectors. Linear Minimum Variance Estimators. Geometric formulation of the linear Minimum Error Variance problem. Block diagonalization of the covariance matrix, Schur complements and the Matrix Inversion Lemma. Bayesian estimation for the linear model. Comparisons of Bayesian and Maximum Likelihood estimators. The recursive algorithm for the linear Minimum Error Variance estimator. The algorithm for the Kalman predictor. The algorithm for the Kalman filter. The Extended Kalman Filter.