IE 3094 (Pitt). Markov Decision Processes (Spring 2023, Fall 2025, Graduate)
Probability Basics: Review of Random variables, Conditional probability and expectation, and Markov Processes.
Finite Horizon MDP: Dynamic programming backward induction, its optimality, with applications to inventory control, linear quadratic control, optimal stopping, and shortest path.
Finite Horizon Partially Observable MDP: Converting POMDP to MDP of Information State, Sufficient Statistics, Evolution of conditional distribution, DP equation as a function of sufficient statistics, Application to Sequential Hypothesis Testing.
Infinite Horizon Discounted Cost MDP: Bellman equation, uniqueness, and fixed point, value iteration, and necessary and sufficient conditions for optimality. Application to multi-armed bandits and Gittin's index.
Infinite Horizon Unbounded and Undiscounted Cost MDP: Bellman equation and value iteration with applications to inventory control, sequential analysis (change detection and sequential hypothesis testing), and linear quadratic control.
Introduction to Reinforcement Learning: Stochastic approximation and Q-learning.
Text: Lecture Notes, Dynamic Programming and Optimal Control, Vols I, II (Bertsekas), POMDP (Vikram Krishnamurthy), Neurodynamic Programming (Bertsekas and Tsitsiklis).
IE 2084 (Pitt). Stochastic Processes (Fall 2023, 2024, Graduate)
Probability Basics: Probability axioms and properties, Random variables, Conditional probability and expectation, Optimal prediction and classification, Markov's, Chebyshev's, and Chernoff's inequality.
IID Processes: Notions of convergence of random variables, Borel-Cantelli Lemma, Rapid convergence, Weak and strong laws of large numbers, Characteristic function, Central limit theorem.
Martingales: Basic properties of martingales, discrete stochastic integral, stopping time, stopped martingale, optional sampling theorem, submartingale inequality, Upcrossing inequality, martingale convergence theorem, likelihood ratio process.
Markov Processes: Basic properties, transition probability, stationary distribution, recurrent and transient states, positive and null recurrent, communicating states, aperiodicity, DTMC convergence.
Text: Lecture Notes, Probability with Martingales (David Williams), Probability I and II (Shiryaev).
IE 1082 (Pitt). Probabilistic Methods in Operations Research (Spring 2026, Undergraduate)
Probability Review: Probability axioms and properties, random variables, independence, conditional probability, law of large numbers, and central limit theorem. Jupyter Notebook to illustrate LLN and CLT.
Discrete-time Markov chains: DTMC transient and steady state analysis. Limiting pmf. Jupyter Notebook to simulate DTMC and calculate transient and steady state pmfs.
Exponential Density and Poisson Processes: Minimum of exponentials, superposition of Poisson processes.
Continuous-Time Markov chains: CTMC characterization, transient and steady state analysis. Applications to various birth and death processes, including queueing models. Jupyter Notebook to simulate a CTMC and calculate transient and steady-state probabilities.
Renewal Theory: Renewal theorem and renewal reward theorem.
Markov Decision Processes: Dynamic programming algorithm with applications to LQG and Inventory control. Jupyter Notebook to execute the dynamic programming iterations for finite and infinite horizons for inventory control.
Queueing models: Queueing nomenclature, PASTA, Little's law.
Text: Lecture Notes based on the book Introduction to Modeling and Analysis of Stochastic Systems by V. G. Kulkarni.
IE 1070 (Pitt). Probability, Random Variables, Distributions (Fall 2022, 2023, 2024, 2025, Undergraduate)
Probability Foundations: Set theory, Probability axioms and properties, Probability and combinatorics, Conditional probability and independence,
Random Variables: Random variable definition, Probability mass function (PMF), Probability density function (PDF), Cumulative distribution function (CDF), Expected value, Variance, Moments.
Important PMFs and PDFs: Important PMFs (Bernoulli, Binomial, Multinomial, Geometric, Negative Binomial, Poisson). Important PDFs (Uniform, Exponential, Gamma, Weibull, Normal/Gaussian).
Multiple Random Variables: Joint and conditional PMF/PDF, Conditional expectation, Independent random variables.
Central Limit Theorem: Properties of expectation, Law of large numbers, Central limit theorem.
Text: Lecture Notes
EE 5263 (UTSA). Machine Learning (Fall 2018, 2019, 2020, 2021, Graduate)
Mathematics of machine learning: vector space and linear algebra, convex optimization, gradient and stochastic gradient methods, probability theory, and basic statistical theory (hypothesis testing, maximum likelihood estimation).
Machine learning basics: concepts of training, testing, and validation, supervised learning (regression and classification), unsupervised learning, dimensionality reduction, and reinforcement learning.
Advanced topics (time-permitting): Advanced statistical inference methods and theory, high-dimensional statistics, machine learning theory.
Text: Lecture Notes and http://web.stanford.edu/~hastie/ElemStatLearn/
EE 5263 (UTSA). Statistical Inference (Fall 2020, Graduate)
1. Probability Theory Basics:
1. Probability triple, random variables, expected value, marginal and joint distribution, independence.
2. Calculations for standard random variables (Gaussian, Poisson, Binomial, Exponential, Geometric, Uniform).
3. Conditional distribution and conditional expectation.
4. Markov, Chebyshev, and Chernoff Inequality
5. Weak law of large numbers and Central limit theorem.
6. Introduction to Weak Convergence
2. Prediction Theory Basics:
1. Optimal minimum mean squared error predictor.
2. Optimal misclassification error predictor
3. Connections to machine learning, Bayesian inference and stochastic filtering.
4. Linear discriminant Analysis.
3. M-Estimation, Z-estimation, and Maximum Likelihood Estimation
1. Proofs of Consistency.
2. Proofs of asymptotic normality.
3. Connections to supervised machine learning.
4. Existence of MLE for exponential family of distributions.
4. Unbiased Estimation
1. Cramer-Rao lower bound (CRLB) and generalized CRLB.
2. Exponential family and achievability of CRLB.
3. Asymptotic efficiency of MLE.
5. Mixture Models and EM Algorithm
6. Decision Theory: Bayes and minimax rules and connections between them.
7. Minimax Function Estimation:
1. Lower bound on minimax estimation using hypothesis testing.
2. Linear nonparametric estimators for density estimation.
3. Linear nonparametric estimators for regression function estimation.
4. Minimax rate optimality of linear nonparametric estimators.
Text: Lecture Notes, Mathematical Statistics (Bickel and Doksum ), Asymptotic Statistics (Aad van der Vaart), Statistical Inference for Engineers and Data Scientists (Moulin and Veeravalli), Introduction to Nonparametric Estimation (Tsybakov).
EE-3533 (UTSA): Probability and Stochastic Processes (Spring 2019, 2020, 2021, 2022, Fall 2021, Undergraduate)
Introduction to set theory. Sample space and probability axioms. Conditional probability: Bayes theorem and total probability theorem. Independence. Discrete random variables: probability mass function (pmf), expectation, functions of a random variable, conditional pmf, independent random variables. Continuous Random Variables: cumulative distribution function, probability density function, convolution. Three important theorems in probability: law of large numbers, central limit theorem, and Markov inequality.
Text: Lecture Notes and ECE 313 Notes by Prof. Bruce Hajek (UIUC).