Randomness appears everywhere in daily life, from the weather to the stock market and the lottery. Probability theory is the branch of mathematics that analyzes random events and quantities. It is one of the major areas of pure mathematics and one of the most widely applied. Probability is of fundamental importance in the natural and social sciences, computer science, medicine, engineering, finance, and many other fields.
Topics to be covered include sample spaces and events, basic combinatorics, conditional probability and independence, discrete and continuous random variables and their distributions, expectations and variances, jointly distributed random variables, distributions and expectations of functions of random variables, conditional expectations, Chebyshev’s inequality, the weak and strong law of large numbers, moment generating functions, the central limit theorem, and Poisson processes. The culmination of this course is the central limit theorem, which says that as long as the sample size is large enough, the sample mean will be normally distributed irrespective of the population’s distribution. It explains why so many distributions look like the “bell curve.”
Note 1 (Sec 1.1 - 1.4): Sample Space / Set Theory / Axiom of Probability (Post)
Note 2 (Sec 2.1 - 2.2): Equally Likely Outcomes / Combinatorics (Post)
Note 3 (Sec 2.3 - 2.4): Applications to Probability / The Binomial Formula (Post)
Note 4 (Sec 3.1 - 3.3): Conditional Probability / Bayes' Formula (Post)
Note 5 (Sec 3.4 - 3.5): Independence / Probability of "At Least 1" (Post)
Note 6 (Sec 4.1 - 4.3): Discrete Random Variables / Bernoulli Distribution (Post)
Note 7 (Sec 4.3 - 4.4): Geometric and Binomial Distributions / Sampling With and Without Replacement (Post)
Note 8 (Sec 4.5 - 4.6): Expectations of Discrete Random Variables (Post)
Note 9 (Sec 5.1 - 5.4): Distribution and Density Functions / Expectation / The Uniform Distribution (Post)
Note 10 (Sec 5.5 - 5.7): The Exponential Distribution / Functions of Continuous Random Variables (Post)
Note 12 (Sec 6.2 - 6.3): The Poisson Approximation to the Binomial Distribution / The Poisson Process (Post)
Note 13 (Sec 6.4 - 6.5): Connections among Poisson, Exponential, and Uniform / The Poisson Renewal Process (Post)
Note 14 (Sec 6.5 - 8.1): The Poisson Renewal Process (continued) / The Gamma Distribution / Joint Distributions of Discrete Random Variables (Post)
Note 15 (Sec 8.2 - 8.3): Independent Discrete Random Variables / Joint Distributions of Continuous Random Variables (Post)
Note 16 (Sec 8.4 - 8.5): Independent Continuous Random Variables / The Sum of Continuous Independent Random Variables (Post)
Note 17 (Sec 8.6 - 8.9): n Independent Random Variables / Conditional Densities (Post)
Note 18 (Sec 9.1): Expectations of Functions of Random Variables (Post)
Note 19 (Sec 9.2 - 9.3): Expectations of Sums and Independent Products / The Variance (Post)
Note 20 (Sec 9.4): The Covariance and Correlation Coefficient (Post)
Note 21 (Sec 10.1 - 10.2): The Normal Distribution / Properties of Normally Distributed Random Variables (Post)
Note 23 (Sec 10.6): Chebyshev's Inequality and the Law of Large Numbers (Post)
Note 24 (Sec 11.1 - 11.2): Moment Generating Functions / Proof of the Central Limit Theorem (Post)