This page presents some of my notes on probability theory along with some of its applications. For my other notes, please visit my main page. All notes are done while I'm at Cornell university.

#### Probability and Random Processes for Electrical Engineering (Selected Topics)

- Introduction to Probability for Electrical Engineering (draft)

Set theory, Combinatorics, Delta function, Classical probability, Algebra and sigma algebra, Kolmogorov axioms, Probability measure, Discrete and continuous random variables, Independence, Expectation and inequalities, Transform method (probability generating function, moment generating function, one-sided Laplace transform, characteristic function), Conditional probability and expectation, Convergence, Law of large number, Central limit theorem, Function of random variables (transformations), Random Vectors.

Last update: Jan 23, 2008.

An older version of this article was posted at scribd.com. - Poisson Random Variables and Processes (Summary)

Poisson distribution, Convergence to the Poisson Law, Compound Poisson Distribution, Random Sum and Filtered Process, Exponential Distribution, Homogeneous Poisson Process (HPP), Non-homogeneous Poisson Process (NHPP), Poisson Limit for Superposition of Processes, Filtered Poisson Process (FPP), Poisson Approximation in Total Variation distance and Relative Entropy, Poisson Limit for Superposition of Processes, Poisson Process in General Space. - Lower bound on Variance Estimation for Integer-valued Samples

Many experiments involve counting the number of desired events. Mean and variance of the resulting collection of integer-valued samples are often estimated. Of course, the estimated variance

will be non-negative. Interestingly, given a specific average, the estimated variance is not only nonnegative, but it is also lower-bounded by a nonnegative function of the average which is strictly positive almost everywhere. - Point Processes (Summary)
- Poisson Limit of Superposed Processes with Application to Neuroscience

#### (Measure-theoretic) Probability Theory

#### Information-Theoretic Identities

The concept of entropy and mutual information are often included in probability books. Entropy was the main topic for my first assignment in the graduate-level probability theory course given by Eugene B. Dynkin. Of course, relative entropy or Kullback-Leibler distance is also an important concept in probability. In fact, most of the standard probability distributions can be characterized as being maximum entropy distributions under appropriate moment constraints. Here, I collect several useful identities and inequalities involving information theoretic quantities. Concept of I-measure is also presented here. My PhD research under Prof. T. Berger and joint work with Jun Chen also gave me an opportunity to study the concept of directed information.

- Information-theoretic identities, part 1
- Directed Information and Channel with Feedback

Key words: discrete memoryless channel, discrete channel used without feedback, discrete memoryless channel used without feedback, Massey's directed information, Kramer's causal conditioning

#### Measure Theory

For those who want to learn more about measure theory, the notes can be found below.

- Sets
- Extended Real Number Systems
- Algebra (field), Sigma-Algebra, Borel sigma-algebra
- Dynkin's Pi-lambda theorem, Monotone class theorem
- Total Variation

to be continued...