Advanced Probability, Linear Algebra and Optimization Techniques (FIC 504)
August-December 2025
10:00-11:00 (Monday) and 11:00-13:00 (Friday)
August-December 2025
10:00-11:00 (Monday) and 11:00-13:00 (Friday)
The objective of this course is to introduce the mathematical tools necessary for developing new algorithms in cybersecurity/ machine learning.
Textbooks:
Mathematics for Machine Learning, Marc Peter Deisenroth, A. Aldo Faisal, Cheng Soon Ong.
Sheldon Ross, A First Course in Probability, 8th Edition, Pearson, 2006.
Kenneth M Hoffman, Ray Kunze, Linear Algebra, 2nd Edition, Pearson.
Reference Books:
J. Medhi, Stochastic Processes, 3rd Edition, New Age International, 2009.
S. M. Ross, Stochastic Processes, 2nd Edition, Wiley, 1996.
Stephen H Friedberg, Arnold J Insel, Lawrence E. Spence, Linear Algebra. 4th Edition, Pearson, 2006.
Topics covered:
Lecture 1: Began with some standard set-theory notations. Introduced basic notions of probability theory (for example, random experiments, sample spaces and events, etc.) Studied and explained the basic axioms of probability theory.
Suggested reading: Sheldon Ross, sections 2.1, 2.2, 2.3, 2.4, 2.5.
Lecture note: 1-basic notions of probability theory.pdf
Lecture 2: Studied one of the most fundamental concepts in probability theory, which is called conditional probability. For a given event A of a random experiment with additional condition B, how should we obtain the probability P(A|B) of A given B from the prior probability P(A)?
Suggested reading: Sheldon Ross, sections 3.1, 3.2, 3.3.
Lecture note: 2-conditional probability.pdf
Lecture 3: Independent events, total probability laws, and Bayes' Theorem.
Suggested reading: Sheldon Ross, sections 3.3, 3.4, 3.5.
Lecture note: 3-independent events.pdf
Homework HW-1.pdf
Lecture 4: To analyse random experiments, we usually focus on some numerical aspects of the experiments; these numerical values are called "random variables". Studied discrete and continuous random variables. Learn the probability mass functions of a random variable.
Suggested reading: Sheldon Ross, sections 4.1, 4.2.
Lecture note: 4-random variable.pdf
Lecture 5: Introduced the probability distributions and further studied special distributions. For example, (1) X~Bernoulli(p), X~Geometric(p) and X~Binomial(n, p).
Suggested reading: Sheldon Ross, sections 4.6
Lecture note: 5-special distributions.pdf
Lecture 6: Introduced the cumulative distribution function (CDF). Although the PDF is one way to describe the distribution of a discrete random variable, it can not be defined for continuous random variables. And, the CDF of a random variable is another method to describe the distribution of random variables. The advantage of the CDF is that it can be defined for any random variable (discrete, continuous and mixed).
Lecture note: 6-cumulative distribution function.pdf
Lecture 7: Expectation of discrete variables, linearity of expectations
Suggested reading: Sheldon Ross, Sections 4.3
Lecture note: 7-expectations.pdf
Lecture 8: Functions of discrete random variables, Law of the Unconscious Statistician (LOTUS) for discrete random variables and variance.
Suggested reading: Sheldon Ross, Sections 4.4, 4.5
Lecture note: 8-LOTUS-variance of discrete random variables.pdf
Homework HW-2.pdf