Mathematics for Machine Learning
Before diving straight into machine learning, whether you're a beginner or an experienced professional seeking a change in career, you should be familiar with a few mathematical concepts, such as probability distribution, statistics, linear algebra and matrix, regression, geometry, dimension reduction, vector calculus, and so on. These ideas are widely applied in machine learning, for instance:- What do we do in ML? Using training data as a basis, we create a prediction model (algorithms/classifiers) and use it to forecast fresh data. We employ a confusion matrix, which is predicated on the idea of conditional probabilityβa fundamental mathematical conceptβto assess the validity of our model. By comprehending these ideas in mathematics .
The field of study known as "machine learning" aims to enable computers to learn without explicit programming. The fundamental principle of machine learning is expressed within the machine learning model through math.
Thus, mathematics plays a vital role in machine learning because of its use in this field.Β
Linear Algebra and Matrix
Content
πVectors and Matrices:
(a). Matrix Introduction
(b). Matrix Addition:
Β Β Β Β Β Β Β Β Β Β Β Β Β Β Β Β Β Β π Matrix Addition using NumPy Arrays
Β Β Β Β Β Β Β Β Β (c). Matrix Multiplication:
Β Β Β Β Β Β Β Β Β Β Β Β Β Β πMatrix Multiplication using Python
Β Β Β Β Β Β Β Β Β Β Β Β Β Β Β Β Β Β πMatrix Manipulation using NumPy Arrays
(e). Inverse of a Matrix:
Β Β Β Β Β Β Β Β Β Β Β Β Β Β πEvaluating Inverse using NumPy Arrays
(f). Transpose of a Matrix:
Β Β Β Β Β Β Β Β Β Β Β Β Β Β πEvaluating Transpose using NumPy Arrays
(g). Properties of Matrix
(h). Determinant
Β (i). Trace
πSystem of Linear Equations:
Β Β Β Β Β Β Β Β (a). System of Linear Equation
Β Β Β Β Β Β Β Β (b). Solving Linear Equations using Gaussian EliminationΒ
Β Β Β Β Β Β Β Β (c). LU Decomposition of Linear Equation
Β Β Β Β Β Β Β Β (d). Matrix Inversion
πMatrix Factorization:
Β Β Β Β Β Β Β Β (a). Gram-Schmidt Process
(b). QR Decomposition
(c). Cholesky Decomposition
(d). Singular Value Decomposition
(e). Matrix Factorization
(f). Diagonalization
(g). Eigenvalues and Eigenvectors
(h). Eigenspace
πVector Spaces:
(a). Vector Operations
(b). Vector Spaces and SubSpaces
(c). Basis and Dimension
πRow Echelon Form
πLinear Mappings
πLeast Square and Curve Fitting
πAffine Spaces
Geometry for Machine learning
Geometry is the branch of mathematics that deals with the forms, angles, measurements, and proportions of ordinary objects.Β
Content
πVector Norms
πInner, Outer, Cross Products
πDistance Between Two Points:
(a). Distance Measures:
πEuclidean Distance
πManhattan Distance
πMinkowski Distance
πChebysev Distance
πSimilarity Measures:
πCosine Similarity
πJaccard Similarity
πPearson Correlation Coefficient
πKendall Rank Correlation Measure
πPearson Product-Moment Correlations
πSpearmanβs Rank Correlation Measure
πOrthogonality and Orthogonal Projections:
πOrthogonality and Orthonormal Vectors
πOrthogonal Projections
πRotations
πGeometric Algorithms:
πNearest Neighbor Search
πVoronoi diagrams
πDelaunay Triangulation
πGeometric intersection and Proximity queries
πConstraints and Splines
πBox-Cox Transformations:
πBox-Cox Transformation using Python
πFourier transformation:
πProperties of Fourier Transform
πInverse Fast Fourier Transformation
Calculus is a subset of mathematics concerned with the study of continuous transition. Calculus is also known as infinitesimal calculus or βinfinite calculus.β The analysis of continuous change of functions is known as classical calculus
Content
πDifferentiation:
(a). Implicit Differentiation
(b). Inverse Trigonometric Functions Differentiation
(c). Logarithmic Differentiation
(d). Partial Differentiation
(e). Advanced Differentiation
πMathematical Intuition Behind Gradients and their usage:
(a). Implementation of Gradients using Python
(b). Optimization Techniques using Gradient Descent
πHigher-Order Derivatives
πMultivariate Taylor Series
πApplication of Derivation:
(a). Application of Derivative β Maxima and Minima
(b). Absolute Minima and Maxima
(c). Constrained Optimization
(d). Unconstrained Optimization
(d). Constrained Optimization β Lagrange Multipliers
(e). Newtonβs Method
πUni-variate Optimization
πMultivariate Optimization:
πConvex Optimization
πLagrangeβs Interpolation
πArea Under Curve
Statistics for Machine Learning
Statistics is the collection of data, tabulation, and interpretation of numerical data, and it is applied mathematics concerned with data collection analysis, interpretation, and presentation.
Statistical InferenceΒ
Content
πMean, Standard Deviation, and Variance:
πCalculating Mean, Standard Deviation, and Variance using Numpy Arrays
πSample Error and True Error
πBias Vs Variance and Its Trade-Off
πHypothesis Testing:
(a). T-test
(b). Paired T-test
(c). p-value
(d). F-Test
(e). z-test
πConfidence Intervals
πCorrelation and Covariance
πCorrelation Coefficient
πCovariance Matrix
πNormal Probability Plot
πQ-Q Plot
πResiduals Leverage Plot
πRobust Correlations
πHypothesis Testing:
(a). Null and Alternative Hypothesis
(b). Type 1 and Type 2 Errors
(c). p-value interaction
(d). Parametric Hypothesis Testing:
πT-test
πPaired Samples t-test
πANOVA Test
(e). Non-Parametric Hypothesis Testing:
πMann-Whitney U test
πWilcoxon signed-rank test
πKruskal-Wallis test
πFriedman test
πTheory of Estimation:
(a). Difference between Estimators and Estimation
(b). Methods of Estimation:
πMethod of Moments
πBayesian Estimation
πLeast Square Estimation
πMaximum Likelihood Estimation
(e). Likelihood Function and Log-Likelihood Function
(f). Properties of Estimation:
πUnbiasedness
πConsistency
πSufficiency
πCompleteness
πRobustness
πConfidence Intervals
Regression is a statistical process for estimating the relationships between the dependent variables or criterion variablesΒ
Content
πParameter Estimation
πBayesian Linear Regression
πQuantile Linear Regression
πNormal Equation in Linear Regression
πMaximum Likelihood as Orthogonal Projection
Probability and distributions are statistical functions that describe all the possible values.
Content
πProbability
πChance and Probability
πAddition Rule for Probability
πLaw of total probability
πBayesβ Theorem
πDiscrete Probability Distributions:
(a). Discrete Uniform Distribution
(b). Bernoulli Distribution
(c). Binomial Distribution
(d). Poisson Distribution
πContinuous Probability Distributions:
(a). Continuous Uniform Distribution
(b). Exponential Distribution
(c). Normal Distribution
(d). Beta Distribution:
πBeta Distribution of First Kind
πBeta Distribution of Second Kind
(e). Gamma Distribution
πSampling Distributions:
(a). Chi-Square Distribution
(b). F β Distribution
(c). t β Distribution
πCentral Limit Theorem:
πImplementation of Central Limit Theorem
πLaw of Large Numbers
πChange of Variables/Inverse Transformation
Dimensionality reduction is a technique to reduce the number of input variables in training data.Β
Content
πIntroduction to Dimensionality Reduction
πProjection Perspective in Machine Learning
πEigenvector Computation and Low-Rank Approximations
πMathematical Intuition Behind PCA:
πPCA implementation in Python
πLatent Variable Perspective
πMathematical Intuition Behind LDA:
πImplementation of Linear Discriminant Analysis (LDA)
πMathematical Intuition Behind GDA:
πImplementation of Generalized Discriminant Analysis (GDA)
πMathematical Intuition Behind t-SNE Algorithm:
πImplementation of the t-SNE Algorithm