Definition of ML: Definitions by Arthur Samuel and Tom Mitchell
Scope and Application of ML
ML Algorithms
Supervised Learning:
Regression and Classification
Linear Regression with one variable:
Model representation
Cost function
Gradient descent for linear regression
Learning rate
Unsupervised Learning:
Cocktail party problem
Reinforcement Learning and Recommender Systems
Linear Algebra:
Inverse and Transpose of Matrices
Significance of Eigenvalues and Eigenvectors
Singular Value Decomposition
Linear Regression with Multiple Variables:
Multiple Features
Polynomial Regression
Normal Equation
Non-invertibility
Logistic Regression for Classification:
Hypothesis representation
Decision boundary
Multiclass classification
Neural Networks:
Representation:
Non-linear hypothesis
Neurons and the brain
Weights, effective inputs, threshold, and activation function
The McCulloch-Pitts Neuron Model:
Single-layer neural network
Multi-layer neural network
Learning:
General learning rule for one neuron
Perceptron and Delta Learning Rules
Backpropagation Algorithm
Evaluating Hypotheses:
Estimating hypothesis accuracy
Sample error and true error
Basics of Sampling Theory:
The Binomial Distribution
Estimators, Bias, and Variance
Confidence Intervals
Central Limit Theorem:
Deriving confidence intervals
Hypothesis testing
Paired r-test for Comparing Learning Algorithms
Support Vector Machines (SVM):
Large margin classification
Applications of SVM
Linear and Non-linear SVM
Kernels
Dimensionality Reduction:
Data Compression
Principal Component Analysis (PCA)
Choosing the number of principal components
Unsupervised Learning and Anomaly Detection:
k-Means Algorithm:
Optimization objective
Choosing the number of clusters
Hierarchical Agglomeration
Scope of Anomaly Detection:
Gaussian Distribution
Developing and Evaluating an Anomaly Detection System
Anomaly Detection vs. Supervised Learning
Choosing features
Multivariate Gaussian Distribution
Anomaly detection using Multivariate Gaussian Distribution
Ethem Alpaydin, Introduction to Machine Learning, Second Edition, The MIT Press, Cambridge, Massachusetts, London, England, 2010.
Tom M. Mitchell, Machine Learning, McGraw Hill Education, 2017.
Christopher M. Bishop, Pattern Recognition and Machine Learning, Springer, 2011.
Trevor Hastie, Robert Tibshirani, Jerome H. Friedman, The Elements of Statistical Learning, Springer, 2nd Edition, 2009.
Max Kuhn and Kjell Johnson, Applied Predictive Modelling, Springer, 2013.
Sebastian Raschka and Vahid Mirjalili, Python Machine Learning, Packt Publishing, 2017.
Module 1: Definition and Applications of Machine Learning (6 Lectures)
Lecture 1: Introduction to Machine Learning (ML)
Lecture 2: Overview of ML Algorithms
Lecture 3: Supervised Learning – Regression and Classification
Lecture 4: Cost Function and Gradient Descent for Linear Regression
Lecture 5: Learning Rate and Optimization in Gradient Descent
Lecture 6: Introduction to Unsupervised Learning
Module 2: Linear Algebra and Advanced Regression (8 Lectures)
Lecture 7: Basics of Linear Algebra
Lecture 8: Eigenvalues, Eigenvectors, and their Significance
Lecture 9: Singular Value Decomposition
Lecture 10: Linear Regression with Multiple Variables
Lecture 11: Normal Equation and Non-invertibility
Lecture 12: Introduction to Logistic Regression for Classification
Lecture 13: Decision Boundary in Logistic Regression
Lecture 14: Multiclass Classification
Module 3: Neural Networks (7 Lectures)
Lecture 15: Introduction to Neural Networks
Lecture 16: McCulloch-Pitts Neuron Model
Lecture 17: Multi-layer Neural Networks
Lecture 18: Learning in Neural Networks
Lecture 19: Perceptron Learning Rule
Lecture 20: Delta Learning Rule
Lecture 21: Backpropagation Algorithm
Module 4: Hypothesis Evaluation and Statistical Theory (7 Lectures)
Lecture 22: Evaluating Hypotheses:
Lecture 23: Sample Error vs. True Error
Lecture 24: Basics of Sampling Theory
Lecture 25: Estimators: Bias and Variance
Lecture 26: Confidence Intervals and the Central Limit Theorem
Lecture 27: Hypothesis Testing
Lecture 28: Paired r-Test for Comparing Learning Algorithms
Module 5: SVM, Dimensionality Reduction, and Anomaly Detection (7 Lectures)
Lecture 29: Support Vector Machines (SVM)
Lecture 30: Linear and Non-linear SVM, Kernels
Lecture 31: Dimensionality Reduction
Lecture 32: k-Means Algorithm
Lecture 33: Hierarchical Agglomeration and Scope of Anomaly Detection
Lecture 34: Gaussian Distribution and Anomaly Detection System Development
Lecture 35: Multivariate Gaussian Distribution for Anomaly Detection
Review Paper: 15 marks
Assignments : 5 marks
Seminar: 5 marks
Quiz(es): 5 marks
Internal Examination: 10 marks
CO: 5
Content (40 marks)
Accuracy (10 marks): Information is factually correct, well-researched.
Relevance (10 marks): Content is directly related to the seminar topic and objectives.
Depth (10 marks): Presentation covers the topic comprehensively, including background information and current trends.
Originality (10 marks): The presentation provides unique insights or a novel approach to the topic.
Organization (20 marks)
Structure (10 marks): Clear introduction, body, and conclusion; logical flow of ideas.
Pacing (10 marks): Time is well-managed, with neither rushed nor excessively slow segments.
Audio and Voice Delivery (20 marks)
Clarity (10 marks): Speaker articulates clearly, with good diction and appropriate volume.
Engagement (10 marks): Speaker uses tone variation and pauses effectively to maintain interest.
Visual Aids (10 marks)
Quality (5 marks): Slides or visual aids are legible, aesthetically pleasing, and free from excessive text.
Usefulness (5 marks): Visual aids enhance understanding of the topic and are relevant to the content discussed.
Understanding and Knowledge (10 marks)
Grasp of Topic (5 marks): Speaker demonstrates a strong understanding of the subject matter.
Responses to Hypothetical Questions (5 marks): Speaker anticipates and addresses potential questions in the presentation.
Technical Quality (10 marks)
Video/Audio Quality (10 marks): The audio is clear without background noise, and the video (if any visual elements are present) is steady and well-lit.
Please upload the slides and the videos. Do not create a separate folder for each student. Format for filename: <RollNo>_<Name>_<Slides/Video>
Submit your assignments here. Create sepearete folders for each student. Folder name should be <YourFirstName>_<YourRollNo.>
Question. All questions carry equal marks.
Q1: CO2, Q2: C)3, Q3: CO3, Q4: CO2, Q5: CO5