Formal Language & Automata Theory
Unit-1
Importance of Automata Theory, Central Concepts of Automata Theory, Introduction to DFA and NFA, Acceptance of a String by a DFA, Acceptance of a String by NFA, Design of DFAs, Design of NFAs, Conversion of NFA to DFA, Introduction to NFA with Є-Transitions, Conversion of NFA with Є-Transitions to NFA without Є-Transitions. Minimization of DFA, Introduction to Mealy and Moore Machines, Design of Mealy and Moore Machines, Conversion of Mealy to Moore Machines and Moore to Mealy Machines, Applications and Limitations of Finite Automata.
Unit-2
Introduction to Regular Expressions, Regular Sets, Identity Rules, Equivalence of two Regular Expressions, Conversion of Regular Expression to NFA with Є-Transitions, Conversion of DFA to Regular Expression. Pumping Lemma of Regular Languages, Applications of pumping lemma, Closure Properties of Regular Languages, Applications of Regular Expressions.
Unit-3
Chomsky Hierarchy, Regular Grammar, Left-Linear Grammar, Right-Linear Grammar, Conversion of Finite Automata to Regular Grammars and Regular Grammars to Finite Automata, Context Free Grammar, Construction of CFGs for Languages, Determining language of the grammar, Leftmost and Rightmost Derivations, Parse Trees. Ambiguous Grammars, Simplification of Context Free Grammars (Elimination of Useless Symbols, Є-Productions and Unit Productions), Normal Forms (Chomsky Normal Form and Greibach Normal Form).
Unit-4
Pumping Lemma for CFL, Applications of pumping lemma for CFL, Closure Properties of CFL, Applications of Context Free Grammars, Introduction to Pushdown Automata, Model, Graphical Notation, Instantaneous Description, Language Acceptance of Pushdown Automata (Acceptance by empty stack and final state), Design of Pushdown Automata for CFL. Deterministic and Non–Deterministic Pushdown Automata, Conversion of Pushdown Automata to Context Free Grammars, Conversion of Context Free Grammars to Pushdown Automata, Application of Pushdown Automata.
Unit-5
Introduction to Turing Machine, Representation of Turing Machines (Instantaneous Descriptions, Transition Tables and Transition Diagrams), Design of Turning Machines. Types of Turning Machines, Church’s Thesis, Universal Turing Machine, Introduction to Decidable and Un-decidable Problems, Halting Problem of Turing Machines, Post’s Correspondence Problem, Modified Post’s Correspondence Problem, Introduction to Classes of P and NP, NP-Hard and NP-Complete Problems.
Compiler Design
Unit-1
Language Processors: Introduction to Language Processing, Structure of a Compiler, The Science of Building a Compiler, Compiler-Construction Tools.
Lexical Analysis: The Role of Lexical Analysis, Input Buffering, Specification of Tokens, Recognitions of Tokens, The Lexical Analyzer Generator LEX.
Unit-2
Syntax Analysis: The Role of a Parser, CFG (Definition of CFG, Derivations and Parse Trees, Ambiguity), Writing Grammar (Eliminating Ambiguity, Elimination of Left Recursion and Left Factoring in CFG), Bottom-up Parsing (Shift Reduce Parsing). Top-down Parsing (Recursive-Descent Parsing and Predictive Parsing)
Unit-3
Introduction to LR Parser (Simple LR Parsing), More Powerful LR Parser (Canonical LR and LALR Parsing), Using Ambiguous Grammars, Error Recovery in LR parser, The Parser
Generator YACC.
Syntax-Directed Translation: Syntax Directed Definitions (Inherited and Synthesized Attributes, Evaluating an SDD at Nodes of Parse Tree), Evolution Order of SDTS (Dependency Graphs, Ordering the Evaluation of Attributes, S-Attributed Definitions and L-Attributed Definitions), Application of SDTS (Construction of Syntax Trees), Syntax Directed Translation Schemes (Postfix Translation Schemes, Parser Stack Implementation of Postfix SDT’s).
Unit-4
Intermediate Code Generation: Variants of Syntax Trees (DAG for Expressions, The Value-Number Method for Constructing DAG's), Three-Address Code (Address and Instructions, Quadruples, Triples), Type Checking (Rules for Type Checking and Type Conversion).
Code Optimization: The Principal Sources of Optimization, Introduction to Basic Blocks and Flow Graphs, Optimization of Basic Blocks, Introduction to Data-Flow Analysis.
Unit-5
Code Generation: Issues in the Design of a Code Generator, The Target Language, A Simple Code Generator, Code Generation from DAG, Peephole Optimization, Register Allocation and Assignment.
Runtime Environments: Storage Organization, Stack Allocation of Space, Heap Management, Symbol Tables (Symbol Table Per Scope, Use of Symbol Tables).
Python Programming
Unit-1
Introduction: Introduction to Python, Program Development Cycle, Input, Processing, and Output, Displaying Output with the Print Function, Comments, Variables, Reading Input from the Keyboard, Performing Calculations, Operators. Type conversions, Expressions, More about Data Output.
Data Types and Expression: Strings Assignment, and Comment, Numeric Data Types and Character Sets, Using functions and Modules.
Decision Structures and Boolean Logic: if, if-else, if-elif-else Statements, Nested Decision Structures, Comparing Strings, Logical Operators, Boolean Variables.
Repetition Structures: Introduction, while loop, for loop, Input Validation Loops, Nested, Loops.
Unit-2
Strings and Text Files: Accessing Character and Substring in Strings, Strings and Number Systems, String Methods, Text Files.
Data structures:
Lists - creating a list, accessing, slicing and other operations
Tuples - creating a tuple, accessing and other operations
Sets - creating a set, modifying, removing and other operations
Dictionaries - creating a dictionary, accessing keys and values and other operations
Unit-3
Design with Function: Functions as Abstraction Mechanisms, Problem Solving with Top Down Design, Design with Recursive Functions, Case Study Gathering Information from a File System, Managing a Program’s Namespace, Higher Order Function.
Modules: Modules, Standard Modules, Packages.
Unit-4
File Operations: Reading config files in python, Writing log files in python, Understanding read functions, read(), readline() and readlines(), Understanding write functions, write() and writelines(), Manipulating file pointer using seek, Programming using file operations
Object Oriented Programming: Concept of class, object and instances, Constructor, class attributes and destructors, Inheritance , overlapping and overloading operators, Adding and retrieving dynamic attributes of classes
Unit-5
Errors and Exceptions: Syntax Errors, Exceptions, Handling Exceptions, Raising Exceptions, User-defined Exceptions, Defining Clean-up Actions
Graphical User Interfaces: The Behaviour of Terminal Based Programs and GUI -Based, Programs, Coding Simple GUI-Based Programs, Other Useful GUI Resources.
Python Programming Lab
Input & Output and Decision structures in Python
Strings in Python
Lists, Tuples, Dictionaries and Sets
Functions in Python
Text and File Handling
Classes
Exception Handling
Graphical user interface: TKinter
Machine Learning
Unit-1
Introduction to machine learning - Basic concepts, designing a learning system, Issues in machine learning, Types of machine learning, A Machine Learning Sampler
The ingredients of machine learning - Tasks: the problems that can be solved with machine learning, Models: the output of machine learning, Features, the workhorses of machine learning.
Preliminaries - The curse of dimensionality, Overfitting, Training, Test and Validation sets, The confusion matrix, The accuracy metrics: Accuracy, sensitivity, specificity, precision, recall, F1 measure, ROC curve, Unbalanced datasets, Naïve Bayes Classifier, Some basic statistics: variance, covariance, bias-variance tradeoff.
Unit-2
Tree Models - Decision Trees.
Linear Models - The least-squares method: Univariate linear regression, Logistic Regression, Support vector machines.
Distance Based Models - Introduction, Nearest Neighbours classification, Distance Based Clustering, Hierarchical Clustering.
Unit-3
Features: Kinds of feature, Feature transformations: Thresholding and discretization, Normalization, Incomplete Features, Feature construction and selection.
Model ensembles: Bagging, random forests, Boosting: AdaBoost, Gradient Boosting, XGBoost.
Unit-4
Dimensionality Reduction: PCA, LDA
Model Evaluation and Optimization: Cross Validation, Grid Search, Regularization
Unit-5
Neurons, NNs, Linear Discriminants: The Neuron, Neural Networks, The perceptron, Multilayer perceptrons: Going forwards, Going backwards: Backpropagation of error, Multilayer perceptron in practice, Examples of using MLP.
Reinforcement Learning: Overview, Example, Markov Decision Process, Values, Back on Holiday: Using reinforcement learning, Uses of Reinforcement Learning.
Machine Learning Lab
1. Vector addition.
2. Data pre-processing: handling missing values, handling categorical data, bringing features to the same scale, and selecting meaningful features.
3. Regression model.
4. Write a program to implement the KNN classifier and logistic regression for binary classification and multiclass classification.
5. Write a program for data clustering (K-means) and evaluate the clustering model.
6. Ensemble learning, grid search and learning, and validation curves.
7. Compressing data via dimensionality reduction: PCA, LDA.
8. Model Evaluation and Optimization: K-fold cross-validation.
9. Write a program to reduce the variance of a linear regression model using Lasso and Ridge regularization.
10. Perceptron for digits.
11. Feed-Forward Network for wheat seed dataset.
12. Write a program to implement a neural network for regression.
13. Write a program to save and load a trained machine-learning model.
Deep Learning
Unit-1
Fundamentals Concepts of Machine Learning
Historical Trends in Deep Learning, Machine Learning Basics: Learning Algorithms-Supervised and Unsupervised Training, Linear Algebra for machine Learning, Testing, Cross-Validation, Dimensionality reduction, Over/Under-fitting, Hyper parameters and validation sets, Bias, Variance, Regularization
Unit-2
Deep Feed Forward Networks
Deep feed forward networks: Introduction, Gradient-Based Learning, Various Activation Functions, error functions, Differentiation algorithms, Regularization for Deep learning, Early Stopping, Drop out.
Unit-3
Convolutional Neural Networks and Sequence Modeling
Convolutional Networks: Convolutional operation, Motivation, Pooling, Normalization.
Sequence Modeling: Recurrent Neural Networks, The Long Short-Term Memory.
Unit-4
Auto Encoders and Optimization Algorithms
Auto encoders - Auto encoders: under complete, denoising, optimization for Deep Learning: gradient descent, stochastic gradient descent, mini batch gradient descent, Adagrad, RMSProp, Adam
Unit-5
More Deep Learning Architectures & Applications: Alexnet, ResNet, Transfer learning.
Deep Generative Models: Boltzmann Machines, Restricted Boltzmann Machines Sentiment Analysis using LSTM, Image Segmentation
Deep Learning Lab
1. Implement multilayer perceptron algorithm for MNIST Handwritten Digit Classification.
2. Implement one hot encoding of words or characters.
3. Apply data augmentation techniques on images.
4. Implement word embeddings for IMDB dataset.
5. Design a neural network for classifying movie reviews (Binary Classification) using IMDB dataset.
6. Design a neural Network for classifying news wires (Multi class classification) using Reutersdataset.
7. Design a neural network for predicting house prices using Boston Housing Price dataset.
8. Build a Convolution Neural Network for MNIST Hand written Digit Classification.
9. Build a Convolution Neural Network for simple image (dogs and Cats) Classification. Study the effect of batch normalization and dropout on the performance of CNN.
10. Use a pre-trained convolution neural network (VGG16) for image classification
11. Implement a Recurrent Neural Network for IMDB movie review classification problem.
Artificial Intelligence and Machine Learning Lab
1. Implement Breadth First Search using Python.
2 Implement Depth First Search using Python.
3 Implement A* algorithm using Python. (Ex: find the shortest path)
4 Implement AO* algorithm using Python (Ex: find the shortest path)
5 Apply the following Pre-processing techniques on real-life datasets
a. Attribute selection
b. Handling Missing Values
c. Discretization
d. Elimination of Outliers
6. Demonstrate dimensionality reduction using PCA.
7. Apply KNN algorithm for classification and regression.
8. Demonstrate decision tree algorithm for a classification problem and perform parameter tuning for better results.
9. Apply Random Forest algorithm for classification and regression.
10. Demonstrate Naïve Bayes Classification algorithm.
11. Apply Support Vector algorithm for classification.
12. Demonstrate Linear Regression and Logistic Regression models.
13. Apply Lasso and Ridge Regularization to reduce the variance of a linear regression model
14. Implement the K-means algorithm and apply it to the data you selected. Evaluate performance by measuring the sum of the Euclidean distance of each example from its class center. Test the performance of the algorithm as a function of the parameters.
15. Develop a mini project that applies machine learning concepts and algorithms to address a practical real-world problem