The course focused on giving an introduction to the statistical approach to automatic speech recognition. The course began with an introduction to building an acoustic model parameterized by hidden Markov models (solving the likelihood problem with forward algorithm, decoding the problem with Viterbi trellis and learning with Expectation Maximization and Baum-Welch algorithm). Next, to combine the acoustic model with the remaining components of a statistical ASR model, Weighted Finite-State Transducers were introduced (along with determinization and minimization). Followed by the development of the acoustic model, probabilistic language models were covered, along with smoothing techniques to deal with unseen N-grams (Laplace smoothing, add-alpha smoothing, good turing estimation, back off interpolation, Katz back-off smoothing and Kneser-Kney smoothing).
The second half of the course focused on developing end-to-end ASR systems, including hybrid neural network-HMM systems, Tandem systems, Time Delay Neural Networks and Recurrent Neural Networks. Apart from this, Listen, Attend and Spell architecture, and Connectionist Temporal Classification was also covered. The course ended with searching and decoding techniques, including beam search algorithm, lattices, multi-pass decoding; acoustic feature analysis and pronunciation models (with a special focus on grapheme to phoneme conversion and wave2vec speech representation).
My notes summarize the contents of this course.
This course focused on covering emerging techniques in the field of machine learning. The course began with an introduction to graphical models, including both, Bayesian Networks and Markov Random Fields. Next, inference on graphical models using techniques such as variable elimination and message passing on junction trees was covered. Next, learning graphical models, along with hidden variables, variational approximation and variational autoencoders was covered.
The second half of the course began with an in-depth study of sampling techniques including importance sampling, forward sampling in Bayesian Networks and Markov Chain Monte Carlo (Gibbs Sampling, Metropolis-Hastings Algorithm, Langevin Monte Carlo). The next set of topics covered included Generative Adversarial Networks, Gaussian Processes and Gaussian Copula, Coherent probabilistic aggregate queries for long-horizon forecasts, Normalizing Flows, Causality and Energy-based models.
My notes crisply summarize the content covered in this course.
The course was structured into a set of four modules -- processes, memory, concurrency and file system. The first module comprised of an introduction to process abstraction and implementation of system calls in a simple operating system (xv6). This module involved an in-depth study of process execution mechanisms, process scheduling policies and inter-process communication mechanisms. The assignment of this module included the implementation of system calls in xv6 and implementing a basic shell program. The second module focused on virtual and physical memory management. This module also included the concept of paging and the corresponding assignment included implementing demand paging in xv6. The second half of the course began with understanding threads and concurrency. We implemented an equivalent of semaphores using condition variables. The course ended with file system implementation, communication with input-output files and files-directory structures.
The course kicked off with several fundamental algorithms for Image Enhancement (using Grey Level Transformations & different types of Filtering), followed by an introduction to Image Segmentation (using Mean Shift and Edge/Corner Detection Algorithms). The first half of the course ended with a deep dive into Fourier Analysis (Fourier Transform, its applications & sampling). The second half of the course has taken off with Discrete Fourier Transform (including the FFT algorithm & its use in filtering images). I look forward to the upcoming course content, including Principal Component Analysis, its applications in Face Recognition & Singular Value Decomposition (SVD), followed by Image Restoration & various denoising techniques. The last phase of the course will involve familiarization with Image & Video Compression, Color Image Processing & Demosaicing.
The course began with an introduction to Random Variables, Axiomatic Probability, Conditional Probability and Bayes Formula, followed by extensive analysis of several Cumulative Density Functions. Next, we took a deep dive into the Correlation of Random Variables, Law of Large numbers and one of the most fundamental theorems in Data Science: the Central Limit Theorem. We derived the equations for Sample Mean and Variance, Confidence Interval and t-distribution, and learnt about Order Statistics and Likelihood functions. Having learnt the basics of Probability and Statistics, the first half of the course ended with a deep dive into Data Visualization and Exploratory Data Analysis in Python. Moving on, the second half of the course aims at covering Linear and Logistic Regression, Supervised Machine Learning, Deep Learning, Software Engineering, Graphical User Interface Programming and Parallel Query Processing.
The course began with an introduction to Machine Learning, including the types of learning algorithms, modelling basics, data type and data scales. We covered the Set Theory, Probability and Statistics, Linear Algebra, Dissimilarity and Similarity Measures. Next, we did rigorous derivation of Linear Regression (coefficients estimation, assessing the accuracy of the coefficients and the estimates, concept of t-statistic and p-value). These concepts from Linear Regression were then extended to Multiple Linear Regression, Hypothesis testing in multi-linear regression (F- statistic), Non-linear regression and ended with a discussion on Potential Problems of Linear Regression. The latter half of the course is designed to cover Classification and Clustering algorithms, Decision Trees, Random Forest, Support Vector Machines, Bayesian LDA & QDA, Big Data Analytics and Techniques for storing and processing Big Data
Today's computing systems are increasingly adaptive and autonomous: they are akin to intelligent, decision-making "agents". With its roots in artificial intelligence and machine learning, this course covers the foundational principles of designing such agents. Broadly, the topics covered include: (1) agency, intelligence, and learning; (2) exploration and multi-armed bandits; (3) Markov Decision Problems and planning; (4) reinforcement learning; (5) multi-agent systems and multi-agent learning; and (6) case studies. The course adopted a "hands-on" approach, with programming assignments designed to highlight the relationship between theory and practice. No doubt, this is one of the best courses which is offered by the CSE Department of IIT Bombay! The contents of this course are openly available here.
This course introduced us to the mathematical foundations of machine learning. The first topic covered in this course was Linear Regression: Loss Functions - least square and likelihood, Prior/Regularization - Bayesian Regression, Ridge and LASSO, optimization of Loss Functions and Regularizer, Gradient Descent Algorithm. Next, we hopped on to the Supervised Learning: Linear Classification - Perceptrons & Logistic Regression, Non-Linear Classification - Neural Networks and Support Vector Classifiers, Deep Learning - Convolution, Recurrence & LSTMs, Bagging, Boosting and Feature Selection. The course ended on Unsupervised Learning: k-means, k-median and Hierarchical Clustering. The key distinguishing factor of this course from standard ML courses was the degree of stress on rigorous mathematical derivations of all the concepts covered in the course, and is the primary reason why this course is one of my favs!
Design and Analysis of Algorithms is a step up for those interested in learning how people came up with the algorithms in the first place. The motivation for this course is learning to come up with algorithms for any problem you may encounter and being able to optimize the running time by analysing the algorithm you have come up with. Also, we see the data structures we have studied in Data Structures and Algorithms course comes up as natural necessities to the algorithms we finally design, which is quite beautiful. The various topics covered in the course include augmenting the Binary Search Tree to obtain Balanced Binary Search Tree, Basic Principles of Algorithm Design and Recursive Algorithms, Designing and Analysing Algorithms of standard problems often encountered, Dynamic Programming, Greedy Algorithms, Introduction to NP, NP–Hard and NP–complete classes of problems.
The motivation for this course is learning to apply algorithms suitably to access, store and later update the data which requires quite a lot of visualisation and thinking of the various data structures thereby improving our logical thinking as well. The course also aims to familiarise the student with fundamental data structures like arrays, binary search trees, heap trees, vectors, maps, 2-3 Trees, Tries, hash tables, undirected and directed graphs, doubly linked lists and stacks. Various Algorithms that were introduced are merging and sorting, heapsort and quicksort as advanced sorting techniques, Dijkstra’s and Bellman Ford’s shortest path algorithms, fast multipole method and insertion, deletion and accessing advanced data structures. These topics are important as a building block for studying further organisational structures as it helps in understanding them more deeply.
I have always been craving courses that delve into the mathematical roots of Machine Learning, so this course was an obvious choice for me. The various topics covered in this course include Perceptron, Feed-forward networks and Multi-layer perceptron, Boltzmann Machines, Hopfield Networks, State based networks like Recurrent Neural Networks, LSTM Networks, Convolutional Neural Networks, Bidirectional networks, Transfer Learning, Structural Networks for structured prediction, Attention based networks, Auto encoders for dimension reduction and embedding, Generative Adversarial Networks, Deep Gaussian Processes, Deep Bayesian nets, Deep Search Models, Deep Reinforcement Learning, Deep Neural Recommenders, Non-convex Optimization tools for Deep Networks and several applications covering operations research, computer vision and natural language processing.
This is the CS101 course which is taken by all freshies. This course provided us with an entry-level foundation in computer programming. The goals of taking up this course are to develop my programming ability and to improve proficiency in applying the computing fundamentals to our respective fields of study. Topics include an overview of high-level languages, introduction to C/C++ Library, basic data types, function definitions and declarations, conditional and iteration statement, array and string manipulation, recursive programming, introduction to searching and sorting and introduction to structures and pointers. In summary, the basic aim is to learn how to program in C/C++ at a level where we are able to eventually write programs to help solve our everyday engineering, science and technology problems.
This is the first course of the Deep Learning Specialization by Andrew Ng on Coursera. In this course, we studied the foundational concept of Neural Networks and Deep Learning. By the end, I was familiar with the significant technological trends driving the rise of Deep Learning; I was fully equipped to build, train, and apply fully connected deep neural networks; implement efficient (vectorized) neural networks; identify key parameters in a neural network’s architecture, and apply deep learning to my own applications. The course made us understand the capabilities, challenges, and consequences of deep learning and prepared us to participate in the development of leading-edge AI technology.
This is the second course of the Deep Learning Specialization. In this course, we opened the deep learning black box to understand the processes that drive performance and generate good results systematically. By the end, we learnt the best practices to train and develop test sets and analyze bias/variance for building deep learning applications; and were able to use standard neural network techniques such as initialization, L2 and dropout regularization, hyperparameter tuning, batch normalization, and gradient checking; implement and apply a variety of optimization algorithms, such as mini-batch gradient descent, Momentum, RMSprop and Adam, and check for their convergence; and implement a neural network in TensorFlow.
This is the third course of the Deep Learning Specialization. In this course, we learn how to build a successful machine learning project and get to practice decision-making as a machine learning project leader. By the end, we were able to diagnose errors in a machine learning system; prioritize strategies for reducing errors; understand complex ML settings, such as mismatched training/test sets, and comparing to and/or surpassing human-level performance; and apply end-to-end learning, transfer learning, and multi-task learning. This is also a standalone course for learners who have basic machine learning knowledge. This course draws on Andrew Ng’s experience building and shipping many deep learning products. This course provides the "industry experience" that one might otherwise get only after years of experience.
This is the fourth course of the Deep Learning Specialization. In this course, we understood how computer vision has evolved and became familiar with its exciting applications such as autonomous driving, face recognition, reading radiology images, and more. By the end, we were able to build a convolutional neural network, including recent variations such as residual networks; apply convolutional networks to visual detection and recognition tasks; and use neural style transfer to generate art and apply these algorithms to a variety of image, video, and other 2-dimensional and 3-dimensional data.
This is the fifth and final course of the Deep Learning Specialization. In this course, we became familiar with sequence models and their exciting applications such as speech recognition, music synthesis, chatbots, machine translation, natural language processing (NLP), and more. By the end, we were able to build and train Recurrent Neural Networks (RNNs) and commonly-used variants such as GRUs and LSTMs; apply RNNs to Character-level Language Modeling; gain experience with natural language processing and Word Embeddings, and use Hugging Face tokenizers and transformer models to solve different NLP tasks such as NER and Question Answering. Overall, this specialization provided a pathway to gain the knowledge and skills to apply ML to my work, level up my technical career, and take a definitive step in the world of AI.
Blockchain is a revolutionary technology that enables peer-to-peer transfer of digital assets without any intermediaries and is predicted to be just as impactful as the Internet. More specifically, it prepares learners to program on the Ethereum blockchain. This course provided an understanding and working knowledge of foundational blockchain concepts, a skill set for designing and implementing smart contracts, methods for developing decentralized applications on the blockchain, and information about the ongoing specific industry-wide blockchain frameworks. This course provides a broad overview of the essential concepts of blockchain technology – by initially exploring the Bitcoin protocol followed by the Ethereum protocol – to lay the foundation necessary for developing applications and programming.
This course equipped me with the knowledge needed to create nodes on my personal Ethereum blockchain, create accounts, unlock accounts, mine, transact, transfer Ethers, and check balances. I also learnt about the decentralized peer-to-peer network, an immutable distributed ledger and the trust model that defines a blockchain. This course enabled me to explain basic components of a blockchain (transaction, block, block header, and the chain), its operations (verification, validation, and consensus model), underlying algorithms, and essentials of trust (hard fork and soft fork). The content includes the hashing and cryptography foundations indispensable to blockchain programming, which is the focus of two subsequent specialization courses, Smart Contracts and Decentralized Applications (Dapps).
This course covers Cloud Computing Concepts with emphasis on Live Demo Sessions. Concepts are covered to provide insight as to how Cloud Computing and Virtualization are related to IaaS (Infrastructure as a Service) and SaaS (Software as a Service). After course completion, a clear picture with regards to Cloud Computing and Virtualization emerged. The contents of this course included an introduction to Cloud Computing (Definition, Concept, Characteristics. Models and Types of Cloud), Computer Networks (LAN, MAN,WAN,IP Addr., Network Dev., NAT & PAT, Router & Def. Gateway), Virtualization (Basics, Hypervisor, Virtualization Tools, Virtual machine Creation), Live Demo Session of Setting up of Infrastructure as a Service (IasS) & Live Demo Session of using Software as a Service (SaaS).
This short course on Introduction to Standard Query Language (SQL) at Kaggle is a crash course for anyone who wants to learn SQL for working with databases, using Google BigQuery. The course begins with an overview of the workflow for handling big datasets with BigQuery and SQL, teaching the projects, databases and tables. The course teaches us the foundational components for all SQL searches: SELECT, FROM & WHERE, how to get more interesting insights directly using SQL: GROUP BY, HAVING & COUNT, ordering the results to focus on the most important data for a particular use case: ORDER BY, organizing the query for better readability, which is extremely important for extremely complicated queries: AS & WITH, and joining data, which is a very common practice in almost all real-world applications: JOIN.
This course covers all the essential mathematics that an engineer might need in any course, including the Eigen Value Problem (solution procedure and application), Scalar and Vector Field Theory (Divergence, Gradient, Curl, Laplacian, Divergence and Stoke theorems), Linear differential equation of second order and higher (solution of homogenous and nonhomogeneous equations with and without constant coefficient), Power Series Solutions (Method of Frobenius, Legendre, Gamma and Bessel functions), Laplace Transform (Properties and application to solution of differential equations), Heaviside and Dirac-delta functions; Fourier Transform (Fourier series of a periodic function, Sturm-Liouville theory, Fourier integral, Fourier Transform), Wave Equation (Separation of variables, d'Alembert's solution), Complex Integral Calculus (Complex integration, Cauchy's theorem, Cauchy's integral formula) and Residue Theorem (Complex series and Taylor series, Laurent series, Classification of singularities).
Numerical Analysis is a very interesting course, given its wide range of applications. The various topics covered in this course include interpolation by polynomials, divided differences, error of the interpolating polynomial, piecewise linear and cubic spline interpolation, numerical integration, composite rules, error formulae, solution of a system of linear equations, implementation of Gaussian elimination, partial pivoting, row echelon form, LU factorization Cholesky method, ill-conditioning, norms solution of a nonlinear equation, bisection and secant methods, Newton's method, rate of convergence, solution of a system of non-linear equations, numerical solution of ordinary differential equations, Euler and Runge-Kutta methods, multi-step methods, predictor-corrector methods, order of convergence, finite difference methods, numerical solutions of elliptic, parabolic, and hyperbolic partial differential equations, Eigenvalue problem, power method, QR method and exposure to software packages like IMSL subroutines, MATLAB.
Linear Algebra is another very basic course, which is a prerequisite for many courses in almost all departments. The various topics covered in this course include vectors, the notion of linear independence and dependence, the linear span of a set of vectors, vector subspaces, basis of a vector subspace, systems of linear equations, matrices and Gauss elimination, row space, null space, and column space, rank of a matrix, determinants and rank of a matrix in terms of determinants, abstract vector spaces, linear transformations, matrix of a linear transformation, change of basis and similarity, rank-nullity theorem, inner product spaces, Gram-Schmidt process, orthonormal bases, projections and least-squares approximation, eigenvalues and eigenvectors, characteristic polynomials, eigenvalues of special matrices, algebraic and geometric multiplicity, diagonalization by similarity transformations, spectral theorem for real symmetric matrices, application to quadratic forms.
The topics covered in this course include exact equations, integrating factors and Bernoulli equations, Orthogonal trajectories, Lipschitz condition, Picard's theorem, examples on non-uniqueness, linear differential equations generalities, linear dependence and Wronskians, the dimensionality of space of solutions, Abel-Liouville formula, linear ODEs with constant coefficients, the characteristic equations, Cauchy-Euler equations, method of undetermined coefficients, method of variation of parameters, Laplace transform generalities and shifting theorems.
This is the first mathematics course that I undertook in my undergrad life at IIT Bombay. The course contents included a Review of Limits, Continuity, Differentiability, Mean Value Theorem, Taylors Theorem, Maxima and Minima. Riemann Integrals, Fundamental Theorem of Calculus, Improper Integrals, applications to area, volume, Convergence of sequences and series, power series, Partial Derivatives, gradient and directional derivatives, chain rule, maxima and minima, Lagrange multipliers, Double and Triple integration, Jacobians and change of variables formula, Parametrization of curves and surfaces, vector fields, line and surface integrals. Divergence and curl, Theorems of Green, Gauss, and Stokes.
The course is intended to cover the basics of astronomy and astrophysics, with particular emphasis on a research-like experience with hands-on projects. The course is divided into four broad areas: Stars and the solar system (Classification of stars, stellar life cycle, planets and orbits, distance scales, etc), the tools of astronomy (telescopes, detectors, sensitivity, coordinate systems, etc), observations and analysis (course project - starting from planning observations to reducing data, analysis, report and presentation), and the big picture (special topics like cosmology, radio astronomy, high energy astronomy, gravitational waves, etc). We were privileged to propose for observing time on the GROWTH-India telescope. We were taught the basics of data analysis.
Introduction to Special Theory of Relativity is one of the most interesting & mind-boggling course that I have done in my entire undergrad life. The kicked off with the discussion of problems with Classical Physics and Postulates of Special Theory of Relativity. Significant amount of time was spent by Prof. Rentala in explaining the concepts and showing simulations, which triggered even more brainstorming and interesting questions and discussions during the lectures. The Galilean and Lorentz transformations were discussed at length, along with examples of length contraction and time dilation. The Velocity Transformation was derived in the lectures and the definitions of Space-like and Time-like intervals were introduced. The concept of causality, doppler effect in special relativity, Minkowski space were discussed. Latter half of the course saw even more extensive mathematics in the discussions of Velocity and Momentum-Energy Four Vectors, Momentum Energy Transformation, concept of zero rest mass particle, force in relativity, Newton’s second law in relativity and an elementary idea of transformation of electric and magnetic fields and current density four vector.
This course, along with Introduction to Special Relativity are hard prerequisites for the course General Theory of Relativity. This course began with the concepts of phase diagrams and a brief review of Newton's laws of motion. Next, we discussed frames of reference, rotating frames, centrifugal and Coriolis forces. Free and constrained motion was also taken up. The heart of the course was the derivation and problem solving of the concepts D'Alemberts principle, Lagrange's equation of first kind, Lagrangian formulation, Hamilton’s equation of motion, Variational principles. I followed Classical Mechanics by Goldstein for this course very religiously, and strongly recommend everyone taking up this course to do so.
This is one of the first physics courses which I took in my undergrad life. The course begins with a review of vector calculus. All of these concepts are covered in the Calculus course in great depth. The various topics covered include electric potential, properties of conductors; Poisson and Laplace equations, uniqueness theorem, boundary value problems, separation of variables, method of images (extremely interesting concept behind it!), multipoles, polarization and bound charges, Gauss law in the presence of dielectrics, linear dielectrics; divergence and curl of magnetic field, vector potential, magnetization, bound currents, Amperes law in magnetic materials, magnetic field, faradays law, motional emf, energy in magnetic fields, displacement current, Maxwells equations, Electromagnetic waves in vacuum and media, Energy and momentum of EM waves. The beauty of this course lies in the interesting problems. Introduction to Electrodynamics by David Griffiths is an awesome book for this course.
This is also one of the first physics courses which I took in my undergrad life. The course begins with an introduction to Classical Equipartition Theorem, Kinetic Theory, Black body radiation, Photoelectric Effect and Compton Scattering. Next, the course hops on to the Concept of Wave packets, de-Broglie wavelength and the experiments demonstrating wave properties of electron. Next, a heuristic derivation of Schrodinger Equation is done, then the concepts of free particle, particle in a box problem, Finite Square well, Bound vs. unbound states, superposition principle of eigen states are discussed. Next, the scattering problem is discussed with Reflection and Transmission coefficients. After discussing Quantum Tunnelling and Simple Harmonic Oscillators, the course moves on the concept of degeneracy with a brief overview of Hydrogen atom problem. Second half of the course is an introduction to Statistical Physics: Pauli’s exclusion principle, micro-states and macro-states, Classical (Maxwell-Boltzmann) and Quantum statistics [Bose Einstein (BE)and Fermi Dirac (FD)].
This is undoubtedly the most interesting and involving course of the entire B. Tech curriculum of Mechanical Engineering. This is the course, where we got to apply all our knowledge from the past three years in Reverse Engineering existing machines and developing one on our own in teams of 7. My team reverse-engineered a bicycle bell and a table fan. We presented our work on the topics "Computational Tools in Mechanics" and "Mechanical Joints". For the past month, our team has been working on developing a Tactical Unmanned Ground Vehicle for surveillance to obviate the inherent risks to soldiers during the conduct of various operations. The bot should be capable of manoeuvring in difficult terrains, navigate using existing mapping of the region, work in coherence with existing technology and knowledge, have an origami design, 360-degree monitoring capability, carry rescue aid in case of hostages, 2-way communication for negotiations, grenades and landmine detection with self-protection ability, point and shoot with friend and foe distinction and ability to recoil from ammunition.
The course began with a discussion of Energy Resources, Heat Engines, Recap of 1st law for Closed and Open Systems. Classification of cycles as Open/Closed, Refrigeration/Power, Multi-component/Single-component, Internal combustion/external combustion, etc. Detailed analysis of performance parameters such as net work, thermal efficiency, heat rate, specific fuel consumption, work ratio, specific output, mean effective pressure, volumetric efficiency, COP, refrigeration effect were discussed for all cycles. We learnt the general stoichiometry and definition of terms (rich mixture, lean mixtures), heat of formation, heat of reaction, calorific Value of fuel, estimation methods for calorific values, Exhaust Gas Analysis and Orsat Apparatus. Detailed derivations of Otto Cycles, Diesel Cycles, Air-standard cycles and Actual cycles, Dual cycle and p-theta diagram were done in class. Brayton and Rankin cycles with explanation of various terms modifications were discussed in great detail, with the analysis of Feed Water Heaters, moisture separators, Vapour Compression, Reverse Brayton Cycles, Vapour Absorption Cycles. Psychrometry. Reciprocating, rotary and centrifugal Compressors.
Industrial Engineering and Operations Research are core components of each industry. This course gives an introduction to various optimization scenarios and algorithms deployed for the same. The topics covered in Operations Research included Convex Analysis, Mathematical Programming Formulations, Linear programming (Geometry, Forms), Farkas’ Lemma, Linear Programming Duality, Simplex Algorithm and Initial Basic Feasible Solutions. The industrial engineering component of the course comprises Quality Control Mechanisms, Production Control Mechanisms, Inventory Control Mechanisms and Failure Control Mechanisms.
Kinematics & Dynamics of Machines is a very important course for CAD enthusiasts. The course began with an introduction to Particle Kinematics and Dynamics in multiple coordinate systems, various mechanisms, degrees of freedom computation and inversions of mechanisms. Next, we hopped on to the planar kinematic analysis of mechanisms, graphical techniques for velocity and acceleration analysis, dynamics of systems of particles and rigid body dynamics. The second half of the course kicked off with Programming ODEs on MATLAB, cam analysis, the fundamental law of gearing, terminology and problems in gearing, epicyclic and other gear trains. We also discussed the Lagrange Formulation for general n-DOF systems. The course ended with a detailed analysis of single DOF vibrations, damped and free vibrations, forced vibrations, two DOF systems, the matrix form of the vibration equation and the eigenvalue solution, modal frequencies and mode shapes.
This course provides a glimpse of the Controls field, in a way what SysCon does in a nutshell. If you liked the field, you can also consider picking up some Controls electives from SysCon. The various topics covered in the course include Block Diagrams, Feedback and feed-forward systems, Adaptive Cruise Control (ACC), Fourier and Laplace Transforms, LTI Systems, BIBO Stability, LCC ODEs, Linear control design, Gain and Phase Margin. The last part of the course on Linear Controller Design is extremely interesting, irrespective of whether you really like the overall theme. As Prof. Shashikanth said (who happens to be one of MIT’s top global under 35 innovators, read up on SEDEMAC, he’s the founder), most of those problems come straight from the automotive industry. Even the initial tutorial problems, daunting as they seem, were very realistic and you will eventually appreciate the effort put into designing them.
The Mechanical Engineering curriculum at IIT Bombay has two core courses on Manufacturing Processes. The first course introduces casting processes: analysis of melting, pouring and solidification phenomena; design of pattern, core, feeder and gating system; casting defects and inspection and bulk and sheet forming processes: rolling, forging, extrusion and drawing; sheet metal working; loads, friction and lubrication; forming defects and inspection. The second course introduces material removal processes: mechanics of machining, tool geometry and materials, chip formation, tool temperature, tool wear, tool life, surface finish, machinability, optimization of machining processes. We also learnt about modern machining processes: EDM, ECM and LASER. The course ended with a discussion of principles of assembly engineering, the theory of dimensional chains, fully interchangeable and selective assembly.
Broadly, this course covers conduction, convection and radiation. The course began with an introduction to Conduction (one-dimensional, later expanded to two and three-dimensional cases), Steady-State Conduction and Transient Conduction. Next, we were introduced to Convection, Forced External Flow, Forced Internal Flow, Natural Convection, Boiling, Condensation and Heat Exchangers. The course concluded on the topic Radiation: Processes, Properties and Radiation Exchange Between Surfaces.
This course is a continuation of the course on Solid Mechanics. The various topics covered in this course include Shear Stresses in Beams, Torsion of circular rods, Axisymmetric problems, Deflections in beams, Unsymmetrical bending of straight beams, Shear stresses in thin-walled beams, Torsion in thin-walled beams, Stresses and Deflections in initially curved bars, Energy Methods, Elastic instability and buckling of beams, Rotating disc and rod problems.
This course covered Dynamic Characteristics of Measurement Systems, which included revision of statistics, general structure of a measurement system, study of a first and second order system subjected to various inputs (step, ramp, impulse, sinusoidal), basic Fourier Series implementation, Temperature Measurement: Thermocouples, thermometers, resistance temperature detector, Pressure Measurement: Manometers, pressure gauges, diaphragms, Pitot tubes, Flow Rate Measurement: Differential pressure devices (Orificemeters, Venturimeters), vortex flowmeter, etc., Strain Gauge and data sampling and thin films.
This course is a prerequisite for many advanced and interesting courses, such as, Strength of Materials, Machine Design, Stress Analysis, Fatigue, fracture and failure analysis, Fracture Mechanics and Vibro Acoustics. Hence, this course is an obvious choice for many undergrads, just like me. The course kicked off with the forces and moments analysis of deformable solids, in order to equip us with the fundamentals of mechanics. The next set of fundamental concepts include stress-strain and their relationships. Equilibrium equations were derived along with the strain-displacement relation. Combined stresses analysis was done using the Mohrs circle diagram for stress and strain rosettes. Problems based on simple beams, bending moments, shear forces and stresses in beams, torsion and energy methods were given a huge fraction of the course. The course ended with a discussion of the theories of failure.
The course began with an introduction to thermodynamics. The definitions of system, surroundings, boundaries, properties and classification of systems were discussed. The various topics covered include equilibrium, processes, interactions, thermodynamic definition of work, adiabatic boundary, systems, processes and work, First Law, Zeroth law, Isothermal states, Empirical temperature, ideal gas, equations of state, introduction to steam tables, Van-der-Waals gas. Critical state, Steady-flow energy equation, Second Law, Kelvin-Planck and Clausius statements, Carnot theorem, thermodynamic temperature, Carnot engine, Clausius inequality, definition of entropy, evaluation of entropy, Availability and Exergy.
The various topics covered include Continuum approximation, fluid pressure, Pascal’s law, manometer, pressure forces, buoyancy, floating, stable/unstable equilibrium of submerged/floating body, accelerating fluid systems, fluid dynamics, control mass, control volume, stream & path lines, integral equations, Reynolds Transport Theorem, integral mass conservation, continuity equation, integral momentum balance, integral energy balance, Bernoulli form of energy conservation, non inertial frame of CV, fluid kinematics, rotation of fluid particles and angular deformations, linear deformations, incompressible fluid condition, differential forms of conservation of mass, momentum, and energy equations, viscosity, pressure measurement in moving fluids, pitot tube, Stoke’s law, Navier-Stokes Equations, viscosity analysis, developed flow, pipe flow, laminar and turbulent flow, non dimensional form of Navier Stokes equations, Reynolds number, Darcy friction factor, head loss due to friction & fittings, boundary layer theory, Boundary layer separation and drag force.