Elementary Computational Mathematics (RTP-1)
Besides the preliminaries and necessary foundation chapters, the notes will mostly focus on my earlier blog posts and on computation topics introduced in the latent seminar. The objective is to provide the necessary training for reasonable research in computational mathematics. It also includes a special chapter on computational inverse problems and guidelines for programming. A tailored version of the project has been incorporated into the coursework for MATH 5630/6630.
Elementary Applied Mathematics (RTP-5)
It covers the basic principles of applied mathematics, including asymptotic analysis, the calculus of variations, dynamical systems, and functional analysis. Some random topics are appended. The objective is to help first-year graduate students become fully prepared for future research. Part of this project has been tailored to the coursework of MATH 7000/7010.
Elementary Deep Learning (RTP 6)
The note covers the mathematical foundations of deep learning, including the classical ridge approximation, the Kolmogorov-Arnold Theorem, approximation theory for deep neural networks, convex optimization, momentum methods, Lyapunov functions, training dynamics across different scaling regimes, etc. Most topics are accessible to first-year graduate students in either mathematics or engineering. Part of this project has been tailored to the coursework of MATH 7970.
Qualification Problem Set in Applied/Computational Mathematics [in preparation]
This set comes from simplified versions of examples/theorems from recent papers.
The projects are posted on the career fair.