I am Akansha, an Assistant Professor in the Department of Mathematics at Manipal Institute of Technology (MIT), Manipal, India. I completed my Ph.D. in Mathematics from IIT Bombay in 2022, where my research focused on developing non-linear approximation algorithms for non-smooth functions.
Prior to my Ph.D., I earned my MSc in Mathematics and Computing from IIT Guwahati. During this period, I was deeply interested in pursuing research in the intersection of Algebra, Topology, and Functional Analysis. Under the guidance of Dr. Anjan Chakraborty, I worked on Banach Algebra for my MSc research project. However, my academic journey took a new direction when I joined IIT Bombay for my Ph.D. under the supervision of Prof. S. Baskar. His research centered on Hyperbolic PDEs, which are known for their solutions exhibiting jump discontinuities. Consequently, I shifted my focus toward developing numerical schemes for such PDEs. This led me to explore and design algorithms for non-smooth (specifically, piecewise-smooth) functions, ultimately focusing on developing non-linear approximation algorithms for non-smooth functions, which played a crucial role in designing numerical schemes for scalar conservation laws during my Ph.D.
Before joining IIT Guwahati, I completed my BSc from Garhwal University. Interestingly, my initial academic interests leaned toward Chemistry. I attempted the IIT JAM examination in Chemistry twice before discovering my passion for Mathematics, which has since become my primary field of interest.
My exploration did not stop there. After submitting my Ph.D. thesis in March 2022, my supervisor encouraged me to explore Deep Learning, highlighting its close connection with non-linear approximation techniques. This marked the beginning of my journey in Deep Learning research, which commenced in November 2022 with the invaluable support of Dr. Karmvir Singh Phogat, a close friend of my loving husband Vivek.
Currently, my research interests span Deep Learning, Graph Representation Learning, Uncertainty Quantification, and Approximation Theory. I am particularly focused on improving the performance of Graph Neural Networks (GNNs) by addressing their key limitations, such as graph OOD generalization and over-squashing (detailed insights are available in the Research Interests section).
If you have reached this point, I appreciate your time and interest in my academic journey. I am passionate about research and always eager to collaborate. If you are interested in being part of my research endeavors, please feel free to reach out. I would be more than happy to connect and collaborate with you.
My current research focuses on:
Agentic AI: Mostly Graph based agentic AI
Domain empowered LLMs:
Graph and Matrices: Mostly nut graphs their connections with matrices and groups
Graph Neural Networks (GNNs): Enhancing GNN robustness through curvature-based rewiring, uncertainty quantification, and model calibration. I am also interested in leveraging LLMs to enhance graph-related tasks.
Deep Learning: Developing improved architectures for structured data, emphasizing interpretability and reliability.
Approximation Theory: I am mostly interested in rational approximation and its applications in spectral graph neural networks.
Uncertainty Quantification: Leveraging conformal prediction and probabilistic models for reliable decision-making.
I am enthusiastic about collaborations and open to exploring new ideas in any areas.
My google scholar profile: https://scholar.google.com/citations?user=CjGtzA0AAAAJ&hl=en&oi=sra
Shamim Yazdani, S. Akansha, Nripsuta Saxena, Zichong Wang, Zhipeng Yin and Wenbin Zhang . (2025), A Comprehensive Survey of Image and Video Generative AI: Recent Advances, Variants, and Applications (Accepted in Journal of Big Data) (Q1, IF: 6.4).
S. Akansha. (2025), Over-Squashing in Graph Neural Networks: A Comprehensive survey, Neurocomputing, 642, (Q1, IF: 6.5) https://doi.org/10.1016/j.neucom.2025.130389
S. Akansha (2025), Piecewise Padé-Chebyshev reconstruction of bivariate piecewise smooth functions, BIT Numerical Mathematics 65, (Q1, IF: 1.8) https://doi.org/10.1007/s10543-025-01059-8
S. Akansha and Aditya Subramanium (2025), Exploring Chebyshev Polynomial Approximations: Error Estimates for Functions of Bounded Variation. AIMS Mathematics, 15(4), (Q1,IF: 1.8) 10.3934/math.2025398
S. Akansha (2024), Piecewise Nonlinear Approximation for Non-Smooth Functions, Results in Applied Mathematics. 23 (Q2) https://doi.org/10.1016/j.rinam.2024.100491
S. Akansha (2024). Decay estimate of bivariate Chebyshev coefficients for functions with limited smoothness, Results in Applied Mathematics, 22 (Q2) https://doi.org/10.1016/j.rinam.2024.100449
S. Akansha (2024). Mitigating Gibbs phenomenon: A localised Pede-Chebyshev approach and its conservation law applications, Results in Nonlinear Analysis. 7 (Q1) https://doi.org/10.31838/rna/2024.07.03.003
Akansha (2023), Addressing the impact of localized training data in graph neural networks, "IEEE 7th International Conference on Computer Applications in Electrical Engineering - Recant Advances 2023". https://doi.org/10.1109/CERA59325.2023.10455308
At Manipal Institute of Technology
MAT 2401: Mathematics DAMP IV (BSC UG)
taught in Spring 2025
MAT 5302: Applied Linear Algebra (UG)
taught in spring 2025, 2024, 2023 - approx. student rating (4.42/5)
MAT 2231: Engineering Mathematics IV
taught in spring 2025, 2024
MAT 2122: Engineering Mathematics III
taught in Fall 2024 – approx. student rating (4.49/5)
MAT 5143: Computational Linear Algebra with Python Lab (MSC PG)
taught in Fall 2024 - approx. student rating (4.4/5)
MAT 5130: Computational Methods and Applied Linear Algebra (MTech PG)
taught in Fall 2024, 2023
MAT 5134: Linear Algebra (MSC PG)
taught in Fall 2023, 2022 - approx. student rating (4.43/5)
MAT 1203: Numerical Analysis (BSC UG)
taught in spring 2023
Post Graduate
S. Surabhi, M.Sc. (2026), MIT Manipal
Title: Practical and Ethical Challenges of Large Language Models in Education
Amitesh Puri, M.Sc. (2025), MIT Manipal
Title: Limitations of Deep Graph Neural Networks
Details: Investigated uncertainty quantification in six state-of-the-art (SOTA) GNN models under conditional data shifts for node classification tasks. Employed metrics such as Earth Mover’s Distance, KL-Divergence, JS-Divergence, CMD, and MMD to mitigate these shifts.
Simran, M.Sc. (2024), MIT Manipal
Title: A Study on Message-Passing Graph Neural Networks (MPNN)
Details: Analyzed and compared various MPNN models, highlighting their key features and limitations in node and graph classification tasks.
Undergraduate
AARYAN PANIGRAHI, TANISH SUNILKUMAR , B.Tech 3rd Year, Mini project in Mathematics, MIT Manipal
Title A Study on Spectral Graph Neural Networks
Details Conducting an introductory study on spectral GNNs and its use cases.
Students Outside MIT
Sarbartha Sankar Mallick, B.Tech DS, West Bengal
Details: Investigating non-conformity scoring methods in conformal prediction, specifically designed for graph learning tasks.
Tushar Chaturvedi, Research Scholar, IIT Mandi
Details:
Omendra Gangwar, Research Scholar, IIT Guwahati
Raviteja Bommireddy, B.Tech AI, IIITD & M Kancheepuram