Welcome to the Param-Intelligence (𝝅) Lab!
Param-Intelligence (𝝅) Lab, located in the Department of Aerospace Engineering, WPI, and led by Prof. Ameya D. Jagtap, is at the forefront of developing trustworthy and scalable scientific machine learning algorithms. Lab’s research interests encompass scientific machine learning, physics & data-driven methods, trustworthy machine learning (ML) algorithms for scientific computations, and parallel algorithms. In particular, the lab is interested in tackling different problems in computational physics including inverse problems, high-dimensional problems, Fractional and Non-local PDEs, Stochastic PDEs, etc.
Our research is dedicated to the development of explainable, trustworthy, and robust machine learning (ML) and deep learning (DL) algorithms tailored for scientific discovery and engineering applications. We focus on advancing scientific machine learning methodologies by integrating physics-informed modeling with data-driven approaches to ensure physical consistency, generalizability, and interpretability. Our goal is to bridge the gap between theoretical innovations and real-world implementations, enabling the reliable deployment of AI tools across scientific disciplines and industrial sectors. Through this, we aim to foster a deeper understanding of complex systems while promoting the responsible and effective use of artificial intelligence in science and engineering.
Announcing our new preprint, An Approximation Theory Perspective on Machine Learning. Our goal is to spark a more rigorous and inspiring theoretical dialogue, one that places approximation theory at the heart of modern ML. Here is the link [Arxiv].
Prof. Jagtap will visit VNIT Nagpur on June 30th, 2025 to deliver an invited talk.
Title: Rethinking Scientific Computing: Are Neural Surrogates the Real Deal?
Announcing our new preprint, Anant-Net: a scalable and interpretable neural surrogate for high-dimensional PDEs, led by Ph.D. student Sidharth S. Menon. This work presents an efficient and accurate framework for solving high-dimensional PDEs. Here is the link [Arxiv].
Prof. Jagtap will visit IISER Pune on July 3rd, 2025 to deliver an invited talk.
Title: Scientific Computing with Neural Surrogates: Hype or Hope?
List of Top 2% Scientists in the World by Stanford University, USA!
Honored to be included among the top 2% of scientists worldwide! This achievement would not have been possible without the dedication and hard work of my incredible collaborators and students. Thank you all for your invaluable contributions!
H.N. Mhaskar, E Tsoukanis, Ameya D. Jagtap, An Approximation Theory Perspective on Machine Learning, arxiv preprint, arXiv:2506.02168, 2025. [Arxiv]
Sidharth S. Menon and Ameya D. Jagtap, Anant-Net: Breaking the curse of dimensionality with scalable and interpretable neural surrogate for high-dimensional PDEs, arxiv preprint, arXiv:2505.03595, 2025. [Arxiv]25
Jassem Abbasi, Ameya D. Jagtap, Ben Moseley, Aksel Hiorth, and Pal Østebø Andersen, Challenges and Advancements in Modeling Shock Fronts with Physics-Informed Neural Networks: A Review and Benchmarking Study, arxiv preprint, arXiv:2503.17379, 2025. [Arxiv]
Jassem Abbasi, Ben Moseley, Takeshi Kurotori, Ameya D. Jagtap, Anthony R. Kovscek, Aksel Hiorth, and Pal Østebø Andersen, History-Matching of Imbibition Flow in Multiscale Fractured Porous Media Using Physics-Informed Neural Networks (PINNs),, Computer Methods in Applied Mechanics and Engineering, Vol. 437 (2025) 117784. [Journal]
S. Brahmachary, S M. Joshi, A. Panda, K. Koneripalli, A. Kumar Sagotra, H Patel, A. Sharma, Ameya D. Jagtap, K. Kalyanaraman, Large Language Model-Based Evolutionary Optimizer: Reasoning with elitism, Neurocomputing, Vol. 622 (2025) 129272 [Journal]