Welcome to the Param-Intelligence (đ ) Lab!
Param-Intelligence (đ ) Lab, located in the Department of Aerospace Engineering, WPI, and led by Prof. Ameya D. Jagtap, is at the forefront of developing trustworthy and scalable scientific machine learning algorithms. Labâs research interests encompass scientific machine learning, physics & data-driven methods, trustworthy machine learning (ML) algorithms for scientific computations, and parallel algorithms. In particular, the lab is interested in tackling different problems in computational physics including inverse problems, high-dimensional problems, Fractional and Non-local PDEs, Stochastic PDEs, etc.Â
Our research is dedicated to the development of explainable, trustworthy, and robust machine learning (ML) and deep learning (DL) algorithms tailored for scientific discovery and engineering applications. We focus on advancing scientific machine learning methodologies by integrating physics-informed modeling with data-driven approaches to ensure physical consistency, generalizability, and interpretability. Our goal is to bridge the gap between theoretical innovations and real-world implementations, enabling the reliable deployment of AI tools across scientific disciplines and industrial sectors. Through this, we aim to foster a deeper understanding of complex systems while promoting the responsible and effective use of artificial intelligence in science and engineering.
Recognized Among the Worldâs Top 2% Scientists by Stanford and Elsevier in 2024 and 2025
In 2025, Dr. Ameya D. Jagtap was once again recognized among the worldâs top 2% scientists by Stanford University and Elsevier, with his global rank improving from 203 last year to 112 this year.
Congratulations to Ph.D. student Sidharth S. Menon on his first publication in Computer Methods in Applied Mechanics and Engineering (Elsevier)
Sidharth S. Menon and Ameya D. Jagtap, Anant-Net: Breaking the curse of dimensionality with scalable and interpretable neural surrogate for high-dimensional PDEs, Computer Methods in Applied Mechanics and Engineering, Volume 447, (2025) 118403. [Journal] [Arxiv]25
A new preprint alert: On Scientific Foundation Models: Rigorous Definitions, Key Applications, and a Survey, Available at SSRN: https://ssrn.com/abstract=5409063 or http://dx.doi.org/10.2139/ssrn.5409063 [Link]
Led by Ph.D students Sidharth S. Menon and Trishit Mondal at WPI, and a wonderful collaborators from Shell, Dr. Shuvayan Brahmachary, Dr. Aniruddha Panda, Dr. Subodh M. Joshi, and Dr. Kaushic Kalyanaraman.
Paper accepted in Neurocomputing, Elsevier: Jassem Abbasi, Ameya D. Jagtap, Ben Moseley, Aksel Hiorth, and Pal Ăstebø Andersen, Challenges and Advancements in Modeling Shock Fronts with Physics-Informed Neural Networks: A Review and Benchmarking Study, Neurocomputing, 657 (2025) 131440. [Journal] [Arxiv]
A new preprint alert: BubbleONet: A Physics-Informed Neural Operator for High-Frequency Bubble Dynamics, Y. Zhang, L. Cheng, A Gnanaskandan, Ameya D. Jagtap, arxiv preprint, arXiv:2508.03965, 2025. [Arxiv]. Led by Ph.D. Student Yunhao Zhang. This paper presents BubbleONet, a physics-informed operator learning surrogate for efficiently simulating high-frequency bubble dynamics governed by RayleighâPlesset and KellerâMiksis equations.
Announcing our new preprint, An Approximation Theory Perspective on Machine Learning. Our goal is to spark a more rigorous and inspiring theoretical dialogue, one that places approximation theory at the heart of modern ML. Here is the link [Arxiv].
Prof. Jagtap will visit VNIT Nagpur on June 30th, 2025 to deliver an invited talk.
Title: Rethinking Scientific Computing: Are Neural Surrogates the Real Deal?
Announcing our new preprint, Anant-Net: a scalable and interpretable neural surrogate for high-dimensional PDEs, led by Ph.D. student Sidharth S. Menon. This work presents an efficient and accurate framework for solving high-dimensional PDEs. Here is the link  [Arxiv].Â
Prof. Jagtap will visit IISER Pune on July 3rd, 2025 to deliver an invited talk.
Title: Scientific Computing with Neural Surrogates: Hype or Hope?
On Scientific Foundation Models: Rigorous Definitions, Key Applications, and a Survey, Sidharth S. Menon, Trishit Mondal, Shuvayan Brahmachary, Aniruddha Panda, Subodh M. Joshi, Kaushic Kalyanaraman, Ameya D. Jagtap Available at SSRN: https://ssrn.com/abstract=5409063 or http://dx.doi.org/10.2139/ssrn.5409063 [Link]
BubbleONet: A Physics-Informed Neural Operator for High-Frequency Bubble Dynamics, Y. Zhang, L. Cheng, A Gnanaskandan, Ameya D. Jagtap, arxiv preprint, arXiv:2508.03965, 2025. [Arxiv]
H.N. Mhaskar, E Tsoukanis, Ameya D. Jagtap, An Approximation Theory Perspective on Machine Learning, arxiv preprint, arXiv:2506.02168, 2025. Â [Arxiv]
Sidharth S. Menon and Ameya D. Jagtap, Anant-Net: Breaking the curse of dimensionality with scalable and interpretable neural surrogate for high-dimensional PDEs, Computer Methods in Applied Mechanics and Engineering, 447, (2025) 118403. [Journal] [Arxiv]25
Jassem Abbasi, Ameya D. Jagtap, Ben Moseley, Aksel Hiorth, and Pal Ăstebø Andersen, Challenges and Advancements in Modeling Shock Fronts with Physics-Informed Neural Networks: A Review and Benchmarking Study, Neurocomputing, 657 (2025) 131440. [Journal] [Arxiv]
Jassem Abbasi, Ben Moseley, Takeshi Kurotori, Ameya D. Jagtap, Anthony R. Kovscek, Aksel Hiorth, and Pal Ăstebø Andersen, History-Matching of Imbibition Flow in Multiscale Fractured Porous Media Using Physics-Informed Neural Networks (PINNs), Computer Methods in Applied Mechanics and Engineering, Vol. 437 (2025) 117784. [Journal]
S. Brahmachary, S M. Joshi, A. Panda, K. Koneripalli, A. Kumar Sagotra, H Patel, A. Sharma, Ameya D. Jagtap, K. Kalyanaraman, Large Language Model-Based Evolutionary Optimizer: Reasoning with elitism, Neurocomputing, Vol. 622 (2025) 129272 [Journal]