PROJECTS
Deep and Classical Finite Element
Spring 2025
Course: MTH 656 – Finite Element Method, Oregon State University
Studied stable and convergent Deep Finite Element models for 2D/3D problems, including plates with circular holes and rock drill booms. Demonstrated up to 99% error reduction versus existing PINNs and significant speedups over classical FEM. Built MATLAB solvers for 1D/2D Poisson equations using linear/quadratic elements and verified convergence rates through L2 and H1 norms.
Parameter Estimation and Sensitivity Analysis for the Nonlinear Maxwell–Duffing System Using FDTD
Winter 2025
Course: MTH 655 – Computational Methods for Inverse Problems and Optimization
Implemented an FDTD solver in MATLAB to simulate nonlinear electromagnetic wave propagation governed by Maxwell–Duffing equations. Solved an inverse problem using nonlinear least squares to estimate key parameters, and performed sensitivity analysis with Jacobian approximations.
Modified Physics-Informed Neural Networks for the Korteweg–de Vries (KdV) Equation
Winter 2025
Course: AI 535 – Deep Learning
Designed a modified PINN framework that conserves mass, energy, and Hamiltonian structure for the KdV equation. Achieved high accuracy (L2 errors ≈ 10−6) using a 3-layer PyTorch model with exponential time-weighting and trapezoidal integration.
Energy-Stable Modeling of the Transport Equation Coupled with Darcy Flow
Fall 2024
Course: MTH 654 – Computational Solutions to Nonlinear Coupled PDEs
Built structure-preserving numerical models for coupled advection–diffusion–reaction systems and Darcy flow. Used operator splitting, finite volume methods, and PINNs.
HPC Simulation of Electromagnetic Wave Propagation in Nonlinear Dispersive Insulators
Winter 2025
Developed a parallelized FDTD solver using MPI and OpenMP for large-scale simulation of wave propagation in nonlinear media. Validated results against theoretical models to ensure numerical accuracy.
Collaborative Research: Compatible Discretizations for Nonlinear Optical Phenomena
Summer 2024, Brown University
Investigated conservation laws (mass, energy, momentum) in coupled nonlinear Schrödinger equations. Developed energy-stable numerical schemes and integrated PINNs to model soliton interactions and preserve physics.
Stability and Dispersion Analysis of (2,2M) KF Schemes for Lorentz Media
Spring 2024
Analyzed high-order finite-difference KF schemes for solving Maxwell’s equations in dispersive Lorentz media. Derived stability conditions using Von Neumann analysis and evaluated numerical dispersion and dissipation.
Sentiment Classification Using Word2Vec and Linear Models
Fall 2024
Course: AI 534 – Machine Learning
Built binary sentiment classifiers using Word2Vec embeddings and models including Perceptron, SVM, and Logistic Regression. Applied cosine similarity, embedding clustering, and dimensionality reduction for interpretability.
Regression Modeling for Housing Price Prediction
Fall 2024
Course: AI 534 – Machine Learning
Created predictive models using polynomial regression, Ridge, and Lasso. Tuned preprocessing and feature selection to minimize RMSLE and optimize model performance.
Feature Engineering and k-NN Optimization
Fall 2024
Course: AI 534 – Machine Learning
Engineered binary features and optimized k-NN classifiers under various distance metrics. Evaluated generalization performance using cross-validation.
Algorithmic Agents and Search Heuristics in AI Systems
Spring 2024
Simulated intelligent agents using A* search, rule-based logic, and constraint solvers for tasks such as Sudoku and the 15-puzzle. Also implemented CNN architectures for vision-based tasks.
Custom Neural Network for CIFAR-10 Binary Classification
Winter 2025
Built a one-hidden-layer neural network in Python to classify a 2-class subset of CIFAR-10. Used ReLU in the hidden layer, sigmoid at output, and trained with binary cross-entropy and SGD with momentum.
ResNet-14 for CIFAR-10 Classification
Winter 2025
Course: AI 535 – Deep Learning
Implemented ResNet-14 (with n=2n=2) in PyTorch for CIFAR-10 classification. Used SGD with momentum for training and visualized performance metrics in TensorBoard. Gained deep insights into residual learning and training dynamics.