Image Restoration
FPGA based Hardware Accelerators
Developing Learning Algorithms for Deep Learning.
TinyML for Mobile and Edge Computing
In-memory computing Architectures
February 2026: One paper got accepted in INTERMAG 2026.
December 2025: Work on atmospheric scattering model based image restoration architecture got accepted at IEEE Signal Processing Letters.
November 2025: Degradation-aware All-in-one Image Restoration Network work got accepted at Journal of Visual Communication and Image Representation.
May 2025: Work on DSHE-MRAM based In-Memory Computing for Image Segmentation got accepted at APL Electronic Devices.
May 2025: Application of Ising machine for image denoising task got accepted at Physica Scripta Journal.
March 2025: Work on Spatial-frequency domain non-local attention based restoration network got accepted at IJCNN'25.
March 2025: One paper got selected for CVPR'25 Workshop: New Trends in Image Restoration and Enhancement (NTIRE).
February 2025: Started Research Internship at Samsung R&D Institute, Bangalore.
December 2024: Work on Approximation aware training and its CiM architecture got accepted at IEEE Open Journal on Nanotechnology.
November 2024: Work on ML models for estimating magnetic parameters got accepted at IEEE Open Journal on Nanotechnology.
April 2024: One paper got selected for CVPR'24 Workshop: New Trends in Image Restoration and Enhancement (NTIRE).
March 2024: Filled Patent on Feature Enhancement with Channel Fusion in Convolution.
February 2024: Paper accepted at IEEE Transactions on Electron Devices on Predicting formation on AFM Skyrmion with ML.
June 2023: Two papers accepted at IEEE NMDC 2023 Conference.
May 2023: Two papers accepted at IEEE NANO 2023 Conference.
May 2023: Two papers accepted at SPIE Optics+Photonics 2023 Conference.
March 2023: Received Prime Minister Research Fellowship.
March 2023: Paper accepted at IEEE Open Journal on Nanotechnology on SOT based MAAP unit for merged pooling and activation function.
January 2023: Paper accepted at IEEE Transactions on Electron Devices on Hybrid Multilevel STT/DSHE Memory Architecture for Training CNNs.
October 2022: Two papers accepted at SPIE Photonics West 2023 Conference.
~IEEE Signal Processing Letters
We present a physics-guided unified deep learning framework for single image restoration, addressing dehazing, deraining, and low-light enhancement in a single model. The approach combines atmospheric model-based estimation with a novel DiffConv feature extractor that captures both low- and high-frequency details efficiently. A grayscale prior is leveraged to reduce noise and color artifacts, while a refinement stage enhances structural consistency. The model is optimized for efficiency through re-parameterization, achieving improved restoration quality (PSNR/SSIM) with lower computational cost.
~IJCNN 2025
We propose SFCola-Net, a spatial–frequency collaborative attention network for robust image restoration under challenging conditions such as fog, rain, and haze. Unlike existing methods that rely solely on local or non-local attention, our approach integrates both to better capture fine details and long-range dependencies. By jointly restoring features in spatial and frequency domains and incorporating a patch-wise non-local attention mechanism, the model effectively handles severe degradation and complex textures, leading to improved image quality for downstream vision tasks like autonomous driving.
~ICASSP 2026
We propose an efficient instance-level image retrieval framework designed for resource-constrained devices. A lightweight student model computes query descriptors on-device, while a powerful teacher model generates gallery descriptors offline. To bridge the gap between the two, we employ a combination of relational knowledge distillation and intermediate feature alignment. Additionally, a lightweight re-ranking strategy using local matching and descriptor refinement (DBA and QE) improves robustness to variations such as viewpoint, scale, and illumination, delivering accurate and efficient retrieval performance.
~IEEE Open Journal of Nanotechnology
This work proposes approximation-aware-training, in which group of weights are approximated using a differential approximation function, resulting in a new weight matrix composed of approximation function's coefficients (AFC). The network is trained using backpropagation to minimize the loss function with respect to AFC matrix with linear and quadratic approximation functions preserving accuracy at high compression rates. This work extends to implement an compute-in-memory architecture for inference operations of approximate neural networks. This architecture maps AFC to crossbar arrays directly. This reduces the number of crossbars, lowering area and energy consumption. Integrating magnetic random-access memory-based devices further enhances performance by reducing latency and energy consumption.
~CVPR'24 Workshop: New Trends in Image Restoration and Enhancement (NTIRE)
This work presents a novel two-stage architecture intended to improve images that have been deteriorated by rain, haze, blur, and other environmental factors. We propose the Fourier prior to improve the generalization ability of image restoration models. This is based on an important finding: degradation can be effectively mitigated by replacing the Fourier amplitude of degraded images with that of clean images. Indicating that phase preserves background structures while amplitude contains information about degradation. As a result, a two-stage model is presented, comprising the Phase Refinement Unit and the Amplitude Refinement Unit, which independently restore phase and amplitude information, respectively of degraded images.