Samsung R&D Institute India, Bengaluru
A PhD Intern in Beyond 5G Team, research interests include the AI/ML in wireless communication, particularly in the context of channel generation.
Matlab, Python, C, LaTeX
Large Foundation Models, Generative AI, AI for 6G, Multimodal Learning, Computer Vision, Distributed & Decentralized AI, Stochastic Gradient Descent, Machine Learning, Deep Learning, Internet of Things
transformers, PyTorch, Hugging Face, torchvision, OpenCV, pandas, NumPy, Matplotlib, seaborn, scikit-learn, SionnaRT
Parth Sharma and Pyari Mohan Pradhan
25 April 2025 • 2024 34th International Conference on Computer Theory and Applications (ICCTA)
Precise segmentation of brain tumors is essential in medical imaging, supporting early diagnosis, personalized treatment planning, and effective monitoring of tumor development and response to therapies. While centralized deep learning models like U-Net are highly effective for this task, they raise substantial privacy concerns when healthcare or-ganizations need to exchange confidential patient information. To address these concerns, the BrainDiffU-Net framework is introduced, a diffusion-based U-Net model designed for decen-tralized learning environments. In this framework, each medical institution independently trains a model on its local dataset and exchanges model updates with neighboring institutions, ensuring data privacy without requiring a central server. The diffusion process allows model parameters to be shared securely across the network, facilitating collaborative learning across multiple institutions while preserving privacy. Simulations conducted on the Low-Grade Glioma (LGG) MRI Segmentation Dataset demonstrate that BrainDiffU-Net achieves segmentation performance comparable to centralized models and significantly outperforms non-cooperative strategies in terms of average loss and Dice coefficient. These results show that BrainDiffU-Net provides an optimal balance between segmentation accuracy and privacy preservation, making it a viable solution for decen-tralized medical applications. The source code is available at https://github.com/Parth-nXp/BrainDiffU-Net.
Parth Sharma and Pyari Mohan Pradhan
03 March 2025 • 2024 IEEE International Conference on Advanced Networks and Telecommunications Systems (ANTS)
Estimation of sparse parameters in presence of non-Gaussian noise is a challenge in Internet of Things (IoT) networks. This paper introduces a novel diffusion β-divergence based block proportionate normalized least mean squares (DβBPNLMS) algorithm tailored for distributed estimation in Internet of Things (IoT) networks. The proposed algorithm incorporates β-divergence measure to enhance robustness against non-Gaussian noise. Further, it employs block proportionate normalization to reduce computational complexity while maintaining high estimation accuracy. The algorithm also utilizes a coefficient re-ordering strategy and uniform weighting within blocks to efficiently prioritize significant coefficients. An adapt-then-combine (ATC) diffusion strategy is implemented to improve robustness and convergence speed in distributed parameter estimation. The performance of the proposed DβBPNLMS algorithm is validated through simulation study on two critical IoT applications: acoustic echo path estimation and urban microcell wireless channel estimation. The results demonstrate that the proposed algorithm significantly outperforms state-of-the-art algorithms, achieving lower steady-state error across various noise conditions, thereby proving its effectiveness and practical applicability in real-time IoT networks.
Parth Sharma and Pyari Mohan Pradhan
03 March 2025 • 2024 IEEE International Conference on Advanced Networks and Telecommunications Systems (ANTS)
As the demand for seamless connectivity across distributed networks increases, traditional federated learning (FL) models struggle to maintain accuracy and efficiency in dynamic environment. Conventional approaches, such as Random Fourier Features-based Kernel Least Mean Squares (RFF-KLMS), often fail to adapt to changing data distributions, leading to performance degradation. To address this issue, the Dynamic Fourier Federated Learning (DFFL) algorithm is introduced, incorporating adaptive Fourier features that iteratively evolve to better align with current data patterns. Simulation results demonstrate that the DFFL algorithm significantly enhances the convergence rate of FL models, ensuring reliable and seamless connectivity in real-world applications where data characteristics are continuously changing.