At the NIH, I utilize the Biowulf high-performance computing (HPC) cluster to accelerate computationally intensive workflows related to FLIM image analysis, behavioral video processing, and in vivo fluorescence imaging. These large-scale datasets require efficient parallel processing and scalable analytical pipelines. We also employ AlphaFold2 and AlphaFold3 (Google DeepMind) for structure prediction and validation to inform the rational design and optimization of protein-based biosensors.
Our team is interested in developing machine learning–based algorithms to quantify and classify complex neural datasets, integrating multimodal data from imaging and behavior. In parallel, I build computational models of neural dynamics and animal behavior to investigate mechanisms of learning and adaptation, particularly in the context of acute and chronic substance exposure. These approaches enable a systems-level understanding of how molecular signaling and circuit activity give rise to behavior.
With extensive engineering background, I have worked on the development of integrated electronics, implantable device technology, and bio-optical systems. The technology includes implantable physical biosensors, multifunctional biomaterials, Raman spectroscopy imaging,, signal/imaging processing algorithms, and artificial neural network (ANN) for bio applications. In my previous biomedical and electrical engineering projects, I developed on-chip CMOS biosensors, implantable optical biosensing systems, and micro and nano electromechanical system (M/NEMS). My current research interest is the integration of electrical, optical, and computational approaches to create highly sensitive and novel neural monitoring modalities that can enable innovative neuroscience research.