ACADEMIC INTEREST

 

Human Embedded systems for Neuro-rehab and surgery

EDUCATION

 

University of Southern California                                                              Expected: December 2025

 

PhD. Biomedical Engineer- Machine Learning research in Interactive Neurorehabilitation (INR)

 

Virginia Tech                                                                                                          August 2019-MAY 2022 (Transferred)

 

PhD. Biomedical Engineer- Machine Learning research in interactive Neurorehabilitation

 

Bangladesh University of Engineering and Technology                    February 2013-September 2017

 

B.S. Electrical and Electronics Engineering

Thesis: Beat Tracking with Empirical Mode Decomposition-based Tempo Estimation and Dynamic Programming

 

RESEARCH EXPERIENCE

 

Shirley Ryan Ability Lab | Contractor

Chicago: May 2024 – December 2024

    Collaborated with clinicians from Shirley Ryan ability lab to design a dashboard that can represent a list of movement impairments with associated quantifiable cluster of kinematics and a summary of assessment profiles for stroke patients.

    Collaborated with Dr. Arun and his team to understand lower body kinematics and the process of extracting them using Xsense and OpenSim sensors. Now we are developing models that can use fewer sensors to extract quality kinematics.

Interactive Neurorehabilitation Lab | Data Scientist  

Los Angeles, California: August 2022 – December 2025

Blacksburg, Virginia: August 2019 – May 2022

    Collaborated with Valero Lab to create hand models to understand hand-object interactions for ARAT exercises

    Developed a Hierarchical Bayesian model to explain the relationship among different layers of movement quality assessment

    Collaborated with ML team for data analysis, human skeleton extraction with ‘Openpose’, and temporal segmentation of stroke patient’s action with a Hierarchical Bayesian model

Implemented R-CNN-based human-object interaction algorithm for stroke survivor’s movement quality assessment

Developed a better understanding of stroke patient’s physical and cognitive impairments from physical therapist collaboration

Mentored and supervised a multidisciplinary team of undergrad and Ph.D. students for machine learning research and data analysis

DSP Research Lab | Applied Deep Learning Researcher                          

Dhaka, Bangladesh: April 2018 – February 2019

Proposed and implemented an innovative deep-learning architecture for better-quality ultrasound image reconstruction

    Extracted temporal feature with a recurrent architecture from 3D tissue displacement

    Improved image reconstruction quality by 4-5% with the incorporation of RNN and 3D-CNN

    Detected musical beat using Empirical mode decomposition

Bangladesh University of Engineering and Technology | Research Engineer

Dhaka, Bangladesh: September 2017 – April 2018

Initiated patient data collection and experiments with multiple CIRS phantom in clinical setup with radiologist collaboration

    Conducted research for simulation of ultrasound wave propagation and provided clinical support for patient report writing

    Performed analytic Gaussian modeling on ultrasound tissue displacement data to achieve a 6-9 percent improvement over existing algorithms

 

TEACHING EXPERIENCE

 

University of Southern California | IDSN 543 Augmented Intelligence

January 2025 – May 2025

Designed the course content and syllabus for undergraduate and graduate level students.

The course trains the students to understand models and algorithms applied in the field of neuro-rehab and how they are connected to multimodal data

PROJECTS

 

Cyber Human System for Upper Extremity Stroke Survivors

USC, USA: May 2022 – present

  Capture data of stroke patients performing different rehab exercises using multi-camera and sensors

   Analyze the data using trained models and generate automated assessment scores and recommendations for the expert clinicians

Hand Vein Detection with BPM estimation Using Pulse Sensor 

BUET, Bangladesh: January 2017 – December 2017

  Decomposed raw sensor data into EMG and ECG signal using Empirical Model Decomposition

  Estimated BPM and vein coordinates from decomposed sensor data with a 78% accuracy

Hand Vein Detection Using Near-Infrared Imaging

BUET, Bangladesh: January 2017 – December 2017

  Administered project management and front-end development for a mechanical hand to inject medicine and extract blood from vein

  Introduced a real-time optimization algorithm for vein detection with 92% accuracy

PATENTS

 

Apparatus, methods, and computer products for deep learning-based shear wave imaging (Provisional patent application number: 62816344)

 

GOOGLE SCHOLAR

 

Profile: Tamim Ahmed , citations: 58

 

WEBSITE

 

..\Users\tamim\Desktop\Tamim Ahmed, PhD Candidate.html

 

GRANTS

 

·          National Science Foundation (Grant Number: 2014499), 2021-2025

·          National Institute on Disability, Independent Living, and Rehabilitation Research (Grant Number: 90REGE0010),2019-2025

·          Applied to J&J MedTech with the proposal “enhancing surgeons by doing event detection during surgery”-2025

 

HONORS

 

 

NIH and NSF grant-funded Ph.D. (2019-current)

Pratt Fellowship, (2019-2020)

Best Innovative Project, ICCIT, 2017

Talent Pool Scholarship, Bangladesh (2013-2017)

 

 

 

PUBLICATIONS

 

    Tamim Ahmed, “A multi-view automated human activity assessment tool for stroke rehabilitation”, under preparation

    Tamim Ahmed, “OpenSim validated 3D reconstructed kinematics from multi-camera setup in stroke rehabilitation”, under preparation

    Jisoo Lee, Tamim Ahmed, “Automatic Temporal Segmentation for Post-Stroke Rehabilitation: Key Point Detection and Temporal Segmentation Approach for Small Datasets”, accepted in WACV 2025

  T. Ahmed, T. Rikakis, J. Lee and P. Turaga, 2024 "A Multi-camera Data Segmentation and Imputation Block for Cyber-Human Assessment of Movement in Upper Extremity Stroke Rehabilitation," under review in IEEE Journal of Biomedical and Health Informatics (JBHI).

    T. Ahmed, T. Rikakis, S. Khan and A. Kelliher, 2024, “Data Acquisition Through Participatory Design for Automated Rehabilitation”, under preparation for CSCW, 2025

  T. Ahmed, T. Rikakis, A. Kelliher and S. L. Wolf, 2024 "A Hierarchical Bayesian Model for Cyber-Human Assessment of Movement in Upper Extremity Stroke Rehabilitation," in IEEE Transactions on Neural Systems and Rehabilitation Engineering, doi: 10.1109/TNSRE.2024.3450008.

  Ahmed, Tamim, et al. (2024). Advances in Computer Vision for Home-Based Stroke Rehabilitation in Computer Vision: Challenges, Trends, and Opportunities (1st ed.). Chapman and Hall/CRC. https://doi.org/10.1201/9781003328957

  Ahmed, Tamim, et al. 2023. ASAR Dataset and Computational Model for Affective State Recognition During ARAT Assessment for Upper Extremity Stroke Survivors. In Companion Publication of the 25th International Conference on Multimodal Interaction (ICMI '23 Companion). Association for Computing Machinery, New York, NY, USA, 11–15. https://doi.org/10.1145/3610661.3617154

  Low-cost Capture and Analysis of Movement Quality and Functionality for Adaptive Therapy of Upper Extremity Stroke Survivors; Tamim Ahmed, Thanassis Rikakis, Aisling Kelliher, Francisco J. Valero-Cuevas; NSF Dare Conference Poster 2023

  Ahmed, Tamim, et al. "Automated movement assessment in stroke rehabilitation." Frontiers in Neurology (2021): 1396.

  Hybrid Workflow Process for Home Based Movement Capture, J. Clark, S. Zilevu, T. Ahmed, A. Kelliher, S. Yeshala, S. Garrison, C. Garcia, O. Menezes, M. Seth, T. Rikakis. forthcoming at ACM IMX 2021

  Kelliher, Aisling, et al. "Towards Standardized Processes for Physical Therapists to Quantify Patient Rehabilitation." Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. 2020.

  2017. Ahmed, Tamim, et al. "Real-time injecting device with automated robust vein detection using near-infrared camera and live video." Global Humanitarian Technology Conference (GHTC), San Jose, CA

  2017. Ahmed, Tamim, et al.  "Auto-HRID: Automated Heart Rate Monitoring and Injecting Device with Precise Vein Detection." IEEE International WIE Conference on Electrical and Computer Engineering (WIECON-ECE), India

 

SKILLS

 

Technical

Machine Learning

Deep Learning

Data Analysis

Signal Processing

Image Processing

Data Observation and Interpretation

 

Personal & workplace

Learning from observation

Proactive

Collaborative

Team Player

 

Programming Languages

Python

MATLAB

 

Frameworks

Keras

PYTORCH

Numpy

Scikit Learn

OPENPOSE

R-CNN

 

LEADERSHIP & VOLUNTARY WORK

 

 

Seminar on Ultrasound Elastography | Presenter

Dhaka, Bangladesh: April 2018

 

WIECON-ECE conference | Presenter

Dehradun, India: December 2017

 

Inter University Robotics Competition | Head Organizer

Dhaka, Bangladesh: August 2017

 

ICCIT | Team Project Showcase

Dhaka, Bangladesh: August 2017

 

COLLABORATION

Carilion Clinic                                                       

Collaborated with clinicians from Carilion Clinic to develop

·          Rubric for movement quality assessment for the SARAH system

·          Rubric for movement quality assessment for the SARAH video segmentation

·          Annotation tool for rating the captured data

Shirley-Ryan Ability Lab (SR Lab)

Collaborating with clinicians from SR Lab to develop

·          Rubric for movement quality assessment for the ARAT system

·          Capturing tool for capturing stroke survivors performing ARAT exercises

·          Calibration tool for camera and activity space calibration

 

Virginia Tech

Collaborating with two PhD students to

·          Develop a fusion model for a multimodal movement quality assessment tool

·          Make wireless data transfer for Activity of Daily Living (ADL) capture

·          Develop annotation tool for SARAH video rating

Arizona State University

Collaborating with a PhD student to

·          Label SARAH and ARAT objects using CVAT

·          Train an object detection model for SARAH and ARAT

·          Develop a multimodal hybrid segmentation model for segmenting SARAH and ARAT-captured videos

MENTORING

 

  I have mentored an M.S. student. She was the lead designer of the UI used for annotating the ARAT videos. She needed to understand the biomechanics of the segments and the relationship of different cameras with the segments. Under the supervision of Dr. Thanassis and Dr. Aisling, I have created a rubric to define the duration of the segments and their relationship with different camera views.

  We hired an undergrad to annotate the ARAT videos using our annotation tool. I taught her to use the annotation tool for segmentation. In addition, she also used rubrics to understand the biomechanics of the segments.

  I have also mentored a volunteer collaborator who designed the database to store the annotations. Since he was new in the field, he needed help understanding the data structure and inter/intra relationships between different components.

  I have mentored an MS student in developing the ARAT capture system, which is to our knowledge, the world’s first ARAT data capture system using RGB cameras