For our updated project goal—after receiving feedback from TAs and a medical specialist—we have decided to focus on administering insulin injections. To support this, the Stretch robot must detect a stationary frame that the patient uses to position their arm, ensuring safe localization of the injection site and proper application of pressure during injection.
To improve detection accuracy, we are taking the following steps:
We are 3D printing a stationary frame to help guide the patient’s arm placement. This frame serves as a reference structure for the robot and minimizes variability in arm positioning.
An ArUco marker will be affixed to the frame to support reliable detection as we have successfully used ArUco markers for glucometer detection in Post #8.
As a backup, we will consider using a brightly colored armband (e.g., neon green or orange) in case marker tracking becomes unstable due to motion blur or occlusion.
Feasibility:
These solutions are low-cost, lightweight, and non-invasive, making them practical in a medical assistance context. ArUco markers are well-supported in ROS2 via OpenCV libraries, and HSV thresholding can be applied for color-based segmentation if using armbands.
To further reduce the risk of incorrect localization, we will integrate human supervision:
The robot will display its camera feed through a teleoperation interface designed for the patient's needs.
A caregiver or the user can click on the upper arm region to confirm or adjust the proposed injection site.
Alternatively, we may leverage tools like Meta’s Segment Anything, which allows users to draw bounding boxes around target regions with minimal effort.
Feasibility:
This approach builds on our teleoperation interface. It is especially helpful in supervised environments and aligns with standard safety practices in healthcare. While it introduces some overhead, the additional layer of verification significantly increases reliability and safety.
Our long-term goal is to detect the upper arm autonomously, without environmental modifications or human input.
We plan to explore pose estimation frameworks such as MediaPipe, OpenPose, or BlazePose to detect body landmarks, including the upper arm.
These models can be adapted to work on seated patients with minimal movement.
Challenges:
Variability due to lighting, clothing, body type, and posture.
The Stretch robot’s camera may lack accuracy as the filed of its view is not wide enough for robust pose estimation.
Planned Steps:
Prototype detection using off-the-shelf models.
Collect a small custom dataset of upper-arm injection postures if needed.
Evaluate model accuracy and inference latency on the Stretch robot’s onboard compute system.
In case of perception failure, the system will:
Fallback to human-in-the-loop confirmation (highest safety).
Revert to marker-based or color-based detection (quick recovery).
Abort the injection and notify the user/caregiver for manual intervention.