TIMS: A Tactile Internet-Based Manipulation System with Haptic Guidance for Surgical Training

 Author: Jialin Lin,  Xiaoqing Guo,  Wen Fan,  Wei Li,  Yuanyi Wang,  Jiaming Liang,  Weiru Liu,  Lei Wei,  Dandan Zhang

Abstruct

Microsurgery involves the dexterous manipulation of delicate tissue or fragile structures such as small blood vessels, nerves, etc., under a microscope. To address the limitation of imprecise manipulation of human hands, robotic systems have been developed to assist surgeons in performing complex microsurgical tasks with greater precision and safety. However, the steep learning curve for robot-assisted microsurgery (RAMS) and the shortage of well-trained surgeons pose significant challenges to the widespread adoption of RAMS. Therefore, the development of a versatile training system for RAMS is necessary, which can bring tangible benefits to both surgeons and patients.

In this paper, we present a Tactile Internet-Based Micromanipulation System (TIMS) based on a ROS-Django web-based architecture for microsurgical training. This system can provide tactile feedback to operators via a wearable tactile display (WTD), while real-time data is transmitted through the internet via a ROS-Django framework. In addition, TIMS integrates haptic guidance to `guide' the trainees to follow a desired trajectory provided by expert surgeons. Learning from demonstration based on Gaussian Process Regression (GPR)  was used to generate the desired trajectory. 

User studies were also conducted to verify the effectiveness of our proposed TIMS, comparing users' performance with and without tactile feedback and/or haptic guidance.

Code has been pubished now:  https://github.com/leen-coding/TIMS_PRIVATE.git

 Motivation:




Contribution:



System design

 Key features:

Novice-friendly ; Visual-tactile mixed ; Operation digitalized

Haptic guidance

Haptic guidance allows the operator to manipulate the robot along a predetermined trajectory. If the operator’s manipulation deviates from the predetermined trajectory, the Geomagic Touch robot produces a force to pull the operator’s hand back onto the correct path.

Method:





Tactile feedback

The tactile feedback will be rendered to the operator through the ROS-Django framework from the micro robot once the surgical tool touches the eyeball. In this system, we installed a camera at the end of micro robot that captures side view of eyeball that can initially show whether the surgical tool has contacted with the patient’s eye. The captured image is then converted into tactile information by training a object detection-image classification model.

The in-house wearable tactiel display (WTD) is constructed based on a  pneumatically actuated tactile actuator array.  It consists of two film layers, which are constructed from a pliable material called Thermoplastic Polyurethane (TPU).  Each pneumatically actuated tactile actuator array contains 16 inflatable actuators arranged in a 2D pattern. The actuators are essentially air pockets with cylindrical chambers that measure 3 mm in diameter. This display has rectangular dimensions of approximately 30 mm by 20 mm, which corresponds to the size of human fingertips. By injecting air into an actuator's chamber, the user can feel a tactile sensation generated by the inflatable actuators.

In TIMS, once surgical tool touches the eyeball, the follower side will send a signal to In-house WTD through ROS-Django framework, enabling tactile feedback rendering to the user.

System evaluation

To prove the effectiveness of our proposed TIMS, we invited five participants to join the user study. The experiment task are given as Line following and Needle insertion.

Each participant is required to do four experiments to evaluate effectiveness of tactile feedback, namely:

• No Feedback (NF),

•With Tactile Feedback (TF),

•With Haptic Guidance (HG),

•With Tactile Feedback and Haptic Guidance (TF\&HG)


Poor performance was observed under the No Feedback condition. When Tactile Feedback or Haptic Guidance was introduced, participants' task performance improved significantly, with Haptic Guidance showing greater improvement than Tactile Feedback. 




To assess the efficacy of haptic guidance , we did microsurgical skill assessment.


We hypothesized that subjects would demonstrate improvement after using the proposed system with haptic guidance. We utilized standardized metrics to evaluate the effectiveness of haptic guidance for microsurgical skill training. We quantified microsurgical skill improvement after each task performed with haptic guidance. Specifically, operators were required to perform trajectory following and needle insertion ten times with tactile feedback.  We visualize the learning curve for one of the participants as an example to demonstrate the value of haptic guidance in microsurgical training.   Following three rounds of haptic guidance, the trainee's performance exceeded the average performance quickly. The participant's performance continued to improve and stabilized after the fourth round of guidance.



Visualization of the quantitative evaluation results using different experimental settings. `NF', `TF', `HG' represent `No Feedback', `Tactile Feedback', and `Haptic Guidance', respectively.




Visualization of the trainee learning curve based on Microsurgical Skill Assessment.

Conclusion

In conclusion, we developed a TIMS system that combines TF and HG, concretely:


•We used ROS-Django as our communication tool to connect the device over the internet, achieving teleoperation, digital training, and visual-tactile mixed feedback.


•Through Django, we built a web page interface that allows real-time interaction, where operators can observe monitor videos, robot real-time trajectories, and robot status.


•We used deep learning methods such as object detection and image classification to achieve visual-to-tactile information conversion.


•We developed an implementation method for haptic guidance, which provides operators with force feedback in real-time based on the current surgical tool position and optimized trajectory through GPR.


•Finally, we proved the system's usability and the necessity of integrating TF and HG into the system through user study.


In the future, we aim to optimize the trajectory generation method by implementing advanced imitation learning to generate trajectories based on visual observations and incorporate trustworthiness features. Furthermore, we plan to integrate Mixed Reality technology into the TIMS framework, which can provide trainees with an immersive operation experience. Additionally, more comprehensive evaluations will be provided to the users. For example, the `Structured Assessment of Robotic Microsurgical Skills' scoring framework will be integrated into the training system. We will invite surgeons to participate in user studies and perform more complicated tasks such as microvascular anastomosis, membrane peeling, etc.