Open-source High-precision Autonomous Suturing Framework With Visual Guidance
Hongbin Lin*, Bin Li*, Yunhui Liu and Kwok Wai Samuel Au
(* means equal contribution)
Hongbin Lin*, Bin Li*, Yunhui Liu and Kwok Wai Samuel Au
(* means equal contribution)
Autonomous surgery has attracted increasing attention for revolutionizing robotic patient care, yet remains a distant and challenging goal. In this paper, we propose an image-based framework for high-precision autonomous suturing operation. We first build an algebraic geometric algorithm to achieve accurate needle pose estimation, then design the corresponding keypoint-based calibration network for joint-offset compensation, and further plan and control suture trajectory. Our solution ranked first among all competitors in the AccelNet Surgical Robotics Challenge. The source code can be found in this web page to accelerate future autonomous surgery research.
Challenge 2
Challenge 3, Case 1
(No Calibration Error)
Challenge 3, Case 2
(No Calibration Error)
Challenge 3, Case 3
(No Calibration Error)
Here we summarize our results on the testbench of AccelNet Challenge 2021-2022:
Challenge 1:
The overall Score of our method is between 0.002 ~ 0.015 Simulation Units.
Challenge 2:
We first calibrated PSM2 and then performed needle picking and needle insertion. The error of our calibration for each joint of the PSM has a mean of 0.2 and a standard deviation of 0.2. We finish Challenge 2 in around 25 seconds within the requirements of task evaluation. Check out the video above to see the performance.
Challenge 3:
We first calibrated PSM1 and 2 and then performed needle insertion, extraction, picking, and re-grasping.
Since the task was very complex, we finished challenge3 in around 4 to 5 minutes.
Unfortunately, we only successfully finish Challenge 3 under no-calibration-error and super-small-calibration-error conditions. Although our calibration can achieve small calibration errors (0.2±0.2 Deg), the task requires extremely high calibration accuracy and sometimes fails when the error is large, (say 0.3 Deg for some joints).
Our results are fully reproducible using our code.
Here we want to discuss the challenges and limitations, to help researchers to understand the problem and explore future directions.
Joint position controller is too "soft" in simulation:
We find that the low-level PID joint-position controllers are too soft, resulting in large joint errors between desired and actual joints for PSMs, (even though there is no calibration error!). This is purely due to the low value of the parameters of PID controllers.
In order to solve this problem, we add another high-level PID controller to minimize this error in our framework. There is a tradeoff between high accuracy and small oscillation for high-level PID when we tuned its parameters. In challenge 2 and 3, high accuracy is required while small oscillation is acceptable. Therefore, we had a bias toward high accuracy within acceptable oscillations. You will see some small oscillations on PSM in the videos of Challenge 2 and 3.
In the real robot, thanks to daVinci Research Kit(dVRK), this kind of error can be minimized using the low-level control of dVRK. In that case, the high-level PID controller might be not necessary.
Things that we cannot do:
Here we want to discuss the limitations of our method:
We cannot do an online calibration.
From my point of view, online calibration is necessary for pose-estimation control using dVRK. There are a lot of factors, (including cable tensions, sensor drifts and e.t.c.), that lead to time-variant joint measurement errors. We recommend people, who want to explore online calibration for dVRK, to read this paper .
We may not be able to track the needle in real-time.
We cannot track the needle in real-time for high-accuracy pose estimation. We think performing online needle estimation would be valuable for reducing the required time and increasing system robustness for autonomous suturing.
Our next steps:
We learned a lot from participating AccelNet Challenge 2021-2022. It is extremely hard to achieve autonomous suturing, even for a simple task from the perspective of humans.
We are working and exploring on learning approach to perform autonomous suturing tasks using the testbench of AccelNet Challenge. Stay tuned for our latest publishment.
We won the first-place-winner prize for 2022 AccelNet Surgical Robotic Challenge
@inproceedings{hlin2022Accel,
title={{Open-source High-precision Autonomous Suturing Framework With Visual Guidance}},
author={Lin, Hongbin and Li, Bin and Liu, Yunhui and Kwok Wai Samuel Au},
booktitle={IEEE Int. Conf. Intell. Robots and Syst. (IROS) Workshop on ``A Panacea Or An Alchemy? Benefits And Risks Of Robot Learning In Medical Applications''},
year={2022}
}