Robotics-assisted surgery with AR-guided navigation has excellent benefits over conventional open surgery, including high precision, less pain, minor infection, and a shorter recovery time. This project aims to facilitate robotics-assisted surgery by providing autonomous robot controls for surgical tool tracking and surgical assistance based on visual feedback.
Snake's forward locomotion and sidewinding are aided by a non-uniform ground contact distribution. In this project, we developed a novel control strategy to achieve non-uniform ground contacts for robotic snake locomotion using body compliance and tensions. A new version of the robot is under development to incorporate an onboard processing and vision system. This project aims to develop an intelligent snake robot that can autonomously navigate in a wild environment.
Robots' capability of physical human-robot interaction is a critical component in fluent human-robot collaboration. However, physical interaction is not always available due to hardware limitations and safety issues. Alternatively, we develop a virtual admittance control framework by introducing an augmented reality-based human-robot interaction exploiting RGB-D camera and hand motion tracking algorithm.
Bone position localization from intra-operative X-ray images is necessary for various surgical applications, including autonomous robotic surgery and image-guided surgical navigation. When a pre-operative CT image is available, the localization can be achieved through 3D/2D registration between the X-ray images and the pre-operative CT image. We develop an AI-based algorithm for fast 2D-3D registration between CT and X-ray images for real-time assistance in the operation room.
Continuum surgical robots are suitable for minimally invasive surgeries for their light and slender shapes. Our research includes developing new continuum mechanisms, modeling and controlling the existing continuum robots, and optimal motion planning for safe operation.