Traditional intraoperative imaging instruments display imageries on 2D monitors, which can distract the surgeon from focusing the patient and compromise hand-eye coordination. We have invented developed a wearable optical imaging and AR display system (i.e. Smart Goggle) for guiding surgeries. Real-time 3D fluorescence image guidance can be visualized in AR. In addition, multiscale imaging involves in vivo microscopy and multimodal imaging using ultrasound can also be integrated. Clinical translation of our systems in ongoing at multiple leading hospitals including Cleveland Clinic.
Surgeons need surgical navigation and intraoperative imaging for accurate surgery. However, current clinical surgical navigation systems have many drawbacks, including the need of fiducial markers, the lack of real-time registration update, and lack of functional imaging. We have developed a novel system that delivers simultaneous real-time multimodal optical imaging, CT-to-Optical image registration, and dynamic fiducial-free surgical navigation. The system integrated real-time intraoperative optical imaging and dynamic CT-based surgical navigation, offering complementary functional and structural information for surgical guidance. To the best of our knowledge, this is the first report of a system capable of concurrent intraoperative fluorescence imaging and dynamic, real-time, CT-based navigation, whereas multiple modalities are accurately registered.
We have investigated and developed deep-learning-based methods for medical image segmentation and multimodal image registration. Automatic spine segmentation was enabled. In addition, we have developed a novel method for multimodal optical image registration.
We are collaborating with surgeons from multiple leading hospitals to translate our systems into clinical studies under IRB. The technology we developed can be broadly applied to multiple surgical subspecialties and clinical use cases. We intend to bring our technologies from the benchtop to the bedside of patients.