Touchless Interaction and Control of Medical Images in Surgery

Project description and objectives:

The number of graphical systems considering touchless interaction and control of medical images for surgical settings has recently grown significantly. In particular, two new input devices that have a gesture-based controller have significantly contributed to this grown: the Kinect sensor (based on a laser and horizontally displaced infrared camera) and the Leap Motion (which projects a cone of infrared light to detect hand and finger positions with a pair of CCD cameras). Both devices can provide a touchless interaction of medical images in surgery and offer some important advantages, for example, maintaining sterility within the operating theatre. However, direct control over image manipulation and navigation during surgery requires a much richer set of image manipulation options beyond select, rotate, pan, and zoom. Using these two previously mentioned input devices, the project aims at creating a graphical system which can be used as a supporting tool for 3D gesture control for touchless interaction during a specific surgical procedure. We also plan to evaluate systematically the system based on real case-studies.

Supported by: