Hand bone segmentation and animation

Registration-based segmentation with articulated model from multi-postural magnetic resonance images for hand bone motion animation

Hsin-Chen Chena, I-Ming Jouc, Chien-Kuo Wangd, Fong-Chin Sue, Yung-Nien Suna, b, *

a Department of Computer Science and Information Engineering, National Cheng Kung University, No. 1, University Road, Tainan 701, Taiwan, ROCb Department of Computer Science and Information Engineering, National Pingtung Institute of Commerce, No. 51, Min Sheng E. Road, Pingtung 900, Taiwan, ROCc Department of Orthopedics, College of Medicine, National Cheng Kung University, No. 138, Sheng Li Road, Tainan 704, Taiwan, ROCd Department of Radiology, National Cheng Kung University Hospital, No. 138, Sheng Li Road, Tainan 704, Taiwan, ROCe Institute of Biomedical Engineering, National Cheng Kung University, No. 1, University Road, Tainan 701, Taiwan, ROC

Automatic segmentation for hand bones

FIG. 1. Five postures of grasping. (a) Neutral posture (posture 1); (b)-(d) the intermediate postures (posture 2-4) between neutral and cage postures; (e) cage posture (posture 5); (f)-(j) sagittal cross sections selected from MR volumetric images of posture 1 to posture 5, respectively.

FIG. 2. Segmentation results of hand bones from the validation postural images using the proposed method. The top row shows the surfaces of segmented bones. The center and bottom rows demonstrate the overlaid contours between the bone surfaces and cross-sectional images.

FIG. 3. Segmentation results of hand bones from the ball-grabbing postural image.

FIG. 4. Segmentation results of hand bones from multi-postural images of different subjects.

Hand motion animations

1. Motion sequence interpolation from neutral posture to posture 3 and then from posture 3 to cage posture

2. Motion sequence interpolation from neutral posture to ball-grabbing posture

3. Motion sequence interpolation from neutral posture to cage posture

4. Dynamic MR image simulation of 3D hand motion

5. Middle finger active motion sequence generated by applying the motion data of a trigger finger patient tracked by optical motion capture system to the proposed hand model