Automatic 3D  reconstruction from DICOM data for  Neruosurgical simulations

These are some of my projects in Virtual Reality 

Automatic 3D reconstruction from DICOM data for Neruosurgical simulations

Category: Volume Visualization

Residents of the Neurosurgery Department at University of Illinois Hospital get trained on surgical procedures using haptic based virtual reality simulations. Time required to produce such simulations is long and still dependent on partners at the College of Engineering. This project aimed at developing methods to

solve the above problems through automation.

This was project was completed as a part of the course in Virtual Manufacturing. Aimed at solving challenges involved in haptic based neurosurgical simulations, this project introduced a method to rapidly

build independent 3D models of parts of the head:

a) Skull

b) Ventricles

c) Brain

d) Outer skin


Automation was achieved by means of a simple algorithm developed for the project. The algorithm in turn makes use of the snake algorithm to obtain exact shape.

Algorithm in short

1) Split the mesh into 8 octants numbered 1 to 8 as shown above.
2) Calculate the voxel density in each octant
3) Obtain voxel density in terms of percentage of the whole mesh
4) Place the n bubbles in those octants that have the top n voxel densities
5) proceed to segmentation.

A few screenshots from the project



Untitled from arun rakesh yoganandan on Vimeo.