These were some experiments I tried in viewing medical data in 3D, given only the scan slices, ~circa 2010. Most modern equipment does much better and some can display 3D views, but this was still a fun exercise in seeing what I could do with flat slices and custom tools.
A number of years ago, I had my sinuses scanned (to find out why there was no air getting through them). Do you have any idea how easy it is to understand scans if you're not a medical professional? The answer is: It's not.
The first hurdle that I had to face at the time was just getting the scans, which required borrowing a favour from a friend at work with a large-format scanner for the X-Ray-like transparency which I was given, and then figuring out how to scan the transparent and reflective bugger!
Jump ahead to two years ago: I fell awkwardly while ice-skating, and managed to break my left shoulder. I have a couple of pins and a screw in there now, all for just a really small piece of bone - actually part of the shoulder blade.
By the time I got the shoulder scans (as a beautiful digital format that I didn't have to worry about scanning) I already had a solution, which I present here.
The first thing that I tried, was how to get it working in 3D. Luckily for me, the distance between the slices and the scales are printed on the slides, so knowing how to lay it all out in 3D and the right scale wasn't too hard. I thought that a stack of transparent images in any modelling software might be okay, but I was way-wrong! I needed transparency.
Long-story-short, I managed to manually apply transparency, supported in .png image files, to the images to keep only the white sections and make the black invisible. Remember how 3D websites were supposed to be the next big thing in the 1990s? No? You're not alone. Anyway, just like HTML makes web-pages, VRML and WRL are simple text descriptions of 3D scene, and though they are dinosaurs now, a similar concept is current and modern even, in X3D files. From a few examples and a little hacking, I managed to make a hand-coded .x3d file, and stumbled upon the answer! The problem was that all of the image processing took time, and as you can see the contrast was all wrong and the images had text and other nonsense, and see how my teeth don't seem to progress normally? It turns out that the images weren't aligned!
I developed a mask for the images, to get rid of the overlays, then I aligned them to each other using video tools and a script normally meant for removing camera shake (I'm pretty handy with video processing you know).
In playing with the contrast settings, a vital step was the step of applying transparency. I didn't have the tools to do this automatically, and if you've had any experience with ImageMagick then you'll know it's probably possible to automate, but you could spend a month fiddling before it worked out right. A big thank you to my friend Lachlan Patrick for GreyToAlpha.exe - a small command-line tool which he created just for this task!
The final result was awesome! I've been using FluxPlayer to view my files, it's quick and free and standards-compliant, and unlike most other model viewers it uses DirectX instead of OpenGL, which is great because you can not just view the models in 3D, you can actually see them in 3D.
The face-flat-to-the-screen images weren't the only set that I had by the way, I also had another set from below. Combining them was... unsuccessful. Too hard to get right unfortunately.
I used all of the same techniques when I got much better data for my broken shoulder.
Interesting trivia: A broken shoulder is no fun, but really not so bad compared to having my tonsils out for example, or really really bad sunburn. It helps that I'm right-handed and I broke my left one, and that I could still use my hand. Many thanks to the staff at the Mater hospital!
The full shoulder model is wonderfully detailed as you can see. Fluids show up extra-bright so you can see the blood vessels in my skin above the layer of fat muscle. The shoulder is surprisingly complex though! I had to crop the images and use a subset of them before I could find the main part of the joint inside, and I inverted the images too (so on the right, fluids are dark and bone is bright). That second model is very compelling.
The diagnostic clinicians are amazing - my surgeon spotted the problem at once from a single glance. Not so good that the early X-Rays and the rather impolite woman who did the ultrasound all sent me home with a diagnosis of just being dislocated. I had to dig deep for the right views to see what had really happened, which you can see below. I've added an annotated image at the right to show you what you're looking at.
There was a good result for me: A successful surgery and full use of my left arm!