This project was just a total excuse for me to finally figure out what these HistoPyramids things were and how they work, which I had been obsessed with ever since seeing this "Bunny of Doom" video
Very cool, they've got real-time IsoSurface generation on the GPU, and an awesome particle system flame effect. All I ever got working was PointList generation and a very simple particle system explosion. Anyway, I'm finally getting around to posting the notes I took for this project.
First, just a quick overview of the process I used, the pictures are all shamelessly lifted from GPU Point List Generation through HistogramPyramids, because they look so much nicer than the debug images I was generating, and I was pretty deficient in taking screenshots during the development process.
Slice - Use the GPU to render a 3D model into 2D volume slices.
Compose - Compose the individual 2D volume slices into a single 2D "lattice" texture.
Thresholding - Use a discriminator function to assign each texel a binary active/inactive state.
Reductions - Use shaders to create a custom GPU based reduction of the lattice texture to generate the HistoPyramid data structure (similar to mipmapping).
Points - Traverse the HistoPyramid in a top-down fashion to generate a 2D texture listing the active cells' 3D coordinates.
Explosion - Apply appropriate forces to generated particles to simulate explosion effect.
Slicing, Composition, and Thresholding were effectively performed in a single step. I think I did this slightly differently than the author. Basically what I found worked best was to create and fully generate a mipmapped texture and then render the slices directly to the base level of the mipmap texture. I found that if I did not mipmap the texture before rendering to its base level, subsequent mipmap levels where not guaranteed to be allocated correctly.
Slicing is performed by pushing the near and far clipping planes through the model as we render the results to the base level of the mipmap texture. Use the glOrtho command.
void glOrtho( GLdouble left,
GLdouble right,
GLdouble bottom,
GLdouble top,
GLdouble zNear,
GLdouble zFar )
OpenGL Gotcha #1: OpenGL negates the Z-component of the matrix produced by glOrtho, so keep that in mind when setting your near and far planes (as you might've guessed I made this mistake).
OpenGL Gotcha #2: So we don't lose backfacing polygons as we render our slices, be sure and disable culling in OpenGL.
glDisable(GL_CULL_FACE);
Composition is achieved by manipulating the OpenGL viewport such that each slice of the of the model is rendered to a separate tile of the base level of the mipmap texture.
void glViewport(GLint x,
GLint y,
GLsizei width,
GLsizei height )
Thresholding is accomplished implicitly by the fragment shader by setting the alpha value to 1 for every fragment that is processed (these will be active cells).
HistoPyramid Reduction, ie Custom Mipmapping, is accomplished by successively binding each level of the mipmapped histopyramid texture to a framebuffer and rendering to it using a custom shader which takes the previous mipmap level as input.
OpenGL Gotcha #3: Be sure and disable GL_BLEND in order to save values to the Alpha channel of your textures. Also, disable GL_DEPTH_TEST.
glDisable(GL_BLEND);
glDisable(GL_DEPTH_TEST);
OpenGL Gotcha #4: When passing the previous mipmap level into your shader make sure that OpenGL accesses the texture data from the correct level by setting the GL_TEXTURE_BASE_LEVEL and GL_TEXTURE_MAX_LEVEL texture parameters.
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_BASE_LEVEL, mipmap_level);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAX_LEVEL, mipmap_level);
PointList Generation
OpenGL Gotcha #5: Remember we when set the GL_TEXTURE_BASE_LEVEL, GL_TEXTURE_MAX_LEVEL texture parameters? We'll don't even think about trying to get data back out of them without setting those parameters to span your entire range of levels, you also have so to set the GL_TEXTURE_MIN_FILTER to GL_NEAREST_MIPMAP_NEAREST, I have no idea why.
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST_MIPMAP_NEAREST);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_BASE_LEVEL, 0);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAX_LEVEL, num_levels);
Explosion
Ironically, though it was the goal of the project I actually spent the least time on the explosion. I basically just generated a bunch of vertices, (and rendered them as point sprites), and used a vertex displacement shader to render them in the locations stored in the PointList Textures, ie, at points on the surface of the model. Then generated random initial (outward) velocity which I stored in a velocity texture, and then updated the PointList and Velocity textures in a shader as I would in a normal particle system. I know that the authors did some much fancier stuff with transform feedback buffers, but I had sort of run out of time at this point. Reading the spec on those is still on my todo list.