Post date: Oct 06, 2008 5:19:4 AM
So I'm posting this for posterity, and because I'm pretty sure I've run into this before. If you're going to use gl_Normal please for the love of God remember that gl_Normal is a vec3, gl_Normal is a vec3
I know your program is absolutely lousy with vec4's all over the place, but that doesn't matter because gl_Normal is a vec3. You can assign it to a vec4 if you want, go ahead, be my guest, I don't really care, and the compiler doesn't care. The compiler won't even complain, not even a warning. It's totally fine as long as you don't want any output and are cool with your shader doing absolutely nothing.
Maybe you have multiple render targets, well don't worry, none of them will work if you assign gl_Normal to anything but a vec3. So consider this a public service announcement, gl_Normal is a vec3, and for fuck's sake don't ever assign it to anything else.
UPDATE: I actually looked around for awhile for a place to submit a bug report on this, but couldn't find anything. So I settled on sending a sarcastic bug report/email to a friend of mine who'd just completed an internship with the Nvidia Windows OpenGL driver group. Of course, he couldn't do anything about it, but more interesting was his statement that "Nvidia doesn't really care about GLSL," which I guess is understandable since they'd rather promote CG as the shading language of choice, but still kinda sad as it leaves those of us who use GLSL somewhat helpless when it comes to finding bugs like this.
I hope more of the same doesn't happen with regards to OpenCL and CUDA.