Inconsistency # 1 signed and unsigned integers
I made the following enum to and allocate an array texures[NumTextures] to keep track of textures in the future.
enum Textures{
testTexture,
NumTextures
};
I perform the following and receive a black texture.
init{
...
glGenTextures(NumTextures, &TextureObjects[testTexture]);
glBindTexture(GL_TEXTURE_2D, TextureObjects[testTexture]);
glUniform1i(glGetUniformLocation(mProgram2, "gSampler"), TextureObjects[testTexture]);
}
paintGL{
glActiveTexture(GL_TEXTURE0);
glBindTexture(GL_TEXTURE_2D, TextureObjects[testTexture]);
}
The culprit after looking up each function in opengl sdk man pages is that 2nd argument to glUniform1i is in signed integer whereas glBindTexture uses unsigned integer.The enum's value is up to the compiler. So, I specify the enum to use unsigned int.
enum Textures: unsigned int {
testTexture,
NumTextures
};
I replace glUniform1i with glUniform1i(glGetUniformLocation(mProgram2, "gSampler"), 0) and now works as expected. This doesn't change the fact that some arguments take unsigned and others signed integers.
Bug #1 Intel openGL compiler/VS2013 compiler bug?
I get undefined texture behavior when I specify two glVertexAttribPointer for position and texture coordinates from 1 VBO if the (void *) pointer is dynamically allocated or is in a std::vector. If I separate texture and position coordinates into 2 VBOs, it works as expected.
Bug #2: Intel HD4000 latest driver the overloaded texture function in glsl does not support sampler2DShadow as the first argument texture despite not giving error for declaring a sampler2DShadow uniform.