OpenGL Vertex Attributes - Normalisation
- by Daniel
Alas, I have searched, and have found no definitive answer.
When would you normalize the vertex data in OpenGL using the following command:
glVertexAttribPointer(index, size, type, normalize, stride, pointer);
I.e when would normalize == GL_TRUE; what situations, and why would you choose to let the GPU do the calculations instead of preprocessing it?
All examples I have ever seen, have this set to GL_FALSE; and I cannot personally see a use for it. But Khronos aren't stupid, so it must be there for something useful (and probably common).