For each vertex I use two floats as position and four unsigned bytes as color.
I want to store all of them in one table, so I tried casting those four unsigned bytes to one float, but I am unable to do that correctly...
All in all, my tests came to one point:
GLfloat vertices[] = { 1.0f, 0.5f, 0, 1.0f, 0, 0 };
glEnableVertexAttribArray(0);
glVertexAttribPointer(0, 2, GL_FLOAT, GL_FALSE, 2 * sizeof(float),
vertices);
// VER1 - draws red triangle
// unsigned char colors[] = { 0xff, 0, 0, 0xff, 0xff, 0, 0, 0xff, 0xff, 0, 0,
// 0xff };
// glEnableVertexAttribArray(1);
// glVertexAttribPointer(1, 4, GL_UNSIGNED_BYTE, GL_TRUE, 4 * sizeof(GLubyte),
// colors);
// VER2 - draws greenish triangle (not "pure" green)
// float f = 255 << 24 | 255; //Hex:0xff0000ff
// float colors2[] = { f, f, f };
// glEnableVertexAttribArray(1);
// glVertexAttribPointer(1, 4, GL_UNSIGNED_BYTE, GL_TRUE, 4 * sizeof(GLubyte),
// colors2);
// VER3 - draws red triangle
int i = 255 << 24 | 255; //Hex:0xff0000ff
int colors3[] = { i, i, i };
glEnableVertexAttribArray(1);
glVertexAttribPointer(1, 4, GL_UNSIGNED_BYTE, GL_TRUE, 4 * sizeof(GLubyte),
colors3);
glDrawArrays(GL_TRIANGLES, 0, 3);
Above code is used to draw one simple red triangle. My question is - why do versions 1 and 3 work correctly, while version 2 draws some greenish triangle?
Hex values are one I read by marking variable during debug. They are equal for version 2 and 3 - so what causes the difference?