Learning OpenGL GLSL - VAO buffer problems?
- by Bleary
I've just started digging through OpenGL and GLSL, and now stumbled on something I can't get my head around this one!? I've stepped back to loading a simple cube and using a simple shader on it, but the result is triangles drawn incorrectly and/or missing. The code I had working perfectly on meshes, but was attempting to move to using VAOs so none of the code for storing the vertices and indices has changed.
http://i.stack.imgur.com/RxxZ5.jpg
http://i.stack.imgur.com/zSU50.jpg
What I have for creating the VAO and buffers is this
//Create the Vertex array object
glGenVertexArrays(1, &vaoID);
// Finally create our vertex buffer objects
glGenBuffers(VBO_COUNT, mVBONames);
glBindVertexArray(vaoID);
// Save vertex attributes into GPU
glBindBuffer(GL_ARRAY_BUFFER, mVBONames[VERTEX_VBO]);
// Copy data into the buffer object
glBufferData(GL_ARRAY_BUFFER, lPolygonVertexCount*VERTEX_STRIDE*sizeof(GLfloat), lVertices, GL_STATIC_DRAW);
glEnableVertexAttribArray(pos);
glVertexAttribPointer(pos, 3, GL_FLOAT, GL_FALSE, VERTEX_STRIDE*sizeof(GLfloat),0);
glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, mVBONames[INDEX_VBO]);
glBufferData(GL_ELEMENT_ARRAY_BUFFER, lPolygonCount*sizeof(unsigned int), lIndices, GL_STATIC_DRAW);
glBindVertexArray(0);
And the code for drawing the mesh.
glBindVertexArray(vaoID);
glUseProgram(shader->programID);
GLsizei lOffset = mSubMeshes[pMaterialIndex]->IndexOffset*sizeof(unsigned int);
const GLsizei lElementCount = mSubMeshes[pMaterialIndex]->TriangleCount*TRIAGNLE_VERTEX_COUNT;
glDrawElements(GL_TRIANGLES, lElementCount, GL_UNSIGNED_SHORT, reinterpret_cast<const GLvoid*>(lOffset));
// All the points are indeed in the correct place!?
//glPointSize(10.0f);
//glDrawElements(GL_POINTS, lElementCount, GL_UNSIGNED_SHORT, 0);
glUseProgram(0);
glBindVertexArray(0);
Eyes have become bleary looking at this today so any thoughts or a fresh set of eyes would be greatly appreciated.