Calculating vertex normals on the GPU
- by Etan
I have some height-map sampled on a regular grid stored in an array. Now, I want to use the normals on the sampled vertices for some smoothing algorithm. The way I'm currently doing it is as follows:
For each vertex, generate triangles to all it's neighbours. This results in eight neighbours when using the 1-neighbourhood for all vertices except at the borders.
+---+---+
¦ \ ¦ / ¦
+---o---+
¦ / ¦ \ ¦
+---+---+
For each adjacent triangle, calculate it's normal by taking the cross product between the two distances.
As the triangles all have the same size when projected on the xy-plane, I simply average over all eight normals then and store it for this vertex.
However, as my data grows larger, this approach takes too much time and I would prefer doing it on the GPU in a shader code. Is there an easy method, maybe if I could store my height-map as a texture?