Calculating vertex normals on the GPU

Posted by Etan on Game Development See other posts from Game Development or by Etan
Published on 2011-06-22T10:14:13Z Indexed on 2011/06/22 16:31 UTC
Read the original article Hit count: 283

Filed under:
|
|
|
|

I have some height-map sampled on a regular grid stored in an array. Now, I want to use the normals on the sampled vertices for some smoothing algorithm. The way I'm currently doing it is as follows:

  1. For each vertex, generate triangles to all it's neighbours. This results in eight neighbours when using the 1-neighbourhood for all vertices except at the borders.

     +---+---+
     ¦ \ ¦ / ¦
     +---o---+
     ¦ / ¦ \ ¦
     +---+---+
    
  2. For each adjacent triangle, calculate it's normal by taking the cross product between the two distances.

  3. As the triangles all have the same size when projected on the xy-plane, I simply average over all eight normals then and store it for this vertex.

However, as my data grows larger, this approach takes too much time and I would prefer doing it on the GPU in a shader code. Is there an easy method, maybe if I could store my height-map as a texture?

© Game Development or respective owner

Related posts about XNA

Related posts about c#