Good morning,
I'm working through creating the spherical billboards technique outlined in this paper. I'm trying to create a shader that calculates the distance from the camera to all objects in the scene and stores the results in a texture. I keep getting either a completely black or white texture.
Here are my questions:
I assume the position that's automatically sent to the vertex shader from ogre is in object space?
The gpu interpolates the output position from the vertex shader when it sends it to the fragment shader. Does it do the same for my depth calculation or do I need to move that calculation to the fragment shader?
Is there a way to debug shaders? I have no errors but I'm not sure I'm getting my parameters passed into the shaders correctly.
Here's my shader code:
void DepthVertexShader(
float4 position : POSITION,
uniform float4x4 worldViewProjMatrix,
uniform float3 eyePosition,
out float4 outPosition : POSITION,
out float Depth
)
{
// position is in object space
// outPosition is in camera space
outPosition = mul( worldViewProjMatrix, position );
// calculate distance from camera to vertex
Depth = length( eyePosition - position );
}
void DepthFragmentShader(
float Depth : TEXCOORD0,
uniform float fNear,
uniform float fFar,
out float4 outColor : COLOR
)
{
// clamp output using clip planes
float fColor = 1.0 - smoothstep( fNear, fFar, Depth );
outColor = float4( fColor, fColor, fColor, 1.0 );
}
fNear is the near clip plane for the scene
fFar is the far clip plane for the scene