GLSL shader render to texture not saving alpha value

Posted by quadelirus on Stack Overflow See other posts from Stack Overflow or by quadelirus
Published on 2010-05-08T04:10:33Z Indexed on 2010/05/08 4:18 UTC
Read the original article Hit count: 373

Filed under:
|
|
|

I am rendering to a texture using a GLSL shader and then sending that texture as input to a second shader. For the first texture I am using RGB channels to send color data to the second GLSL shader, but I want to use the alpha channel to send a floating point number that the second shader will use as part of its program. The problem is that when I read the texture in the second shader the alpha value is always 1.0. I tested this in the following way:

at the end of the first shader I did this:

gl_FragColor(r, g, b, 0.1);

and then in the second texture I read the value of the first texture using something along the lines of

vec4 f = texture2D(previous_tex, pos);
if (f.a != 1.0) {
    gl_FragColor = vec4(0.0, 0.0, 0.0, 1.0);
    return;
}

No pixels in my output are black, whereas if I change the above code to read

gl_FragColor(r, g, 0.1, 1.0); //Notice I'm now sending 0.1 for blue

and in the second shader

vec4 f = texture2D(previous_tex, pos);
if (f.b != 1.0) {
    gl_FragColor = vec4(0.0, 0.0, 0.0, 1.0);
    return;
}

All the appropriate pixels are black. This means that for some reason when I set the alpha value to something other than 1.0 in the first shader and render to a texture, it is still seen as being 1.0 by the second shader.

Before I render to texture I glDisable(GL_BLEND);

It seems pretty clear to me that the problem has to do with OpenGL handling alpha values in some way that isn't obvious to me since I can use the blue channel in the way I want, and figured someone out there will instantly see the problem.

© Stack Overflow or respective owner

Related posts about glsl

Related posts about opengl