Fragment shaders on a texture

Posted by Snowangelic on Stack Overflow See other posts from Stack Overflow or by Snowangelic
Published on 2010-03-14T15:21:54Z Indexed on 2010/03/14 15:25 UTC
Read the original article Hit count: 381

Hello stack overflow.

I am trying to add some post-processing capabilities to a program. The rendering is done using openGL. I just want to allow the program to load some home made fragment shader and use them on the video stream.

I wrote a little piece of shader using "OpenGL Shader Builder" that just turns a texture in grayscale. The shaders works well in the shader builder but I can't make it work in the main program. The screens stays all black.

Here is the setup :

@implementation PluginGLView

- (id) initWithCoder: (NSCoder *) coder
{
const GLubyte * strExt;

if ((self = [super initWithCoder:coder]) == nil)
    return nil;

glLock = [[NSLock alloc] init];
if (nil == glLock) {
    [self release];
    return nil;
}

// Init pixel format attribs
NSOpenGLPixelFormatAttribute attrs[] =
{
    NSOpenGLPFAAccelerated,
    NSOpenGLPFANoRecovery,
    NSOpenGLPFADoubleBuffer,
    0
};

// Get pixel format from OpenGL
NSOpenGLPixelFormat* pixFmt = [[NSOpenGLPixelFormat alloc] initWithAttributes:attrs];
if (!pixFmt)
{
    NSLog(@"No Accelerated OpenGL pixel format found\n");

    NSOpenGLPixelFormatAttribute attrs2[] =
    {
        NSOpenGLPFANoRecovery,
        0
    };

    // Get pixel format from OpenGL
    pixFmt = [[NSOpenGLPixelFormat alloc] initWithAttributes:attrs2];
    if (!pixFmt) {
        NSLog(@"No OpenGL pixel format found!\n");

        [self release];
        return nil;
    }
}

[self setPixelFormat:[pixFmt autorelease]];

/*
long swapInterval = 1 ;
[[self openGLContext]
        setValues:&swapInterval
        forParameter:NSOpenGLCPSwapInterval];
*/
[glLock lock];
[[self openGLContext] makeCurrentContext];

// Init object members
strExt = glGetString (GL_EXTENSIONS);
texture_range  = gluCheckExtension ((const unsigned char *)"GL_APPLE_texture_range", strExt) ? GL_TRUE : GL_FALSE;
texture_hint   = GL_STORAGE_SHARED_APPLE ;
client_storage = gluCheckExtension ((const unsigned char *)"GL_APPLE_client_storage", strExt) ? GL_TRUE : GL_FALSE;
rect_texture   = gluCheckExtension((const unsigned char *)"GL_EXT_texture_rectangle", strExt) ? GL_TRUE : GL_FALSE;

// Setup some basic OpenGL stuff
glPixelStorei(GL_UNPACK_ALIGNMENT, 1);
glTexEnvi(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_REPLACE);
glColor4f(1.0f, 1.0f, 1.0f, 1.0f);
glClearColor(0.0f, 0.0f, 0.0f, 1.0f);
glClear(GL_COLOR_BUFFER_BIT);

// Loads the shaders
shader=LoadShader(GL_FRAGMENT_SHADER,"/Users/alexandremathieu/fragment.fs");
program=glCreateProgram();
glAttachShader(program, shader);
glLinkProgram(program);
glUseProgram(program);

[NSOpenGLContext clearCurrentContext];
[glLock unlock];

image_width = 1024;
image_height = 512;
image_depth = 16;

image_type = GL_UNSIGNED_SHORT_1_5_5_5_REV;
image_base = (GLubyte *) calloc(((IMAGE_COUNT * image_width * image_height) / 3) * 4, image_depth >> 3);
if (image_base == nil) {
    [self release];
    return nil;
}

// Create and load textures for the first time
[self loadTextures:GL_TRUE];

// Init fps timer
//gettimeofday(&cycle_time, NULL);

drawBG = YES;

// Call for a redisplay
noDisplay = YES;
PSXDisplay.Disabled = 1;
[self setNeedsDisplay:true];

return self;
}

And here is the "render screen" function wich basically...renders the screen.

- (void)renderScreen
{
    int bufferIndex = whichImage;

    glBindTexture(GL_TEXTURE_RECTANGLE_EXT, bufferIndex+1);

    glUseProgram(program);
    int loc=glGetUniformLocation(program, "texture");
    glUniform1i(loc,bufferIndex+1);

    glTexSubImage2D(GL_TEXTURE_RECTANGLE_EXT, 0, 0, 0, image_width, image_height, GL_BGRA, image_type, image[bufferIndex]);


    glBegin(GL_QUADS);
        glTexCoord2f(0.0f, 0.0f);
        glVertex2f(-1.0f, 1.0f);

        glTexCoord2f(0.0f, image_height);
        glVertex2f(-1.0f, -1.0f);

        glTexCoord2f(image_width, image_height);
        glVertex2f(1.0f, -1.0f);

        glTexCoord2f(image_width, 0.0f);
        glVertex2f(1.0f, 1.0f);
    glEnd();

    [[self openGLContext] flushBuffer];
    [NSOpenGLContext clearCurrentContext];
    //[glLock unlock];

}

and finally here's the shader.

uniform sampler2DRect texture;

void main() {
    vec4 color, texel;
    color = gl_Color;
    texel = texture2DRect(texture, gl_TexCoord[0].xy);
    color *= texel;
    // Begin Shader
    float gray=0.0;
    gray+=(color.r + color.g + color.b)/3.0;
    color=vec4(gray,gray,gray,color.a);  
    // End Shader
    gl_FragColor = color;
 }

The loading and using of shaders works since I am able to turn the screen all red with this shader

void main(){
    gl_FragColor=vec4(1.0,0.0,0.0,1.0);
}

If the shader contains a syntax error I get an error message from the LoadShader function etc. If I remove the use of the shader, everything works normally.

I think the problem comes from the "passing the texture as a uniform parameter" thing. But these are my very firsts step with openGL and I cant be sure of anything.

Don't hesitate to ask for more info.

Thank you Stack O.

© Stack Overflow or respective owner

Related posts about objective-c

Related posts about opengl