Drawing a texture with an alpha channel doesn't work -- draws black

Posted by DevDevDev on Stack Overflow See other posts from Stack Overflow or by DevDevDev
Published on 2010-03-23T23:48:35Z Indexed on 2010/03/23 23:53 UTC
Read the original article Hit count: 430

Filed under:
|
|
|

I am modifying GLPaint to use a different background, so in this case it is white. Anyway the existing stamp they are using assumes the background is black, so I made a new background with an alpha channel. When I draw on the canvas it is still black, what gives? When I actually draw, I just bind the texture and it works. Something is wrong in this initialization.

Here is the photo alt text

- (id)initWithCoder:(NSCoder*)coder 
{
    CGImageRef brushImage;
    CGContextRef brushContext;
    GLubyte *brushData;
    size_t width, height;

    if (self = [super initWithCoder:coder]) 
    {
        CAEAGLLayer *eaglLayer = (CAEAGLLayer *)self.layer;

        eaglLayer.opaque = YES;
        // In this application, we want to retain the EAGLDrawable contents after a call to presentRenderbuffer.
        eaglLayer.drawableProperties = [NSDictionary dictionaryWithObjectsAndKeys:
                                        [NSNumber numberWithBool:YES], kEAGLDrawablePropertyRetainedBacking, kEAGLColorFormatRGBA8, kEAGLDrawablePropertyColorFormat, nil];

        context = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES1];

        if (!context || ![EAGLContext setCurrentContext:context]) {
            [self release];
            return nil;
        }

        // Create a texture from an image
        // First create a UIImage object from the data in a image file, and then extract the Core Graphics image
        brushImage = [UIImage imageNamed:@"test.png"].CGImage;

        // Get the width and height of the image
        width = CGImageGetWidth(brushImage);
        height = CGImageGetHeight(brushImage);

        // Texture dimensions must be a power of 2. If you write an application that allows users to supply an image,
        // you'll want to add code that checks the dimensions and takes appropriate action if they are not a power of 2.

        // Make sure the image exists
        if(brushImage) 
        {
            brushData = (GLubyte *) calloc(width * height * 4, sizeof(GLubyte));
            brushContext = CGBitmapContextCreate(brushData, width, width, 8, width * 4, CGImageGetColorSpace(brushImage), kCGImageAlphaPremultipliedLast);
            CGContextDrawImage(brushContext, CGRectMake(0.0, 0.0, (CGFloat)width, (CGFloat)height), brushImage);
            CGContextRelease(brushContext);
            glGenTextures(1, &brushTexture);
            glBindTexture(GL_TEXTURE_2D, brushTexture);
            glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
            glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, width, height, 0, GL_RGBA, GL_UNSIGNED_BYTE, brushData);
            free(brushData);
        }

        //Set up OpenGL states
        glMatrixMode(GL_PROJECTION);
        CGRect frame = self.bounds;
        glOrthof(0, frame.size.width, 0, frame.size.height, -1, 1);
        glViewport(0, 0, frame.size.width, frame.size.height);
        glMatrixMode(GL_MODELVIEW);

        glDisable(GL_DITHER);
        glEnable(GL_TEXTURE_2D);
        glEnable(GL_BLEND);
        glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_DST_ALPHA);
        glEnable(GL_POINT_SPRITE_OES);
        glTexEnvf(GL_POINT_SPRITE_OES, GL_COORD_REPLACE_OES, GL_TRUE);
        glPointSize(width / kBrushScale);
    }
    return self;
}

© Stack Overflow or respective owner

Related posts about opengl-es

Related posts about opengl