Why am I not getting an sRGB default framebuffer?

Posted by Aaron Rotenberg on Game Development See other posts from Game Development or by Aaron Rotenberg
Published on 2014-04-21T19:07:50Z Indexed on 2014/06/03 21:42 UTC
Read the original article Hit count: 330

Filed under:
|
|
|

I'm trying to make my OpenGL Haskell program gamma correct by making appropriate use of sRGB framebuffers and textures, but I'm running into issues making the default framebuffer sRGB.

Consider the following Haskell program, compiled for 32-bit Windows using GHC and linked against 32-bit freeglut:

import Foreign.Marshal.Alloc(alloca)
import Foreign.Ptr(Ptr)
import Foreign.Storable(Storable, peek)
import Graphics.Rendering.OpenGL.Raw
import qualified Graphics.UI.GLUT as GLUT
import Graphics.UI.GLUT(($=))

main :: IO ()
main = do
    (_progName, _args) <- GLUT.getArgsAndInitialize
    GLUT.initialDisplayMode $= [GLUT.SRGBMode]
    _window <- GLUT.createWindow "sRGB Test"

    -- To prove that I actually have freeglut working correctly.
    -- This will fail at runtime under classic GLUT.
    GLUT.closeCallback $= Just (return ())

    glEnable gl_FRAMEBUFFER_SRGB
    colorEncoding <- allocaOut $ glGetFramebufferAttachmentParameteriv
        gl_FRAMEBUFFER gl_FRONT_LEFT gl_FRAMEBUFFER_ATTACHMENT_COLOR_ENCODING
    print colorEncoding

allocaOut :: Storable a => (Ptr a -> IO b) -> IO a
allocaOut f = alloca $ \ptr -> do
    f ptr
    peek ptr

On my desktop (Windows 8 64-bit with a GeForce GTX 760 graphics card) this program outputs 9729, a.k.a. gl_LINEAR, indicating that the default framebuffer is using linear color space, even though I explicitly requested an sRGB window. This is reflected in the rendering results of the actual program I'm trying to write - everything looks washed out because my linear color values aren't being converted to sRGB before being written to the framebuffer.

On the other hand, on my laptop (Windows 7 64-bit with an Intel graphics chip), the program prints 0 (huh?) and I get an sRGB default framebuffer by default whether I request one or not! And on both machines, if I manually create a non-default framebuffer bound to an sRGB texture, the program correctly prints 35904, a.k.a. gl_SRGB.

Why am I getting different results on different hardware? Am I doing something wrong? How can I get an sRGB framebuffer consistently on all hardware and target OSes?

© Game Development or respective owner

Related posts about opengl

Related posts about color