What is the PIXELFORMATDESCRIPTOR parameter in SetPixelFormat() used for?
- by Mads Elvheim
Usually when setting up OpenGL contexts, I've simply filled out a PIXELFORMATDESCRIPTOR structure with the necessary information and called ChoosePixelFormat(), followed by a call to SetPixelFormat() with the returned matching pixelformat from ChoosePixelFormat(). Then I've simply passed the initial descriptor without giving much thought of why.
But now I use wglChoosePixelFormatARB() instead if ChoosePixelFormat() because I need some extended traits like sRGB and multisampling. It takes an attribute list of integers, just like XLib/GLX on Linux, not a PIXELFORMATDESCRIPTOR structure. So, do I really have to fill in a descriptor for SetPixelFormat() to use? What does SetPixelFormat() use the descriptor for when it already has the pixelformat descriptor index? Why do I have to specify the same pixelformat attributes in two different places? And which one takes precedence; the attribute list to wglChoosePixelFormatARB(), or the PIXELFORMATDESCRIPTOR attributes passed to SetPixelFormat()?
Here are the function prototypes, to make the question more clear:
/* Finds a best match based on a PIXELFORMATDESCRIPTOR,
and returns the pixelformat index */
int ChoosePixelFormat(HDC hdc, const PIXELFORMATDESCRIPTOR *ppfd);
/* Finds a best match based on an attribute list of integers and floats,
and returns a list of indices of matches, with the best matches at the head.
Also supports extended pixelformat traits like sRGB color space,
floating-point framebuffers and multisampling. */
BOOL wglChoosePixelFormatARB(HDC hdc, const int *piAttribIList,
const FLOAT *pfAttribFList, UINT nMaxFormats, int *piFormats,
UINT *nNumFormats
);
/* Sets the pixelformat based on the pixelformat index */
BOOL SetPixelFormat(HDC hdc, int iPixelFormat, const PIXELFORMATDESCRIPTOR *ppfd);