How can I make a 32 bit render target with a 16 bit alpha channel in DirectX?

Posted by J Junker on Game Development See other posts from Game Development or by J Junker
Published on 2011-06-21T17:31:02Z Indexed on 2011/06/22 0:33 UTC
Read the original article Hit count: 254

Filed under:
|
|

I want to create a render target that is 32-bit, with 16 bits each for alpha and luminance. The closest surface formats I can find in the DirectX SDK are:

D3DFMT_A8L8   // 16-bit using 8 bits each for alpha and luminance.
D3DFMT_G16R16F   // 32-bit float format using 16 bits for the red channel and 16 bits for the green channel.

But I don't think either of these will work, since D3DFMT_A8L8 doesn't have the precision and D3DFMT_G16R16F doesn't have an alpha channel (I need a separate blend state for alpha).

How can I create a render target that allows a separate blend state for luminance and alpha, with 16 bit precision on each channel, that doesn't exceed 32 bits per pixel?

© Game Development or respective owner

Related posts about rendering

Related posts about directx9