How can I make a 32 bit render target with a 16 bit alpha channel in DirectX?
- by J Junker
I want to create a render target that is 32-bit, with 16 bits each for alpha and luminance. The closest surface formats I can find in the DirectX SDK are:
D3DFMT_A8L8 // 16-bit using 8 bits each for alpha and luminance.
D3DFMT_G16R16F // 32-bit float format using 16 bits for the red channel and 16 bits for the green channel.
But I don't think either of these will work, since D3DFMT_A8L8 doesn't have the precision and D3DFMT_G16R16F doesn't have an alpha channel (I need a separate blend state for alpha).
How can I create a render target that allows a separate blend state for luminance and alpha, with 16 bit precision on each channel, that doesn't exceed 32 bits per pixel?