Does DirectX implement Triple Buffering?

Posted by Asik on Game Development See other posts from Game Development or by Asik
Published on 2013-07-02T18:55:06Z Indexed on 2013/07/02 23:19 UTC
Read the original article Hit count: 353

Filed under:
|

As AnandTech put it best in this 2009 article:

In render ahead, frames cannot be dropped. This means that when the queue is full, what is displayed can have a lot more lag. Microsoft doesn't implement triple buffering in DirectX, they implement render ahead (from 0 to 8 frames with 3 being the default).

The major difference in the technique we've described here is the ability to drop frames when they are outdated. Render ahead forces older frames to be displayed. Queues can help smoothness and stuttering as a few really quick frames followed by a slow frame end up being evened out and spread over more frames. But the price you pay is in lag (the more frames in the queue, the longer it takes to empty the queue and the older the frames are that are displayed).

As I understand it, DirectX "Swap Chain" is merely a render ahead queue, i.e. buffers cannot be dropped; the longer the chain, the greater the input latency. At the same time, I find it hard to believe that the most widely used graphics API would not implement such fundamental functionality correctly. Is there a way to get proper triple buffered vertical synchronisation in DirectX?

© Game Development or respective owner

Related posts about directx

Related posts about vsync