Hi,
I'd like to generate a movie in real time with a self-made application doing fast screen captures with part of the screen occupied by a running 3D application.
I'm aware that several applications already exist for this (like FRAPS or Taksi), and even dedicated DirectShow filters (like UScreenCapture), but i really need to make this with my own external application.
When correctly setup (UScreenCapture + ffdshow), capturing an compressing a full screen does not consumes as much CPU as you would expect (about 15%), and does not impairs the performances of the 3D app.
The problem of doing a capture from an external application is that the 3D application loses it's Vsync and creates a shaggy, difficult to use 3D application (3D app is only presented on a small part of the screen, the rest being GDI, DirectX)
FRAPS solves this problem by allowing you to capture only one application at a time (the one with focus). Depending on the technology used (OpenGl, DirectX, GDI), it hooks the Vsync and does its capture (with glReadPixels,...), without perturbing it.
Doing this does not solve my problem, since I want the full composed screen image (including 3D and the rest) AND a smooth 3D app.
The UScreenCapture seems to use a fast DirectX call to capture the whole screen, but the openGL 3D app is still out of sync.
Doing a BitBlt is too slow and CPU consumming to do real time 30 fps acquisition (at least under windows XP, not sure with 7)
My question is to know if there is a way to achieve my goal with Windows 7 and it's brand new DirectX compositing engine?
Windows 7 succeeds to show live VSynced duplicated previews of every app (in the taskbar), so there must be a way to access the currenlty displayed screen buffer without perturbing the rendering of the 3D OpenGL app ?
Any other suggestion, technology ?
thank you