How do you set a live video feed as an OpenGL RenderTarget or framebuffer?

Posted by Joe on Stack Overflow See other posts from Stack Overflow or by Joe
Published on 2010-05-28T18:11:32Z Indexed on 2010/05/28 18:11 UTC
Read the original article Hit count: 166

Filed under:
|

i would like to take a live video feed from a video camera or 2 to do split screen stuff and render on top of them. How can i capture the input of the video?

i found some old code that uses pbuffers.. is this still the optimal way of doing it?

i guess their is a lot that depends on the connection interface, whether it is USB or fire wire or whatever?

thanks!

© Stack Overflow or respective owner

Related posts about opengl

Related posts about opengl-es