AVFoundation: Video to OpenGL texture working - How to play and sync audio?

Posted by j00hi on Stack Overflow See other posts from Stack Overflow or by j00hi
Published on 2011-08-27T10:55:02Z Indexed on 2012/10/02 15:37 UTC
Read the original article Hit count: 671

I've managed to load a video-track of a movie frame by frame into a OpenGL texture with AVFoundation. I followed the steps described in the answer here: iOS4: how do I use video file as an OpenGL texture? and took some code from the GLVideoFrame sample from WWDC2010 which can be downloaded here: http://bit.ly/cEf0rM

How do I play the audio-track of the movie synchronously to the video. I think it would not be a good idea to play it in a separate player, but to use the audio-track of the same AVAsset.

AVAssetTrack* audioTrack = [[asset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];

I retrieve a videoframe and it's timestamp in the CADisplayLink-callback via

CMSampleBufferRef sampleBuffer = [self.readerOutput copyNextSampleBuffer];
CMTime timestamp = CMSampleBufferGetPresentationTimeStamp( sampleBuffer );

where readerOutput is of type AVAssetReaderTrackOutput*

How to get the corresponding audio-samples? And how to play them?


Edit:

I've looked around a bit and I think, best would be to use AudioQueue from the AudioToolbox.framework using the approach described here: AVAssetReader and Audio Queue streaming problem

There is also an audio-player in the AVFoundation: AVAudioPlayer. But I don't know exactly how I should pass data to it's initWithData-initializer which expects NSData. Furthermore I don't think it's the best choice for my case because a new AVAudioPlayer-instance would have to be created for every new chunk of audio samples, as I understand it.

Any other suggestions? What's the best way to play the raw audio samples which i get from the AVAssetReaderTrackOutput?

© Stack Overflow or respective owner

Related posts about ios

Related posts about opengl-es