Search Results

Search found 41 results on 2 pages for 'avfoundation'.

Page 1/2 | 1 2  | Next Page >

  • AVFoundation buffer comparison to a saved image

    - by user577552
    Hi, I am a long time reader, first time poster on StackOverflow, and must say it has been a great source of knowledge for me. I am trying to get to know the AVFoundation framework. What I want to do is save what the camera sees and then detect when something changes. Here is the part where I save the image to a UIImage : if (shouldSetBackgroundImage) { CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB(); // Create a bitmap graphics context with the sample buffer data CGContextRef context = CGBitmapContextCreate(rowBase, bufferWidth, bufferHeight, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst); // Create a Quartz image from the pixel data in the bitmap graphics context CGImageRef quartzImage = CGBitmapContextCreateImage(context); // Free up the context and color space CGContextRelease(context); CGColorSpaceRelease(colorSpace); // Create an image object from the Quartz image UIImage * image = [UIImage imageWithCGImage:quartzImage]; [self setBackgroundImage:image]; NSLog(@"reference image actually set"); // Release the Quartz image CGImageRelease(quartzImage); //Signal that the image has been saved shouldSetBackgroundImage = NO; } and here is the part where I check if there is any change in the image seen by the camera : else { CGImageRef cgImage = [backgroundImage CGImage]; CGDataProviderRef provider = CGImageGetDataProvider(cgImage); CFDataRef bitmapData = CGDataProviderCopyData(provider); char* data = CFDataGetBytePtr(bitmapData); if (data != NULL) { int64_t numDiffer = 0, pixelCount = 0; NSMutableArray * pointsMutable = [NSMutableArray array]; for( int row = 0; row < bufferHeight; row += 8 ) { for( int column = 0; column < bufferWidth; column += 8 ) { //we get one pixel from each source (buffer and saved image) unsigned char *pixel = rowBase + (row * bytesPerRow) + (column * BYTES_PER_PIXEL); unsigned char *referencePixel = data + (row * bytesPerRow) + (column * BYTES_PER_PIXEL); pixelCount++; if ( !match(pixel, referencePixel, matchThreshold) ) { numDiffer++; [pointsMutable addObject:[NSValue valueWithCGPoint:CGPointMake(SCREEN_WIDTH - (column/ (float) bufferHeight)* SCREEN_WIDTH - 4.0, (row/ (float) bufferWidth)* SCREEN_HEIGHT- 4.0)]]; } } } numberOfPixelsThatDiffer = numDiffer; points = [pointsMutable copy]; } For some reason, this doesn't work, meaning that the iPhone detects almost everything as being different from the saved image, even though I set a very low threshold for detection in the match function... Do you have any idea of what I am doing wrong?

    Read the article

  • Missing AVFoundation.framework

    - by Alex
    Hi, AVFoundation.framework is not where the documentation says it should be. I have iPhone SDK 2.2 installed (never had previous sdk versions installed) and I can't find that folder under /System/Library/Frameworks I did find it under /Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS2.2.sdk/System/Library/Frameworks/ folder but if I add it from that location, then the compiler can't find the header files. I tried copying the entire AVFoundation.framework folder to /System/Library/Framework, but it still can't find the header files. How can I use AVFoundation classes? Thanks, Alex

    Read the article

  • AVFoundation: Video to OpenGL texture working - How to play and sync audio?

    - by j00hi
    I've managed to load a video-track of a movie frame by frame into a OpenGL texture with AVFoundation. I followed the steps described in the answer here: iOS4: how do I use video file as an OpenGL texture? and took some code from the GLVideoFrame sample from WWDC2010 which can be downloaded here: http://bit.ly/cEf0rM How do I play the audio-track of the movie synchronously to the video. I think it would not be a good idea to play it in a separate player, but to use the audio-track of the same AVAsset. AVAssetTrack* audioTrack = [[asset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0]; I retrieve a videoframe and it's timestamp in the CADisplayLink-callback via CMSampleBufferRef sampleBuffer = [self.readerOutput copyNextSampleBuffer]; CMTime timestamp = CMSampleBufferGetPresentationTimeStamp( sampleBuffer ); where readerOutput is of type AVAssetReaderTrackOutput* How to get the corresponding audio-samples? And how to play them? Edit: I've looked around a bit and I think, best would be to use AudioQueue from the AudioToolbox.framework using the approach described here: AVAssetReader and Audio Queue streaming problem There is also an audio-player in the AVFoundation: AVAudioPlayer. But I don't know exactly how I should pass data to it's initWithData-initializer which expects NSData. Furthermore I don't think it's the best choice for my case because a new AVAudioPlayer-instance would have to be created for every new chunk of audio samples, as I understand it. Any other suggestions? What's the best way to play the raw audio samples which i get from the AVAssetReaderTrackOutput?

    Read the article

  • Clever recording using AVFoundation

    - by martin
    Hello I am working on my master thesis and I am programming app for iOS using AVFoundation framework. I can make by myself session and attach devices to it and record video with sound. The main problem is that I need continous recording (3hours or longer). After three hours user will stop recording and user will choose time eg. 15 mins (max 30 mins) and only this last 15 mins will be stored to iphone memory. Is it possible to 'cut' video while recording or should I record it eg. in 10 minutes block and then delete old video segments and last two segments connect to one bigger? Will perform these connections (stop recording, start new recording and then connect these two segments) lags in final long video segment? Is there any way to perform this 'clever' recording? Thank you for any ideas.

    Read the article

  • displaying music current time & duration from AVFoundation

    - by msb
    I have a view-based iphone application that has a single play-pause button that plays an mp3 file. At the time of invoking my doPlayPauseButton() method, I 'd like to show the current time and total duration of this mp3 through the AVAudioPlayer instance I've created, called myAudioPlayer. I have placed two labels at the UI and i'm trying to assign the currentTime and duration properties to these label when the playing begins but my attempts have failed. Here's my play/pause loop, any help would be appreciated: -(IBAction) doPlayPauseButton:(UIButton *)theButton { if(myAudioPlayer.playing) { [myActivityIndicatorView stopAnimating]; myActivityIndicatorView.hidden = YES; //I think I need a myAudioPlayer.currentTime call of some sort for my labels. [myAudioPlayer pause]; [theButton setTitle:@"Play" forState:UIControlStateNormal]; [myTimer invalidate]; } else { [myActivityIndicatorView startAnimating]; myActivityIndicatorView.hidden = NO; myTimer = [NSTimer scheduledTimerWithTimeInterval:1.0 target:self selector:@selector(doTimer) userInfo:nil repeats:YES]; [myAudioPlayer play]; [theButton setTitle:@"Pause" forState:UIControlStateNormal];

    Read the article

  • How to play audio file ios

    - by Camus
    I am trying to play an audio file but I can get it working. I imported the AVFoundation framework. Here is the code: NSString *fileName = [[NSBundle mainBundle] pathForResource:@"Alarm" ofType:@"caf"]; NSURL *url = [[NSURL alloc] initFileURLWithPath:fileName]; NSLog(@"Test: %@ ", url); AVAudioPlayer *audioFile = [[AVAudioPlayer alloc] initWithContentsOfURL:url error:NULL]; audioFile.delegate = self; audioFile.volume = 1; [audioFile play]; I am receiving an error nil string parameter I copied the file to the supporting files folder so the file is there. Can you guys help me? Thanks

    Read the article

  • Problems playing multiple sounds using AVPlayer (NOT AVAudioPlayer)

    - by myetter37
    I'm trying to play a background song in my iPhone game and also have sound effects, using the AVFoundation framework and AVPlayerItem. I've scoured the Internet for help with AVPlayerItem and AVPlayer but I'm only finding stuff about AVAudioPlayer. The background song plays fine, but when the character jumps, I have 2 problems: 1) On the initial jump ([player play] inside jump method), the jump sound effect interrupts the background music. 2) If I try to jump again, the game crashes with the error "AVPlayerItem cannot be associated with more than one instance of AVPlayer" My professor told me to create a new instance of AVPlayer for each sound I want to play, so I'm confused. I'm doing data driven design, so my sound files are listed in a .txt and then loaded into a NSDictionary. Here's my code: - (void) storeSoundNamed:(NSString *) soundName withFileName:(NSString *) soundFileName { NSURL *assetURL = [[NSURL alloc] initFileURLWithPath:[[NSBundle mainBundle] pathForResource:soundName ofType:soundFileName]]; AVURLAsset *mAsset = [[AVURLAsset alloc] initWithURL:assetURL options:nil]; AVPlayerItem *mPlayerItem = [AVPlayerItem playerItemWithAsset:mAsset]; [soundDictionary setObject:mPlayerItem forKey:soundName]; NSLog(@"Sound added."); } - (void) playSound:(NSString *) soundName { // from .h: @property AVPlayer *mPlayer; // from .m: @synthesize mPlayer = _mPlayer; _mPlayer = [[AVPlayer alloc] initWithPlayerItem:[soundDictionary valueForKey:soundName]]; [_mPlayer play]; NSLog(@"Playing sound."); } If I move this line from the second method into the first: _mPlayer = [[AVPlayer alloc] initWithPlayerItem:[soundDictionary valueForKey:soundName]]; The game does not crash, and the background song will play perfectly, but the jump sound effect does not play, even though the console is showing "Playing sound." on each jump. Thank you

    Read the article

  • how to play an audio soundclip when a nib is loaded - welcome screen - xcode

    - by Pavan
    I would like to do the following two things: 1) I would like to play an audio file qhen a nib is loaded 2) After that i would like to switch views when the audio file has finished playing. This will be easy as i just need to call the event that initiates the change of view by using the delegate method -(void) audioPlayerDidFinishPlaying{ //code to change view } I dont know how to to play the audio file when a nib is loaded. Using the AVFoundation framework, I tried doing the following after setting up the audio player and the variables associated with it in the appropriate places i wrote the following: - (void)viewDidLoad { [super viewDidLoad]; NSString *soundFilePath = [[NSBundle mainBundle] pathForResource: @"sound" ofType: @"mp3"]; NSURL *fileURL = [[NSURL alloc] initFileURLWithPath: soundFilePath]; AVAudioPlayer *newPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL: fileURL error:nil]; [fileURL release]; self.player = newPlayer; [newPlayer release]; [player prepareToPlay]; [player setDelegate: self]; [player play]; } Although this does not play the file as the viewdidload method gets called before the nib is shown so the audio is never played or heard. What do i need to do so that i can play the audio file AFTER the nib has loaded and is shown on the screen? can someone please help me as ive been working on this for 3 hours now. Thanks in advance.

    Read the article

  • Caching with AVPlayer and AVAssetExportSession

    - by tba
    I would like to cache progressive-download videos using AVPlayer. How can I save an AVPlayer's item to disk? I'm trying to use AVAssetExportSession on the player's currentItem (which is fully loaded). This code is giving me "AVAssetExportSessionStatusFailed (The operation could not be completed)" : AVAsset *mediaAsset = self.player.currentItem.asset; AVAssetExportSession *es = [[AVAssetExportSession alloc] initWithAsset:mediaAsset presetName:AVAssetExportPresetLowQuality]; NSString *outPath = [NSTemporaryDirectory() stringByAppendingPathComponent:@"out.mp4"]; NSFileManager *fileManager = [NSFileManager defaultManager]; [fileManager removeItemAtPath:outPath error:NULL]; es.outputFileType = @"com.apple.quicktime-movie"; es.outputURL = [[[NSURL alloc] initFileURLWithPath:outPath] autorelease]; NSLog(@"exporting to %@",outPath); [es exportAsynchronouslyWithCompletionHandler:^{ NSString *status = @""; if( es.status == AVAssetExportSessionStatusUnknown ) status = @"AVAssetExportSessionStatusUnknown"; else if( es.status == AVAssetExportSessionStatusWaiting ) status = @"AVAssetExportSessionStatusWaiting"; else if( es.status == AVAssetExportSessionStatusExporting ) status = @"AVAssetExportSessionStatusExporting"; else if( es.status == AVAssetExportSessionStatusCompleted ) status = @"AVAssetExportSessionStatusCompleted"; else if( es.status == AVAssetExportSessionStatusFailed ) status = @"AVAssetExportSessionStatusFailed"; else if( es.status == AVAssetExportSessionStatusCancelled ) status = @"AVAssetExportSessionStatusCancelled"; NSLog(@"done exporting to %@ status %d = %@ (%@)",outPath,es.status, status,[[es error] localizedDescription]); }]; How can I export successfully? I'm looking into copying mediaAsset into an AVMutableComposition, but haven't had much luck with that either. Thanks! PS: Here are some questions from people trying to accomplish the same thing (but with MPMoviePlayerController): Cache Progressive downloaded content in MPMoviePlayerController Simultaneously stream and save a video? Caching videos to disk after successful preload by MPMoviePlayerController

    Read the article

  • How to preserve the aspect ratio of video using AVAssetWriter

    - by Satoshi Nakajima
    I have a following code, which captures the video from the camera and stores it as a QuickMovie file using AVAssetWriter. It works fine, but the aspect ratio is not perfect because the width and height are hardcoded (480 x 320) in the outputSettings for AVAssetWriterInput. I'd rather find out the aspect ratio of the source video, and specify the appropriate height (480 x aspect ratio). Does anybody know how to do it? Should I defer the creation of AssetWriterInput until the first sampleBuffer? // set the sessionPreset to 'medium' self.captureSession = [[AVCaptureSession alloc] init]; self.captureSession.sessionPreset = AVCaptureSessionPresetMedium; ... // create AVCaptureVideoDataOutput self.captureVideo = [[AVCaptureVideoDataOutput alloc] init]; NSString* formatTypeKey = (NSString*)kCVPixelBufferPixelFormatTypeKey; self.captureVideo.videoSettings = @{ formatTypeKey:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA] }; [self.captureVideo setSampleBufferDelegate:self queue:dispatch_get_main_queue()]; // create an AVAssetWriter NSError* error = nil; self.videoWriter = [[AVAssetWriter alloc] initWithURL:url fileType:AVFileTypeQuickTimeMovie error:&error]; ... // create AVAssetWriterInput with specified settings NSDictionary* compression = @{ AVVideoAverageBitRateKey:[NSNumber numberWithInt:960000], AVVideoMaxKeyFrameIntervalKey:[NSNumber numberWithInt:1] }; self.videoInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:@{ AVVideoCodecKey:AVVideoCodecH264, AVVideoCompressionPropertiesKey:compression, AVVideoWidthKey:[NSNumber numberWithInt:480], // required AVVideoHeightKey:[NSNumber numberWithInt:320] // required }]; // add it to the AVAssetWriter [self.videoWriter addInput:self.videoInput];

    Read the article

  • AVURLAsset cannot load with remote file

    - by Quentin
    Hi, I have a problem using AVURLAsset. NSString * const kContentURL = @ "http://devimages.apple.com/iphone/samples/bipbop/bipbopall.m3u8"; ... NSURL *contentURL = [NSURL URLWithString:kContentURL]; AVURLAsset *asset = [AVURLAsset URLAssetWithURL:contentURL options:nil]; [asset loadValuesAsynchronouslyForKeys:[NSArray arrayWithObject:tracksKey] completionHandler:^{ ... NSError *error = nil; AVKeyValueStatus status = [asset statusOfValueForKey:tracksKey error:&error]; ... } In the completion block, the status is AVKeyValueStatusFailed and the error message is "Cannot Open". All exemples I have seen, use a local file, so maybe there is a problem using a remote file... Regards, Quentin

    Read the article

  • Getting AveragePower and PeakPower for a Channel in AVAudioRecorder

    - by Biranchi
    Hi all, I am annoyed with this piece of code. I am trying to get the averagePowerForChannel and peakPowerForChannel while recording Audio, but every time i am getting it as 0.0 Below is my code for recording audio : NSMutableDictionary *recordSetting =[[NSDictionary alloc] initWithObjectsAndKeys:[NSNumber numberWithFloat: 22050.0], AVSampleRateKey, [NSNumber numberWithInt: kAudioFormatLinearPCM], AVFormatIDKey, [NSNumber numberWithInt: 1], AVNumberOfChannelsKey, [NSNumber numberWithInt: AVAudioQualityMax], AVEncoderAudioQualityKey, [NSNumber numberWithInt:32],AVLinearPCMBitDepthKey, [NSNumber numberWithBool:NO],AVLinearPCMIsBigEndianKey, [NSNumber numberWithBool:NO],AVLinearPCMIsFloatKey, nil]; recorder1 = [[AVAudioRecorder alloc] initWithURL:[NSURL fileURLWithPath:audioFilePath] settings:recordSetting error:&err]; recorder1.meteringEnabled = YES; recorder1.delegate=self; [recorder1 prepareToRecord]; [recorder1 record]; levelTimer = [NSTimer scheduledTimerWithTimeInterval: 0.3f target: self selector: @selector(levelTimerCallback:) userInfo: nil repeats: YES]; - (void)levelTimerCallback:(NSTimer *)timer { [recorder1 updateMeters]; NSLog(@"Peak Power : %f , %f", [recorder1 peakPowerForChannel:0], [recorder1 peakPowerForChannel:1]); NSLog(@"Average Power : %f , %f", [recorder1 averagePowerForChannel:0], [recorder1 averagePowerForChannel:1]); } What is the error in the code ???

    Read the article

  • AVAudioRecorder Won't Record On Device

    - by Dyldo42
    This is my method: -(void) playOrRecord:(UIButton *)sender { if (playBool == YES) { NSError *error = nil; NSString *filePath = [[NSBundle mainBundle] pathForResource:[NSString stringWithFormat: @"%d", [sender tag]] ofType:@"caf"]; NSURL *fileUrl = [NSURL fileURLWithPath:filePath]; AVAudioPlayer *player = [[AVAudioPlayer alloc] initWithContentsOfURL:fileUrl error:&error]; [player setNumberOfLoops:0]; [player play]; } else if (playBool == NO) { if ([recorder isRecording]) { [recorder stop]; [nowRecording setImage:[UIImage imageNamed:@"NormalNormal.png"] forState:UIControlStateNormal]; [nowRecording setImage:[UIImage imageNamed:@"NormalSelected.png"] forState:UIControlStateSelected]; } if (nowRecording == sender) { nowRecording = nil; return; } nowRecording = sender; NSError *error = nil; NSString *filePath = [[NSBundle mainBundle] pathForResource:[NSString stringWithFormat: @"%d", [sender tag]] ofType:@"caf"]; NSURL *fileUrl = [NSURL fileURLWithPath:filePath]; [sender setImage:[UIImage imageNamed:@"RecordingNormal.png"] forState:UIControlStateNormal]; [sender setImage:[UIImage imageNamed:@"RecordingSelected.png"] forState:UIControlStateSelected]; recorder = [[AVAudioRecorder alloc] initWithURL:fileUrl settings:recordSettings error:&error]; [recorder record]; } } Most of it is self explanatory; playBool is a BOOL that is YES when it is in play mode. Everything works in the simulator however, when I run it on a device, [recorder record] returns NO. Does anyone have a clue as to why this is happening?

    Read the article

  • Are these AVAudioSettings right?

    - by Dyldo42
    self.recordSettings = [NSMutableDictionary dictionary]; [recordSettings setValue:[NSNumber numberWithInt:kAudioFormatAppleIMA4] forKey:AVFormatIDKey]; [recordSettings setValue:[NSNumber numberWithFloat:44100.0] forKey:AVSampleRateKey]; [recordSettings setValue:[NSNumber numberWithInt:1] forKey:AVNumberOfChannelsKey]; recordSettings is declared in my view's header file and initialised in its viewDidLoad method. For some reason, everything is working in the simulator, but on a device, the [recorder record] method is returning NO. The only theory I have is that something in the recordSettings isn't compatible with an actual device. Any ideas?

    Read the article

  • AVPlayer seeking to a different point after app resume

    - by CGuess
    I have an AVPlayer, the video in it is ~2 seconds long. After the video plays, if the app goes to the background and reenters the foreground I need the video to still be shown exactly as it was when the app was exited. The AVPlayer sticks around just fine, however when I reenter the app from the background the video appears to be seeked to the middle of the video. However, if I just play the video, it starts from the beginning, so it doesn't seem like it actually seeked and is just showing a preview image. I've tried to auto-seek the video to the end on relaunch but nothing happens . Nothing I can figure out or find in the docs would describe this behavior. Any tips on having the video launch either at the beginning or end?

    Read the article

  • iPhone: How many instances of AVAudioPlayer should I have for multiple sounds?

    - by foreyez
    So I'm using AvAudioPlayer to play multiple wav files. About 20 different sounds (each about 1 sec long), and you can think of each being played on a button press. Also I don't need them all to play simultaneously, i.e., one plays and you press another button to play another one (which stops the currently played one). What I'm wondering, should I have multiple instances of AVAudioPlayer (20 of them) and then preload the audio files, or should I just use one instance of AvAudioPlayer and each time a button is pressed, initialize the AvAudioPlayer with the sound url (or would this be too slow?) Thanks in advance!

    Read the article

  • IPhone avcomposition issue

    - by user346443
    Hi, Im trying to create a video that shows two videos one after the other using avcomposition on the iphone. This code works, however i can only see one of the videos for the entire duration of the newly created video - (void) startEdit{ AVMutableComposition* mixComposition = [AVMutableComposition composition]; NSString* a_inputFileName = @"export.mov"; NSString* a_inputFilePath = [NSTemporaryDirectory() stringByAppendingPathComponent:a_inputFileName]; NSURL* a_inputFileUrl = [NSURL fileURLWithPath:a_inputFilePath]; NSString* b_inputFileName = @"output.mov"; NSString* b_inputFilePath = [NSTemporaryDirectory() stringByAppendingPathComponent:b_inputFileName]; NSURL* b_inputFileUrl = [NSURL fileURLWithPath:b_inputFilePath]; NSString* outputFileName = @"outputFile.mov"; NSString* outputFilePath = [NSTemporaryDirectory() stringByAppendingPathComponent:outputFileName]; NSURL* outputFileUrl = [NSURL fileURLWithPath:outputFilePath]; if ([[NSFileManager defaultManager] fileExistsAtPath:outputFilePath]) [[NSFileManager defaultManager] removeItemAtPath:outputFilePath error:nil]; CMTime nextClipStartTime = kCMTimeZero; AVURLAsset* a_videoAsset = [[AVURLAsset alloc]initWithURL:a_inputFileUrl options:nil]; CMTimeRange a_timeRange = CMTimeRangeMake(kCMTimeZero,a_videoAsset.duration); AVMutableCompositionTrack *a_compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid]; [a_compositionVideoTrack insertTimeRange:a_timeRange ofTrack:[[a_videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:nextClipStartTime error:nil]; nextClipStartTime = CMTimeAdd(nextClipStartTime, a_timeRange.duration); AVURLAsset* b_videoAsset = [[AVURLAsset alloc]initWithURL:b_inputFileUrl options:nil]; CMTimeRange b_timeRange = CMTimeRangeMake(kCMTimeZero, b_videoAsset.duration); AVMutableCompositionTrack *b_compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid]; [b_compositionVideoTrack insertTimeRange:b_timeRange ofTrack:[[b_videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:nextClipStartTime error:nil]; AVAssetExportSession* _assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetLowQuality]; _assetExport.outputFileType = @"com.apple.quicktime-movie"; _assetExport.outputURL = outputFileUrl; [_assetExport exportAsynchronouslyWithCompletionHandler: ^(void ) { [self saveVideoToAlbum:outputFilePath]; } ]; } - (void) saveVideoToAlbum:(NSString*)path{ if(UIVideoAtPathIsCompatibleWithSavedPhotosAlbum(path)){ UISaveVideoAtPathToSavedPhotosAlbum (path, self, @selector(video:didFinishSavingWithError: contextInfo:), nil); } } - (void) video: (NSString *) videoPath didFinishSavingWithError: (NSError *) error contextInfo: (void *) contextInfo { NSLog(@"Finished saving video with error: %@", error); } I've posted the whole code as it may help someone else. Shouldn't nextClipStartTime = CMTimeAdd(nextClipStartTime, a_timeRange.duration); [b_compositionVideoTrack insertTimeRange:b_timeRange ofTrack:[[b_videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:nextClipStartTime error:nil]; add the second video to the end of the first Cheers

    Read the article

  • avmutablecomposition insertEmptyTimeRange

    - by smartfaceweb
    I have created an avmutablecomposition and tried to use insertEmptyTimeRange to generate 1 minute of silence. This doesn't appear to be working. I have also tried creating an avmutablecompositiontrack using addMutableTrackWithMediaType:preferredTrackID: and then insertEmptyTimeRange on the track and still no success. To give some background on my app, I allow users to add audio samples a timeline and then playback or export and this is working really well using the av classes. The problem is that I need to make sure that the audio is exactly 1 min (for example). Regardless of the info about my specific app above, is it possible to insert an empty time range into a comp or comptrack?

    Read the article

  • Audio Framework in iPhone

    - by suse
    There are three major frameworks for iPhone audio : AVFoundation Framework CoreAudio Framework OpenAL Library And in turn CoreAudio Framework has AudioToolkit Framework and AudioUnit Framework Is this correct? Suppose I import AVFoundation Framework into my project and it in turn needs a feature which is provided by CoreAudio Framework.. Can it internally access the features of CoreAudio without importing CoreAudio framework into my project?

    Read the article

  • iphone: Help with AudioToolbox Leak: Stack trace/code included here...

    - by editor guy
    Part of this app is a "Scream" button that plays random screams from cast members of a TV show. I have to bang on the app quite a while to see a memory leak in Instruments, but it's there, occasionally coming up (every 45 seconds to 2 minutes.) The leak is 3.50kb when it occurs. Haven't been able to crack it for several hours. Any help appreciated. Instruments says this is the offending code line: [appSoundPlayer play]; that's linked to from line 9 of the below stack trace: 0 libSystem.B.dylib malloc 1 libSystem.B.dylib pthread_create 2 AudioToolbox CAPThread::Start() 3 AudioToolbox GenericRunLoopThread::Start() 4 AudioToolbox AudioQueueNew(bool, AudioStreamBasicDescription const*, TCACallback const&, CACallbackTarget const&, unsigned long, OpaqueAudioQueue*) 5 AudioToolbox AudioQueueNewOutput 6 AVFoundation allocAudioQueue(AVAudioPlayer, AudioPlayerImpl*) 7 AVFoundation prepareToPlayQueue(AVAudioPlayer*, AudioPlayerImpl*) 8 AVFoundation -[AVAudioPlayer prepareToPlay] 9 Scream Queens -[ScreamViewController scream:] /Users/laptop2/Desktop/ScreamQueens Versions/ScreamQueens25/Scream Queens/Classes/../ScreamViewController.m:210 10 CoreFoundation -[NSObject performSelector:withObject:withObject:] 11 UIKit -[UIApplication sendAction:to:from:forEvent:] 12 UIKit -[UIApplication sendAction:toTarget:fromSender:forEvent:] 13 UIKit -[UIControl sendAction:to:forEvent:] 14 UIKit -[UIControl(Internal) _sendActionsForEvents:withEvent:] 15 UIKit -[UIControl touchesEnded:withEvent:] 16 UIKit -[UIWindow _sendTouchesForEvent:] 17 UIKit -[UIWindow sendEvent:] 18 UIKit -[UIApplication sendEvent:] 19 UIKit _UIApplicationHandleEvent 20 GraphicsServices PurpleEventCallback 21 CoreFoundation CFRunLoopRunSpecific 22 CoreFoundation CFRunLoopRunInMode 23 GraphicsServices GSEventRunModal 24 UIKit -[UIApplication _run] 25 UIKit UIApplicationMain 26 Scream Queens main /Users/laptop2/Desktop/ScreamQueens Versions/ScreamQueens25/Scream Queens/main.m:14 27 Scream Queens start Here's .h: #import <UIKit/UIKit.h> #import <AVFoundation/AVFoundation.h> #import <MediaPlayer/MediaPlayer.h> #import <AudioToolbox/AudioToolbox.h> #import <MessageUI/MessageUI.h> #import <MessageUI/MFMailComposeViewController.h> @interface ScreamViewController : UIViewController <UIApplicationDelegate, AVAudioPlayerDelegate, MFMailComposeViewControllerDelegate> { //AudioPlayer related AVAudioPlayer *appSoundPlayer; NSURL *soundFileURL; BOOL interruptedOnPlayback; BOOL playing; //Scream button related IBOutlet UIButton *screamButton; int currentScreamIndex; NSString *currentScream; NSMutableArray *screams; NSMutableArray *personScreaming; NSMutableArray *photoArray; int currentSayingsIndex; NSString *currentButtonSaying; NSMutableArray *funnyButtonSayings; IBOutlet UILabel *funnyButtonSayingsLabel; IBOutlet UILabel *personScreamingField; IBOutlet UIImageView *personScreamingImage; //Mailing the scream related IBOutlet UILabel *mailStatusMessage; IBOutlet UIButton *shareButton; } //AudioPlayer related @property (nonatomic, retain) AVAudioPlayer *appSoundPlayer; @property (nonatomic, retain) NSURL *soundFileURL; @property (readwrite) BOOL interruptedOnPlayback; @property (readwrite) BOOL playing; //Scream button related @property (nonatomic, retain) UIButton *screamButton; @property (nonatomic, retain) NSMutableArray *screams; @property (nonatomic, retain) NSMutableArray *personScreaming; @property (nonatomic, retain) NSMutableArray *photoArray; @property (nonatomic, retain) UILabel *personScreamingField; @property (nonatomic, retain) UIImageView *personScreamingImage; @property (nonatomic, retain) NSMutableArray *funnyButtonSayings; @property (nonatomic, retain) UILabel *funnyButtonSayingsLabel; //Mailing the scream related @property (nonatomic, retain) IBOutlet UILabel *mailStatusMessage; @property (nonatomic, retain) IBOutlet UIButton *shareButton; //Scream Button - (IBAction) scream: (id) sender; //Mail the scream - (IBAction) showPicker: (id)sender; - (void)displayComposerSheet; - (void)launchMailAppOnDevice; @end Here's the top of .m: #import "ScreamViewController.h" //top of code has Audio session callback function for responding to audio route changes (from Apple's code), then my code continues... @implementation ScreamViewController @synthesize appSoundPlayer; // AVAudioPlayer object for playing the selected scream @synthesize soundFileURL; // Path to the scream @synthesize interruptedOnPlayback; // Was application interrupted during audio playback @synthesize playing; // Track playing/not playing state @synthesize screamButton; //Press this button, girls scream. @synthesize screams; //Mutable array holding strings pointing to sound files of screams. @synthesize personScreaming; //Mutable array tracking the person doing the screaming @synthesize photoArray; //Mutable array holding strings pointing to photos of screaming girls @synthesize personScreamingField; //Field updates to announce which girl is screaming. @synthesize personScreamingImage; //Updates to show image of the screamer. @synthesize funnyButtonSayings; //Mutable array holding the sayings @synthesize funnyButtonSayingsLabel; //Label that updates with the funnyButtonSayings @synthesize mailStatusMessage; //did the email go out @synthesize shareButton; //share scream via email Next line begins the block with the offending code: - (IBAction) scream: (id) sender { //Play a click sound effect SystemSoundID soundID; NSString *sfxPath = [[NSBundle mainBundle] pathForResource:@"aClick" ofType:@"caf"]; AudioServicesCreateSystemSoundID((CFURLRef)[NSURL fileURLWithPath:sfxPath],&soundID); AudioServicesPlaySystemSound (soundID); // Because someone may slam the scream button over and over, //must stop current sound, then begin next if ([self appSoundPlayer] != nil) { [[self appSoundPlayer] setDelegate:nil]; [[self appSoundPlayer] stop]; [self setAppSoundPlayer: nil]; } //after selecting a random index in the array (did that in View Did Load), //we move to the next scream on each click. //First check... //Are we past the end of the array? if (currentScreamIndex == [screams count]) { currentScreamIndex = 0; } //Get the string at the index in the personScreaming array currentScream = [screams objectAtIndex: currentScreamIndex]; //Get the string at the index in the personScreaming array NSString *screamer = [personScreaming objectAtIndex:currentScreamIndex]; //Log the string to the console NSLog (@"playing scream: %@", screamer); // Display the string in the personScreamingField field NSString *listScreamer = [NSString stringWithFormat:@"scream by: %@", screamer]; [personScreamingField setText:listScreamer]; // Gets the file system path to the scream to play. NSString *soundFilePath = [[NSBundle mainBundle] pathForResource: currentScream ofType: @"caf"]; // Converts the sound's file path to an NSURL object NSURL *newURL = [[NSURL alloc] initFileURLWithPath: soundFilePath]; self.soundFileURL = newURL; [newURL release]; [[AVAudioSession sharedInstance] setDelegate: self]; [[AVAudioSession sharedInstance] setCategory: AVAudioSessionCategoryPlayback error: nil]; // Registers the audio route change listener callback function AudioSessionAddPropertyListener ( kAudioSessionProperty_AudioRouteChange, audioRouteChangeListenerCallback, self ); // Activates the audio session. NSError *activationError = nil; [[AVAudioSession sharedInstance] setActive: YES error: &activationError]; // Instantiates the AVAudioPlayer object, initializing it with the sound AVAudioPlayer *newPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL: soundFileURL error: nil]; //Error check and continue if (newPlayer != nil) { self.appSoundPlayer = newPlayer; [newPlayer release]; [appSoundPlayer prepareToPlay]; [appSoundPlayer setVolume: 1.0]; [appSoundPlayer setDelegate:self]; //NEXT LINE IS FLAGGED BY INSTRUMENTS AS LEAKY [appSoundPlayer play]; playing = YES; //Get the string at the index in the photoArray array NSString *screamerPic = [photoArray objectAtIndex:currentScreamIndex]; //Log the string to the console NSLog (@"displaying photo: %@", screamerPic); // Display the image of the person screaming personScreamingImage.image = [UIImage imageNamed:screamerPic]; //show the share button shareButton.hidden = NO; mailStatusMessage.hidden = NO; mailStatusMessage.text = @"share!"; //Get the string at the index in the funnySayings array currentSayingsIndex = random() % [funnyButtonSayings count]; currentButtonSaying = [funnyButtonSayings objectAtIndex: currentSayingsIndex]; NSString *theSaying = [funnyButtonSayings objectAtIndex:currentSayingsIndex]; [funnyButtonSayingsLabel setText: theSaying]; currentScreamIndex++; } } Here's my dealloc: - (void)dealloc { [appSoundPlayer stop]; [appSoundPlayer release], appSoundPlayer = nil; [screamButton release], screamButton = nil; [mailStatusMessage release], mailStatusMessage = nil; [personScreamingField release], personScreamingField = nil; [personScreamingImage release], personScreamingImage = nil; [funnyButtonSayings release], funnyButtonSayings = nil; [funnyButtonSayingsLabel release], funnyButtonSayingsLabel = nil; [screams release], screams = nil; [personScreaming release], personScreaming = nil; [soundFileURL release]; [super dealloc]; } @end Thanks so much for reading this far! Any input appreciated.

    Read the article

  • Problem with playing a sound when a button is clicked.

    - by iSharreth
    in my code: import "MyViewController.h" import (IBAction)playSound{ AVAudioPlayer *myExampleSound; NSString *myExamplePath = [[NSBundle mainBundle] pathForResource:@"myaudiofile" ofType:@"caf"]; myExampleSound =[[AVAudioPlayer alloc] initWithContentsOfURL:[NSURL fileURLWithPath:myExamplePath] error:NULL]; myExampleSound.delegate = self; [myExampleSound play]; } But it is showing a warning that Class MyViewController does not implement AVAudioplayerDelegate. Anyone please help. I had included the AVFoundation.Framework.

    Read the article

  • Problem with memory leaks

    - by user191723
    Sorry, having difficulty formattin code to appear correct here??? I am trying to understand the readings I get from running instruments on my app which are telling me I am leaking memory. There are a number, quite a few in fact, that get reported from inside the Foundation, AVFoundation CoreGraphics etc that I assume I have no control over and so should ignore such as: Malloc 32 bytes: 96 bytes, AVFoundation, prepareToRecordQueue or Malloc 128 bytes: 128 bytes, CoreGraphics, open_handle_to_dylib_path Am I correct in assuming these are something the system will resolve? But then there are leaks that are reported that I believe I am responsible for, such as: This call reports against this line leaks 2.31KB [self createAVAudioRecorder:frameAudioFile]; Immediately followed by this: -(NSError*) createAVAudioRecorder: (NSString *)fileName { // flush recorder to start afresh [audioRecorder release]; audioRecorder = nil; // delete existing file to ensure we have clean start [self deleteFile: fileName]; VariableStore *singleton = [VariableStore sharedInstance]; // get full path to target file to create NSString *destinationString = [singleton.docsPath stringByAppendingPathComponent: fileName]; NSURL *destinationURL = [NSURL fileURLWithPath: destinationString]; // configure the recording settings NSMutableDictionary *recordSettings = [[NSMutableDictionary alloc] initWithCapacity:6]; //****** LEAKING 384 BYTES [recordSettings setObject:[NSNumber numberWithInt:kAudioFormatLinearPCM] forKey: AVFormatIDKey]; //***** LEAKING 32 BYTES float sampleRate = 44100.0; [recordSettings setObject:[NSNumber numberWithFloat: sampleRate] forKey: AVSampleRateKey]; //***** LEAKING 48 BYTES [recordSettings setObject:[NSNumber numberWithInt:2] forKey:AVNumberOfChannelsKey]; int bitDepth = 16; [recordSettings setObject: [NSNumber numberWithInt:bitDepth] forKey:AVLinearPCMBitDepthKey]; //***** LEAKING 48 BYTES [recordSettings setObject:[NSNumber numberWithBool:YES] forKey:AVLinearPCMIsBigEndianKey]; [recordSettings setObject:[NSNumber numberWithBool: NO]forKey:AVLinearPCMIsFloatKey]; NSError *recorderSetupError = nil; // create the new recorder with target file audioRecorder = [[AVAudioRecorder alloc] initWithURL: destinationURL settings: recordSettings error: &recorderSetupError]; //***** LEAKING 1.31KB [recordSettings release]; recordSettings = nil; // check for erros if (recorderSetupError) { UIAlertView *alert = [[UIAlertView alloc] initWithTitle: @"Can't record" message: [recorderSetupError localizedDescription] delegate: nil cancelButtonTitle: @"OK" otherButtonTitles: nil]; [alert show]; [alert release]; alert = nil; return recorderSetupError; } [audioRecorder prepareToRecord]; //***** LEAKING 512 BYTES audioRecorder.delegate = self; return recorderSetupError; } I do not understand why there is a leak as I release audioRecorder at the start and set to nil and I release recordSettings and set to nil? Can anyone enlighten me please? Thanks

    Read the article

  • play two sounds simultaneously iphone sdk

    - by Asaf Greene
    I am trying to make a small music app on the iphone. I want to have an octave a piano which will respond to touches and play the key or keys that the user touches. How would i be able to get two or more sounds to play at the same time so it sounds like a chord? I tried using AVFoundation but the two sounds just play one after the other.

    Read the article

1 2  | Next Page >