Search Results

Search found 10 results on 1 pages for 'avplayer'.

Page 1/1 | 1 

  • How to reduce iOS AVPlayer start delay

    - by Bernt Habermeier
    Note, for the below question: All assets are local on the device -- no network streaming is taking place. The videos contain audio tracks. I'm working on an iOS application that requires playing video files with minimum delay to start the video clip in question. Unfortunately we do not know what specific video clip is next until we actually need to start it up. Specifically: When one video clip is playing, we will know what the next set of (roughly) 10 video clips are, but we don't know which one exactly, until it comes time to 'immediately' play the next clip. What I've done to look at actual start delays is to call addBoundaryTimeObserverForTimes on the video player, with a time period of one millisecond to see when the video actually started to play, and I take the difference of that time stamp with the first place in the code that indicates which asset to start playing. From what I've seen thus-far, I have found that using the combination of AVAsset loading, and then creating an AVPlayerItem from that once it's ready, and then waiting for AVPlayerStatusReadyToPlay before I call play, tends to take between 1 and 3 seconds to start the clip. I've since switched to what I think is roughly equivalent: calling [AVPlayerItem playerItemWithURL:] and waiting for AVPlayerItemStatusReadyToPlay to play. Roughly same performance. One thing I'm observing is that the first AVPlayer item load is slower than the rest. Seems one idea is to pre-flight the AVPlayer with a short / empty asset before trying to play the first video might be of good general practice. [http://stackoverflow.com/questions/900461/slow-start-for-avaudioplayer-the-first-time-a-sound-is-played] I'd love to get the video start times down as much as possible, and have some ideas of things to experiment with, but would like some guidance from anyone that might be able to help. Update: idea 7, below, as-implemented yields switching times of around 500 ms. This is an improvement, but it it'd be nice to get this even faster. Idea 1: Use N AVPlayers (won't work) Using ~ 10 AVPPlayer objects and start-and-pause all ~ 10 clips, and once we know which one we really need, switch to, and un-pause the correct AVPlayer, and start all over again for the next cycle. I don't think this works, because I've read there is roughly a limit of 4 active AVPlayer's in iOS. There was someone asking about this on StackOverflow here, and found out about the 4 AVPlayer limit: fast-switching-between-videos-using-avfoundation Idea 2: Use AVQueuePlayer (won't work) I don't believe that shoving 10 AVPlayerItems into an AVQueuePlayer would pre-load them all for seamless start. AVQueuePlayer is a queue, and I think it really only makes the next video in the queue ready for immediate playback. I don't know which one out of ~10 videos we do want to play back, until it's time to start that one. ios-avplayer-video-preloading Idea 3: Load, Play, and retain AVPlayerItems in background (not 100% sure yet -- but not looking good) I'm looking at if there is any benefit to load and play the first second of each video clip in the background (suppress video and audio output), and keep a reference to each AVPlayerItem, and when we know which item needs to be played for real, swap that one in, and swap the background AVPlayer with the active one. Rinse and Repeat. The theory would be that recently played AVPlayer/AVPlayerItem's may still hold some prepared resources which would make subsequent playback faster. So far, I have not seen benefits from this, but I might not have the AVPlayerLayer setup correctly for the background. I doubt this will really improve things from what I've seen. Idea 4: Use a different file format -- maybe one that is faster to load? I'm currently using .m4v's (video-MPEG4) H.264 format. I have not played around with other formats, but it may well be that some formats are faster to decode / get ready than others. Possible still using video-MPEG4 but with a different codec, or maybe quicktime? Maybe a lossless video format where decoding / setup is faster? Idea 5: Combination of lossless video format + AVQueuePlayer If there is a video format that is fast to load, but maybe where the file size is insane, one idea might be to pre-prepare the first 10 seconds of each video clip with a version that is boated but faster to load, but back that up with an asset that is encoded in H.264. Use an AVQueuePlayer, and add the first 10 seconds in the uncompressed file format, and follow that up with one that is in H.264 which gets up to 10 seconds of prepare/preload time. So I'd get 'the best' of both worlds: fast start times, but also benefits from a more compact format. Idea 6: Use a non-standard AVPlayer / write my own / use someone else's Given my needs, maybe I can't use AVPlayer, but have to resort to AVAssetReader, and decode the first few seconds (possibly write raw file to disk), and when it comes to playback, make use of the raw format to play it back fast. Seems like a huge project to me, and if I go about it in a naive way, it's unclear / unlikely to even work better. Each decoded and uncompressed video frame is 2.25 MB. Naively speaking -- if we go with ~ 30 fps for the video, I'd end up with ~60 MB/s read-from-disk requirement, which is probably impossible / pushing it. Obviously we'd have to do some level of image compression (perhaps native openGL/es compression formats via PVRTC)... but that's kind crazy. Maybe there is a library out there that I can use? Idea 7: Combine everything into a single movie asset, and seekToTime One idea that might be easier than some of the above, is to combine everything into a single movie, and use seekToTime. The thing is that we'd be jumping all around the place. Essentially random access into the movie. I think this may actually work out okay: avplayer-movie-playing-lag-in-ios5 Which approach do you think would be best? So far, I've not made that much progress in terms of reducing the lag.

    Read the article

  • AVPlayer seeking to a different point after app resume

    - by CGuess
    I have an AVPlayer, the video in it is ~2 seconds long. After the video plays, if the app goes to the background and reenters the foreground I need the video to still be shown exactly as it was when the app was exited. The AVPlayer sticks around just fine, however when I reenter the app from the background the video appears to be seeked to the middle of the video. However, if I just play the video, it starts from the beginning, so it doesn't seem like it actually seeked and is just showing a preview image. I've tried to auto-seek the video to the end on relaunch but nothing happens . Nothing I can figure out or find in the docs would describe this behavior. Any tips on having the video launch either at the beginning or end?

    Read the article

  • Caching with AVPlayer and AVAssetExportSession

    - by tba
    I would like to cache progressive-download videos using AVPlayer. How can I save an AVPlayer's item to disk? I'm trying to use AVAssetExportSession on the player's currentItem (which is fully loaded). This code is giving me "AVAssetExportSessionStatusFailed (The operation could not be completed)" : AVAsset *mediaAsset = self.player.currentItem.asset; AVAssetExportSession *es = [[AVAssetExportSession alloc] initWithAsset:mediaAsset presetName:AVAssetExportPresetLowQuality]; NSString *outPath = [NSTemporaryDirectory() stringByAppendingPathComponent:@"out.mp4"]; NSFileManager *fileManager = [NSFileManager defaultManager]; [fileManager removeItemAtPath:outPath error:NULL]; es.outputFileType = @"com.apple.quicktime-movie"; es.outputURL = [[[NSURL alloc] initFileURLWithPath:outPath] autorelease]; NSLog(@"exporting to %@",outPath); [es exportAsynchronouslyWithCompletionHandler:^{ NSString *status = @""; if( es.status == AVAssetExportSessionStatusUnknown ) status = @"AVAssetExportSessionStatusUnknown"; else if( es.status == AVAssetExportSessionStatusWaiting ) status = @"AVAssetExportSessionStatusWaiting"; else if( es.status == AVAssetExportSessionStatusExporting ) status = @"AVAssetExportSessionStatusExporting"; else if( es.status == AVAssetExportSessionStatusCompleted ) status = @"AVAssetExportSessionStatusCompleted"; else if( es.status == AVAssetExportSessionStatusFailed ) status = @"AVAssetExportSessionStatusFailed"; else if( es.status == AVAssetExportSessionStatusCancelled ) status = @"AVAssetExportSessionStatusCancelled"; NSLog(@"done exporting to %@ status %d = %@ (%@)",outPath,es.status, status,[[es error] localizedDescription]); }]; How can I export successfully? I'm looking into copying mediaAsset into an AVMutableComposition, but haven't had much luck with that either. Thanks! PS: Here are some questions from people trying to accomplish the same thing (but with MPMoviePlayerController): Cache Progressive downloaded content in MPMoviePlayerController Simultaneously stream and save a video? Caching videos to disk after successful preload by MPMoviePlayerController

    Read the article

  • Problems playing multiple sounds using AVPlayer (NOT AVAudioPlayer)

    - by myetter37
    I'm trying to play a background song in my iPhone game and also have sound effects, using the AVFoundation framework and AVPlayerItem. I've scoured the Internet for help with AVPlayerItem and AVPlayer but I'm only finding stuff about AVAudioPlayer. The background song plays fine, but when the character jumps, I have 2 problems: 1) On the initial jump ([player play] inside jump method), the jump sound effect interrupts the background music. 2) If I try to jump again, the game crashes with the error "AVPlayerItem cannot be associated with more than one instance of AVPlayer" My professor told me to create a new instance of AVPlayer for each sound I want to play, so I'm confused. I'm doing data driven design, so my sound files are listed in a .txt and then loaded into a NSDictionary. Here's my code: - (void) storeSoundNamed:(NSString *) soundName withFileName:(NSString *) soundFileName { NSURL *assetURL = [[NSURL alloc] initFileURLWithPath:[[NSBundle mainBundle] pathForResource:soundName ofType:soundFileName]]; AVURLAsset *mAsset = [[AVURLAsset alloc] initWithURL:assetURL options:nil]; AVPlayerItem *mPlayerItem = [AVPlayerItem playerItemWithAsset:mAsset]; [soundDictionary setObject:mPlayerItem forKey:soundName]; NSLog(@"Sound added."); } - (void) playSound:(NSString *) soundName { // from .h: @property AVPlayer *mPlayer; // from .m: @synthesize mPlayer = _mPlayer; _mPlayer = [[AVPlayer alloc] initWithPlayerItem:[soundDictionary valueForKey:soundName]]; [_mPlayer play]; NSLog(@"Playing sound."); } If I move this line from the second method into the first: _mPlayer = [[AVPlayer alloc] initWithPlayerItem:[soundDictionary valueForKey:soundName]]; The game does not crash, and the background song will play perfectly, but the jump sound effect does not play, even though the console is showing "Playing sound." on each jump. Thank you

    Read the article

  • iOS: AVQueuePlayer/AVPlayerItem 'An AVPlayerItem can occupy only one position in a player's queue at a time.'

    - by JoshDG
    I keep getting this error: 'An AVPlayerItem can occupy only one position in a player's queue at a time.' I NSLog'd the players items, and none of them seem to be equal. Further, I added this just to be sure: if([player canInsertItem:itemToAdd afterItem:nil]) [player insertItem:itemToAdd afterItem:nil]; When I wasn't sure if that would work (can have two identical items in different memory locations) I wrote a category method to test if a player contains an item or something identical to it. Yet, I'm still getting the error. I've seen several posts of people getting this error with MPMoviePlayerController, but I'm not using that custom class, just the out of the box AVQueuePlayer. Any ideas on how to fix this?

    Read the article

  • Strange behavior: save video recorded within app?

    - by Josue Espinosa
    I allow the user to record a video within my app, then later play it again. When a user records a video, I save the URL of the video, then play the video later from the saved URL. I save the video both in the Photos app and in my app. If I delete the video within the photos app, it still plays. After about 7 days, the video gets deleted. I think I am saving in my tmp directory, but i'm not sure. Here is what I am doing: -(void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info { NSString *mediaType = [info objectForKey: UIImagePickerControllerMediaType]; [self dismissViewControllerAnimated:YES completion:nil]; // Handle a movie capture if (CFStringCompare ((__bridge_retained CFStringRef) mediaType, kUTTypeMovie, 0) == kCFCompareEqualTo) { NSString *moviePath = [NSString stringWithFormat:@"%@",[[info objectForKey:UIImagePickerControllerMediaURL] path]]; NSURL *videoURL = [info objectForKey:UIImagePickerControllerMediaURL]; NSData *videoData = [NSData dataWithContentsOfURL:videoURL]; _justRecordedVideoURL = [NSString stringWithFormat:@"%@",videoURL]; AppDelegate *appDelegate = [[UIApplication sharedApplication] delegate]; _managedObjectContext = [appDelegate managedObjectContext]; Video *video = [NSEntityDescription insertNewObjectForEntityForName:@"Video" inManagedObjectContext:_managedObjectContext]; [video setVideoData:videoData]; [video setVideoURL:[NSString stringWithFormat:@"%@",videoURL]]; NSDateFormatter *dateFormatter = [[NSDateFormatter alloc] init]; dateFormatter.dateStyle = NSDateFormatterLongStyle; [dateFormatter setDateStyle:NSDateFormatterLongStyle]; NSDate *date = [dateFormatter dateFromString:[dateFormatter stringFromDate:[NSDate date]]]; NSString *dateAdded = [dateFormatter stringFromDate:date]; [video setDate_recorded:dateAdded]; if(_currentAthlete != nil){ [video setWhosVideo:_currentAthlete]; } NSError *error = nil; if(![_managedObjectContext save:&error]){ //handle dat error } NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES); NSString *documentsDirectory = [paths objectAtIndex:0]; NSString *tempPath = [documentsDirectory stringByAppendingFormat:@"/vid1.mp4"]; BOOL success = [videoData writeToFile:tempPath atomically:NO]; if(success == FALSE){ NSLog(@"Video was not successfully saved."); } if (UIVideoAtPathIsCompatibleWithSavedPhotosAlbum(moviePath)) { UISaveVideoAtPathToSavedPhotosAlbum(moviePath, self, @selector(video:didFinishSavingWithError:contextInfo:), nil); } } } Am I saving it incorrectly? When I go to play the video, it works fine, after a couple days the video will play without audio, then eventually it will be gone. Any ideas why?

    Read the article

  • Playing video and audio in iPhone not working...

    - by Scott
    So we have buttons linked up to display images/videos/audio on click depending on a check we do earlier. That part works fine. It knows which one to play, however, when we click the buttons for video and audio, nothing happens. The image one works fine. The video and audio are being taken for a URL online, they are not local, but everywhere said this was still possible. Here is a little snippet of the code where we play the two files: if ( [fName hasSuffix:@".png"]) { NSLog(@"PICTURE"); NSURL *url = [NSURL URLWithString: fName]; UIImage *image = [UIImage imageWithData: [NSData dataWithContentsOfURL:url]]; self.view = [[UIView alloc] initWithFrame:[[UIScreen mainScreen] applicationFrame]]; // self.view.backgroundColor = [[UIColor alloc] initWithPatternImage:[UIImage imageNamed:@"MainBG.jpg"]]; [self.view addSubview:[[UIImageView alloc] initWithImage:image]]; } if ( [fName hasSuffix:@".mp4"]) { NSLog(@"VIDEO"); //NSString *path = [[NSBundle mainBundle] pathForResource:fName ofType:@"mp4"]; //NSLog(path); NSURL *url = [NSURL fileURLWithPath:fName]; MPMoviePlayerController *player = [[MPMoviePlayerController alloc] initWithContentURL:url]; [player play]; } if ( [fName hasSuffix:@".mp3"]) { NSLog(@"AUDIO"); NSURL *url = [NSURL fileURLWithPath:fName]; NSData *soundData = [NSData dataWithContentsOfURL:url]; AVAudioPlayer *avPlayer = [[AVAudioPlayer alloc] initWithData:soundData error: nil]; [avPlayer play]; } See anything wrong? By the way it compiles and runs, but nothing happens when we hit the button that executes that code.

    Read the article

  • Play Multiple iPod Library Songs On iPhone At The Same Time With Pitch Bending & Other Effects

    - by Dino
    Hi, I have been going at this for the past two weeks and it is driving me crazy. I asked this question a couple of days ago (Extract iPod Library raw PCM samples and play with sound effects) and whilst the answer got me half way there I am still stuck. Basically what I am trying to achieve is load up multiple songs from the iPod library for playback with effects such as pitch bend, eq effects etc... I have gone down the route of AVPlayer and AVAudioPlayer which are too simple. The only framework I've seen that can play audio with these effects is OpenAL. I have tried a few objective c wrappers (Finch and ObjectAL) Finch does not play compressed audio whilst ObjectAL will only convert it for me if I pass in a URL for the file (which is something I cannot do because I only have an incompatible iPod library URL). An example of an app that does what I want beautifilly is Tap DJ. It can load up songs from the iPod library fast (unlike TouchDJ and it plays them with all sorts of effects. Any help would be much appreciated.

    Read the article

  • Thread is being killed by the OS

    - by Or.Ron
    I'm currently programming an app that extracts frames from a movie clip. I designed it so that the extraction will be done on a separate thread to prevent the application from freezing. The extraction process itself is taking a lot of resources, but works fine when used in the simulator. However, there are problems when building it for the iPad. When I perform another action (I'm telling my AV player to play while I extract frames), the thread unexpectedly stops working, and I believe it's being killed. I assume it's becauase I'm using a lot of resources, but not entirely sure. Here are my questions: 1. How can I tell if/why my thread stopping? 2. If it's really from over processing what should I do? I really need this action to be implemented. Heres some code im using: To create the thread: [NSThread detachNewThreadSelector:@selector(startReading) toTarget:self withObject:nil]; I'll post any information you need, Thanks so much! Update I'm using GCD now and it populates the threads for me. However the OS still kills the threads. I know exactly when is it happening. when i tell my [AVplayer play]; it kills the thread. This issue is only happening in the actual iPad and not on the simulator

    Read the article

1