In my project, I need to copy a chunk of each frame of a video on one unique resulting image.
Capturing video frames is not a big deal. It would be something like :
// duration is the movie lenght in s.
// frameDuration is 1/fps. (or 24fps, frameDuration = 1/24)
// player is a MPMoviePlayerController
for (NSTimeInterval i=0; i < duration; i += frameDuration) {
UIImage * image = [player thumbnailImageAtTime:i timeOption:MPMovieTimeOptionExact];
CGRect destinationRect = [self getDestinationRect:i];
[self drawImage:image inRect:destinationRect fromRect:originRect];
// UI feedback
[self performSelectorOnMainThread:@selector(setProgressValue:) withObject:[NSNumber numberWithFloat:x/totalFrames] waitUntilDone:NO];
}
The problem comes when I try to implement drawImage:inRect:fromRect:
method.
I tried this code, which :
CGImageCreateWithImageInRect
from the video frame to extract the chunk of image.But when the video reaches 12-14s, my iPhone 4S is announcing his third memory warning and crashes. I've profiled the app with the Leak tool, and it found no leak at all...
I'm not very strong in Quartz. Is there better optimized way to achieve this?
Finally I kept the Quartz part of my code and changed the way I retrieved the images.
Now I use AVFoundation, which is a far faster solution.
// Creating the tools : 1/ the video asset, 2/ the image generator, 3/ the composition, which helps to retrieve video properties.
AVURLAsset *asset = [[[AVURLAsset alloc] initWithURL:moviePathURL
options:[NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithBool:YES], AVURLAssetPreferPreciseDurationAndTimingKey, nil]] autorelease];
AVAssetImageGenerator *generator = [[[AVAssetImageGenerator alloc] initWithAsset:asset] autorelease];
generator.appliesPreferredTrackTransform = YES; // if I omit this, the frames are rotated 90° (didn't try in landscape)
AVVideoComposition * composition = [AVVideoComposition videoCompositionWithPropertiesOfAsset:asset];
// Retrieving the video properties
NSTimeInterval duration = CMTimeGetSeconds(asset.duration);
frameDuration = CMTimeGetSeconds(composition.frameDuration);
CGSize renderSize = composition.renderSize;
CGFloat totalFrames = round(duration/frameDuration);
// Selecting each frame we want to extract : all of them.
NSMutableArray * times = [NSMutableArray arrayWithCapacity:round(duration/frameDuration)];
for (int i=0; i<totalFrames; i++) {
NSValue *time = [NSValue valueWithCMTime:CMTimeMakeWithSeconds(i*frameDuration, composition.frameDuration.timescale)];
[times addObject:time];
}
__block int i = 0;
AVAssetImageGeneratorCompletionHandler handler = ^(CMTime requestedTime, CGImageRef im, CMTime actualTime, AVAssetImageGeneratorResult result, NSError *error){
if (result == AVAssetImageGeneratorSucceeded) {
int x = round(CMTimeGetSeconds(requestedTime)/frameDuration);
CGRect destinationStrip = CGRectMake(x, 0, 1, renderSize.height);
[self drawImage:im inRect:destinationStrip fromRect:originStrip inContext:context];
}
else
NSLog(@"Ouch: %@", error.description);
i++;
[self performSelectorOnMainThread:@selector(setProgressValue:) withObject:[NSNumber numberWithFloat:i/totalFrames] waitUntilDone:NO];
if(i == totalFrames) {
[self performSelectorOnMainThread:@selector(performVideoDidFinish) withObject:nil waitUntilDone:NO];
}
};
// Launching the process...
generator.requestedTimeToleranceBefore = kCMTimeZero;
generator.requestedTimeToleranceAfter = kCMTimeZero;
generator.maximumSize = renderSize;
[generator generateCGImagesAsynchronouslyForTimes:times completionHandler:handler];
Even with very long video, it takes the time but it never crash !