Search code examples
iosobjective-cvideoffmpegavfoundation

How to do Slow Motion video in IOS


I have to do "slow motion" in a video file along with audio, in-between some frames and need to store the ramped video as a new video.

Ref: http://www.youtube.com/watch?v=BJ3_xMGzauk (watch from 0 to 10s)

From my analysis, I've found that AVFoundation framework can be helpful.

Ref: http://developer.apple.com/library/ios/#documentation/AudioVideo/Conceptual/AVFoundationPG/Articles/00_Introduction.html

Copy and pasted from the above link:

" Editing AV Foundation uses compositions to create new assets from existing pieces of media (typically, one or more video and audio tracks). You use a mutable composition to add and remove tracks, and adjust their temporal orderings. You can also set the relative volumes and ramping of audio tracks; and set the opacity, and opacity ramps, of video tracks. A composition is an assemblage of pieces of media held in memory. When you export a composition using an export session, it's collapsed to a file. On iOS 4.1 and later, you can also create an asset from media such as sample buffers or still images using an asset writer.

"

Questions: Can I do " slow motion " the video/audio file using the AVFoundation framework ? Or Is there any other package available? If i want to handle audio and video separately, please guide me how to do?

Update :: Code For AV Export Session :

 NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
    NSString *outputURL = paths[0];
    NSFileManager *manager = [NSFileManager defaultManager];
    [manager createDirectoryAtPath:outputURL withIntermediateDirectories:YES attributes:nil error:nil];
    outputURL = [outputURL stringByAppendingPathComponent:@"output.mp4"];
    // Remove Existing File
    [manager removeItemAtPath:outputURL error:nil];
    AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:self.inputAsset presetName:AVAssetExportPresetLowQuality];
    exportSession.outputURL = [NSURL fileURLWithPath:outputURL]; // output path;
    exportSession.outputFileType = AVFileTypeQuickTimeMovie;
    [exportSession exportAsynchronouslyWithCompletionHandler:^(void) {
        if (exportSession.status == AVAssetExportSessionStatusCompleted) {
            [self writeVideoToPhotoLibrary:[NSURL fileURLWithPath:outputURL]];
            ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
            [library writeVideoAtPathToSavedPhotosAlbum:[NSURL fileURLWithPath:outputURL] completionBlock:^(NSURL *assetURL, NSError *error){
                if (error) {
                    NSLog(@"Video could not be saved");
                }
            }];
        } else {
            NSLog(@"error: %@", [exportSession error]);
        }
    }];

Solution

  • You could scale video using AVFoundation and CoreMedia frameworks. Take a look at the AVMutableCompositionTrack method:

    - (void)scaleTimeRange:(CMTimeRange)timeRange toDuration:(CMTime)duration;
    

    Sample:

    AVURLAsset* videoAsset = nil; //self.inputAsset;
    
    //create mutable composition
    AVMutableComposition *mixComposition = [AVMutableComposition composition];
    
    AVMutableCompositionTrack *compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo
                                                                                   preferredTrackID:kCMPersistentTrackID_Invalid];
    NSError *videoInsertError = nil;
    BOOL videoInsertResult = [compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration)
                                                            ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]
                                                             atTime:kCMTimeZero
                                                              error:&videoInsertError];
    if (!videoInsertResult || nil != videoInsertError) {
        //handle error
        return;
    }
    
    //slow down whole video by 2.0
    double videoScaleFactor = 2.0;
    CMTime videoDuration = videoAsset.duration;
    
    [compositionVideoTrack scaleTimeRange:CMTimeRangeMake(kCMTimeZero, videoDuration)
                               toDuration:CMTimeMake(videoDuration.value*videoScaleFactor, videoDuration.timescale)];
    
    //export
    AVAssetExportSession* assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition
                                                                         presetName:AVAssetExportPresetLowQuality];
    

    (Probably audio track from videoAsset should also be added to mixComposition)