Search code examples
iosvideomergeavfoundationavassetexportsession

Merging Clips with Different Resolutions


I have a set of video clips that I would like to merge together and then put a watermark on it.

I am able to do both functions individually, however problems arise when performing the them together.

All clips that will be merged are either 1920x1080 or 960x540.

For some reason, AVAssetExportSession does not display them well together.

Here are the 2 bugs based on 3 different scenarios: Merged Screenshot

This image is a result of:

  • Merging Clips together

As you can see, there is nothing wrong here, the output video produces the desired effect.

However, when I then try to add a watermark, it creates the following issue: Merged and watermarked

This image is a result of:

  • Merging Clips together
  • Putting a watermark on it

BUG 1: Some clips in the video get resized for whatever reason while other clips do not.

Merged, watermarked, and edited

This image is a result of:

  • Merging Clips together
  • Resizing clips that are 960x540 to 1920x1080
  • Putting a watermark on it

Bug 2 Now the clips that need to be resized get resized, however the old unresized clip is still there.

Merging/Resizing Code:

-(void) mergeClips{
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
    AVMutableComposition *mixComposition = [[AVMutableComposition alloc] init];      
    
    AVMutableCompositionTrack *mutableVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
    
    AVMutableCompositionTrack *mutableAudioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];

    // loop through the list of videos and add them to the track
    CMTime currentTime = kCMTimeZero;
    
    NSMutableArray* instructionArray = [[NSMutableArray alloc] init];
    if (_clipsArray){
        for (int i = 0; i < (int)[_clipsArray count]; i++){
            NSURL* url = [_clipsArray objectAtIndex:i];
            
            AVAsset *asset = [AVAsset assetWithURL:url];
            
            AVAssetTrack *videoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
            AVAssetTrack *audioTrack = [[asset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
            
            CGSize size = videoTrack.naturalSize;
            CGFloat widthScale = 1920.0f/size.width;
            CGFloat heightScale = 1080.0f/size.height;
            
// lines that performs resizing
            AVMutableVideoCompositionLayerInstruction *layerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:mutableVideoTrack];
            CGAffineTransform scale = CGAffineTransformMakeScale(widthScale,heightScale);
            CGAffineTransform move = CGAffineTransformMakeTranslation(0,0);
            [layerInstruction setTransform:CGAffineTransformConcat(scale, move) atTime:currentTime];
            [instructionArray addObject:layerInstruction];
            
            
            [mutableVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration)
                                ofTrack:videoTrack
                                 atTime:currentTime error:nil];
            
            [mutableAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration)
                                ofTrack:audioTrack
                                 atTime:currentTime error:nil];
            
            currentTime = CMTimeMakeWithSeconds(CMTimeGetSeconds(asset.duration) + CMTimeGetSeconds(currentTime), asset.duration.timescale);
        }
    }
    
    AVMutableVideoCompositionInstruction * mainInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
    
    mainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, currentTime);
    mainInstruction.layerInstructions = instructionArray;
    
    
    // 4 - Get path
    NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
    NSString *documentsDirectory = [paths objectAtIndex:0];
    NSString *lastPostedDayPath = [documentsDirectory stringByAppendingPathComponent:@"lastPostedDay"];
    
    //Check if folder exists, if not create folder
    if (![[NSFileManager defaultManager] fileExistsAtPath:lastPostedDayPath]){
        [[NSFileManager defaultManager] createDirectoryAtPath:lastPostedDayPath withIntermediateDirectories:NO attributes:nil error:nil];
    }
    
    
    NSString *fileName = [NSString stringWithFormat:@"%li_%li_%li.mov", (long)_month, (long)_day, (long)_year];
    
    NSString *finalDayPath = [lastPostedDayPath stringByAppendingPathComponent:fileName];
    
    NSURL *url = [NSURL fileURLWithPath:finalDayPath];
    
    BOOL fileExists = [[NSFileManager defaultManager] fileExistsAtPath:finalDayPath];
    if (fileExists){
        NSLog(@"file exists");
        [[NSFileManager defaultManager] removeItemAtURL:url error:nil];
    }
    
    AVMutableVideoComposition *mainComposition = [AVMutableVideoComposition videoComposition];
    
    mainComposition.instructions = [NSArray arrayWithObject:mainInstruction];
    mainComposition.frameDuration = CMTimeMake(1, 30);
    mainComposition.renderSize = CGSizeMake(1920.0f, 1080.0f);
    
    // 5 - Create exporter
    _exportSession = [[AVAssetExportSession alloc] initWithAsset:mixComposition
                                                                            presetName:AVAssetExportPresetHighestQuality];
    _exportSession.outputURL=url;
    _exportSession.outputFileType = AVFileTypeQuickTimeMovie;
    _exportSession.shouldOptimizeForNetworkUse = YES;
    _exportSession.videoComposition = mainComposition;
    
    [_exportSession exportAsynchronouslyWithCompletionHandler:^{
        [merge_timer invalidate];
        merge_timer = nil;
        
        switch (_exportSession.status) {
            case AVAssetExportSessionStatusFailed:
                NSLog(@"Export failed -> Reason: %@, User Info: %@",
                      _exportSession.error.localizedDescription,
                      _exportSession.error.userInfo.description);
                [self showSavingFailedDialog];
                break;
                
            case AVAssetExportSessionStatusCancelled:
                NSLog(@"Export cancelled");
                [self showSavingFailedDialog];
                
                break;
                
            case AVAssetExportSessionStatusCompleted:
                NSLog(@"Export finished");
                [self addWatermarkToExportSession:_exportSession];
                
                break;
                
            default:
                break;
        }
    }];
});
}

Once it finishes this, I run it through a different Export Session that just simply adds a watermark.

Is there something I am doing wrong in my code or process? Is there an easier way for achieving this?

Thank you for your time!


Solution

  • I was able to solve my issue. For some reason, AVAssetExportSession will not actually create a 'flat' video file of the merged clips, so it still recognized the lower resolution clips and their locations when adding the watermark which caused them to resize.

    What I did to solve this was, first use AVAssetWriter to merge my clips and create one 'flat' file. I then could add a watermark without having a resizing issue.

    Hope this helps anyone who may come across this problem in the future!