Search code examples
iosobjective-cuiimageavfoundationavassetwriter

UIImages exported as movie error


Problem

My AVAssetWriter is failing after appending 5 or so images to it using a AVAssetWriterInputPixelBufferAdaptor, and I have no idea why.

Details

This popular question helped but isn't working for my needs:

How do I export UIImage array as a movie?

Everything works as planned, I even delay the assetWriterInput until it can handle more media. But for some reason, it always fails after 5 or so images. The images I'm using are extracted frames from a GIF

Code

Here is my iteration code:

-(void)writeImageData
{

     __block int i = 0;
     videoQueue = dispatch_queue_create("com.videoQueue", DISPATCH_QUEUE_SERIAL);
    [self.writerInput requestMediaDataWhenReadyOnQueue:dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_HIGH, 0) usingBlock:^{

     while (self.writerInput.readyForMoreMediaData) {
        if (i >= self.imageRefs.count){
            [self endSession];
            videoQueue = nil;
            [self saveToLibraryWithCompletion:^{
                NSLog(@"Saved");
            }];
            break;
        }

        if (self.writerInput.readyForMoreMediaData){
            CGImageRef imageRef = (__bridge CGImageRef)self.imageRefs[i];
            CVPixelBufferRef buffer = [self pixelBufferFromCGImageRef:imageRef];


            CGFloat timeScale = (CGFloat)self.imageRefs.count / self.originalDuration;
            BOOL accepted = [self.adaptor appendPixelBuffer:buffer withPresentationTime:CMTimeMake(i, timeScale)];
            CVBufferRelease(buffer);
            if (!accepted){
                NSLog(@"Buffer did not add %@, index %d, timescale %f", self.writer.error, i, timeScale);
            }else{
                NSLog(@"Buffer did nothing wrong");
            }
            i++;
        }
    }
}];

}

My other bits of code match the code from the Link above. This is only slightly different:

-(CVPixelBufferRef)pixelBufferFromCGImageRef:(CGImageRef)image
{
   NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
                         [NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey,
                         [NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey,
                         nil];
   CVPixelBufferRef pxbuffer = NULL;
   CGFloat width = 640;
   CGFloat height = 640;
   CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault, width,
                                      height, kCVPixelFormatType_32ARGB, (__bridge CFDictionaryRef) options,
                                      &pxbuffer);

   NSParameterAssert(status == kCVReturnSuccess && pxbuffer != NULL);

   CVPixelBufferLockBaseAddress(pxbuffer, 0);
   void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);
   NSParameterAssert(pxdata != NULL);

   CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
   CGContextRef context = CGBitmapContextCreate(pxdata, width,
                                             height, 8, 4*width, rgbColorSpace,
                                             kCGImageAlphaNoneSkipFirst);
   NSParameterAssert(context);

   CGContextDrawImage(context, CGRectMake(0, 0, width,
                                       height), image);
   CGColorSpaceRelease(rgbColorSpace);
   CGContextRelease(context);

   CVPixelBufferUnlockBaseAddress(pxbuffer, 0);
   return pxbuffer;

}


Solution

  • One thing that stands out to me is your use of CMTimeMake(adjustedTime, 1).

    You need to calculate the time of each frame properly. Note that CMTime takes two integers, and passing them as floating point values in truncates them.

    The second issue is that you weren't using your serial dispatch queue :)