Search code examples
objective-ccocoaimage-processingcgcontextvimage

Image Quality getting affected on scaling the image using vImageScale_ARGB8888 - Cocoa Objective C


I am capturing my system's screen with AVCaptureSession and then create a video file out of the image buffers captured. It works fine.

Now I want to scale the image buffers by maintaining the aspect ratio for the video file's dimension. I have used the following code to scale the images.

- (void)captureOutput:(AVCaptureOutput *)output didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {

    CVImageBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
    
    if (pixelBuffer == NULL) { return; }
    
    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
    
    size_t finalWidth = 1080;
    size_t finalHeight = 720;
    
    size_t sourceWidth = CVPixelBufferGetWidth(imageBuffer);
    size_t sourceHeight = CVPixelBufferGetHeight(imageBuffer);
    
    CGRect aspectRect = AVMakeRectWithAspectRatioInsideRect(CGSizeMake(sourceWidth, sourceHeight), CGRectMake(0, 0, finalWidth, finalHeight));
    
    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);

    size_t startY = aspectRect.origin.y;
    size_t yOffSet = (finalWidth*startY*4);
    CVPixelBufferLockBaseAddress(imageBuffer, 0);
    
    void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer);
    
    void* destData = malloc(finalHeight * finalWidth * 4);
    
    vImage_Buffer srcBuffer = { (void *)baseAddress, sourceHeight, sourceWidth, bytesPerRow};
    vImage_Buffer destBuffer = { (void *)destData+yOffSet, aspectRect.size.height, aspectRect.size.width, aspectRect.size.width * 4};
    
    vImage_Error err = vImageScale_ARGB8888(&srcBuffer, &destBuffer, NULL, 0);
    
    CVPixelBufferUnlockBaseAddress(imageBuffer, 0);
    OSType pixelFormat = CVPixelBufferGetPixelFormatType(imageBuffer);
    
    CVImageBufferRef pixelBuffer1 = NULL;
    
    CVReturn result = CVPixelBufferCreateWithBytes(NULL, finalWidth, finalHeight, pixelFormat, destData, finalWidth * 4, NULL, NULL, NULL, &pixelBuffer1);
}

I am able scale the image with the above code but the final image seems to be blurry compare to resizing the image with Preview application. Because of this the video is not clear.

This works fine if I change the output pixel format to RGB with below code.

output.videoSettings = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA], (id)kCVPixelBufferPixelFormatTypeKey, nil];

But I want the image buffers in YUV format (which is the default format for AVCaptureVideoDataOutput) since this will reduce the size of the buffer when transferring it over network.

Image after scaling:

enter image description here

Image resized with Preview application:

enter image description here

I have tried using vImageScale_CbCr8 instead of vImageScale_ARGB8888 but the resulting image didn't contain correct RGB values.

I have also noticed there is function to convert image format: vImageConvert_422YpCbYpCr8ToARGB8888(const vImage_Buffer *src, const vImage_Buffer *dest, const vImage_YpCbCrToARGB *info, const uint8_t permuteMap[4], const uint8_t alpha, vImage_Flags flags);

But I don't know what should be the values for vImage_YpCbCrToARGB and permuteMap as I don't know anything about image processing.

Expected Solution:

How to convert YUV pixel buffers to RGB buffers and back to YUV (or) How to scale YUV pixel buffers without affecting the RGB values.


Solution

  • After a lot search and going through different questions related to image rendering, found the below code to convert the pixel format of the image buffers. Thanks to the answer in this link.

    CVPixelBufferRef imageBuffer;
    CVPixelBufferCreate(kCFAllocatorDefault, sourceWidth, sourceHeight, kCVPixelFormatType_32ARGB, 0, &imageBuffer);
    
    VTPixelTransferSessionTransferImage(pixelTransferSession, pixelBuffer, imageBuffer);