Search code examples
objective-cswiftuiimagergbcgbitmapcontextcreate

Different UIImage RGB data output in Swift and Objective-C apps


I'm working on image processing framework and use this code to read RGB data:

if let data = image.cgImage?.dataProvider?.data {           
  let dataPtr: UnsafePointer<UInt8> = CFDataGetBytePtr(data)
  let width = Int(image.size.width)
  let height = Int(image.size.height)
  for y in 0..<height {
    for x in 0..<width {
      let pixelInfo: Int = ((width * y) + x) * 4
      let r = dataPtr[pixelInfo]
      let g = dataPtr[pixelInfo + 1]
      let b = dataPtr[pixelInfo + 2]
      print("\(r), \(g), \(b)")
    }
  }
}

The if I create new Swift project and new Objective-C project and use the same code (using bridge header file for Objc project) I get different results, for example:

5, 36, 20;   24, 69, 48 (Swift)
5, 36, 18;   21, 69, 47 (Objc)

It causes much different results in further processing. I've tried to use Objective-C code and read data with CGBitmapContextCreate() but I get exact same result. It shows same ColorSpace in both apps, I've tried to set it manually to DeviceRGB and sRGB without any luck.

I have to match Objc output with Android app that has exact same results as Swift app.

UPDATE. Second solution that I've tried is to write another code for Objective-C and it returns exact same result that doesn't match Swift:

size_t bytesSize = 0;
unsigned char *bytes = [self getBytesFromImage:image dataSize:&bytesSize];

size_t doubleSize = sizeof(double) * bytesSize;
double *doubles = (double *)malloc(doubleSize);

size_t doublesIndex = 0;
size_t counter = 0;
while (counter < bytesSize) {
    unsigned char r = bytes[counter];
    unsigned char g = bytes[counter+1];
    unsigned char b = bytes[counter+2];
    counter += 4;
}

- (unsigned char*) getBytesFromImage:(UIImage *)image dataSize:(size_t *)dataSize {
    *dataSize = size_t(4 * image.size.width * image.size.height);
    unsigned char *imageData = (unsigned char*)malloc(*dataSize);
    
    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
    
    CGImageRef imageRef = [image CGImage];
    CGContextRef bitmap = CGBitmapContextCreate( imageData, image.size.width, image.size.height, 8, image.size.width * 4 , colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
    CGContextDrawImage(bitmap, CGRectMake(0, 0, image.size.width, image.size.height), imageRef);
    
    CGContextRelease(bitmap);
    CGColorSpaceRelease(colorSpace);
    
    return imageData;
}

Solution

  • Based on your Swift code, I coded what you see the below.

    I used your code and this code and it seems to match perfectly - at least for an image I am using this side.

    // Dump some bytes
    - ( void ) test
    {
        UIImage * img = [UIImage imageNamed:@"test"];
        CGDataProviderRef dp = CGImageGetDataProvider( img.CGImage );
        CFDataRef data = CGDataProviderCopyData ( dp );
        CFIndex length = CFDataGetLength ( data );
        NSUInteger i = 0;
        unsigned char rgba [ 4 ];
    
        // Print all
        // while ( i < length )
        // Print first few
        while ( i < 100 )
        {
            CFDataGetBytes( data, CFRangeMake( i, sizeof( rgba ) ), rgba );
            i += 4;
    
            NSLog ( @"RGB %d %d %d", rgba[ 0 ], rgba[ 1 ], rgba[ 2 ] );
        }
        CFRelease ( data );
        CGDataProviderRelease ( dp );
    }