Search code examples
iosobjective-ccarrayspixel

Stored UIImage pixel data into c array, unable to determine array's element count


I initialized the array like so

CGImageRef imageRef = CGImageCreateWithImageInRect(image.CGImage, bounds);
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
NSUInteger width = CGImageGetWidth(imageRef);
NSUInteger height = CGImageGetHeight(imageRef);
unsigned char *rawData = malloc(height * width * 4);
NSUInteger bytesPerPixel = 4;
NSUInteger bytesPerRow = bytesPerPixel * width;
NSUInteger bitsPerComponent = 8;
CGContextRef context = CGBitmapContextCreate(rawData, width, height, bitsPerComponent, bytesPerRow, colorSpace, kCGImageAlphaPremultipliedLast | kCGBitmapByteOrder32Big);

However, when I tried checking the count through an NSLog, I always get 4 (4/1, specifically).

int count = sizeof(rawData)/sizeof(rawData[0]);
NSLog(@"%d", count);

Yet when I NSLog the value of individual elements, it returns non zero values.

ex.
CGFloat f1 = rawData[15];

CGFloat f2 = rawData[n], where n is image width*height*4;

//I wasn't expecting this to work since the last element should be n-1

Finally, I tried

int n = lipBorder.size.width *lipBorder.size.height*4*2; //lipBorder holds the image's dimensions, I tried multiplying by 2 because there are 2 pixels for every CGPoint in retina
CGFloat f = rawData[n];

This would return different values each time for the same image, (ex. 0.000, 115.000, 38.000).

How do I determine the count / how are the values being stored into the array?


Solution

  • rawData is a pointer to unsigned char, as such its size is 32 bits (4 bytes)[1]. rawData[0] is an unsigned char, as such its size is 8 bits (1 byte). Hence, 4/1.

    You've probably seen this done with arrays before, where it does work as you would expect:

    unsigned char temp[10] = {0};
    NSLog(@"%d", sizeof(temp)/sizeof(temp[0])); // Prints 10
    

    Note, however, that you are dealing with a pointer to unsigned char, not an array of unsigned char - the semantics are different, hence why this doesn't work in your case.

    If you want the size of your buffer, you'll be much better off simply using height * width * 4, since that's what you passed to malloc anyway. If you really must, you could divide that by sizeof(char) or sizeof(rawData[0]) to get the number of elements, but since they're chars you'll get the same number anyway.

    Now, rawData is just a chunk of memory somewhere. There's other memory before and after it. So, if you attempt to do something like rawData[height * width * 4], what you're actually doing is attempting to access the next byte of memory after the chunk allocated for rawData. This is undefined behaviour, and can result in random garbage values being returned[2] (as you've observed), some "unassigned memory" marker value being returned, or a segmentation fault occurring.


    [1]: iOS is a 32-bit platform
    [2]: probably whatever value was put into that memory location last time it was legitimately used.