Search code examples
iphoneobjective-cquartz-graphicsquartz-2dquartz-core

How to create a bitmap context for an alpha mask image?


I want to draw an alpha mask image in code. Right now I do:

1) Create a bitmap context using CGBitmapContextCreate with options CGColorSpaceCreateDeviceRGB and kCGImageAlphaPremultipliedFirst.

2) Then I draw into this context, using only grayscale colors like white and black.

3) Then I create a mask image from that context, using CGImageMaskCreate.

Conclusion: I waste a lot of memory! Because from my understanding, a mask image is grayscale only, right? So why create a context in ARGB in the first place.

How can I create a CGContextRef that is intended to be used for drawing a mask image? My thoughts are to use CGColorSpaceCreateDeviceGray, but here the problems start. This is the exact code how I create my ARGB bitmap context:

CGContextRef    context = NULL;
CGColorSpaceRef colorSpace;
uint32_t *      bitmapData;

int imageWidth = round(size.width);
int imageHeight = round(size.height);

int bitmapBytesPerRow = (imageWidth * 4);
int bitmapByteCount = (bitmapBytesPerRow * imageHeight);

colorSpace = CGColorSpaceCreateDeviceRGB();

bitmapData = malloc(bitmapByteCount);

context = CGBitmapContextCreate(bitmapData,
                                imageWidth,
                                imageHeight,
                                8,  // bits per component
                                bitmapBytesPerRow,
                                colorSpace,
                                kCGImageAlphaPremultipliedFirst);

CGColorSpaceRelease(colorSpace);

I am not sure how to compute the bitmapBytesPerRow for such a context. I assume it would be just imageWidth? And what must I supply for bits per component in CGBitmapContextCreate?

There is CGColorSpaceGetNumberOfComponents() but it reports only the number of components. This does not tell me how many bytes a component has.

Also what makes me nervous is that the 4 and 8 are hard-coded in my code above. Who says it's always 4 bytes per component, and who says it's 8 bits per component? I just took this from various sample codes out there. Everyone seems to do it this way. It works. But future proof? Probably not.

You would make my day with some great answers. Thanks.


Edit: I found a code-snippet, but it is confusing:

CGColorSpaceRef colorSpace2 = CGColorSpaceCreateDeviceGray();
CGContextRef gradientBitmapContext = CGBitmapContextCreate (NULL, 1, reflectRect.size.height,8, 0, colorSpace2, kCGImageAlphaNone);

Why 0 for bytes per row? The documentation does not say you can pass 0. Looks wrong.


Solution

  • Those parameters are telling the system how to treat the data or memory you supply. You have created that yourself, so you know what layout you intend. What, if anything, the system might want to do with it behind the scenes is not your immediate problem.

    In this case, you'll provide 8 bits per sample, with just the 1 component, and probably not want to use any row padding, in which case your bytesPerRow should indeed be the same as the image width.