Search code examples
iphoneiosuiimagequartz-graphicscgimage

Can I edit the Alpha Mask of a UIImage in a UIImageView without have to move too much memory?


I want to take a image (a brush) and draw it into a displayed image. I only want to affect the alpha of that image, and I need to export it later.

From what I've seen, most directions only really get into some costly looking operations that don't pan out. ie they recommend you draw into an offscreen context, create a CGImage of the mask, and create a CGImageWithMask pretty much every time the brush is applied at all.

I already know this is costly because even just doing this and drawing into a context is rather rough for iPhone.

What I'd like to do is take the UIImage of an UIImageView, and manipulate it's alpha channel directly. I also am not doing it pixel-by-pixel, but with a largish (20px radius) brush that has a softness of its own.


Solution

  • I would not use an UIImageView for this. A normal UIView is enough.

    Just put the Image into the layer with

    UIView *view = ...
    view.layer.contents = (id)image.CGImage;
    

    after that you can make parts of the image transparent by adding a mask to the layer

    CALayer *mask = [[CALayer alloc] init]
    mask.contents = maskimage.CGImage;
    view.layer.mask = mask;
    

    for a project I did something where I had a brush.png that you could use to reveal an image with a finger... my update mask function there was:

    - (void)updateMask {
    
        const CGSize size = self.bounds.size;
        const size_t bitsPerComponent = 8;
        const size_t bytesPerRow = size.width; //1byte per pixel
        BOOL freshData = NO;
        if(NULL == _maskData || !CGSizeEqualToSize(size, _maskSize)) {
            _maskData = calloc(sizeof(char), bytesPerRow * size.height);
            _maskSize = size;
            freshData = YES;
        }
    
        //release the ref to the bitmat context so it doesn't get copied when we manipulate it later
        _maskLayer.contents = nil;
        //create a context to draw into the mask
        CGContextRef context = 
        CGBitmapContextCreate(_maskData, size.width, size.height, 
                              bitsPerComponent, bytesPerRow,
                              NULL,
                              kCGImageAlphaOnly);
        if(NULL == context) {
            LogDebug(@"Could not create the context");
            return;
        }
    
        if(freshData) {
            //fill with mask with alpha == 0, which means nothing gets revealed
            CGContextSetFillColorWithColor(context, [[UIColor clearColor] CGColor]);
            CGContextFillRect(context, CGRectMake(0, 0, size.width, size.height));    
        }
    
        CGContextTranslateCTM(context, 0, self.bounds.size.height);
        CGContextScaleCTM(context, 1.0f, -1.0f);
    
        //Draw all the points in the array into a mask
        for (NSValue* pointValue in _pointsToDraw)
        {
            CGPoint point;
            [pointValue getValue:&point];
            //LogDebug(@"location: %@", NSStringFromCGPoint(point));
    
            [self drawBrush:[_brush CGImage] at:point inContext:context];
        }
        [_pointsToDraw removeAllObjects];
    
        //extract an image from it
        CGImageRef newMask = CGBitmapContextCreateImage(context);
    
        //release the context
        CGContextRelease(context);
    
        //now update the mask layer
        _maskLayer.contents = (id)newMask;
        //self.layer.contents = (id)newMask;
        //and release the mask as it's retained by the layer
        CGImageRelease(newMask);
    }