Search code examples
objective-ccore-graphicscgimagerefcgbitmapcontextcreatecgbitmapcontext

Create CGImage From CGBitmapContext and Add to UIImageView


I am trying to create a snapshot of a UICollectionViewCell by creating a CGBitMapContext. I am not entirely clear on how to do this or how to use the associated classes, but after a bit of research, I have written the following method which is called from inside my UICollectionViewCell subclass:

- (void)snapShotOfCell
{
    float scaleFactor = [[UIScreen mainScreen] scale];
    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
    CGContextRef context = CGBitmapContextCreate(NULL, self.frame.size.width * scaleFactor, self.frame.size.height * scaleFactor, 8, self.frame.size.width * scaleFactor * 4, colorSpace, kCGImageAlphaPremultipliedFirst);

    CGImageRef image = CGBitmapContextCreateImage(context);
    UIImage *snapShot = [[UIImage alloc]initWithCGImage:image];

    UIImageView *imageView = [[UIImageView alloc]initWithFrame:self.frame];
    imageView.image = snapShot;
    imageView.opaque = YES;
    [self addSubview:imageView];

     CGImageRelease(image);
     CGContextRelease(context);
     CGColorSpaceRelease(colorSpace);
}

The result is that the image does not appear. Upon debugging, I can determine that I have a valid (non nil) context, CGImage, UIImage and UIImageView, but nothing appears onscreen. Can someone tell me what I am missing?


Solution

  • You can add this as a category to UIView and it will be accessible for any view

    - (UIImage*) snapshot
    {
        UIGraphicsBeginImageContextWithOptions(self.frame.size, YES /*opaque*/, 0 /*auto scale*/);
        [self.layer renderInContext:UIGraphicsGetCurrentContext()];
        UIImage* image = UIGraphicsGetImageFromCurrentImageContext();
        UIGraphicsEndImageContext();
        return image;
    }
    

    Then you just need to do [self addSubview:[[UIImageView alloc] initWithImage:self.snapshot]] from you cell object.

    [EDIT]

    Providing the need for asynchronous rendering (totally understandable) this can be achieved using dispatch queues. I think this would work:

    typedef void(^ImageOutBlock)(UIImage* image);
    
    - (void) snapshotAsync:(ImageOutBlock)block
    {
        CGFloat scale = [[UIScreen mainScreen] scale];
        CALayer* layer = self.layer;
        CGRect frame = self.frame;
        dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^() {
            CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
            CGContextRef context = CGBitmapContextCreate(NULL, frame.size.width * scaleFactor, frame.size.height * scaleFactor, 8, frame.size.width * scaleFactor * 4, colorSpace, kCGImageAlphaPremultipliedFirst);
            UIGraphicsBeginImageContextWithOptions(frame.size, YES /*opaque*/, scale);
            [layer renderInContext:UIGraphicsGetCurrentContext()];
            UIImage* image = UIGraphicsGetImageFromCurrentImageContext();
            UIGraphicsEndImageContext();
            CGContextRelease(context);
            CGColorSpaceRelease(colorSpace);
            dispatch_async(dispatch_get_main_queue(), ^() {
                block(image);
            });
        });
    }
    

    [EDIT]

    - (void) execute
    {
        __weak typeof(self) weakSelf = self;
        [self snapshotAsync:^(UIImage* image) { 
            [weakSelf addSubview:[[UIImageView alloc] initWithImage:image]] 
        }];
    }