Search code examples
performanceasynchronousuiimageviewuiimagecore-graphics

Setting image property of UIImageView causes major lag


Let me tell you about the problem I am having and how I tried to solve it. I have a UIScrollView which loads subviews as one scrolls from left to right. Each subview has 10-20 images around 400x200 each. When I scroll from view to view, I experience quite a bit of lag.

After investigating, I discovered that after unloading all the views and trying it again, the lag was gone. I figured that the synchronous caching of the images was the cause of the lag. So I created a subclass of UIImageView which loaded the images asynchronously. The loading code looks like the following (self.dispatchQueue returns a serial dispatch queue).

- (void)loadImageNamed:(NSString *)name {
    dispatch_async(self.dispatchQueue, ^{
        UIImage *image = [UIImage imageNamed:name];

        dispatch_sync(dispatch_get_main_queue(), ^{
            self.image = image;
        });
    });
}

However, after changing all of my UIImageViews to this subclass, I still experienced lag (I'm not sure if it was lessened or not). I boiled down the cause of the problem to self.image = image;. Why is this causing so much lag (but only on the first load)?

Please help me. =(


Solution

  • EDIT 3: iOS 15 now offers UIImage.prepareForDisplay(completionHandler:).

    imageView.image = await image.byPreparingForDisplay()
    

    or

    image.prepareForDisplay { decodedImage in
        DispatchQueue.main.async {
            imageView.image = decodedImage
        }
    }
    

    EDIT 2: Here is a Swift version that contains a few improvements. (Untested.) https://gist.github.com/fumoboy007/d869e66ad0466a9c246d


    EDIT: Actually, I believe all that is necessary is the following. (Untested.)

    - (void)loadImageNamed:(NSString *)name {
        dispatch_async(self.dispatchQueue, ^{
            // Determine path to image depending on scale of device's screen,
            // fallback to 1x if 2x is not available
            NSString *pathTo1xImage = [[NSBundle mainBundle] pathForResource:name ofType:@"png"];
            NSString *pathTo2xImage = [[NSBundle mainBundle] pathForResource:[name stringByAppendingString:@"@2x"] ofType:@"png"];
    
            NSString *pathToImage = ([UIScreen mainScreen].scale == 1 || !pathTo2xImage) ? pathTo1xImage : pathTo2xImage;
    
    
            UIImage *image = [[UIImage alloc] initWithContentsOfFile:pathToImage];
    
            // Decompress image
            if (image) {
                UIGraphicsBeginImageContextWithOptions(image.size, NO, image.scale);
    
                [image drawAtPoint:CGPointZero];
    
                image = UIGraphicsGetImageFromCurrentImageContext();
    
                UIGraphicsEndImageContext();
            }
    
    
            // Configure the UI with pre-decompressed UIImage
            dispatch_async(dispatch_get_main_queue(), ^{
                self.image = image;
            });
        });
    }
    

    ORIGINAL ANSWER: It turns out that it wasn't self.image = image; directly. The UIImage image loading methods don't decompress and process the image data right away; they do it when the view refreshes its display. So the solution was to go a level lower to Core Graphics and decompress and process the image data myself. The new code looks like the following.

    - (void)loadImageNamed:(NSString *)name {
        dispatch_async(self.dispatchQueue, ^{
            // Determine path to image depending on scale of device's screen,
            // fallback to 1x if 2x is not available
            NSString *pathTo1xImage = [[NSBundle mainBundle] pathForResource:name ofType:@"png"];
            NSString *pathTo2xImage = [[NSBundle mainBundle] pathForResource:[name stringByAppendingString:@"@2x"] ofType:@"png"];
            
            NSString *pathToImage = ([UIScreen mainScreen].scale == 1 || !pathTo2xImage) ? pathTo1xImage : pathTo2xImage;
            
            
            UIImage *uiImage = nil;
            
            if (pathToImage) {
                // Load the image
                CGDataProviderRef imageDataProvider = CGDataProviderCreateWithFilename([pathToImage fileSystemRepresentation]);
                CGImageRef image = CGImageCreateWithPNGDataProvider(imageDataProvider, NULL, NO, kCGRenderingIntentDefault);
                
                
                // Create a bitmap context from the image's specifications
                // (Note: We need to specify kCGImageAlphaPremultipliedFirst | kCGBitmapByteOrder32Little
                // because PNGs are optimized by Xcode this way.)
                CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
                CGContextRef bitmapContext = CGBitmapContextCreate(NULL, CGImageGetWidth(image), CGImageGetHeight(image), CGImageGetBitsPerComponent(image), CGImageGetWidth(image) * 4, colorSpace, kCGImageAlphaPremultipliedFirst | kCGBitmapByteOrder32Little);
                
                
                // Draw the image into the bitmap context
                CGContextDrawImage(bitmapContext, CGRectMake(0, 0, CGImageGetWidth(image), CGImageGetHeight(image)), image);
                
                //  Extract the decompressed image
                CGImageRef decompressedImage = CGBitmapContextCreateImage(bitmapContext);
                
                
                // Create a UIImage
                uiImage = [[UIImage alloc] initWithCGImage:decompressedImage];
                
                
                // Release everything
                CGImageRelease(decompressedImage);
                CGContextRelease(bitmapContext);
                CGColorSpaceRelease(colorSpace);
                CGImageRelease(image);
                CGDataProviderRelease(imageDataProvider);
            }
            
            
            // Configure the UI with pre-decompressed UIImage
            dispatch_async(dispatch_get_main_queue(), ^{
                self.image = uiImage;
            });
        });
    }