Search code examples
iosswiftcifilterciimage

Applying CIFilter to UIImage results in resized and repositioned image


After applying a CIFilter to a photo captured with the camera the image taken shrinks and repositions itself.

I was thinking that if I was able to get the original images size and orientation that it would scale accordingly and pin the imageview to the corners of the screen. However nothing is changed with this approach and not aware of a way I can properly get the image to scale to the full size of the screen.

func applyBloom() -> UIImage {
  let ciImage = CIImage(image: image) // image is from UIImageView

  let filteredImage = ciImage?.applyingFilter("CIBloom",
                                              withInputParameters: [ kCIInputRadiusKey: 8,
                                                                     kCIInputIntensityKey: 1.00 ])
  let originalScale = image.scale
  let originalOrientation = image.imageOrientation


  if let image = filteredImage {
    let image = UIImage(ciImage: image, scale: originalScale, orientation: originalOrientation)
    return image
  }

  return self.image
}

Picture Description: Photo Captured and screenshot of the image with empty spacing being a result of an image shrink.

enter image description here


Solution

  • Try something like this. Replace:

    func applyBloom() -> UIImage {
        let ciInputImage = CIImage(image: image) // image is from UIImageView
        let ciOutputImage = ciInputImage?.applyingFilter("CIBloom",
                                              withInputParameters: [kCIInputRadiusKey: 8, kCIInputIntensityKey: 1.00 ])
        let context = CIContext()
        let cgOutputImage = context.createCGImage(ciOutputImage, from: ciInputImage.extent)
        return UIImage(cgImage: cgOutputImage!)
    }
    
    • I remained various variables to help explain what's happening.
    • Obviously, depending on your code, some tweaking to optionals and unwrapping may be needed.

    What's happening is this - take the filtered/output CIImage, and using a CIContext, write a CGImage the size of the input CIImage.

    • Be aware that a CIContext is expensive. If you already have one created, you should probably use it.
    • Pretty much, a UIImage size is the same as a CIImage extent. (I say pretty much because some generated CIImages can have infinite extents.)
    • Depending on your specific needs (and your UIImageView), you may want to use the output CIImage extent instead. Usually though, they are the same.

    Last, a suggestion. If you are trying to use a CIFilter to show "near real-time" changes to an image (like a photo editor), consider the major performance improvements you'll get using CIImages and a GLKView over UIImages and a UIImageView. The former uses a devices GPU instead of the CPU.