Search code examples
iosswiftciimageimagefilter

Imageview image either not setting or filtering properly


I have a imageview and I am setting its image either by using camera or from gallery and then adding filter to it using ciImage. The problem is when i am applying filter to that image and then set that image back to imageview it start after some gap. I don't have problem with gab but it should be in both side equally maintaining the original image aspect. filter on captured photo when opening captured image in Photos app and the black part is part of imageview only(checked in debug view hierarchy )

captuingImageController(capture image and send the image to) -> filterViewController(filter and set the image)

Weird part is, its not happening for all images and only with the images i'm capturing using this app. but if i don't filter the image and directly its setting perfectly. So not understanding where is the problem, is it with filtering or capturing the image. PS: If I open the same image with Photos app its looking fine.

This is the code used for adding filter to the image.

 override func viewDidLoad() {
    super.viewDidLoad()
    self.navigationController?.navigationBar.isHidden = true
    self.selectedImageView.image  = image

    //------ if I comment out the below filtering part image lookes perfect but if i use image not center with scaleAspectFill.
    self.selectedImageView.contentMode = .scaleAspectFit
    guard let image = image, let cgimg = image.cgImage else {
        print("imageView doesn't have an image!")
        return
    }
    let coreImage = CIImage(cgImage: cgimg)
    let filter = CIFilter(name: "CISepiaTone")
    filter?.setValue(coreImage, forKey: kCIInputImageKey)
    filter?.setValue(1, forKey: kCIInputIntensityKey)

    if let output = filter?.value(forKey: kCIOutputImageKey) as? CIImage {
        let filteredImage: UIImage = UIImage(ciImage: output)
        selectedImageView?.image = filteredImage
        UIImageWriteToSavedPhotosAlbum(filteredImage, self, #selector(image(_:didFinishSavingWithError:contextInfo:)), nil)            
        self.selectedImageView.contentMode = UIViewContentMode.scaleAspectFill
    } else {
        print("image filtering failed")
    }
    // ------
} 

code after capturing image

public func photoOutput(_ captureOutput: AVCapturePhotoOutput, didFinishProcessingPhoto photoSampleBuffer: CMSampleBuffer?, previewPhoto previewPhotoSampleBuffer: CMSampleBuffer?,
                        resolvedSettings: AVCaptureResolvedPhotoSettings, bracketSettings: AVCaptureBracketedStillImageSettings?, error: Swift.Error?) {
    if let error = error { self.photoCaptureCompletionBlock?(nil, error) }

    else if let buffer = photoSampleBuffer, let data = AVCapturePhotoOutput.jpegPhotoDataRepresentation(forJPEGSampleBuffer: buffer, previewPhotoSampleBuffer: nil),
        var image = UIImage(data: data) {
        if currentCameraPosition == .front {
            image = UIImage(cgImage: image.cgImage!, scale: image.scale, orientation: .leftMirrored)
        }

        self.photoCaptureCompletionBlock?(image, nil)
    }

    else {
        self.photoCaptureCompletionBlock?(nil, CameraControllerError.unknown)
    }
}

And also I have to add different filter based of user swipe left or right like instagarm and also add crop func, square, rectangle and ratios and many cool stuffs, but first this demo thing is not working perfectly :-( if you have any tutorial or sample code who is doing all this kind of stuffs do share link thanks in advance

filter on old pic from gallery


Solution

  • was missing only one line setting scale and orientation of the orignal image when ever applying filter store scale and orientation and then set it back while converting back to uiImage

    let filteredImage  = UIImage(cgImage:filteredImageRef!, scale:imgScale!, orientation:imgOrientation!)