Search code examples
iosswiftkernelcore-imagemetal

Convert colours of every pixel in video preview - Swift


I have the following code which displays a camera preview, retrieves a single pixel's colour from the UIImage and converts this value to a 'filtered' colour.

func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {
    connection.videoOrientation = orientation
    let videoOutput = AVCaptureVideoDataOutput()
    videoOutput.setSampleBufferDelegate(self, queue: DispatchQueue.main)

    let pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)
    let cameraImage = CIImage(cvImageBuffer: pixelBuffer!)

    let typeOfColourBlindness = ColourBlindType(rawValue: "deuteranomaly")

    /* Gets colour from a single pixel - currently 0,0 and converts it into the 'colour blind' version */

    let captureImage = convert(cmage: cameraImage)

    let colour = captureImage.getPixelColour(pos: CGPoint(x: 0, y: 0))

    var redval: CGFloat = 0
    var greenval: CGFloat = 0
    var blueval: CGFloat = 0
    var alphaval: CGFloat = 0

    _ = colour.getRed(&redval, green: &greenval, blue: &blueval, alpha: &alphaval)
    print("Colours are r: \(redval) g: \(greenval) b: \(blueval) a: \(alphaval)")

    let filteredColour = CBColourBlindTypes.getModifiedColour(.deuteranomaly, red: Float(redval), green: Float(greenval), blue: Float(blueval))
    print(filteredColour)

    /* #################################################################################### */

    DispatchQueue.main.async {
        // placeholder for now
        self.filteredImage.image = self.applyFilter(cameraImage: cameraImage, colourBlindness: typeOfColourBlindness!)
    }
}

Here is where the x: 0, y: 0 pixel value is converted:

import Foundation

enum ColourBlindType: String {
    case deuteranomaly = "deuteranomaly"
    case protanopia = "protanopia"
    case deuteranopia = "deuteranopia"
    case protanomaly = "protanomaly"
}

class CBColourBlindTypes: NSObject {
   class func getModifiedColour(_ type: ColourBlindType, red: Float, green: Float, blue: Float) -> Array<Float> {
        switch type {
        case .deuteranomaly:
            return [(red*0.80)+(green*0.20)+(blue*0),
                    (red*0.25833)+(green*0.74167)+(blue*0),
                    (red*0)+(green*0.14167)+(blue*0.85833)]
        case .protanopia:
            return [(red*0.56667)+(green*0.43333)+(blue*0),
                    (red*0.55833)+(green*0.44167)+(blue*0),
                    (red*0)+(green*0.24167)+(blue*0.75833)]
        case .deuteranopia:
            return [(red*0.625)+(green*0.375)+(blue*0),
                    (red*0.7)+(green*0.3)+(blue*0),
                    (red*0)+(green*0.3)+(blue*0.7)]
        case .protanomaly:
            return [(red*0.81667)+(green*0.18333)+(blue*0.0),
                    (red*0.33333)+(green*0.66667)+(blue*0.0),
                    (red*0.0)+(green*0.125)+(blue*0.875)]
        }

    }

}

The placeholder for now comment refers to the following function:

func applyFilter(cameraImage: CIImage, colourBlindness: ColourBlindType) -> UIImage {

    //do stuff with pixels to render new image


    /*      Placeholder code for shifting the hue      */

    // Create a place to render the filtered image
    let context = CIContext(options: nil)

    // Create filter angle
    let filterAngle = 207 * Double.pi / 180

    // Create a random color to pass to a filter
    let randomColor = [kCIInputAngleKey: filterAngle]

    // Apply a filter to the image
    let filteredImage = cameraImage.applyingFilter("CIHueAdjust", parameters: randomColor)

    // Render the filtered image
    let renderedImage = context.createCGImage(filteredImage, from: filteredImage.extent)

    // Return a UIImage
    return UIImage(cgImage: renderedImage!)
}

And here is my extension for retrieving a pixel colour:

extension UIImage {
    func getPixelColour(pos: CGPoint) -> UIColor {

        let pixelData = self.cgImage!.dataProvider!.data
        let data: UnsafePointer<UInt8> = CFDataGetBytePtr(pixelData)

        let pixelInfo: Int = ((Int(self.size.width) * Int(pos.y)) + Int(pos.x)) * 4

        let r = CGFloat(data[pixelInfo]) / CGFloat(255.0)
        let g = CGFloat(data[pixelInfo+1]) / CGFloat(255.0)
        let b = CGFloat(data[pixelInfo+2]) / CGFloat(255.0)
        let a = CGFloat(data[pixelInfo+3]) / CGFloat(255.0)

        return UIColor(red: r, green: g, blue: b, alpha: a)
    }

}

How can I create a filter for the following colour range for example? enter image description here

I want to take in the camera input, replace the colours to be of the Deuteranopia range and display this on the screen, in real time, using Swift.

I am using a UIImageView for the image display.


Solution

  • To learn how to perform filtering of video capture and real-time display of the filtered image, you may want to study the AVCamPhotoFilter sample code from Apple, and other sources such as this objc.io tutorial

    In short, using a UIImage for real-time rendering is not a good idea - it's too slow. Use a OpenGL (e.g. GLKView) of Metal (e.g. MTKView). The AVCamPhotoFilter code uses MTKView and renders to intermediate buffers, but you can also render a CIImage directly using the appropriate CIContext methods, e.g. for metal https://developer.apple.com/documentation/coreimage/cicontext/1437835-render

    In addition, regarding your color filter - you may want to look at the CIColorCube core image filter as shown here.