I'm learning how to write custom Core Image filters, but have this problem where the rendered color is not the same as specified in kernel code when color value is between 0 - 1. (Color is correct when equal to 0 or 1.)
This is my filter code:
import UIKit
class testFilter: CIFilter
{
// simply return half red color (0.5)
var colorKernel = CIColorKernel(string:
"kernel vec4 theFilter()" +
"{ return vec4(0.5, 0.0, 0.0, 1.0); }"
)
override var outputImage: CIImage!
{
let rect = CGRect(x: 0, y: 0, width: 200, height: 100)
return colorKernel.applyWithExtent(rect,arguments: nil)
}
}
Usage:
class ViewController: UIViewController {
override func viewDidLoad() {
super.viewDidLoad()
// generate image
let filter = testFilter()
let ciImage = f.outputImage
let image = UIImage(CIImage: ciImage)
// add to view
let v = UIImageView(frame: CGRect(x: 50, y: 50, width: 200, height: 100))
v.image = image
view.addSubview(v)
}
}
The output image looks like this, the actual red component value is around 190 (I checked color value using Photoshop). For a value of 0.5 I thought the correct output value would be 255 * 0.5 = 122.5?
I'm thinking this is because the kernel is creating colors based on an sRGB curve and you're expecting colors based on linear values. You can change the output of your filter to:
return colorKernel!.applyWithExtent(rect, arguments: nil)?
.imageByApplyingFilter("CISRGBToneCurveToLinear", withInputParameters: nil)
to correct the output color. By the way, the channels are UInt8
, so the values are integers and the half red value is 127.