Search code examples
iosswiftcore-imagecifilter

CIColorMatrix math doesn't make sense


I'm trying to use CIColorMatrix in Swift on iOS, and I'm coming up with a result that doesn't match my understanding of how it is supposed to work.

The documentation defines it as:

s.r = dot(s, redVector)
s.g = dot(s, greenVector)
s.b = dot(s, blueVector)
s.a = dot(s, alphaVector)
s = s + bias

So let's say I have an input pixel with the RGBA values [255, 0, 0, 255] (100% red). And I apply the vector [0.5, 0, 0, 0] to the red channel. Shouldn't this result in a red channel value of 127?

(255*0.5)+(0*0)+(0*0)+(255*0) = 127.5

For some reason, with these values, CIColorMatrix is giving me a value of 187. Is this not the definition of dot() that they are talking about?

Here's a code snippet (the input image is all red).

// Before this, the RGB values are [255, 0, 0]
let vec = CIVector(x: 0.5, y: 0, z: 0, w: 0)
let filter = CIFilter(name: "CIColorMatrix")
filter!.setDefaults()
// Default bias is [0,0,0,0] (explicitly setting it as such doesn't change the result).
filter!.setValue(myImage, forKey: kCIInputImageKey)
filter!.setValue(vec, forKey: "inputRVector")
// After this, the RGB values are [187, 0, 0]

What am I missing or misunderstanding here?


Solution

  • All Core Image filters operate in the working color space of the CIContext executing them. Filter kernels also work with alpha-unpremultiplied pixel values, while filter output is premultiplied. If you're seeing results that aren't what you expect, check that your working and output color spaces are configured as you want them, and make sure you aren't doing anything funny with alpha premultiplication.