In my app, I'm trying to realize a motion blur features that stack different frames (averaging them) coming from the video output into a single image. The effect I'm trying to obtain is well-explained here: https://photographylife.com/image-averaging-technique.
I tried using a custom CIKernel that performs the averaging operation on each color channel as follow:
float4 makeAverage(sample_t currentStack, sample_t newImage, float stackCount) {
float4 cstack = unpremultiply(currentStack);
float4 nim = unpremultiply(newImage);
float4 avg = ((cstack * stackCount) + nim) / (stackCount + 1.0);
return premultiply(avg);
}
You can find more details on the complete code here: Problems with frame averaging with Core Image
It works but, after a while, weird patches start to appear in the image, hinting that the color channels are clipping.
Is there a way I could achieve the same results using alpha blending in core image? Maybe, instead of doing the stacking operation on the color channels, could I stack subsequent images with a decreasing alpha value?
If so, what would be the procedure/algorithm to do it?
You can accomplish the same as what you are doing via combination of CIColorMatrix and CISourceOverCompositing in a simpler way just by using CIMix filter like this
func makeCompositeImage(stackImage: CIImage, newImage: CIImage?, count: Double) -> CIImage {
let opacity = 1.0 / count
return newImage?
.applyingFilter("CIMix", parameters: [
kCIInputBackgroundImageKey: stackImage,
kCIInputAmountKey: opacity
]) ?? stackImage
}
Please check out this app I just published: https://apps.apple.com/us/app/filter-magic/id1594986951. It lets you play with every single filter out there.