I am trying to efficiently count Shannon entropy (the -sum(pi * log pi) formula) of an image on macOS/iOS with swift. I have found Accelerate framework and vImage functions, which look like what I am looking for, however documentation is scarce and I got lost in it.
I am creating vImage buffer like this
var format = vImage_CGImageFormat(
bitsPerComponent: 8,
bitsPerPixel: 8 * 4,
colorSpace: CGColorSpace(name: CGColorSpace.displayP3)!,
bitmapInfo: .init(rawValue: CGImageAlphaInfo.noneSkipFirst.rawValue))!
let buf = try vImage.PixelBuffer(
cgImage: cg,
cgImageFormat: &format,
pixelFormat: vImage.Interleaved8x4.self)
My idea was to then convert it to 1 channel grayscale buffer ( vImage.PixelBuffer<vImage.Planar8>
) by buf.multiply()
, according to this page: https://developer.apple.com/documentation/accelerate/converting_color_images_to_grayscale . Then create histogram from it, and then manually iterate over its 256 values and count the sum. However, it seems that vImage.PixelBuffer<vImage.Planar8>
does not have histogram()
method at all ... while vImage.PixelBuffer<vImage.Interleaved8x4>
does.
Can you guide me to correct way to do it?
From my experience, I feel like your thinking is correct but we can streamline this a little more. These are some of the common steps that I have followed in previous occasions and got some insights from GPT as well.
You can use the
convert(to: )
method to transform avImage.PixelBuffer
from an interleaved colour format to a single channel.
Use
vImageHistogramCalculation_Planr8
to compute histogram info.
−∑(p i ⋅log(p i ))
This is somewhat of a rudimentary approach, there are a few other things you can add and make it more custom. But this solution will provide the necessary steps and output for the use cases in the selected custom info.
Feel free to reach out if you have any questions.