Search code examples
iosxcodeswiftpixel

Getting Pixel value in the image


I am calculating the RGB values of pixels in my captured photo. I have this code

func getPixelColorAtLocation(context: CGContext, point: CGPoint) -> Color {

    self.context = createARGBBitmapContext(imgView.image!)

    let data = CGBitmapContextGetData(context)
    let dataType = UnsafePointer<UInt8>(data)

    let offset = 4 * ((Int(imageHeight) * Int(point.x)) + Int(point.y))
    var color = Color()
    color.blue = dataType[offset]
    color.green = dataType[offset + 1]
    color.red = dataType[offset + 2]
    color.alpha = dataType[offset + 3]
    color.point.x = point.x
    color.point.y = point.y

But I am not sure what this line means in the code.

let offset = 4 * ((Int(imageHeight) * Int(point.x)) + Int(point.y))

Any help?? Thanks in advance


Solution

  • Image is the set of pixels. In order to get the pixel at (x,y) point, you need to calculate the offset for that set.

    If you use dataType[0], it has no offset 'cos it points to the place where the pointer is. If you used dataType[10], it would mean you took 10-th element from the beginning where the pointer is.

    Due to the fact, we have RGBA colour model, you should multiply by 4, then you need to get what offset by x (it will be x), and by y (it will be the width of the image multiplied by y, in order to get the necessary column) or:

    offset = x + width * y
    // offset, offset + 1, offset + 2, offset + 3  <- necessary values for you
    

    Imagine, like you have a long array with values in it.

    It will be clear if you imagine the implementation of two-dimensional array in the form of one-dimensional array. It would help you, I hope.