Search code examples
swiftnancgcontextcgcolorspace

Swift Color space different on iPhone 5 compared to iPhone 5s or 5se


I click on an image to return a pixel colour. This works fine but I have found out that it doesn't work on iPhone 5. It works on iPhone 5s, 5SE and all other sizes. I do not understand why. The following code give red and blue as nan on iPhone 5.

//  returns the color data of the pixel at the currently selected point

func getPixelColorAtPoint(_ point:CGPoint)-> UIColor{
 let pixel = UnsafeMutablePointer<CGFloat>.allocate(capacity: 4)
 let colorSpace = CGColorSpaceCreateDeviceRGB()
 let bitmapInfo = CGBitmapInfo(rawValue: CGImageAlphaInfo.premultipliedLast.rawValue)
 if let context = CGContext(data: pixel, width: 1, height: 1, bitsPerComponent: 8, bytesPerRow: 4, space: colorSpace, bitmapInfo: bitmapInfo.rawValue){
   context.translateBy(x: -point.x, y: -point.y)
   torsoImage.layer.render(in: context)
 }
 let pColor:UIColor = UIColor(red: CGFloat(pixel[0])/255.0, green: CGFloat(pixel[1])/255.0, blue: CGFloat(pixel[2])/255.0, alpha: CGFloat(pixel[3])/255.0)

 pixel.deallocate(capacity: 4)
 print("Colour touch is \(pColor)")
 return pColor
}

//Colour touch is UIExtendedSRGBColorSpace nan 1.94443e+12 nan 1.48678e-42

Solution

  • CGContext(data: pixel, width: 1, height: 1, bitsPerComponent: 8, bytesPerRow: 4, ...)
    

    creates a graphics context which is backed up by four bytes (one for each color component), therefore you have to pass an array of UInt8:

    let pixel = UnsafeMutablePointer<UInt8>.allocate(capacity: 4)
    

    What happens in your case is that only the first 4 bytes of the first CGFloat will be set, all remaining data is undefined (and might result in nan or some arbitrary value). Even if you don't get nan, the values will be wrong.