Search code examples
macosappkitnswindowcontrollernseventnscolor

NSReadPixel() always returns nil


I want to create a color picker. So I thought using NSReadPixel would be a good approach to receive the pixels’ color. So what I basically did was this:

class CustomWindowController: NSWindowController {

    override func mouseMoved(with event: NSEvent) {

        let mouseLocation = NSEvent.mouseLocation()
        let pickedColor = NSReadPixel(mouseLocation)

    }

}

But pickedColor always returns nil. Even if I try to "readPixel" with a fixed point (for testing purposes) it still returns nil. What am I missing?

EDIT #1

I’ve followed the NSBitmapImageRep / colorAt approach from the answers and noticed that the resulting NSColor seems to be a bit different (in most cases brighter) that it should be (take a look at the screenshot). Do I have to consider colorSpaces or so? (and how?)

enter image description here

EDIT #2

Got it work - bitmap.colorSpaceName = NSDeviceRGBColorSpace does the trick.


Solution

  • It really doesn't work. I use this code for getting pixel color

    NSPoint _point = [NSEvent mouseLocation];
    
    CGFloat x = floor(_point.x);
    CGFloat y = [NSScreen mainScreen].frame.size.height - floor(_point.y);
    //it needs because AppKit and CoreGraphics use different coordinat systems
    
    CGWindowID windowID = (CGWindowID)[self windowNumber];
    
    CGImageRef pixel = CGWindowListCreateImage(CGRectMake(x, y, 1, 1), kCGWindowListOptionOnScreenBelowWindow, windowID, kCGWindowImageNominalResolution);
    
    NSBitmapImageRep *bitmap = [[NSBitmapImageRep alloc] initWithCGImage:pixel];
    CGImageRelease(pixel);
    
    NSColor *color = [bitmap colorAtX:0 y:0];