Search code examples
pythonopencv

OpenCV - Reading a 16 bit grayscale image


I'm trying to read a 16 bit grayscale image using OpenCV 2.4 in Python, but it seems to be loading it as 8 bit.

I'm doing:

im = cv2.imread(path,0)
print im

[[25 25 28 ...,  0  0  0]
[ 0  0  0 ...,  0  0  0]
[ 0  0  0 ...,  0  0  0]
..., 

How do I get it as 16 bit?


Solution

  • Figured it out. In case anyone else runs into this problem:

    im = cv2.imread(path,-1)
    

    Setting the flag to 0, to load as grayscale, seems to default to 8 bit. Setting the flag to -1 loads the image as is.