Search code examples
matplotlibscikit-imagecolormap

Digitize a colormap


Consider the following image: Source: https://fr.mathworks.com/matlabcentral/mlc-downloads/downloads/submissions/34863/versions/15/screenshot.jpg

I'd like to print it as a grayscale image. I can do the conversion with scikit-image:

from skimage.io import imread
from matplotlib import pyplot as plt
from skimage.color import rgb2gray


img = imread('image.jpg')

plt.grid(which = 'both')
plt.imshow(rgb2gray(img), cmap=plt.cm.gray)

I get:

enter image description here

which is obviously not what I want.

My question is: Is there a way with scikit-image or with raw numpy and/or mathplotlib to digitize the image so that I get a 3D array (first dimension: X index, second dimension: Y index, third dimension: value according to the colormap). Then I can easily change to colormap to something that turns out to have better results when printing in grayscale?


Solution

  • The example below demonstrates a simple way to undo a colormap's value -> RGB mapping.

    def unmap_nearest(img, rgb):
        """ img is an image of shape [n, m, 3], and rgb is a colormap of shape [k, 3]. """
        d = np.sum(np.abs(img[np.newaxis, ...] - rgb[:, np.newaxis, np.newaxis, :]), axis=-1)    
        i = np.argmin(d, axis=0)
        return i / (rgb.shape[0] - 1)
    

    This function works by taking the RGB value of each pixel and looking up the index of the best matching color in the colormap. Some trickery with indexing and broadcasting allows for efficient vectorization (at the cost of memory spent on temporary arrays):

    • img[np.newaxis, ...] converts the image from shape [n, m, 3] to [1, n, m, 3]

    • rgb[:, np.newaxis, np.newaxis, :] converts the colormap from shape [k, 3] to [k, 1, 1, 3].

    • subtracting the resulting arrays leads to an array of shape [k, n, m, 3] that contians the difference between each colormap index k and pixel n, m for each color component.
    • sum(abs(..), axis=-1) takes the absolute value of the differences and sums over all color components (the last dimension) to get the total difference between all pixels and color map entries (array of shape [k, n, m]).
    • i = np.argmin(d, axis=0) finds the index of the minimum element along the first dimension. The result is the index of the best matching color map entry of each pixel [n, m].
    • return i / (rgb.shape[0] - 1) finally returns the indices normalized by the color map size so that the result is in range 0-1.

    enter image description here

    There are a faw caveats with this approach:

    1. It cannot reconstruct the original value range.
    2. It will treat all pixels as part of the color map (i.e. continent contuors will also be mapped).
    3. If you use the wrong color map it will fail hilariously.

    .

    import numpy as np
    import matplotlib.pyplot as plt
    from skimage.color import rgb2gray
    
    
    def unmap_nearest(img, rgb):
        """ img is an image of shape [n, m, 3], and rgb is a colormap of shape [k, 3]. """
        d = np.sum(np.abs(img[np.newaxis, ...] - rgb[:, np.newaxis, np.newaxis, :]), axis=-1)    
        i = np.argmin(d, axis=0)
        return i / (rgb.shape[0] - 1)
    
    
    cmap = plt.cm.jet
    rgb = cmap(np.linspace(0, 1, cmap.N))[:, :3]
    
    
    original = (np.arange(10)[:, None] + np.arange(10)[None, :])
    
    plt.subplot(2, 2, 1)
    plt.imshow(original, cmap='gray')
    plt.colorbar()
    plt.title('original')
    
    
    plt.subplot(2, 2, 2)
    rgb_img = cmap(original / 18)[..., :-1]
    plt.imshow(rgb_img)
    plt.title('color-mapped')
    
    plt.subplot(2, 2, 3)
    wrong = rgb2gray(rgb_img)
    plt.imshow(wrong, cmap='gray')
    plt.title('rgb2gray')
    
    plt.subplot(2, 2, 4)
    reconstructed = unmap_nearest(rgb_img, rgb)
    plt.imshow(reconstructed, cmap='gray')
    plt.colorbar()
    plt.title('reconstructed')
    
    plt.show()