Search code examples
pythonnumpymatplotlibimshow

Matplotlib: how to make imshow read x,y coordinates from other numpy arrays?


When you want to plot a numpy array with imshow, this is what you normally do:

import numpy as np
import matplotlib.pyplot as plt

A=np.array([[3,2,5],[8,1,2],[6,6,7],[3,5,1]]) #The array to plot

im=plt.imshow(A,origin="upper",interpolation="nearest",cmap=plt.cm.gray_r)
plt.colorbar(im)

Which gives us this simple image: enter image description here

In this image, the x and y coordinates are simply extracted from the position of each value in the array. Now, let's say that A is an array of values that refer to some specific coordinates:

real_x=np.array([[15,16,17],[15,16,17],[15,16,17],[15,16,17]])
real_y=np.array([[20,21,22,23],[20,21,22,23],[20,21,22,23]])

These values are made-up to just make my case. Is there a way to force imshow to assign each value in A the corresponding pair of coordinates (real_x,real_y)?

PS: I am not looking for adding or subtracting something to the array-based x and y to make them match real_x and real_y, but for something that reads these values from the real_x and real_y arrays. The intended outcome is then an image with the real_x values on the x-axis and the real_y values on the y-axis.


Solution

  • Setting the extent

    Assuming you have

    real_x=np.array([15,16,17])
    real_y=np.array([20,21,22,23])
    

    you would set the image extent as

    dx = (real_x[1]-real_x[0])/2.
    dy = (real_y[1]-real_y[0])/2.
    extent = [real_x[0]-dx, real_x[-1]+dx, real_y[0]-dy, real_y[-1]+dy]
    plt.imshow(data, extent=extent)
    

    Changing ticklabels

    An alternative would be to just change the ticklabels

    real_x=np.array([15,16,17])
    real_y=np.array([20,21,22,23])
    plt.imshow(data)
    plt.gca().set_xticks(range(len(real_x)))
    plt.gca().set_yticks(range(len(real_x)))
    plt.gca().set_xticklabels(real_x)
    plt.gca().set_yticklabels(real_y)