Search code examples
pythonmatplotlibstandardsstandard-deviation

Python - Matplotlib: normalizing y-axis to show multiples of standard deviation


I would like to renormalize my y-axis to show my signal in multiples of sigma (standard deviation). For example, one then could say at 50Hz there's a 3 sigma signal while at 3Hz there's a 0.5 sigma signal.

I thought using plt.yticks() could be the way to go:

import numpy as np
import matplotlib.pyplot as plt

X = range(0,50,2)
Y = range(0,50,2)

signal_sigma = np.std(Y)

plt.figure()
plt.plot(X, Y)
plt.yticks(np.arange(0, 25*signal_sigma, signal_sigma))
y_labels = [r"${} \sigma$".format(i) for i in range(0, 26)]
plt.ylabel(y_labels)
plt.show()

But this doesn't seem quite right yet. What am I missing?

UPDATE:

This is what I would like to do: What does a 1-sigma, a 3-sigma or a 5-sigma detection mean? The bit right below the probability table.


Solution

  • You want to set the yticklabels which is different than setting the axis label with plt.ylabel:

    import numpy as np
    import matplotlib.pyplot as plt
    
    x = np.arange(1000)
    y = 42 * np.random.randn(1000)
    
    signal_sigma = y.std()
    
    num_sigma = 3
    sigma_values = np.arange(-num_sigma, num_sigma+1)
    yticks = signal_sigma * sigma_values
    yticklabels = ['$'+str(k)+'\sigma$' if k != 0 else '$\mu$' for k in sigma_values]
    
    plt.figure()
    plt.plot(x, y)
    plt.yticks(yticks, yticklabels)
    plt.ylabel('the axis label')
    

    enter image description here