Search code examples
pythonscipynormal-distributionscipy.statskurtosis

Kurtosis remains constant over different shaped normal distributions?


I want to explore the normal distribution with the same mean, but changing std.

I expect that the Kurtosis will change with the std, but in my results the Kurtosis stays constant?

What is the issue here?

At first I generated some normal distribution with shifting std:

nd_l_std_44 = {} for i in range(1,10): >> nd_std_44 = stats.norm.rvs(loc=0, scale=i, size=10000, random_state=5) >> nd_l_std_44["ndl_std_{i}".format(i=i)] = nd_std_44 print(nd_l_std_44.keys())

This worked and I did get a dict with different values for each key.

I did plot the resulting distributions:

enter image description here

I expected this. The Kurtosis is different, while the mean remains the same. Now I calculated the Kurtosis in many ways, e.g. with scipy.stats

kurt_std_1 = dict() for k,v in nd_l_std_44.items(): >> kurt_std_1[k] = stats.kurtosis(v, fisher=False) print(kurt_std_1)

The problem is, I do get the same Kurtosis for all distributions. This is also the case with Pandas. I expected significant different Kurtosis values for distributions with different std. Instead the values are largely equal.)

{ 'ndl_std_1': -0.0690005257753592, 'ndl_std_2': -0.0690005257753592, 'ndl_std_3': -0.0690005257753592, 'ndl_std_4': -0.0690005257753592, 'ndl_std_5': -0.06900052577535831, 'ndl_std_6': -0.0690005257753592, 'ndl_std_7': -0.06900052577535876, 'ndl_std_8': -0.0690005257753592, 'ndl_std_9': -0.0690005257753592 }

What is happening here? Help is greatly appreciated.


Solution

  • That is to be expected. As noted in the wikipedia article on kurtosis, the kurtosis of any univariate normal distribution is 3 (and the excess kurtosis is therefore 0). It is independent of both the mean and the standard deviation.