I am using lmfit to fit my data to Gaussians. There are three things I am trying to accoplish: 1) Understand how the errors are calculated in lmfit 2) How to include my own calculated errors in lmfit 3) How to plot the errors within the fit
def gaussian(x, amp, cen, fwhm):
return + amp * np.exp(-(x - cen) ** 2 / (2 * (fwhm / 2.35482) ** 2))
def gaussian_fit(x,y,guess=[1,0,0,5],varies=[True,True,True,True]):
c = 299792458 #m/s
gmod = Model(gaussian)
gmod.nan_policy = 'omit'
#x,y - your dataset to fit, with x and y values
print (np.max(y))
gmod.set_param_hint('amp', value=guess[0],vary=varies[0])
gmod.set_param_hint('cen', value=guess[1],vary=varies[1])
gmod.set_param_hint('fwhm', value=guess[2],vary=varies[2])
gmod.make_params()
result = gmod.fit(y,x=x,amp=guess[0], cen=guess[1], fwhm=guess[2])
amp = result.best_values['amp']
cen = result.best_values['cen']
fwhm = result.best_values['fwhm']
#level = result.best_values['level']
sigma = fwhm / 2.35482
c = 299792458 #m/s
print(result.fit_report())
gaussfield = amp * np.sqrt(2 * np.pi * sigma ** 2)
residual = y - result.best_fit
print ('params:',amp,cen,fwhm,sigma,gaussfield)
return amp,cen,fwhm,sigma,gaussfield,residual
amp, cen, fwhm, sigma, gaussfield, residual
= gaussian_fit(xdata,ydata,guess=[.1,6.9,.02],varies=[True,False,False])
I don't see where the errors are incorporated in the script, so how were they included in the final report? How can you include your own errors instead of the ones fro lmfit, and how can you finally plot these errors?
First, I would recommend just using lmfit.models.GaussianModel
, and not using set_param_hint()
-- just be explicit, not clever -- as with:
from lmfit.models import GaussianModel
gmodel = GaussianModel()
params = gmodel.make_params(amplitude=1, center=0, sigma=2)
result = gmodel.fit(y, params, x=x)
print(result.fit_report())
Now, to your questions:
Uncertainties in the best-fit parameters are estimated by looking at how changing the parameter values would change the fit. With chi-square
defined as the sum of squares of the residual array (data-fit)/data_uncertainty
, the uncertainties for each parameter (and the parameter-parameter correlations) are estimated as values that increase chi-square
by 1. There are many more in-depth resources available for how non-linear least-squares fitting works.
"How to include my own calculated errors in lmfit". Here, I would guess that you mean the uncertainties in the y data, not the uncertainties in the parameters (like how would you know the uncertainties but not the values?). If you have uncertainties in your data, pass that in as a weights
array to Model.fit
, perhaps as
result = gmodel.fit(y, params, x=x, weights=1.0/dely)
How to plot the errors within the fit. You can plot the data and their errorbars with matplotlibs errorbar
function. If you want to plot the effect of the uncertainties in the parameters on the expected range of the best-fit curve, you can use delmodel = result.eval_uncertainty(x=x)
and then plot x
vs result.best_fit + delmodel
and result.best_fit - delmodel
. Matplotlib's fill_between
function is often useful for this.