I want to perform a weighted linear fit to extract the parameters m
and c
in the equation y = mx+c
.
The data I want to perform the fit on is:
xdata = [661.657, 1173.228, 1332.492, 511.0, 1274.537]
ydata = [242.604, 430.086, 488.825, 186.598, 467.730]
yerr = [0.08, 0.323, 0.249, 0.166, 0.223]
I would like to use scipy.optimize.curve_fit
but I don't know how to use this when each y data point has an error associated with it.
IIUC then what you are looking for is the sigma
keyword argument.
sigma: None or M-length sequence or MxM array, optional
Determines the uncertainty in ydata. If we define residuals as r = ydata - f(xdata, *popt),
then the interpretation of sigma depends on its number of dimensions:
A 1-d sigma should contain values of standard deviations of errors in ydata.
In this case, the optimized function is chisq = sum((r / sigma) ** 2).
None (default) is equivalent of 1-d sigma filled with ones.
Then the code would become:
def func(x, m, c):
return m * x + c
curve_fit(func, xdata, ydata, sigma=yerr)