I would like to minimize x
and y
in function f
using least square (Levenberg-Marquardt). In Python I can use lmfit
like follows
params = lmfit.Parameters()
params.add('x', value=0, min=-np.pi, max=np.pi)
params.add('y', value=0.0, min=-0.25, max=0.25)
# Least square is the default method
x,y = lmfit.minimize(f, params)
Is there any equivalent/what is the best way to achieve this in Julia?
Does it have to be Levenberg-Marquardt? If not, you can get what you want using Optim.jl:
using Optim
f(x) = x[1]^2 + x[2]^4
result = optimize(f, [1.0,2.0]) # minimum expected at (0,0)
x,y = result.minimizer # (2.3024075561537708e-5, -0.0009216015268974243)
lbounds = [1, -0.25]
ubounds = [2, 0.25]
result = optimize(f, lbounds, ubounds, [1.5,0.1]) # minimum expected at (1,0)
x,y = result.minimizer # (1.0000000000000002, -2.1978466115000986e-11)
Previous answer:
You could perhaps use the package LsqFit.jl:
using LsqFit
# function with two parameters
@. f(x, p) = p[1]*exp(-x*p[2])
# fake data
xdata = range(0, stop=10, length=20)
ydata = f(xdata, [1.0 2.0]) + 0.01*randn(length(xdata))
# upper and lower bounds + initial parameter guess
lb = [-π, -0.25]
ub = [π, 0.25]
p0 = [0.5, 0.1]
# least squares fit
fit_bounds = curve_fit(f, xdata, ydata, p0, lower=lb, upper=ub)
p1,p2 = fit_bounds.param