There are two methods I am experimenting with in minimizing a cost function. The first is optim() and the second is optim_nm() part of the optimization package. The problem I am facing is my error function takes on 2 parameters,
optim(par = variableParameters,fn = error_function, par2 = fixedParameters):
optim handles this well because the first argument is the variable parameters, the function and then a set of optional parameters where I can pass the fixed parameters. This works, however, the function is slow.
optim_nm(fun = error_function,k=5,start = variable_parameters)
optim_nm, allows me to tune the optimization function, however, i'm unsure of how to pass the fixed parameters. All the examples in the documentation are with variable parameters.
Both methods implement the Nelder and Mead algorithm which is robust for nondifferentiable error functions which is what I require. If there are other packages that do this fast please do mention them too!
If someone has used this, or can better interpret the documentation I could use your help.
Create a wrapper function that fills in the values for the fixed parameters:
error_function <- function(variableParameters, fixedParameters) {
...
}
wrapper <- function(x) {
error_function(x, fixedParameters = 3)
}
optim_nm(fun = wrapper,
k = 5,
start = initial_parameter_values)
If error_function
is expensive to evaluate, you may want to look into Bayesian optimization with the rBayesianOptimization
or mlrMBO
packages.