Search code examples
roptimizationnonlinear-optimization

Optimizing a non differentiable function in R


There are two methods I am experimenting with in minimizing a cost function. The first is optim() and the second is optim_nm() part of the optimization package. The problem I am facing is my error function takes on 2 parameters,

  1. A list of variable parameters the optimization function needs to modify
  2. A set of fixed parameters

optim(par = variableParameters,fn = error_function, par2 = fixedParameters):

optim handles this well because the first argument is the variable parameters, the function and then a set of optional parameters where I can pass the fixed parameters. This works, however, the function is slow.

optim_nm(fun = error_function,k=5,start = variable_parameters)

optim_nm, allows me to tune the optimization function, however, i'm unsure of how to pass the fixed parameters. All the examples in the documentation are with variable parameters.

Both methods implement the Nelder and Mead algorithm which is robust for nondifferentiable error functions which is what I require. If there are other packages that do this fast please do mention them too!

If someone has used this, or can better interpret the documentation I could use your help.

optim_nm Documentation

optim documentation


Solution

  • Create a wrapper function that fills in the values for the fixed parameters:

    error_function <- function(variableParameters, fixedParameters) {
      ...
    }
    
    wrapper <- function(x) {
      error_function(x, fixedParameters = 3)
    }
    
    optim_nm(fun = wrapper,
             k = 5,
             start = initial_parameter_values) 
    

    If error_function is expensive to evaluate, you may want to look into Bayesian optimization with the rBayesianOptimization or mlrMBO packages.