Search code examples
optimizationruntime-errorjuliamathematical-optimization

Optimization Error: Box constraint optimization (Julia Optim.jl)


I'm trying to run the following code snippet to fit a curve to some empirical data, but keep getting an issue with the optimize() method in the Julia Optim.jl package. I'm using Julia v1.1.0 and have all the correct packages installed. The error I keep getting is:

ERROR: LoadError: MethodError: no method matching optimize(::getfield(Main, Symbol("##13#14")), ::Array{Float64,1}, ::Array{Int32,1}, ::Array{Float64,1}, ::Fminbox{LBFGS{Nothing,LineSearches.InitialStatic{Float64},LineSearches.HagerZhang{Float64,Base.RefValue{Bool}},getfield(Optim, Symbol("##19#21"))},Float64,getfield(Optim, Symbol("##43#45"))})

Here is my code:

# Loading in dependencies
using Distributions # To use probability & statistics library
using Plots # To visualize results
using Optim # For minimization (curve fitting)

# Empirical data for curve fitting
IM = [1, 2, 3, 4] # x axis variables
pfs = [0.0, 0.0, 0.13, 0.23] # associated probabilities y-axis
n = 1000 # assume this number of independent trials for each x value

# Create functions to evaluate fit between theoretical values and empirical values
theor_vals = x -> cdf.(LogNormal(log(x[1]), x[2]), IM) # Assume lognormal shape and construct CDF with arbitrary fit parameters
likelihood = x -> [pdf(Binomial(n,xx[1]), round(xx[2])) for xx in zip(theor_vals(x),n.*pfs)] # getting likelihood values from binomial distribution for n trials
log_likelihood = x -> log.([xi > 0 ? xi : 1e-30 for xi in likelihood(x)]) # getting log value of likelihood
min_function = x -> -sum(log_likelihood(x)) # summing and switching sign for optimization


# Set inputs for minimization - first index is for the median and second index is for the dispersion (uncertainty)
init_guess = [median(IM), 0.5] # reasonable initial guess
lx = [0.001, 5.0] # lower bound
ux = [5,10] # upper bound

# Using Optim to optimize the objective function and get best curve fit
result = optimize(min_function, lx, ux, init_guess, Fminbox(LBFGS())) # call optimize function
theta, beta_a = result.minimizer # retrieve lognormal fit params

I'm still getting familiar with the julia language so it's very possible I'm just not comprehending documentation properly. Thanks in advance for any help or guidance provided!


Solution

  • You have to fix two things in your code and all will work:

    1. ux must contain floats, so you should change its definition to ux = [5.0,10.0]
    2. init_guess must be within the optimization bounds so you can e.g. set it to init_guess = (lx+ux)/2

    Given these changes you can run your code. Here is the result I got (I have not checked your problem from optimization specification side - I assume it is correct; I have only proposed the changes to make your example runnable):

    julia> result = optimize(min_function, lx, ux, init_guess, Fminbox(LBFGS()))
     * Status: success
    
     * Candidate solution
        Minimizer: [5.00e+00, 5.00e+00]
        Minimum:   1.417223e+03
    
     * Found with
        Algorithm:     Fminbox with L-BFGS
        Initial Point: [2.50e+00, 7.50e+00]
    
     * Convergence measures
        |x - x'|               = 8.88e-16 ≰ 0.0e+00
        |x - x'|/|x'|          = 1.26e-16 ≰ 0.0e+00
        |f(x) - f(x')|         = 0.00e+00 ≤ 0.0e+00
        |f(x) - f(x')|/|f(x')| = 0.00e+00 ≤ 0.0e+00
        |g(x)|                 = 8.87e+01 ≰ 1.0e-08
    
     * Work counters
        Seconds run:   0  (vs limit Inf)
        Iterations:    6
        f(x) calls:    2571
        ∇f(x) calls:   2571