Search code examples
juliamathematical-optimizationjulia-jump

Instead of defining an objective function, gradient, and hessian function seperatley, can you have one function return all?


Is there a way in JuMP to define a function that returns for example a tuple containing the objective function and the gradient, which you can pass to register?

In the documentation : https://jump.dev/JuMP.jl/stable/manual/nlp/#Register-a-function-and-gradient

I see you can do this

f(x) = x^2
∇f(x) = 2x
∇²f(x) = 2
model = Model()
register(model, :my_square, 1, f, ∇f, ∇²f)
@variable(model, x >= 0)
@NLobjective(model, Min, my_square(x))

i.e define the functions seperatley. In my context I'm solving an optimal control problem where in one fuction I can get the objective and gradient in one go. Is there a way when I set autodiff=false that I can just make a function that returns the objective and gradient and do gradient based optimization, or returns all three to do hessian optimization based on my needs?

Also, how do you make the symbols like the nabla and nabla squared? Do you need to use a special editor?


Solution

  • Use a closure:

    julia> function fg()
               last_g = zeros(2)
               last_x = nothing
               last_fx = 0.0
               function update(x...)
                   if x != last_x
                       @info "Updating $x"
                       last_fx = x[1]^2 + x[2]^2
                       last_g[1] = 2x[1]
                       last_g[2] = 2x[2]
                       last_x = x
                   end
                   return
               end
               function f(x...)
                   update(x...)
                   return last_fx
               end
               function ∇f(g, x...)
                   update(x...)
                   copy!(g, last_g)
                   return
               end
               return f, ∇f
           end
    fg (generic function with 1 method)
    
    julia> f, ∇f = fg()
    (f, ∇f)
    
    julia> f(1, 2)
    [ Info: Updating (1, 2)
    5
    
    julia> f(1, 2)
    5
    
    julia> f(1, 2)
    5
    
    julia> f(1, 3)
    [ Info: Updating (1, 3)
    10
    
    julia> f(1, 3)
    10
    
    julia> g = zeros(2)
    2-element Vector{Float64}:
     0.0
     0.0
    
    julia> ∇f(g, 1, 2)
    [ Info: Updating (1, 2)
    
    julia> ∇f(g, 1, 2)
    
    julia> f(1, 2)
    5
    
    julia> g
    2-element Vector{Float64}:
     2.0
     4.0