Search code examples
drake

Does Drake guarantee optimizations will succeed if they are satisfied by the initial guess?


If I create a mathematical program in drake, with some constraints and some costs, and give it an initial guess that satisfies all the constraints, are all drake solvers guaranteed to find a solution? And is that solution guaranteed to have a cost less than or equal to the cost of the initial guess?


Solution

  • No, none of Drake's supported solvers guarantee these.

    I suppose you have a nonlinear non-convex optimization problem. For these problems Drake support SNOPT, IPOPT and NLopt solvers. For all these solvers, they don't guarantee constraint violation during the iterative optimization process. Temporarily violating the constraints enable the solver to find possibly better solutions. For example if you have a problem

    min x
    s.t x² = 1
    

    and you start from the initial guess x=1, you probably would hope the solver to find the better solution x=-1, but to jump from x=1 to x=-1 the solver has to go through the intermediate values (like x = 0.5, 0, ...) which violates the constraint.

    It doesn't guarantee to have a cost less than or equal to the cost of the initial guess either. It could end up with a worse solution. In non-convex optimization problem, there isn't much we can guarantee. If you find the solver returns a worse solution than your initial guess, you could just use the initial guess.