Search code examples
pythonoptimizationipoptdeapobjective-function

Going from non-linear root-finding to multi-objective optimization


To simplify, let's say I can describe a system with

variables x1 and x2,

parameters p1 and p2, and

constraints f(x, p) = 0 and g(x, p) = 0:

For example:

f(x1, x2, p1, p2) = x1^2 * p1 + x1^2 * p2 + x2 = 0

g(x1, x2, p1, p2) = x2^2 * p2 + x1 * p1 = 0

Now, let's say that given the true value of parameters p1 and p2, the root(s) exist(s). However, in my case, the parameters are determined in an imperfect way and a non-linear root-finder like scipy.optimize's fsolve is unsuccessful. One could feed in the parameters as variables and try to find the roots, but increase the variables and the parameters by an order of magnitude, like in my actual system, and the constraints become very difficult to respect.

Therefore, I've been looking at optimization packages in python that could "solve" my set of non-linear equations. And this is where my lack of understanding of optimization presents a roadblock.

If I understand correctly, as posited, my equations are constraints, meaning that they must be respected for my design to be successful. However, I've realized that given the imperfect nature of the parameters (or the large number of possible variables), I need to have one (or several) objective functions to minimize as opposed to constraints.

All the equations that describe my system have equal validity, so I don't think I can simply choose one or several equations to be objective functions and the rest remain as constraints. It looks like I need to have all my equations as objective functions.

I therefore have two questions:

  1. Is my logic to have all my equations as objective functions valid, and
  2. What python package would allow me to minimize these objective functions?

I have looked at cyipopt, casadi, pyomo, and DEAP, but I am a bit lost. I think once my model of my system is better defined, I will know exactly what to look for. However, if code for my simple example can be provided, I would really appreciate it.

P.S: My model as it stands now has 11 variables and 11*5 parameters (five coefficients representing a 4th degree polynomial for each variable). I can also add constraints to the variables if necessary in the optimization package.


Solution

  • With equations:

    e1 = x1**2 * p1 + x1**2 * p2 + x2
    e2 = x2**2 * p2 + x1 * p1
    

    it is possible to solve the system of equations as hard constraints (equality form):

    m.Equation(e1==0)
    m.Equation(e2==0)
    

    or as soft constraints (minimize objective):

    m.Minimize(e1**2)
    m.Minimize(e2**2)
    

    This approach works in Pyomo, CasADi, and Gekko. Here is a Gekko example that uses both approaches:

    from gekko import GEKKO    
    import numpy as np
    m = GEKKO()
    x = m.Array(m.Var,2,value=2)
    x1,x2 = x
    p = m.Array(m.Param,2,value=1)
    p1,p2 = p
    e1 = x1**2 * p1 + x1**2 * p2 + x2
    e2 = x2**2 * p2 + x1 * p1
    
    # approach #1
    m.Equation(e1==0)
    m.Equation(e2==0)
    
    # approach #2
    m.Minimize(e1**2)
    m.Minimize(e2**2)
    
    m.options.SOLVER=1
    m.solve()
    print('x: ', x)
    print('Objective: ',m.options.OBJFCNVAL)
    

    In this case, the solutions are the same by using one or the other (or both). If it is a large-scale optimization problem, the solver may not reduce e1 or e2 to zero with soft constraints. Sometimes including both can help the solver find a solution.