An optimization problem with a squared objective solves successfully with IPOPT in Python Gekko.
from gekko import GEKKO
import numpy as np
m = GEKKO()
x = m.Var(); y = m.Param(3.2)
m.Obj((x-y)**2)
m.solve()
print(x.value[0],y.value[0])
However, when I switch to an absolute value objective np.abs(x-y)
(the numpy version of abs
) or m.abs(x-y)
(the Gekko version of abs
), the IPOPT solver reports a failed solution. An absolute value approximation m.sqrt((x-y)**2)
also fails.
Failed Solution
from gekko import GEKKO
import numpy as np
m = GEKKO()
x = m.Var(); y = m.Param(3.2)
m.Obj(m.abs(x-y))
m.solve()
print(x.value[0],y.value[0])
I understand that gradient-based solvers don't like functions without continuous first and second derivatives so I suspect that this is happening with abs()
where 0
is a point that does not have continuous derivatives. Is there any alternative to abs()
to reliably solve an absolute value with gradient-based solvers in Python Gekko?
You can use m.abs2 instead, It takes into account the issue with the derivative and should solve the issue.