I have an optimization problem with constraints, but the COBYLA solver doesn't seem to respect the constraints I specify.
My optimization problem:
cons = ({'type':'ineq', 'fun':lambda t: t},) # all variables must be positive
minimize(lambda t: -stateEst(dict(zip(self.edgeEvents.keys(),t)), (0.1,)*len(self.edgeEvents), constraints=cons, method='COBYLA')
and stateEst
is defined as:
def stateEst(t):
val = 0
for edge,nextState in self.edgeEvents.iteritems():
val += edge_probability(self,edge,ts) * estimates[nextState]
val += node_probability(self, edge.head, ts, edge_list=[edge])* cost
for node,nextState in self.nodeEvents.iteritems():
val += node_probability(self, node, ts) * \
(estimates[nextState] + cost*len([e for e in node.incoming if e in self.compEdges])
return val
The probability functions are only defined for positive t
values. The dictionary is necessary because the probabilities are calculated with respect to the 'named' t-values.
When I run this, I notice that COBYLA tries a value of -0.025 for one of the t-values. Why is the optimization not respecting the constraints?
COBYLA is technically speaking an infeasible method, which means, that the iterates might not be always feasible in regards to your constraints! (it's only about the final convergence, where feasibility matters for these algorithms).
Using an objective-function which is not defined everywhere will be problematic. Maybe you are forced to switch to some feasible method.
Alternatively you could think about generalizing your objective, so that there are penalties introduced for negative t's. But this is problem-dependent and could introduce other problems as well (convergence; numeric-stability).
Try using L-BFGS-B, which is limited to bound-constraints, which is not a problem here (for your current problem!).