Search code examples
pythonoptimizationconstraintsmystic

Optimization & Linear Inequalities with Mystic


I am working with a fairly complex objective function that I am minimizing by varying 4 parameters. A while ago I decided to use the Python framework Mystic, which seamlessly allows me to use penalties for complex inequalities (which I need).

However, Mystic has a less-than-obvious way to assign hard constraints (not inequalities and not bound constraints, linear inequalities between parameters only) and even less obvious way to handle them.

All my 4 parameters have finite lower and upper bounds. I would like to add a linear inequality as a hard constraint like this:

def constraint(x):  # needs to be <= 0
    return x[0] - 3.0*x[2]

But if I try to use Mystic in this way:

from mystic.solvers import fmin_powell
xopt = fmin_powell(OF, x0=x0, bounds=bounds, constraints=constraint)

Then Mystic insists in calling the objective function to resolve the constraints first and then proceed with the actual optimization; since the objective function value has no impact nor any effect on the constraint function as defined above then I am not sure why this is happening. The constraint function defined above simply tells Mystic that a region of the hyperparameters search space should be off limits.

I have scoured pretty much all the examples in the Mystic folder and I stumbled across an alternative way to define a hard constraint: use a penalty function and then call a magic method "as_constraint" to "convert it" to a constraint. Unfortunately, all those examples go pretty much this way:

from mystic.solvers import fmin_powell
from mystic.constraints import as_constraint
from mystic.penalty import quadratic_inequality

def penalty_function(x): # <= 0.0
    return x[0] - 3.0*x[2]

@quadratic_inequality(penalty_function)
def penalty(x):
    return 0.0

solver = as_constraint(penalty)

result = fmin_powell(OF, x0=x0, bounds=bounds, penalty=penalty)

There is this magic line:

solver = as_constraint(penalty)

That I can't see what it's doing - the solver variable is never used again.

So, for the question: is there any way to define linear inequalities in Mystic that do not involve an expensive pre-solve of the constraints but simply tell Mystic to exclude certain regions of the search space?

Thank you in advance for any suggestion.

Andrea.


Solution

  • What mystic does is map the space it searches, so you are optimizing over a "kernel transformed" space (to use machine learning jargon). You can think of the constraints as applying an operator, if you know that that means. So, y = f(x) under some constraints x' = c(x) becomes y = f(c(x)). This is why the optimizer evaluates the constraints before evaluating the objective.

    So you can build a constraint like this:

    >>> import mystic.symbolic as ms
    >>> equation = 'x1 - 3*a*x2 <= 0'
    >>> eqn = ms.simplify(equation, locals=dict(a=1), all=True)
    >>> print(eqn)
    x1 <= 3*x2
    >>> c = ms.generate_constraint(ms.generate_solvers(eqn, nvars=3))
    >>> c([1,2,3])
    [1, 2, 3]
    >>> c([0,100,-100])
    [0, -300.0, -100]
    

    Or if you have more than one:

    >>> equation = '''
    ... x1 > x2 * x0     
    ... x0 + x1 < 10
    ... x1 + x2 > 5
    ... '''
    >>> eqn = ms.simplify(equation, all=True)
    >>> print(eqn)
    x1 > -x2 + 5
    x0 < -x1 + 10
    x1 > x0*x2
    >>> import mystic.constraints as mc
    >>> c = ms.generate_constraint(ms.generate_solvers(eqn), join=mc.and_)
    >>> c([1,2,3])
    [1, 3.000000000000004, 3]