Search code examples
pythonscipyrestriction

Not iterable Error with constraints in minimize from scipy.optimize


I just started studying optimization with Python and I am facing an issue. I have a problem where I want to minimize my objective function (obj_fun) using minimize from scipy.optimize. I will share an example:

import numpy as np

def analysis(A):
    N = []
    for i in A:
        N.append(i*3)
    return N

def cons(A):
    N = analysis(A)
    C = []
    for i in len(N):
        if N[i] < 2:
            C.append({'type': 'ineq', 'fun': lambda x: x[0]*N[i]})
        else:
            C.append({'type': 'ineq', 'fun': lambda x: x[0]-N[i]})
    return C

def obj_fun(A):
    """Objective function returns the weight of the structure"""
    w=  0.5*[1*A[0]+2*A[1]+3*A[2]]
    return w

# Initial values
A0 = np.array([0.001 for i in range(0, 3)])

N = analysis(A0)

##  Optimization

bnds = [(1e-6, None) for i in range(len(A0))]

from scipy.optimize import minimize
sol = minimize(obj_fun, x0=A0, method='trust-constr', bounds=bnds, 
constraints=cons)
print(sol)

The whole error I get is: runfile('C:/Users/Myc/Documents/Python Scripts/example stack.py', wdir='C:/Users/Myc/Documents/Python Scripts') Traceback (most recent call last):

File "C:\Users\Myc\Documents\Python Scripts\example stack.py", line 40, in sol = minimize(obj_fun, x0=A0, method='trust-constr', bounds=bnds, constraints=cons)

File "C:\Users\Myc\anaconda3\lib\site-packages\scipy\optimize_minimize.py", line 605, in minimize constraints = standardize_constraints(constraints, x0, meth)

File "C:\Users\Myc\anaconda3\lib\site-packages\scipy\optimize_minimize.py", line 825, in standardize_constraints constraints = list(constraints) # ensure it's a mutable sequence

TypeError: 'function' object is not iterable

I know the main problem is how i define the constraints and I could replace constraints=cons for constraints = Cons1 if i define Cons1 = rest(A0) before the optimization. However that wouldn't help me because I need the function trus_analysis to be executed on every iteration of the optimization in order to update the parameters N for the restrictions. How can I define the constraints?


Solution

  • The original script:

    def obj_fun(A):
        return 7*A[0]+ 3*A[1]+ 7*A[2]
    
    def analysis(A):
        N = []
        for i in A:
            N.append(i*3)
        return N
    
    def cons(A):
        n = analysis(A)
        C = []
        for i in range(len(A)):
            if n[i] < 4:
                C.append({'type': 'ineq', 'fun': lambda x: x[i]**2 / n[i]})
            else:
                C.append({'type': 'ineq', 'fun': lambda x: x[i] - n[i]})
        return C
    
    
    A0 = [1,2,3]
    C = cons(A0)
    bnds = [(1e-6, None) for i in range(len(A0))]
    
    from scipy.optimize import minimize
    sol = minimize(obj_fun, x0=A0, method='trust-constr', bounds=bnds, constraints=C)
    print(sol)
    

    runs with:

    /usr/local/lib/python3.8/dist-packages/scipy/optimize/_hessian_update_strategy.py:182: UserWarning: delta_grad == 0.0. Check if the approximated function is linear. If the function is linear better results can be obtained by defining the Hessian as zero instead of using quasi-Newton approximations.
      warn('delta_grad == 0.0. Check if the approximated '
     barrier_parameter: 0.00016000000000000007
     barrier_tolerance: 0.00016000000000000007
              cg_niter: 15
          cg_stop_cond: 1
                constr: [array([9.00009143]), array([4.57149698e-05]), array([4.57149698e-05]), array([2.38571416e-05, 5.43334162e-05, 9.00004571e+00])]
           constr_nfev: [40, 40, 40, 0]
           constr_nhev: [0, 0, 0, 0]
           constr_njev: [0, 0, 0, 0]
        constr_penalty: 1.0
      constr_violation: 0.0
        execution_time: 0.0873115062713623
                   fun: 63.00065000502843
                  grad: array([7.        , 3.        , 6.99999999])
                   jac: [array([[0.        , 0.        , 2.00001017]]), array([[0., 0., 1.]]), array([[0., 0., 1.]]), array([[1., 0., 0.],
           [0., 1., 0.],
           [0., 0., 1.]])]
       lagrangian_grad: array([1.77635684e-15, 1.55431223e-14, 5.67948534e-14])
               message: '`gtol` termination condition is satisfied.'
                method: 'tr_interior_point'
                  nfev: 40
                  nhev: 0
                   nit: 14
                 niter: 14
                  njev: 10
            optimality: 5.679485337974424e-14
                status: 1
               success: True
             tr_radius: 18734.614693588483
                     v: [array([-1.77775972e-05]), array([-3.49997333]), array([-3.49997333]), array([-7.00000000e+00, -3.00000000e+00, -1.77776895e-05])]
                     x: array([2.38571416e-05, 5.43334162e-05, 9.00004571e+00])
    

    Here

    In [36]: C
    Out[36]: 
    [{'type': 'ineq', 'fun': <function __main__.cons.<locals>.<lambda>(x)>},
     {'type': 'ineq', 'fun': <function __main__.cons.<locals>.<lambda>(x)>},
     {'type': 'ineq', 'fun': <function __main__.cons.<locals>.<lambda>(x)>}]
    

    A0 is used the create the 3 constraint functions.

    The analysis function just multiplies A by 3.

    In [38]: analysis(A0)
    Out[38]: [3, 6, 9]
    In [39]: A0
    Out[39]: [1, 2, 3]
    In [40]: analysis(A0)
    Out[40]: [3, 6, 9]
    In [41]: np.array(A0)*3
    Out[41]: array([3, 6, 9])
    

    In the latest cons you dropped the range, and use cons directly rather than cons(A0). The constraints parameter is supposed to be a list of dict, as shown in C.