Search code examples
pythonconstraintspyomo

Pyomo : summation constraint on row level <= 1


During the setup of a Concretemodel in Pyomo I have the need to set up the following constraint sum_of_perc[times,companies] <= 1

Currently I have the following code

m.cons.add(sum(m.key_optimized[t, c] for t, c in itertools.product(m.times, m.companies)) <= 1)

The itertools.product allows me to cycle through the different times and companies.

Think of the following hypothetical/random example of the 2 dimensional array. Each row is a t and each element is c

[0.1 , 0.4 , 0.0 , 0.5] <= 1
[0.2 , 0.2 , 0.6 , 0.0] <= 1

Currently my constraint tries to get the whole 2D matrix to be <= 1. But I want this to be a summation on a t level or row level. Example result of my code gives the following. This adds up to exactly 1.

(0, 0) 0.34
(0, 1) 0.42
(0, 2) 0.0
(0, 3) 0.16
(1, 0) 0.005
(1, 1) 0.075
(1, 2) 0.0
(1, 3) 0.0

Thank you for the assistance!


Solution

  • I think this will answer your question...

    First, if you want to replicate what you just did without itertools it is easy to make the full cross product of 2 sets. See first part below.

    If you are making a constraint for each of something, you should immediately think of setting up a function-rule combo that will call the function for each member of the set(s) that you provide in the Constraint. See the latter part below.

    In [22]: from pyomo.environ import *                                            
    
    In [23]: m = ConcreteModel('example')                                           
    
    In [24]: m.a = Set(initialize=[1,2,3])                                          
    
    In [25]: m.b = Set(initialize=list('abc'))                                      
    
    In [26]: from pyomo.environ import *                                            
    
    In [27]: m = ConcreteModel('example')                                           
    
    In [28]: m.A = Set(initialize=[1,2,3])                                          
    
    In [29]: m.B = Set(initialize=list('abc'))                                      
    
    In [30]: m.X = Var(m.A, m.B, domain=NonNegativeReals)                           
    
    In [31]: # example constraint for "the whole matrix" like you have now          
    
    In [32]: m.C1 = Constraint(expr=sum(m.X[a,b] for a in m.A for b in m.B)<=1)     
    
    In [33]: m.pprint()                                                             
    3 Set Declarations
        A : Size=1, Index=None, Ordered=Insertion
            Key  : Dimen : Domain : Size : Members
            None :     1 :    Any :    3 : {1, 2, 3}
        B : Size=1, Index=None, Ordered=Insertion
            Key  : Dimen : Domain : Size : Members
            None :     1 :    Any :    3 : {'a', 'b', 'c'}
        X_index : Size=1, Index=None, Ordered=True
            Key  : Dimen : Domain : Size : Members
            None :     2 :    A*B :    9 : {(1, 'a'), (1, 'b'), (1, 'c'), (2, 'a'), (2, 'b'), (2, 'c'), (3, 'a'), (3, 'b'), (3, 'c')}
    
    1 Var Declarations
        X : Size=9, Index=X_index
            Key      : Lower : Value : Upper : Fixed : Stale : Domain
            (1, 'a') :     0 :  None :  None : False :  True : NonNegativeReals
            (1, 'b') :     0 :  None :  None : False :  True : NonNegativeReals
            (1, 'c') :     0 :  None :  None : False :  True : NonNegativeReals
            (2, 'a') :     0 :  None :  None : False :  True : NonNegativeReals
            (2, 'b') :     0 :  None :  None : False :  True : NonNegativeReals
            (2, 'c') :     0 :  None :  None : False :  True : NonNegativeReals
            (3, 'a') :     0 :  None :  None : False :  True : NonNegativeReals
            (3, 'b') :     0 :  None :  None : False :  True : NonNegativeReals
            (3, 'c') :     0 :  None :  None : False :  True : NonNegativeReals
    
    1 Constraint Declarations
        C1 : Size=1, Index=None, Active=True
            Key  : Lower : Body                                                                           : Upper : Active
            None :  -Inf : X[1,a] + X[1,b] + X[1,c] + X[2,a] + X[2,b] + X[2,c] + X[3,a] + X[3,b] + X[3,c] :   1.0 :   True
    
    5 Declarations: A B X_index X C1
    
    In [34]: # OK, now if we want to make the row summation "for each a in m.A"     
    
    In [35]: # we want to use a function-rule combo is easiest...                   
    
    In [36]: def sum_over_b(m, a): 
        ...:     return sum(m.X[a,b] for b in m.B) <= 1 
        ...:                                                                        
    
    In [37]: m.C2 = Constraint(m.A, rule=sum_over_b)                                
    
    In [38]: m.C2.pprint()                                                          
    C2 : Size=3, Index=A, Active=True
        Key : Lower : Body                     : Upper : Active
          1 :  -Inf : X[1,a] + X[1,b] + X[1,c] :   1.0 :   True
          2 :  -Inf : X[2,a] + X[2,b] + X[2,c] :   1.0 :   True
          3 :  -Inf : X[3,a] + X[3,b] + X[3,c] :   1.0 :   True
    
    In [39]: