I am working on a linear optimization problem where I have a set of cities and a set of powerplants. The cities have an electricity demand that needs to be satisfied. However, in the context of my problem, in certain time periods, the cities have no electricity demand (from the power plants because they can produce some of their own). I do not think the specific details are very important so below is my best description of the issue.
The objective function contains the following term:
I created the appropriate city and month sets and set up my objective function as:
sum(sum(1/model.monthly_demand[c,t]*model.theta[c] for c in model.cities) for t in model.months)
The issue clearly arises when monthly_demand[c,t] = 0
as i get a division by zero error. And I am not sure how to deal with this. Ideally I would like theta[c]
to be set to zero in that case but I am unsure how to do this. I tried adding some if/else statements in the sum() function but this is not possible as far as I understand.
I think I can also define a function that is passed into the pyomo objective, so my idea was to try something like an if statement that sets theta[c]
to zero when the monthly demand is zero, but this was not successful.
Another idea was to set the demands to something like 0.000001 but I would like this to be a last resort solution because I think it will cause issues.
You should just make a subset wherever convenient of the non-zero elements of demand. I am assuming demand
is a parameter which would make sense, but you didn't specify. Then use that subset as the basis of summation. This will avoid including the empty/zero elements.
Note the elements of the objective in the printout of the model.
import pyomo.environ as pyo
demand_data = { ('LA', 0) : 20,
('LA', 1) : 0,
('SF', 1) : 15,
('NY', 0) : 20,
('NY', 1) : 30}
m = pyo.ConcreteModel('electricity')
m.C = pyo.Set(initialize=['LA', 'SF', 'NY'])
m.T = pyo.Set(initialize=list(range(2)))
# VARS
m.x = pyo.Var(m.C, m.T, domain=pyo.NonNegativeReals)
# PARAMS
m.demand = pyo.Param(m.C, m.T, initialize=demand_data, default=0)
# CONSTRAINTS
# ...
# OBJ
# make a subset of the non-zero demand periods
nonzero_demand_domain = {(c,t) for c in m.C for t in m.T if m.demand[c, t] > 0}
# use that in your objective...
m.obj = pyo.Objective(expr=sum(m.x[c, t] for c, t in nonzero_demand_domain))
m.pprint()
4 Set Declarations
C : Size=1, Index=None, Ordered=Insertion
Key : Dimen : Domain : Size : Members
None : 1 : Any : 3 : {'LA', 'SF', 'NY'}
T : Size=1, Index=None, Ordered=Insertion
Key : Dimen : Domain : Size : Members
None : 1 : Any : 2 : {0, 1}
demand_index : Size=1, Index=None, Ordered=True
Key : Dimen : Domain : Size : Members
None : 2 : C*T : 6 : {('LA', 0), ('LA', 1), ('SF', 0), ('SF', 1), ('NY', 0), ('NY', 1)}
x_index : Size=1, Index=None, Ordered=True
Key : Dimen : Domain : Size : Members
None : 2 : C*T : 6 : {('LA', 0), ('LA', 1), ('SF', 0), ('SF', 1), ('NY', 0), ('NY', 1)}
1 Param Declarations
demand : Size=6, Index=demand_index, Domain=Any, Default=0, Mutable=False
Key : Value
('LA', 0) : 20
('LA', 1) : 0
('NY', 0) : 20
('NY', 1) : 30
('SF', 1) : 15
1 Var Declarations
x : Size=6, Index=x_index
Key : Lower : Value : Upper : Fixed : Stale : Domain
('LA', 0) : 0 : None : None : False : True : NonNegativeReals
('LA', 1) : 0 : None : None : False : True : NonNegativeReals
('NY', 0) : 0 : None : None : False : True : NonNegativeReals
('NY', 1) : 0 : None : None : False : True : NonNegativeReals
('SF', 0) : 0 : None : None : False : True : NonNegativeReals
('SF', 1) : 0 : None : None : False : True : NonNegativeReals
1 Objective Declarations
obj : Size=1, Index=None, Active=True
Key : Active : Sense : Expression
None : True : minimize : x[NY,0] + x[NY,1] + x[LA,0] + x[SF,1]
7 Declarations: C T x_index x demand_index demand obj