Search code examples
juliamathematical-optimizationjulia-jump

Julia JuMP feasibility slack of constraints


In Julia, by using JuMP am setting up a simple optimization problem (MWE, the real problem is much bigger).

model = Model()
set_optimizer(model, MosekTools.Optimizer)
@variable(model, 0 <= x[1:2])
@constraint(model, sum(x) <= 2)
@constraint(model, 1 <= sum(x))
@objective(model, Min, sum(x))
print(model)

Which gives this model:

Min x[1] + x[2]
Subject to
 x[1] + x[2] ≤ 2.0
 -x[1] - x[2] ≤ -1.0
 x[1] ≥ 0.0
 x[2] ≥ 0.0

I optimize this model via optimize!(model).

Now, obviously, the constraint x[1] + x[2] <= 2 is redundant and it has a feasibility slack of "3". My goal is to determine all the constraints that have slacks larger than 0 and display the slacks. Then I will delete those from the model.

To this end, I iterate over the constraints which are not variable bounds and print their values.

for (F, S) in list_of_constraint_types(model)
    # Iterate over constraint types
    if F!= JuMP.VariableRef #for constraints that 
        for ci in all_constraints(model, F, S)
            println(value(ci))
        end
    end
end

However, because I print the value of the constraints, I get the left-hand sides:

1.0
-1.0

I want to instead see the slacks as

0
3

How may I do this? Note that I am not necessarily interested in linear programs, so things like shadow_value is not useful for me.


Based on the accepted answer, I am adding a MWE that solves this problem.

model = Model()
set_optimizer(model, MosekTools.Optimizer)
@variable(model, 0 <= x[1:2])
@constraint(model, sum(x) <= 2)
@constraint(model, 1 <= sum(x))
@constraint(model, 0.9 <= sum(x))
@objective(model, Min, sum(x))
print(model)
optimize!(model)
constraints_to_delete = vec([])
for (F, S) in list_of_constraint_types(model)
    if F!= JuMP.VariableRef
        for ci in all_constraints(model, F, S)
            slack = normalized_rhs(ci) - value(ci)
            if slack > 10^-5
                push!(constraints_to_delete, ci)
                println(slack)
                #delete(model, ci)
            end
        end
    end
end

for c in constraints_to_delete
    delete(model, c)
end

print(model)

Solution

  • Read this (hot off the press) tutorial: https://jump.dev/JuMP.jl/dev/tutorials/linear/lp_sensitivity/.

    Although focused on LPs, it shows how to compute slacks etc using normalized_rhs(ci) - value(ci).