-
Notifications
You must be signed in to change notification settings - Fork 3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support GenericModel
and AbstractModel
#93
Conversation
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## master #93 +/- ##
=======================================
Coverage 99.89% 99.90%
=======================================
Files 10 10
Lines 967 1001 +34
=======================================
+ Hits 966 1000 +34
Misses 1 1 ☔ View full report in Codecov by Sentry. |
GenericModel
and AbstractModel
GenericModel
and AbstractModel
For reference, the following will be needed as an extension to enable the use of InfiniteOpt. Currently, this relies on the master branch of InfiniteOpt and can be included in another PR once InfiniteOpt makes its next release. import DisjunctiveProgramming as DP
import InfiniteOpt, JuMP
function InfiniteGDPModel(args...; kwargs...)
return DP.GDPModel{
InfiniteOpt.InfiniteModel,
InfiniteOpt.GeneralVariableRef,
InfiniteOpt.InfOptConstraintRef
}(args...; kwargs...)
end
InfiniteLogical(prefs...) = DP.Logical(InfiniteOpt.Infinite(prefs...))
function DP.requires_disaggregation(vref::InfiniteOpt.GeneralVariableRef)
if vref.index_type <: InfiniteOpt.InfOptParameter
return false
else
return true
end
end
function DP.make_disaggregated_variable(
model::InfiniteOpt.InfiniteModel,
vref::InfiniteOpt.GeneralVariableRef,
name,
lb,
ub
)
prefs = InfiniteOpt.parameter_refs(vref)
if !isempty(prefs)
return JuMP.@variable(model, base_name = name, lower_bound = lb, upper_bound = ub,
variable_type = InfiniteOpt.Infinite(prefs...))
else
return JuMP.@variable(model, base_name = name, lower_bound = lb, upper_bound = ub)
end
end
function JuMP.add_constraint(
model::InfiniteOpt.InfiniteModel,
c::JuMP.VectorConstraint{F, S},
name::String = ""
) where {F, S <: DP.AbstractCardinalitySet}
return DP._add_cardinality_constraint(model, c, name)
end
function JuMP.add_constraint(
model::M,
c::JuMP.ScalarConstraint{DP._LogicalExpr{M}, S},
name::String = ""
) where {S, M <: InfiniteOpt.InfiniteModel}
return DP._add_logical_constraint(model, c, name)
end
function JuMP.add_constraint(
model::M,
c::JuMP.ScalarConstraint{DP.LogicalVariableRef{M}, S},
name::String = ""
) where {M <: InfiniteOpt.InfiniteModel, S}
error("Cannot define constraint on single logical variable, use `fix` instead.")
end
function JuMP.add_constraint(
model::M,
c::JuMP.ScalarConstraint{JuMP.GenericAffExpr{C, DP.LogicalVariableRef{M}}, S},
name::String = ""
) where {M <: InfiniteOpt.InfiniteModel, S, C}
error("Cannot add, subtract, or multiply with logical variables.")
end
function JuMP.add_constraint(
model::M,
c::JuMP.ScalarConstraint{JuMP.GenericQuadExpr{C, DP.LogicalVariableRef{M}}, S},
name::String = ""
) where {M <: InfiniteOpt.InfiniteModel, S, C}
error("Cannot add, subtract, or multiply with logical variables.")
end With the above, then we can write infinite GDP models and reformulate them before discretizing which improves performance: using InfiniteOpt, DisjunctiveProgramming, HiGHS
# Initialize the model
model = InfiniteGDPModel(HiGHS.Optimizer)
# Create the infinite variables
I = 1:4
@infinite_parameter(model, t ∈ [0, 1], num_supports = 100)
@variable(model, 0 <= g[I] <= 10, Infinite(t))
# Add the disjunctions and their indicator variables
@variable(model, G[I, 1:2], InfiniteLogical(t))
@constraint(model, [i ∈ I, j ∈ 1:2], 0 <= g[i], Disjunct(G[i, 1]))
@constraint(model, [i ∈ I, j ∈ 1:2], g[i] <= 0, Disjunct(G[i, 2]))
@disjunction(model, [i ∈ I], G[i, :])
# Add the logical propositions
@variable(model, W, InfiniteLogical(t))
@constraint(model, G[1, 1] ∨ G[2, 1] ∧ G[3, 1] == W := true)
@constraint(model, 𝔼(binary_variable(W), t) >= 0.95)
# Reformulate and solve
optimize!(model, gdp_method = Hull()) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
looks good. Thanks!
I added default a reformulation method (BigM) to master
. Can you rebase/resolve any conflicts?
Thanks for reviewing, it should be good to go, assuming all the tests pass. |
This closes #85. This adds new constructors for
GDPModel
to supportJuMP.AbstractModel
s,JuMP.GenericModel
s, and other types of variable/constraint references. This paves the way for this package to work with InfiniteOpt.The major changes include:
GDPData
since this is commonly used by several reformulation methodsbinary_variable
to access them (this is needed to embed logical variables in algebraic constraints which InfiniteOpt needs)method
keyword inoptimize!
togdp_method
for better clarity --> needed for InfiniteOpt