Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support GenericModel and AbstractModel #93

Merged
merged 9 commits into from
Nov 16, 2023
Merged

Support GenericModel and AbstractModel #93

merged 9 commits into from
Nov 16, 2023

Conversation

pulsipher
Copy link
Collaborator

@pulsipher pulsipher commented Nov 4, 2023

This closes #85. This adds new constructors for GDPModel to support JuMP.AbstractModels, JuMP.GenericModels, and other types of variable/constraint references. This paves the way for this package to work with InfiniteOpt.

The major changes include:

  • Parameterizing many of the types by the model, variable, and/or constraint type
  • Storing variable bounds directly in GDPData since this is commonly used by several reformulation methods
  • Creating binary variables immediately when logical variables are added and adding binary_variable to access them (this is needed to embed logical variables in algebraic constraints which InfiniteOpt needs)
  • Cleaning up the docs a little more
  • Changing the method keyword in optimize! to gdp_method for better clarity --> needed for InfiniteOpt

Copy link

codecov bot commented Nov 4, 2023

Codecov Report

All modified and coverable lines are covered by tests ✅

Comparison is base (08878e3) 99.89% compared to head (1e6dbe2) 99.90%.

Additional details and impacted files
@@           Coverage Diff           @@
##           master      #93   +/-   ##
=======================================
  Coverage   99.89%   99.90%           
=======================================
  Files          10       10           
  Lines         967     1001   +34     
=======================================
+ Hits          966     1000   +34     
  Misses          1        1           

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@pulsipher pulsipher changed the title DO NOT MERGE YET: Support GenericModel and AbstractModel Support GenericModel and AbstractModel Nov 6, 2023
@pulsipher
Copy link
Collaborator Author

pulsipher commented Nov 7, 2023

For reference, the following will be needed as an extension to enable the use of InfiniteOpt. Currently, this relies on the master branch of InfiniteOpt and can be included in another PR once InfiniteOpt makes its next release.

import DisjunctiveProgramming as DP
import InfiniteOpt, JuMP

function InfiniteGDPModel(args...; kwargs...)
    return DP.GDPModel{
        InfiniteOpt.InfiniteModel, 
        InfiniteOpt.GeneralVariableRef, 
        InfiniteOpt.InfOptConstraintRef
        }(args...; kwargs...)
end

InfiniteLogical(prefs...) = DP.Logical(InfiniteOpt.Infinite(prefs...))

function DP.requires_disaggregation(vref::InfiniteOpt.GeneralVariableRef)
    if vref.index_type <: InfiniteOpt.InfOptParameter
        return false
    else
        return true
    end
end

function DP.make_disaggregated_variable(
    model::InfiniteOpt.InfiniteModel,
    vref::InfiniteOpt.GeneralVariableRef,
    name,
    lb,
    ub
    )
    prefs = InfiniteOpt.parameter_refs(vref)
    if !isempty(prefs)
        return JuMP.@variable(model, base_name = name, lower_bound = lb, upper_bound = ub, 
                              variable_type = InfiniteOpt.Infinite(prefs...))
    else
        return JuMP.@variable(model, base_name = name, lower_bound = lb, upper_bound = ub)
    end
end

function JuMP.add_constraint(
    model::InfiniteOpt.InfiniteModel,
    c::JuMP.VectorConstraint{F, S},
    name::String = ""
    ) where {F, S <: DP.AbstractCardinalitySet}
    return DP._add_cardinality_constraint(model, c, name)
end
function JuMP.add_constraint(
    model::M,
    c::JuMP.ScalarConstraint{DP._LogicalExpr{M}, S},
    name::String = ""
    ) where {S, M <: InfiniteOpt.InfiniteModel}
   return DP._add_logical_constraint(model, c, name)
end
function JuMP.add_constraint(
    model::M,
    c::JuMP.ScalarConstraint{DP.LogicalVariableRef{M}, S},
    name::String = ""
    ) where {M <: InfiniteOpt.InfiniteModel, S}
    error("Cannot define constraint on single logical variable, use `fix` instead.")
end
function JuMP.add_constraint(
    model::M,
    c::JuMP.ScalarConstraint{JuMP.GenericAffExpr{C, DP.LogicalVariableRef{M}}, S},
    name::String = ""
    ) where {M <: InfiniteOpt.InfiniteModel, S, C}
    error("Cannot add, subtract, or multiply with logical variables.")
end
function JuMP.add_constraint(
    model::M,
    c::JuMP.ScalarConstraint{JuMP.GenericQuadExpr{C, DP.LogicalVariableRef{M}}, S},
    name::String = ""
    ) where {M <: InfiniteOpt.InfiniteModel, S, C}
    error("Cannot add, subtract, or multiply with logical variables.")
end

With the above, then we can write infinite GDP models and reformulate them before discretizing which improves performance:

using InfiniteOpt, DisjunctiveProgramming, HiGHS

# Initialize the model
model = InfiniteGDPModel(HiGHS.Optimizer)

# Create the infinite variables
I = 1:4
@infinite_parameter(model, t  [0, 1], num_supports = 100)
@variable(model, 0 <= g[I] <= 10, Infinite(t))

# Add the disjunctions and their indicator variables
@variable(model, G[I, 1:2], InfiniteLogical(t))
@constraint(model, [i  I, j  1:2], 0 <= g[i], Disjunct(G[i, 1]))
@constraint(model, [i  I, j  1:2], g[i] <= 0, Disjunct(G[i, 2]))
@disjunction(model, [i  I], G[i, :])

# Add the logical propositions
@variable(model, W, InfiniteLogical(t))
@constraint(model, G[1, 1]  G[2, 1]  G[3, 1] == W := true)
@constraint(model, 𝔼(binary_variable(W), t) >= 0.95)

# Reformulate and solve 
optimize!(model, gdp_method = Hull())

@pulsipher pulsipher requested a review from hdavid16 November 7, 2023 04:04
src/variables.jl Outdated Show resolved Hide resolved
Copy link
Owner

@hdavid16 hdavid16 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

looks good. Thanks!
I added default a reformulation method (BigM) to master. Can you rebase/resolve any conflicts?

@pulsipher
Copy link
Collaborator Author

looks good. Thanks! I added default a reformulation method (BigM) to master. Can you rebase/resolve any conflicts?

Thanks for reviewing, it should be good to go, assuming all the tests pass.

@hdavid16 hdavid16 merged commit 7382c45 into master Nov 16, 2023
6 checks passed
@pulsipher pulsipher deleted the abstract-model branch November 16, 2023 21:45
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Generalize to AbstractModels
2 participants