Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[docs] fix a bug in the Benders tutorial #3834

Merged
merged 2 commits into from
Oct 2, 2024
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
19 changes: 10 additions & 9 deletions docs/src/tutorials/algorithms/benders_decomposition.jl
Original file line number Diff line number Diff line change
Expand Up @@ -168,9 +168,8 @@ objective_value(model)

# and the optimal flows are:

function optimal_flows(x_sol)
flows = [(i, j) => value(x[i, j]) for i in 1:n for j in 1:n]
return filter!(flow -> last(flow) > 0, flows)
function optimal_flows(x)
return [(i, j) => x[i, j] for i in 1:n for j in 1:n if x[i, j] > 0]
end

monolithic_solution = optimal_flows(value.(y))
Expand Down Expand Up @@ -217,7 +216,7 @@ end

# Note that `solve_subproblem` returns a `NamedTuple` of the objective value,
# the optimal primal solution for `y`, and the optimal dual solution for `π`,
# which we obtained from the reduced cost of the `x` variables.
# which we obtained from the [`reduced_cost`](@ref) of the `x` variables.

# We're almost ready for our optimization loop, but first, here's a helpful
# function for logging:
Expand Down Expand Up @@ -370,11 +369,11 @@ set_silent(subproblem)
@constraint(subproblem, [i = 1:n, j = 1:n], y[i, j] <= G[i, j] * x_copy[i, j])
@constraint(subproblem, [i = 2:n-1], sum(y[i, :]) == sum(y[:, i]))
@objective(subproblem, Min, -sum(y[1, :]))
subproblem

# Our function to solve the subproblem is also slightly different. First, we
# Our function to solve the subproblem is also slightly different because we
# need to fix the value of the `x_copy` variables to the value of `x` from the
# first-stage problem, and second, we compute the dual using the
# [`reduced_cost`](@ref) of `x_copy`:
# first-stage problem:

function solve_subproblem(model, x)
fix.(model[:x_copy], x)
Expand Down Expand Up @@ -426,9 +425,9 @@ inplace_solution == monolithic_solution
# first-stage values of `x`, the subproblem might be infeasible. The solution is
# to add a Benders feasibility cut:
# ```math
# v_k + u_k^\top (x - x_k) \le 0.
# v_k + u_k^\top (x - x_k) \le 0
# ```
# where $u_k$ is an dual unbounded ray of the subproblem, and $v_k$ is the
# where $u_k$ is a dual unbounded ray of the subproblem and $v_k$ is the
# intercept of the unbounded ray.

# As a variation of our example which leads to infeasibilities, we add a
Expand All @@ -443,6 +442,7 @@ set_silent(model)
@variable(model, θ >= M)
@constraint(model, sum(x) <= 11)
@objective(model, Min, 0.1 * sum(x) + θ)
model

# But the subproblem has a new constraint that `sum(y) >= 1`:

Expand All @@ -457,6 +457,7 @@ set_attribute(subproblem, "presolve", "off")
@constraint(subproblem, [i = 1:n, j = 1:n], y[i, j] <= G[i, j] * x_copy[i, j])
@constraint(subproblem, [i = 2:n-1], sum(y[i, :]) == sum(y[:, i]))
@objective(subproblem, Min, -sum(y[1, :]))
subproblem

# The function to solve the subproblem now checks for feasibility, and returns
# the dual objective value and an dual unbounded ray if the subproblem is
Expand Down
Loading