Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ensure compatibility with Plasmo 0.5.4 and static partition example #5

Open
wants to merge 3 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
62 changes: 62 additions & 0 deletions examples/optimal_control_static_partition.jl
Original file line number Diff line number Diff line change
@@ -0,0 +1,62 @@
# Example demonstrating the use of overlap to solve a long horizon control problem
# Uses a static partitioning of the graph (useful if KaHyPar is not available)
using Plasmo, Ipopt
using SchwarzOpt

T = 8 # number of time points
d = sin.(1:T) # a disturbance vector
imbalance = 0.1 # partition imbalance
distance = 2 # expand distance
n_parts = 5 # number of partitions

# Create the model (an optigraph)
graph_schwarz = OptiGraph()

@optinode(graph_schwarz, state[1:T])
@optinode(graph_schwarz, control[1:(T-1)])

for (i, node) in enumerate(state)
@variable(node, x)
@constraint(node, x >= 0)
@objective(node, Min, x^2) #- 2*x*d[i])
end
for node in control
@variable(node, u)
@constraint(node, u >= -1000)
@objective(node, Min, u^2)
end
n1 = state[1]
@constraint(n1, n1[:x] == 0)

for i in 1:(T-1)
@linkconstraint(graph_schwarz, state[i][:x] + control[i][:u] + d[i] == state[i+1][:x])
end

hypergraph, hyper_map = hyper_graph(graph_schwarz) # create hypergraph object based on graph
# Create an equally sized partition based on the number of time points and number of partitions.
N_node = 2 * T - 1
partition_vector = repeat(1:n_parts, inner=N_node ÷ n_parts)
remaining = N_node % n_parts
if remaining > 0
partition_vector = [partition_vector; repeat(1:remaining)]
end


# apply partition to graph
partition = Partition(hypergraph, partition_vector, hyper_map)
##
@run apply_partition!(graph_schwarz, partition)
##
# calculate subproblems using expansion distance
subs = subgraphs(graph_schwarz)
expanded_subgraphs = Plasmo.expand.(graph_schwarz, subs, distance)
sub_optimizer = optimizer_with_attributes(Ipopt.Optimizer, "print_level" => 0)

# optimize using schwarz overlapping decomposition
optimizer = SchwarzOpt.optimize!(
graph_schwarz;
subgraphs=expanded_subgraphs,
sub_optimizer=sub_optimizer,
max_iterations=100,
mu=100.0,
)
32 changes: 21 additions & 11 deletions src/optimizer.jl
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@tso-martin: Are the value and dual functions not working as expected when given the graph as an argument in Plasmo v0.5.4? Removing the graph argument should cause issues when running on multiple threads.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I get errors for each thread, when value and dual are called with the graph argument. Example:

 nested task error: KeyError: key MOI.VariableIndex(37) not found
    Stacktrace:
      [1] getindex
        @ xxx\.julia\packages\MathOptInterface\yczX1\src\Utilities\CleverDicts.jl:164 [inlined]
      [2] getindex
        @ xxx\.julia\packages\MathOptInterface\yczX1\src\Utilities\copy\index_map.jl:64 [inlined]
      [3] get(node_pointer::Plasmo.NodePointer, attr::MathOptInterface.VariablePrimal, idx::MathOptInterface.VariableIndex)
        @ Plasmo xxx\SchwarzTryout\dev\Plasmo\src\moi_backend_node.jl:287
      [4] value(graph::OptiGraph, var::VariableRef)
        @ Plasmo xxx\SchwarzTryout\dev\Plasmo\src\optigraph.jl:514
      [5] (::SchwarzOpt.var"#25#27"{OptiGraph})(::Pair{Int64, VariableRef})
        @ SchwarzOpt .\none:0
      [6] iterate(g::Base.Generator, s::Vararg{Any})
        @ Base .\generator.jl:47 [inlined]
      [7] _all(f::Base.var"#384#386", itr::Base.Generator{Dict{Int64, VariableRef}, SchwarzOpt.var"#25#27"{OptiGraph}}, ::Colon)
        @ Base .\reduce.jl:1287
      [8] all
        @ .\reduce.jl:1283 [inlined]
      [9] Dict(kv::Base.Generator{Dict{Int64, VariableRef}, SchwarzOpt.var"#25#27"{OptiGraph}})
        @ Base .\dict.jl:111
     [10] _do_iteration(subproblem_graph::OptiGraph)
        @ SchwarzOpt xxx\SchwarzTryout\dev\SchwarzOpt\src\optimizer.jl:305
     [11] macro expansion
        @ xxx\SchwarzTryout\dev\SchwarzOpt\src\optimizer.jl:414 [inlined]
     [12] (::SchwarzOpt.var"#45#threadsfor_fun#32"{SchwarzOpt.var"#45#threadsfor_fun#31#33"{…}})(tid::Int64; onethread::Bool)
        @ SchwarzOpt .\threadingconstructs.jl:214
     [13] #45#threadsfor_fun
        @ SchwarzOpt .\threadingconstructs.jl:181 [inlined]
     [14] (::Base.Threads.var"#1#2"{SchwarzOpt.var"#45#threadsfor_fun#32"{SchwarzOpt.var"#45#threadsfor_fun#31#33"{…}}, Int64})()
        @ Base.Threads .\threadingconstructs.jl:153

Removing the graph argument worked fine in multiple threads

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Interesting, that is definitely a bug. Without the graph argument, value grabs the last solution the node was used in. So it's possible it takes the wrong one if there are multiple subgraphs that use it. I will take a look at this and get a fix out. Thanks for checking into this!

Original file line number Diff line number Diff line change
Expand Up @@ -291,19 +291,21 @@ end
function _do_iteration(subproblem_graph::OptiGraph)
Plasmo.optimize!(subproblem_graph)
term_status = termination_status(subproblem_graph)
# Create label of subproblem_graph by concatenating labels of optinodes
label = join(map(x -> x.label, all_nodes(subproblem_graph)), "_")
!(
term_status in [
MOI.TerminationStatusCode(4),
MOI.TerminationStatusCode(1),
MOI.TerminationStatusCode(10),
]
) && @warn("Suboptimal solution detected for subproblem with status $term_status")
) && @warn("Suboptimal solution detected for subproblem with status $term_status: $label")

x_sub = subproblem_graph.ext[:x_map] # primal variables to update
l_sub = subproblem_graph.ext[:l_map] # dual variables to update

xk = Dict(key => value(subproblem_graph, val) for (key, val) in x_sub)
lk = Dict(key => dual(subproblem_graph, val) for (key, val) in l_sub)
xk = Dict(key => value(val) for (key, val) in x_sub)
lk = Dict(key => dual(val) for (key, val) in l_sub)

return xk, lk
end
Expand All @@ -323,13 +325,20 @@ function _calculate_primal_feasibility(optimizer)
val = 0
linkcon = constraint_object(linkref)
terms = linkcon.func.terms
for (term, coeff) in terms
node = optinode(term)
graph = optimizer.node_subgraph_map[node]
subproblem_graph = optimizer.expanded_subgraph_map[graph]
val += coeff * value(subproblem_graph, term)
try
for (term, coeff) in terms
node = optinode(term)
graph = optimizer.node_subgraph_map[node]
subproblem_graph = optimizer.expanded_subgraph_map[graph]
val += coeff * value(term)
end
push!(prf, val - linkcon.set.value)
catch
println("Error in calculating primal feasibility for linkconstraint: $linkcon")
println("linkcon.set: $(linkcon.set)")
println("linkcon.func: $(linkcon.func)")
println("linkcon.func.terms: $(linkcon.func.terms)")
end
push!(prf, val - linkcon.set.value)
end
return prf
end
Expand Down Expand Up @@ -403,7 +412,8 @@ function optimize!(optimizer::Optimizer)

#Do iteration for each subproblem
optimizer.timers.update_subproblem_time += @elapsed begin
for subproblem_graph in optimizer.subproblem_graphs
for (i_sp, subproblem_graph) in enumerate(optimizer.subproblem_graphs)
println("Updating subproblem $i_sp")
_update_subproblem!(optimizer, subproblem_graph)
end
end
Expand Down Expand Up @@ -457,7 +467,7 @@ function optimize!(optimizer::Optimizer)
JuMP.set_start_value.(
Ref(subproblem),
all_variables(subproblem),
value.(Ref(subproblem), all_variables(subproblem)),
value.(all_variables(subproblem)),
)
end
end
Expand Down
Loading