-
-
Notifications
You must be signed in to change notification settings - Fork 398
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Refactor GenericNonlinearExpr and NonlinearOperator constructors #3489
Conversation
Codecov ReportPatch coverage:
Additional details and impacted files@@ Coverage Diff @@
## master #3489 +/- ##
==========================================
- Coverage 98.11% 98.11% -0.01%
==========================================
Files 37 37
Lines 5531 5529 -2
==========================================
- Hits 5427 5425 -2
Misses 104 104
☔ View full report in Codecov by Sentry. 📢 Have feedback on the report? Share it here. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I like this
For future reference, here were the functions I removed function variable_ref_type(
::Type{GenericNonlinearExpr},
::AbstractArray{T},
) where {T<:AbstractJuMPScalar}
return variable_ref_type(T)
end
moi_function(x::AbstractArray) = moi_function.(x)
jump_function(model::GenericModel, x::AbstractArray) = jump_function.(model, x)
function test_nonlinear_operator_vector_args()
model = Model()
@variable(model, x[1:2, 1:2])
op_det = NonlinearOperator(LinearAlgebra.det, :det)
@inferred log(op_det(x))
z = [2.0 1.0; 1.0 2.0]
@inferred log(op_det(z))
@objective(model, Min, log(op_det(x)))
@test isequal_canonical(objective_function(model), log(op_det(x)))
f = MOI.get(model, MOI.ObjectiveFunction{MOI.ScalarNonlinearFunction}())
g = MOI.ScalarNonlinearFunction(:det, Any[index.(x)])
h = MOI.ScalarNonlinearFunction(:log, Any[g])
@test f ≈ h
@test moi_function(log(op_det(x))) ≈ h
return
end |
function GenericNonlinearExpr(head::Symbol, args::Vector{Any}) | ||
index = findfirst(Base.Fix2(isa, AbstractJuMPScalar), args) | ||
if index === nothing | ||
error( | ||
"Unable to create a nonlinear expression because it did not " * | ||
"contain any JuMP scalars. head = $head, args = $args.", | ||
) | ||
end | ||
return new{variable_ref_type(args[index])}(head, args) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I understand the motivation for this. Unfortunately, it will break InfiniteOpt, but I can look into refactoring the workflow. If supporting array arguments necessitates this change, then I support it.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We can probably just keep this method then. It's not necessary. Let me make some changes.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Try this
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I confirmed that this works with InfiniteOpt: infiniteopt/InfiniteOpt.jl#321
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks, you beat me to it :)
Alternative for #3451
If we get rid of the "let's check if we can infer the variable type" then a lot of things become much easier.