-
Notifications
You must be signed in to change notification settings - Fork 3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support Function Calls #119
Comments
I have found, that a functionality for this exists for NLP using @NLconstraint(). It would be interesting to see this extended to GDP: m = Model(Ipopt.Optimizer) @variable(m, x, start=1.0) @variable(m, shiftx) function call_cpp_function(x, shiftx, shifty) function derivative(grad, x, shiftx, shifty) register(m, :call_cpp_function, 3, call_cpp_function, derivative, autodiff=false) @objective(m, Min, y) print("Value of y: ",value(y)) |
Hi @lukasscheffold, Since DisjunctiveProgramming is a JuMP extension, it inherits all the same features that JuMP has. A
This already works with DisjunctiveProgramming. It is a paradigm called function tracing, see https://jump.dev/JuMP.jl/stable/manual/nonlinear/#Function-tracing.
Registering functions is the way to add external functions to JuMP models. Note however that DisjunctiveProgramming will allow you to use user defined operators to build your model. However, once the model is reformulated, you'll end up with a MINLP that has user defined operators which most if not all MINLP solvers will not support. Although, maybe Juniper will... This limitation is not unique to JuMP models and is also true if you try to use an external function with Pyomo.gdp. |
Thank you for your response. I have tried the @operator implementation, first with IPOPT and a regular JuMP model (which works!). Afterwards I proceded with the GDPModel, using AMPLs bonmin (works with Pyomo.gdp and external function calls) and HiGHS. The HiGHS solver throws the error, that it is not capable of evaluating user defined nonlinear functions. The AMPL bonmin solver however throws the following error: |
Hi @lukasscheffold, this is likely a limitation of how AmplNLWriter.jl (the Julia package) interfaces with Bonmin. Any thoughts, @odow? In any case, the ability to have mixed-integer solvers like Bonmin support use user-defined operators is not determined by DisjunctiveProgramming.jl, but rather the JuMP (or more specifically the MathOptInterface) ecosystem and the solver itself. All DisjunctiveProgramming does is use GDP modelling objects to automate the creation of a MILP or MINLP JuMP model. How the resulting MILP/MINLP model is solved is solely determined by JuMP. |
AmplNLWriter does not support user-defined functions. |
@lukasscheffold, since this issue is not directly related to DisjunctiveProgramming, I am going to close it. To request support of user-defined operators for Bonmin, you can raise an issue with https://github.com/jump-dev/AmplNLWriter.jl. |
I have no plans to support this feature in AmplNLWriter. |
It would be useful to be able to implement function calls into constraints. Pyomo has a similar functionality under the external() module. Here you can supply a function call, such as a function or an external library (DLL) call, to calculate a value. This value, along with its derivatives, can then be applied to a constraint or logicalConstraint. I am thinking of a functionality like the following:
using DisjunctiveProgramming
using HiGHS
m = GDPModel(HiGHS.Optimizer)
@variable(m, 0 <= a)
function quadratic(x)
return y = (x + 2)^2 + 1
end
@constraint(m, 0 == quadratic(a))
The text was updated successfully, but these errors were encountered: