You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In some situations (like precise control of Enzyme, or hand-written hessians and gradients), I would like to not use any AutoX() AD backend and prefer to provide explicit grad, hess, and hv functions.
When doing so, I get the warning emitted
┌ Warning: The selected optimization algorithm requires second order derivatives, but `SecondOrder` ADtype was not provided.
│ So a `SecondOrder` with SciMLBase.NoAD() for both inner and outer will be created, this can be suboptimal and not work in some cases so
│ an explicit `SecondOrder` ADtype is recommended.
└ @ OptimizationBase ~/.julia/packages/OptimizationBase/gvXsf/src/cache.jl:49
which is erroneous, as it isn't stated that explicit AD is ever necessary anywhere in Optimization.jl
Expected behavior
I would expect this warning to be thrown if there is no explicit SecondOrder ADtype AND there was no explicit hessian/HV.
┌ Warning: The selected optimization algorithm requires second order derivatives, but `SecondOrder` ADtype was not provided.
│ So a `SecondOrder` with SciMLBase.NoAD() for both inner and outer will be created, this can be suboptimal and not work in some cases so
│ an explicit `SecondOrder` ADtype is recommended.
└ @ OptimizationBase ~/.julia/packages/OptimizationBase/gvXsf/src/cache.jl:49
retcode: Success
u:2-element Vector{Float64}:0.9999999999991080.9999999999981819
Environment (please complete the following information):
Output of using Pkg; Pkg.status()
Status `~/sync/Projects/Julia/julia_cal/Project.toml`
[47edcb42] ADTypes v1.11.0
[6e4b80f9] BenchmarkTools v1.5.0
[052768ef] CUDA v5.5.2
[7da242da] Enzyme v0.13.22
[f6369f11] ForwardDiff v0.10.38
[f67ccb44] HDF5 v0.17.2
[b6b21f68] Ipopt v1.7.0
[c3a54625] JET v0.9.12
[0fc2ff8b] LeastSquaresOptim v0.8.6
[33e6dc65] MKL v0.7.0
⌅ [1cead3c2] Manifolds v0.9.18
[15e1cf62] NPZ v0.4.3
[8913a72c] NonlinearSolve v4.2.0
[429524aa] Optim v1.10.0
[3bd65402] Optimisers v0.4.2
[7f7a1694] Optimization v4.0.5
[fd9f6733] OptimizationMOI v0.5.1
[e57b7fff] OptimizationManopt v0.0.4
[4e6fcdb7] OptimizationNLopt v0.3.2
[36348300] OptimizationOptimJL v0.4.1
[42dfb2eb] OptimizationOptimisers v0.3.6
[91a5bcdd] Plots v1.40.9
[37e2e3b7] ReverseDiff v1.15.3
[9f842d2f] SparseConnectivityTracer v0.6.9
[10745b16] Statistics v1.11.1
[2913bbd2] StatsBase v0.34.3
[4297ee4d] SymbolicAnalysis v0.3.0
[0c5d862f] Symbolics v6.22.0
Info Packages marked with ⌅ have new versions available but compatibility constraints restrict them from upgrading. To see why use `status --outdated`
Output of using Pkg; Pkg.status(; mode = PKGMODE_MANIFEST)
I guess you could always create your own AutoManual <: AbstractADType and implement everything necessary for DifferentiationInterface to work.
Related issue: JuliaDiff/DifferentiationInterface.jl#634
Describe the bug 🐞
In some situations (like precise control of Enzyme, or hand-written hessians and gradients), I would like to not use any
AutoX()
AD backend and prefer to provide explicitgrad
,hess
, andhv
functions.When doing so, I get the warning emitted
which is erroneous, as it isn't stated that explicit AD is ever necessary anywhere in Optimization.jl
Expected behavior
I would expect this warning to be thrown if there is no explicit
SecondOrder
ADtype AND there was no explicit hessian/HV.Minimal Reproducible Example 👇
Error & Stacktrace⚠️
Environment (please complete the following information):
using Pkg; Pkg.status()
using Pkg; Pkg.status(; mode = PKGMODE_MANIFEST)
versioninfo()
Additional context
I notice this same error is being thrown in the docs as well such as here
The text was updated successfully, but these errors were encountered: