[Feature Request] Make torchopt.optim.Optimizer compatible with pytorch lightning #203
Open
2 tasks done
Labels
enhancement
New feature or request
Required prerequisites
Motivation
Currently torchopt.optim classes aren't compatible with lightning's
configure_optimizers
.This is because lightning doesn't think they are Optimizable
For it to be Optimizable it requires
defaults
andstate
attributes.If simply you do
then
isinstance(optimizer, Optimizable)
passes and torchopt <> lightning works a charm 😍Solution
Can we add
defaults
andstate
attributes to thetorchopt.optim.Optimizer
class?Alternatives
No response
Additional context
No response
The text was updated successfully, but these errors were encountered: