-
Notifications
You must be signed in to change notification settings - Fork 46
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
decouple the lr scheduler and optimizer? #36
Comments
I would like to second this. A split in ranger optimizer and ranger scheduler would be really cool. |
Hi @hiyyg and @neuronflow, |
This is what I initially had in mind. Maybe, just maybe Ranger optimizer should go hand in hand with Ranger scheduler following the standard pytorch conventions? |
Hi @lessw2020, apparently in this current implementation there is no way to have different parameters learn using different learning rates. Did I get it right? If this were available, I would love to use it. Two use cases are the following:
|
Hi @fmellomascarenhas, @neuronflow and @hiyyg - fully agree with all the points above (decoupled scheduler and parameter groups. |
Hi @lessw2020, thanks for the very nice work!
I noticed that in this Ranger21, the optimizer is tightly coupled with the lr scheduler, could you guide me how I can decouple them?
The text was updated successfully, but these errors were encountered: