-
Notifications
You must be signed in to change notification settings - Fork 24
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
add methods #18
Comments
upd: has been added via this commit |
some problems with installation of the lates version of |
is there a pull request for this? would be nice to collaborate |
hi, we are deploying it to the |
a useful settings:
|
Adam-mini Note I use
TODO: update partitions names |
hi, I'll add sophia and adafactor. |
Hello! Super, just develop this in your branch and then PR to Note: in official repository, they do not show SophiaH (with Hutchinson's preconditioner), only SophiaG. We want to have both methods here. SophiaH is nicely implemented in optax for now, but its not so hard to write in PyToch, see: this link Thx) |
hi, Bristen is back early, so I'll get back to that. I did some research on Sophia, though, main findings:
Adafactor is simple, it's already close to being released officially, see pytorch/pytorch#129905. When I get some time next I'll return to this if you haven't. |
muon optimizer should also be a good one to add. i think @doikov might be interested in that one too: |
once we have a handful, we'll have a nice benchmark collection for LLM optimizers, probably worth a small writeup soon |
yes, i am working on that. btw |
I mean, for the official version of SophaiG, you may just look at the paper's repo: https://github.com/Liuhong99/Sophia |
SOAPMuonAdam-miniLionSophiaAdEMAMixSchedule-FreeAdafactorSignum, signSGDProdigySGDFLAMBMARS(with grad calculation in a different stochasticity)The text was updated successfully, but these errors were encountered: