- Deprecate RAdam optimizer.
- Revert for Drop RAdam.
- Drop RAdam optimizer since it is included in pytorch.
- Do not include tests as installable package.
- Preserver memory layout where possible.
- Add MADGRAD optimizer.
- Initial release.
- Added support for A2GradExp, A2GradInc, A2GradUni, AccSGD, AdaBelief, AdaBound, AdaMod, Adafactor, Adahessian, AdamP, AggMo, Apollo, DiffGrad, Lamb, Lookahead, NovoGrad, PID, QHAdam, QHM, RAdam, Ranger, RangerQH, RangerVA, SGDP, SGDW, SWATS, Shampoo, Yogi.