You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is your feature request related to a problem? Please describe:
LoRA is designed to reduce the memory requirements to finetune LLMs and already exists in Fairseq2 to some capacity. VeRA reduces the memory overhead of vanilla LoRA even further by training a single vector that is treated as a diagonal matrix that is multiplied by two other matrices initialized using a normal distribution.
Describe the solution you would like:
There should be a separate class to wrap models with VeRA like there is currently with LoRA and a premade recipe would also be nice to have.
Describe the alternatives you have considered:
None
Additional Context:
None
The text was updated successfully, but these errors were encountered:
Is your feature request related to a problem? Please describe:
LoRA is designed to reduce the memory requirements to finetune LLMs and already exists in Fairseq2 to some capacity. VeRA reduces the memory overhead of vanilla LoRA even further by training a single vector that is treated as a diagonal matrix that is multiplied by two other matrices initialized using a normal distribution.
Describe the solution you would like:
There should be a separate class to wrap models with VeRA like there is currently with LoRA and a premade recipe would also be nice to have.
Describe the alternatives you have considered:
None
Additional Context:
None
The text was updated successfully, but these errors were encountered: