You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, currently the flexattention is implemented in this specific transformers, and you need to install this specific version. It is mostly identical to v4.37.2 except transformers/src/transformers/models/llama/modeling_llama.py
Question
transformers version
diff
https://github.com/UMass-Foundation-Model/FlexAttention/tree/main/transformers
The text was updated successfully, but these errors were encountered: