You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The FlexAttention and the high-resolution feature selection module is implemented in transformers/src/transformers/models/llama/modeling_llama.py, within the self attention impl. For example, this line.
Question
I have looked at the overall code but haven't found which module it is in yet. Is the FlexAttention module in that py file
The text was updated successfully, but these errors were encountered: