You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is your feature request related to a problem? Please describe.
The current implementation causes issues when loading old model checkpoints during inference as it is not clear whether flash attention was used or not.
Describe the solution you'd like
A class MultiHeadSelfAttention and FlashMultiHeadSelfAttention which inherits from the former, but makes use of flash attention. This should be set by the user in the config.
Describe alternatives you've considered
No response
Additional context
No response
Organisation
ECMWF
The text was updated successfully, but these errors were encountered:
Is your feature request related to a problem? Please describe.
The current implementation causes issues when loading old model checkpoints during inference as it is not clear whether flash attention was used or not.
Describe the solution you'd like
A class MultiHeadSelfAttention and FlashMultiHeadSelfAttention which inherits from the former, but makes use of flash attention. This should be set by the user in the config.
Describe alternatives you've considered
No response
Additional context
No response
Organisation
ECMWF
The text was updated successfully, but these errors were encountered: