Skip to content

Commit

Permalink
Better Attention Added
Browse files Browse the repository at this point in the history
  • Loading branch information
erfanzar committed Nov 25, 2023
1 parent 3317476 commit 0f50450
Show file tree
Hide file tree
Showing 6 changed files with 2,984 additions and 2 deletions.
2 changes: 1 addition & 1 deletion fjformer/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -26,4 +26,4 @@
JaxRNG, GenerateRNG, init_rng, next_rng, count_num_params
)

__version__ = '0.0.10'
__version__ = '0.0.11'
2 changes: 2 additions & 0 deletions fjformer/attention/__init__.py
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
from .efficient_attention import efficient_attention
from .flash_attention_0 import dot_product_attention_multihead, dot_product_attention_multiquery, \
dot_product_attention_queries_per_head
from .flash_attention import ring_attention, ring_attention_standard, ring_flash_attention_gpu, \
ring_flash_attention_tpu, blockwise_ffn, blockwise_attn
Loading

0 comments on commit 0f50450

Please sign in to comment.