Skip to content

Commit

Permalink
fix problem with attention_size
Browse files Browse the repository at this point in the history
  • Loading branch information
jannik-brinkmann authored and jettjaniak committed May 21, 2024
1 parent 3d8c119 commit 1234a17
Show file tree
Hide file tree
Showing 6 changed files with 11 additions and 11 deletions.
2 changes: 1 addition & 1 deletion configs/stories/llama2/1m.json
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
{
"model_config": {
"hidden_size": 84,
"hidden_size": 82,
"intermediate_size": 256,
"num_attention_heads": 8,
"num_hidden_layers": 4,
Expand Down
4 changes: 2 additions & 2 deletions configs/stories/llama2/2.5m.json
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
{
"model_config": {
"hidden_size": 168,
"intermediate_size": 384,
"hidden_size": 176,
"intermediate_size": 352,
"num_attention_heads": 8,
"num_hidden_layers": 4,
"num_key_value_heads": 4
Expand Down
4 changes: 2 additions & 2 deletions configs/stories/llama2/250k.json
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
{
"model_config": {
"hidden_size": 28,
"intermediate_size": 96,
"hidden_size": 30,
"intermediate_size": 68,
"num_attention_heads": 4,
"num_hidden_layers": 2,
"num_key_value_heads": 2
Expand Down
4 changes: 2 additions & 2 deletions configs/stories/llama2/500k.json
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
{
"model_config": {
"hidden_size": 52,
"intermediate_size": 184,
"hidden_size": 54,
"intermediate_size": 144,
"num_attention_heads": 4,
"num_hidden_layers": 2,
"num_key_value_heads": 2
Expand Down
4 changes: 2 additions & 2 deletions configs/stories/llama2/50k.json
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
{
"model_config": {
"hidden_size": 6,
"intermediate_size": 24,
"hidden_size": 8,
"intermediate_size": 16,
"num_attention_heads": 2,
"num_hidden_layers": 1,
"num_key_value_heads": 1
Expand Down
4 changes: 2 additions & 2 deletions configs/stories/llama2/5m.json
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
{
"model_config": {
"hidden_size": 232,
"intermediate_size": 512,
"hidden_size": 240,
"intermediate_size": 480,
"num_attention_heads": 12,
"num_hidden_layers": 6,
"num_key_value_heads": 6
Expand Down

0 comments on commit 1234a17

Please sign in to comment.