Replies: 1 comment 1 reply
-
how did you load it in huggingface's transformers ,thanks |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Can you please add support or point me to a branch for https://huggingface.co/CohereForAI/c4ai-command-r-plus ? currently vllm fails to load and throws error. I updated transformers to the latest and still fails. i am able to load it directly with transformers though.
The error vllm throws is
KeyError: 'model.layers.14.self_attn.k_norm.weight' [repeated 2x across cluster]
Beta Was this translation helpful? Give feedback.
All reactions