We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
transformers 4.46.0+没有GEMMA2_ATTENTION_CLASSES 这个了,在gemma_model.py 56行 需要把这个去掉
The text was updated successfully, but these errors were encountered:
感谢您的反馈! 考虑到不同版本兼容性不同,这里我们推荐使用4.44.2及以下的transformers版本,去掉GEMMA2_ATTENTION_CLASSES可能导致低版本transformers报错,这里就需要自行更改了
GEMMA2_ATTENTION_CLASSES
Sorry, something went wrong.
这个目前用于bge-m3暂时work的
我修改了一下,切换成4.47.0版本是一样的效果,在这个模型的效果没啥区别啊,计算内积也是一样的mbge-multilingual-gemma2
No branches or pull requests
transformers 4.46.0+没有GEMMA2_ATTENTION_CLASSES 这个了,在gemma_model.py 56行 需要把这个去掉
The text was updated successfully, but these errors were encountered: