-
Notifications
You must be signed in to change notification settings - Fork 23
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
论文有在中文数据集上的效果吗? #5
Comments
我们在CPT(https://arxiv.org/abs/2109.05729)开源的中文BART上尝试过,印象中性能会比直接sequence labeling的方式差一些,当时的看的几个错误感觉应该是由于找准boundary在中文上比英文更难。 |
我看huggingface上有 中文的BART 没有在中文的BART上进行实验吗? |
我用huggingface上中文的BART-base跑了一下,效果和Albert差不多,但是还没有找到蒸馏的方法。。。 |
请问替换为中文的fnlp/bart-large-chinese 模型后,如何改动代码,能让bart训练中文?请问有 训练成功过吗 |
这是来自QQ邮箱的假期自动回复邮件。你发给我的信件已经收到。谢谢!
|
请问能发一份你跑的demo吗,我试过用fnlp/bart-base-chinese,但一直没有成功,最后还是用bart-base跑的中文数据集。。。 |
您好,中文的一直结果是0,要改代码哪里,请指教 |
这是来自QQ邮箱的假期自动回复邮件。你发给我的信件已经收到。谢谢!
|
您好,想请教下代码要改哪里,我改完f一直等于0,谢谢🙏 |
这是来自QQ邮箱的假期自动回复邮件。你发给我的信件已经收到。谢谢!
|
如题。
请问,论文有在中文数据集上进行实验吗?效果如何?
The text was updated successfully, but these errors were encountered: