Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

vocab_file找不到 #4

Open
pythonla opened this issue Dec 9, 2021 · 2 comments
Open

vocab_file找不到 #4

pythonla opened this issue Dec 9, 2021 · 2 comments

Comments

@pythonla
Copy link

pythonla commented Dec 9, 2021

File "C:\Users\1\miniconda3\lib\site-packages\transformers\models\gpt2\tokenization_gpt2.py", line 179, in init
with open(vocab_file, encoding="utf-8") as vocab_handle:
TypeError: expected str, bytes or os.PathLike object, not NoneType
您好,我使用3.4.0的transformers加载tokenizer的时候一直报错,然后看了一下transformers的源码,BartTokenizer要加载json格式的词表,但我从huggingface下载的bart 预训练模型中词表是TXT文件,是transformers版本问题吗? 谢谢

@yhcc
Copy link
Owner

yhcc commented Dec 30, 2021

我不太清楚是啥问题,不过为啥不直接BartTokenizer.from_pretrained让它自己下载vocab文件咧。

@pythonla
Copy link
Author

pythonla commented Dec 30, 2021 via email

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants