-
Notifications
You must be signed in to change notification settings - Fork 15
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How to solve dataset AssertionError? #4
Comments
Hi, the first error means something is wrong with the created dataset. Check out the init() method of the For the second error I guess your path to the state dict of |
I spend some time to check as you hints. Here is the running outputs:
What can I do next? |
Hi, I think the two errors are unrelated. For the second error it looks like something might be wrong with the pretrained text encoder. Try downloading it again to make sure the file is not corrupted. Also, which PyTorch version are you using? I think they changed the state_dict loading in one of the previous versions, so that might be an issue here, too. |
Hi, I encounter the same problem as above and I have already tried re-downloading the DAMSM text encoder. I am using python 2.7.12 and pytorch 0.4.1 in a docker container. Here is my running outputs
May I know how to solve this error? |
Hi, to me this looks like a problem with the state_dict for the text encoder. The text encoder has an embedding layer of shape [27297, 300] (27297 words, each with a 300-dim embedding). It seems that the state_dict only has an embedding of size [1, 300]. Length Could you please check that
The pre-trained AttnGAN text encoder model (text_encoder100.pth) should be about 33MB in size, the image encoder (image_encoder100.pth) about 86MB. |
After I decide to use python2 to run, I use pycharm to create a new python2 virtual environment and run:
After I run above instructions I got errors:
If I comment line 134 of main.py in code/coco/attngan directory and run the same instruction again, it shows:
How could I solve these problems?
Thank you~
The text was updated successfully, but these errors were encountered: