-
Notifications
You must be signed in to change notification settings - Fork 110
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Getting error while generating vocabulary #14
Comments
Hi, It seems that MolGraph.FRAGMENTS is not initialized. In hgraph2graph/generation/get_vocab.py Line 50 in 3249f93
This is strange because as long as load_fragment is called (get_vocab.py Line 50), MolGraph.FRAGMENTS cannot be None (at best it's an empty list). I think the error happened before load_fragment function is called. You can try to print out fragments variable in line 49 to see whether it gets executed or not. |
I too was getting the same error in generation/preprocessing.py file. |
@nikhilmittal444 This is an issue with Pool in Windows. MolGraph.FRAGMENTS is not accessible in functions called through Pool. I removed the multiprocessing and I am able to get the vocabulary without any issue. However, I am only getting 2273 lines in contrast to 2288 lines in the provided vocab. @wengong-jin I am still going through the code to see if there is any randomness. However, do you think this is normal? |
I made the FRAGMENTS from load_fragments as a new variable and put that as input argument to the tensorize function(self.new_variable) and the MolGraph object in the init(), which gave me 2288 lines as initialized. |
Hi guys, this issue could be easily solved by simple replacement of |
Hello Wengong !
Thanks for the great work !!
I am trying to get vocabulary using your dataset
< ../data/polymers/all.txt >
; however, I am getting this error. I cannot figure this out. At the end I tried try-exception there but there are lots of these errors in the whole run. I will appreciate if you could assist me.The text was updated successfully, but these errors were encountered: