-
Notifications
You must be signed in to change notification settings - Fork 15
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Package Dependency and Reproducibility Issue #238
Comments
Hi @morenzoe, yes I think we should unlock the dependencies. Can you elaborate on the reproducibility issues? What are the library versions that cause them? We need to know so that we can specify the dependency versions to the intended/correct behavior. |
Thank you, @jasonlyik! That would be incredibly helpful. Regarding the reproducibility issues, I will need to ask @vinniesun for assistance in providing more details. I appreciate your support in advance! |
These are the package versions that I used: |
Thank you @vinniesun for providing the package versions. Could you please elaborate further on the specific reproducibility issues you encountered with my code submission? Did you encounter the same errors I previously described in my post? |
no, I didn't run into the issues you mentioned, I just couldn't reproduce the figures you had in your comment block. |
Hi, I am using NeuroBench with PyTorch Lightning. There seems to be a problem with the numpy version requirements between those two packages, which leads to some reproducibility issues. I ran the code bellow in Kaggle notebook and Google Colab notebook:
The order between neurobench and lightning is interchangeable and both results in the error bellow:
Kaggle notebook, while importing lightning:
Google Colab notebook, while installing neurobench:
These are the packages listed by
!pip freeze
:Kaggle notebook packages:
Google Colab notebook packages:
I have found a workaround by first installing and importing other packages, then installing and importing neurobench. However, this seems to be making a reproducibility issue with the BioCAS2024 Grand Challenge code submission check. Any help would be much appreciated!
The text was updated successfully, but these errors were encountered: