Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Pytorch optimization #974

Merged
merged 17 commits into from
Sep 28, 2023
Merged

Pytorch optimization #974

merged 17 commits into from
Sep 28, 2023

Conversation

luigibonati
Copy link
Contributor

Description
  • Add performance optimizations to PYTORCH (torch jit freeze and optimize_for_inference)
  • Changed configure to link also CUDA-enabled libtorch libraries if available (requested by @maxbonomi)
  • Moved instructions to link libtorch from the module PYTORCH page to the installation one.
  • Updated LibTorch version tested in CI from 1.13 to 2.0
Target release

I would like my code to appear in release 2.10

Type of contribution
  • changes to code or doc authored by PLUMED developers, or additions of code in the core or within the default modules
  • changes to a module not authored by you
  • new module contribution or edit of a module authored by you
Copyright
  • I agree to transfer the copyright of the code I have written to the PLUMED developers or to the author of the code I am modifying.
  • the module I added or modified contains a COPYRIGHT file with the correct license information. Code should be released under an open source license. I also used the command cd src && ./header.sh mymodulename in order to make sure the headers of the module are correct.
Tests
  • I added a new regtest or modified an existing regtest to validate my changes.
  • I verified that all regtests are passed successfully on GitHub Actions.

@carlocamilloni carlocamilloni merged commit 585cddc into plumed:master Sep 28, 2023
18 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants