This repository contains the official PyTorch implementation of the work "Generalizing Denoising to Non-Equilibrium Structures Improves Equivariant Force Fields" (TMLR 2024). We show that force encoding enables generalizing denoising to non-equilibrium structures and propose to use DeNS (Denoising Non-Equilibrium Structures) as an auxiliary task to improve the performance on energy and force predictions.
We provide the code for training EquiformerV2 with DeNS on OC20 and OC22 datasets here and training Equiformer with DeNS on MD17 in this repository.
See here for setting up the environment.
Please first set up the environment and file structures (placing this repository under ocp
and rename it to experimental
) following the above Environment section.
The OC20 S2EF dataset can be downloaded by following instructions in their GitHub repository.
For example, we can download the OC20 S2EF-2M dataset by running:
cd ocp
python scripts/download_data.py --task s2ef --split "2M" --num-workers 8 --ref-energy
We also need to download the "val_id"
data split to run training.
After downloading, the datasets should be under ocp/data
.
To train on different splits like All and All+MD, we can follow the same link above to download the datasets.
Please first set up the environment and file structures (placing this repository under ocp
and rename it to experimental
) following the above Environment section.
Similar to OC20, the OC22 dataset can be downloaded by following instructions in their GitHub repository.
Please refer to this repository for training Equiformer with DeNS on MD17.
configs
contains config files for training with DeNS on different datasets.datasets
contains LMDB dataset class that can distinguish whether structures in OC20 come from All split or MD split.model
contains EquiformerV2 and eSCN models capable of training with DeNS.scripts
contains the scripts for launching training based on config files.trainers
contains the code for training models for S2EF and with DeNS.
-
Modify the paths to datasets before launching training. For example, we need to modify the path to the training set as here and the validation set as here before training EquiformerV2 with DeNS on OC20 S2EF-2M dataset for 12 epochs.
-
We train EquiformerV2 with DeNS on the OC20 S2EF-2M dataset for 12 epochs by running:
cd ocp/ sh experimental/scripts/train/oc20/s2ef/equiformer_v2/equiformer_dens_v2_N@12_L@6_M@2_epochs@12_splits@[email protected]
Note that following the above Environment section, we will run the script under
ocp
. This script will use 2 nodes with 8 GPUs on each node.We can also run training on 8 GPUs on 1 node:
cd ocp/ sh experimental/scripts/train/oc20/s2ef/equiformer_v2/equiformer_dens_v2_N@12_L@6_M@2_epochs@12_splits@[email protected]
Note that this is to show that we can train on a single node and the results are not the same as training on 16 GPUs.
Similarly, we train EquiformerV2 with DeNS on the OC20 S2EF-2M dataset for 30 epochs by running:
cd ocp/ sh experimental/scripts/train/oc20/s2ef/equiformer_v2/equiformer_dens_v2_N@12_L@6_M@2_epochs@30_splits@[email protected]
This script will use 4 nodes with 8 GPUs on each node.
-
We train EquiformerV2 with DeNS on the OC20 S2EF-All+MD dataset by running:
cd ocp/ sh experimental/scripts/train/oc20/s2ef/equiformer_v2/equiformer_dens_v2_N@20_L@6_M@3_splits@[email protected]
This script will use 16 nodes with 8 GPUs on each node.
We use a slightly different dataset class
DeNSLmdbDataset
so that we can differentiate whether a structure is from the All split or the MD split. This corresponds to the code here and requiresrelaxations
andmd
to exist indata_log.*.txt
files under the All+MD data directory. Thosedata_log.*.txt
should look like:# for All split /.../relaxations/.../random1331004.traj,258,365 ...
After reading the lmdb files, the
DeNSLmdbDataset
dataset will add a new attributemd
as here.
-
Modify the paths to datasets before launching training. Specifically, we need to modify the path to the training set as here and the validation set as here.
In addition, we need to download the linear reference file from here and then add the path to the linear reference file as here and here.
Finally, we download the OC20 reference information file from here and add the path to that file as here and here.
-
We train EquiformerV2 with DeNS on OC22 dataset by running:
cd ocp/ sh experimental/scripts/train/oc22/s2ef/equiformer_v2/equiformer_dens_v2_N@18_L@6_M@2_epochs@[email protected]
This script will use 4 nodes with 8 GPUs on each node.
Please refer to this repository for training Equiformer with DeNS on MD17.
We provide the checkpoints of EquiformerV2 trained with DeNS on OC20 S2EF-2M dataset for 12 and 30 epochs, OC20 S2EF-All+MD dataset, and OC22 dataset.
Split | Epochs | Download | val force MAE (meV / Å) | val energy MAE (meV) |
---|---|---|---|---|
OC20 S2EF-2M | 12 | checkpoint | config | 19.09 | 269 |
OC20 S2EF-2M | 30 | checkpoint | config | 18.02 | 251 |
OC20 S2EF-All+MD | 2 | checkpoint | config | 14.0 | 222 |
OC22 | 6 | checkpoint | config | (ID) 20.66 | (OOD) 27.11 | (ID) 391.6 | (OOD) 533.0 |
We provide the evaluation script on OC20 and OC22 datasets. After following the above Environment section and downloading the checkpoints here, we run the script to evaluate the results on validation sets.
For instance, after updating the path to the validation set as here and CHECKPOINT
as here, we evaluate the result of EquiformerV2 trained on OC20 S2EF-2M dataset for 12 epochs by running:
cp ocp/
sh experimental/scripts/evaluate/oc20/s2ef/equiformer_v2/equiformer_dens_v2_N@12_L@6_M@2_epochs@12_splits@[email protected]
We can update the path in the config file to evaluate on different validation sub-splits and use different config files to evaluate different models.
Please consider citing the works below if this repository is helpful:
-
DeNS:
@article{ DeNS, title={Generalizing Denoising to Non-Equilibrium Structures Improves Equivariant Force Fields}, author={Yi-Lun Liao and Tess Smidt and Muhammed Shuaibi* and Abhishek Das*}, journal={arXiv preprint arXiv:2403.09549}, year={2024} }
-
@inproceedings{ equiformer_v2, title={{EquiformerV2: Improved Equivariant Transformer for Scaling to Higher-Degree Representations}}, author={Yi-Lun Liao and Brandon Wood and Abhishek Das* and Tess Smidt*}, booktitle={International Conference on Learning Representations (ICLR)}, year={2024}, url={https://openreview.net/forum?id=mCOBKZmrzD} }
-
@inproceedings{ equiformer, title={{Equiformer: Equivariant Graph Attention Transformer for 3D Atomistic Graphs}}, author={Yi-Lun Liao and Tess Smidt}, booktitle={International Conference on Learning Representations (ICLR)}, year={2023}, url={https://openreview.net/forum?id=KwmPfARgOTD} }
Please direct questions to Yi-Lun Liao ([email protected]).
Our implementation is based on PyTorch, PyG, e3nn, timm, ocp, Equiformer, and EquiformerV2.