You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Describe the bug
Related to the issue
When I do Multihead transfer learning from a personal pre-trained model. I found that the energy_key and forces_key should be the key for pre-training properties(dftb_energy, dftb_forces in this case) rather than dft_energy and dft_forces from args.energy_key and args.forces_key. Otherwise it can not read corrected properties for pt_head. line 288 .. 323
2024-11-13 13:12:30.501 INFO: ==================Using multiheads finetuning mode==================
2024-11-13 13:12:30.501 INFO: Using foundation model for multiheads finetuning with ../train_pt.xyz
2024-11-13 13:12:32.778 INFO: Training set [7642 configs, 0 energy, 380589 forces] loaded from '../train_pt.xyz'
2024-11-13 13:12:33.279 INFO: Validation set [1000 configs, 1000 energy, 46593 forces] loaded from '../valid_pt.xyz'
2024-11-13 13:12:33.279 INFO: Total number of configurations: train=7642, valid=1000
2024-11-13 13:12:30.501 INFO: ==================Using multiheads finetuning mode==================
2024-11-13 13:12:30.501 INFO: Using foundation model for multiheads finetuning with ../train_pt.xyz
2024-11-13 13:12:32.778 INFO: Training set [7642 configs, 7642 energy, 380589 forces] loaded from '../train_pt.xyz'
2024-11-13 13:12:33.279 INFO: Validation set [1000 configs, 1000 energy, 46593 forces] loaded from '../valid_pt.xyz'
2024-11-13 13:12:33.279 INFO: Total number of configurations: train=7642, valid=1000
The text was updated successfully, but these errors were encountered:
Hi, Ilyes. I reinstall the main branch. but the outputs seems to be the same as shown above, the energy_key and forces_key for pre-trained model are still missing.
I compared it with the multihead fine-tunning based on the foundation model,
the energy_key and forces_key are defined for the foundation model inside of the head_config_pt, as shown in the following (None for both property keys):
where these arguments are defined in multihead_tools.py manually if I understood correctly, so I guess the property keys for pre-trained model have to be defined manually while doing multihead fine-tunning?
Also, I did not find the property keys are saved inside of pre-trained models, maybe this is the reason why it cant do?
Describe the bug
Related to the issue
When I do Multihead transfer learning from a personal pre-trained model. I found that the
energy_key
andforces_key
should be the key for pre-training properties(dftb_energy, dftb_forces in this case) rather than dft_energy and dft_forces from args.energy_key and args.forces_key. Otherwise it can not read corrected properties for pt_head.line 288 .. 323
To Reproduce
The text was updated successfully, but these errors were encountered: