You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
You can definitely use the config to train a smaller SaProt. The line 14 doesn't mean the model is trained. It is just the name of the training log. You actually train it from scratch.
If you want to train a smaller version, you have to first create a folder containing the configuration of your model like SaProt_650M_AF2 or SaProt_35M_AF2, except that you do not have to inlclude pytorch_model.bin. Then you only need to change the config_path to your folder path so you can train your own model.
Hi there!
Awesome work! Do you have a script that I can use to train SaProt with a smaller ESM model? I think I can use the config https://github.com/westlake-repl/SaProt/blob/main/config/pretrain/saprot.yaml but from the name
https://github.com/westlake-repl/SaProt/blob/main/config/pretrain/saprot.yaml#L14 it looks like the model is already trained?
Would appreciate any pointers!
Thank you!
The text was updated successfully, but these errors were encountered: