Facing issue to load after finetuning #93
Unanswered
EnugulaVishnucharan
asked this question in
Q&A
Replies: 1 comment 1 reply
-
You can use a config file like this https://github.com/SalesforceAIResearch/uni2ts/blob/7f9a34ec5b6a55d831baa54aca77e80044706277/cli/conf/eval/model/moirai_lightning_ckpt.yaml |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I am trying to finetune moirai model on my data . After finetuning with the script given in the readme i,e
python -m cli.train
-cp conf/finetune
run_name=example_run \
model=moirai_1.0_R_small \
data=etth1 \
val_data=etth1
It is producing the files attached .
But when I load the finetuned model will full path
model = MoiraiForecast(
module=MoiraiModule.from_pretrained("fullpathofthefinetuneddirectory"),
prediction_length=PDT,
context_length=CTX,
patch_size=PSZ,
num_samples=100,
target_dim=1,
feat_dynamic_real_dim=ds.num_feat_dynamic_real,
past_feat_dynamic_real_dim=ds.num_past_feat_dynamic_real,
module_kwargs=checkpoint['hyper_parameters']['module_kwargs']
)
it is giving
MoiraiModule.init() missing 7 required positional arguments: 'distr_output', 'd_model', 'num_layers', 'patch_sizes', 'max_seq_len', 'attn_dropout_p', and 'dropout_p'
I could find there arguments in the checkpoints and also the config.yaml files in the output finetuned model folder.
Please let me know where am I missing
Beta Was this translation helpful? Give feedback.
All reactions