-
Hi @gorold and @liu-jc. I want to ask about the calculation of PackedNLLLoss, which is the default loss for finetuning. I found that when do finetuning, the values in this loss function sometimes can be negative. For example, if I finetune Moirai-base on ETTh1 datasets, there will be some values being negative. However, this PackedNLLLoss is the negative log of probability of Distribution Can you help to explain why the loss values can be negative? Thank you. PS. Based on my experience, in some cases, the training loss over a batch/epoch can also be negative values. Is it okay that training loss being negative in Moirai? |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
Hey, negative log likelihood would actually be the negative log pdf instead of negative log probability for continuous distributions (pdf is positive unbounded). |
Beta Was this translation helpful? Give feedback.
Hey, negative log likelihood would actually be the negative log pdf instead of negative log probability for continuous distributions (pdf is positive unbounded).