You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thanks for your great work! I have a question. When I train the diffste in my own dataset, the training Progress bar only prints out the loss of half of total training steps, and the loss of the rest half of total training steps is not show by the Progress bar. I carefully reviewed the training process, but couldn't find the problem. May I ask if you were in the same situation at training time?
The text was updated successfully, but these errors were encountered:
Hi, actually, the training loss printed in the terminal is the loss for a single batch instead of the average loss over the entire dataset. For the details of what is printed, please see this link.
The reason this progress bar is split in the middle of one epoch is the output during training, the line do_classifier_free_guidance; I guess this is some log information you print when you log some generated images.
Thanks for your great work! I have a question. When I train the diffste in my own dataset, the training Progress bar only prints out the loss of half of total training steps, and the loss of the rest half of total training steps is not show by the Progress bar. I carefully reviewed the training process, but couldn't find the problem. May I ask if you were in the same situation at training time?
The text was updated successfully, but these errors were encountered: