Replies: 3 comments 4 replies
-
The DeepSpeed question, that's an easy one! DeepSpeed has no bearing on training, only TTS generation. As AllTalk's scripts are relying on other scripts to train (that other people make) I cannot activate DeepSpeed for training as their scripts don't support it. As for training the models and how many epochs/how much is enough, have a look here. It's not a one size fits all solution as such, however, the settings I setup in finetuning are the default/recommended for these models. I don't know of any specific data around batch size as such. This person was training an entirely new language into the model here with a quite a large dataset and for that he used 1000 epochs, but of course, that's an entirely new language rather than training an existing language with a new sounding voice. The general guide is to train your model on X amount of epochs, then make some samples, compare their waveform+spectrograph, listen to them and see how they sound. If you aren't happy, train more. As for how many will end up being enough, as the FAQ says, its somewhat subjective without a definitive answer. Im not trying to be wishy-washy, its just there's no firm conclusion (or available data) on it. |
Beta Was this translation helpful? Give feedback.
-
Ok.. I have a working system now... So after you have trained your model and tested.. if you want to further finetune it, you would choose this option: and then close and re-open the finetuning script. You should be able to go straight to step 2 (if you are going to re-use the same dataset) and on there you will have an option to use your finetuned model to train it: I dont have a finetuned model at the moment, so I dont get the option to select it, but you get the idea. |
Beta Was this translation helpful? Give feedback.
-
@Dolyfin I'm closing this discussion, but I'm not closing it closing it. I'm actually putting up a central list of feature requests and I am linking back to all requests off that list. So your request (extra model save points and load points) will be noted in here #74. I'm also asking on the Feature Requests list that people join in the original thread discussions by following the links. Thanks |
Beta Was this translation helpful? Give feedback.
-
Would love to know how to better optimize epoch/batch for longer datasets (2hrs) and if deepspeed helps training at all.
Beta Was this translation helpful? Give feedback.
All reactions