-
-
Notifications
You must be signed in to change notification settings - Fork 320
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Long Inference Time #142
Comments
Implementing |
has it been implemented? |
How long does it take? |
I think the Inference times on an RTX 3090 / 4090 are quite acceptable actually. Not perfect, but acceptable. What Hardware are you using?
@lifeiteng Is this planned to be added to the repo? A speedup by this degree would be incredible. |
@RuntimeRacer I don't have time to do it. |
I'll probably use 4090. Do you know how much time it takes? I haven't run the code yet. Just exploring my option rn. |
Hello, do you have any experience in this field? It would be incredible if you did. |
Valle model is taking very long time to generate voices. is there any ongoing issues or PR being raised to work on it. Has there been any discussion how to speed up?
The text was updated successfully, but these errors were encountered: