-
Notifications
You must be signed in to change notification settings - Fork 46
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
sh scripts/inference/inference_hunyuan.sh 卡住 #105
Comments
无法进行inference |
can you show the output of the terminal to give us more clue? Thanks a lot. |
会一直卡在inference |
me too |
H800 |
A800 |
A100,same problem |
Hi, from the screenshot above, I think you are trying to use a single 40G GPU for inference Fast-Hunyuan which is now sufficient. However, I failed to reproduce the bug by myside as it leads to a OOM error. Please have a try with 80G VRAM in total and the bug might be fixed. We are going to release low precision version of FastHunyuan very soon. |
我这个80G显存呢? |
80G A800 also gets stuck. |
Can you try to use |
No description provided.
The text was updated successfully, but these errors were encountered: