Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Can't get cuda to work. #983

Open
manus693 opened this issue Jan 8, 2025 · 2 comments
Open

Can't get cuda to work. #983

manus693 opened this issue Jan 8, 2025 · 2 comments

Comments

@manus693
Copy link

manus693 commented Jan 8, 2025

Whatever I do I get Torch not compiled with CUDA, I've followed the instructions and installed as written.

miniconda3\envs\whisperx\lib\site-packages\torch\cuda_init_.py", line 310, in _lazy_init
raise AssertionError("Torch not compiled with CUDA enabled")
AssertionError: Torch not compiled with CUDA enabled

When running pip install whisperx it installs torch without cuda enabled. I'm running this inside the conda environment.
I'm not really sure how the get this to work, been trying for ages now.
It also install torch 2.5.0, but the conda install is 2.0.0 before the "pip install whisperx" in the description.

Is Setup in description outdated?

Is there any difference except maybe speed? CPU or Cuda?
I ran whisperx on a movie with int8 enabled, and it was almost correct, some timestamps where completely wrong, just random places.
I could not get float 16 to work, trying CPU and float 32 now.

I have 4060ti.

@mukhituly
Copy link

+1, experiencing this issue as well

@NefariousC
Copy link

NefariousC commented Jan 12, 2025

so this happened to me too, idk if this is a bug from pip or not but when you installed whisperx, somehow pip automatically grabs the cpu version of torch instead of the cuda version.

you can try this (bear in mind that i'm on CUDA 12)
it will force to reinstall torch (with cuda enabled) and torchaudio along with their dependencies:

pip install torch torchaudio --index-url https://download.pytorch.org/whl/cu121 --force-reinstall --no-cache-dir
  • then if you get an error when using whisperx :

Warning

Could not locate cudnn_ops_infer64_8.dll. Please make sure it is in your library path!

  1. download the cuDNN library from nvidia website (for windows)

Important

because CTranslate 4.4.0 only supports up to 8.x so stick to the 8.x version with cuda 12:

  1. locate the files inside the downloaded zip bin folder then extract them into whisperx bin folder ...\envs\whisperx\bin

i think there's a simpler solution somewhere but after searching for hours this is what I got.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants