-
Recently, I'm testing the inference performance of H20. |
Beta Was this translation helpful? Give feedback.
Answered by
zhyncs
Aug 5, 2024
Replies: 3 comments 5 replies
-
We support CUDA 12.4 |
Beta Was this translation helpful? Give feedback.
1 reply
Answer selected by
merrymercy
-
I try to use this image still unable to server any model on H20 get this error
|
Beta Was this translation helpful? Give feedback.
4 replies
-
我也遇到这个问题了,您那解决了吗 |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
We support CUDA 12.4
docker pull lmsysorg/sglang:v0.2.10-cu124