You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We did not explicitly compare with vLLM because we believe its performance is worse than TRT-LLM-FP16 (which implements the same paged attention functionality but with a faster attention kernel). Our throughput is much better than TRT-LLM-FP16.
as title
The text was updated successfully, but these errors were encountered: