Adding cascade inference to vLLM #26963
Annotations
10 errors
Ruff (E501):
vllm/attention/backends/flashinfer.py#L362
vllm/attention/backends/flashinfer.py:362:81: E501 Line too long (81 > 80)
|
Ruff (E501):
vllm/attention/backends/flashinfer.py#L363
vllm/attention/backends/flashinfer.py:363:81: E501 Line too long (105 > 80)
|
Ruff (E501):
vllm/attention/backends/flashinfer.py#L368
vllm/attention/backends/flashinfer.py:368:81: E501 Line too long (82 > 80)
|
Ruff (E501):
vllm/attention/backends/flashinfer.py#L373
vllm/attention/backends/flashinfer.py:373:81: E501 Line too long (81 > 80)
|
Ruff (E501):
vllm/attention/backends/flashinfer.py#L376
vllm/attention/backends/flashinfer.py:376:81: E501 Line too long (81 > 80)
|
Ruff (E501):
vllm/attention/backends/flashinfer.py#L385
vllm/attention/backends/flashinfer.py:385:81: E501 Line too long (100 > 80)
|
Ruff (E501):
vllm/attention/backends/flashinfer.py#L388
vllm/attention/backends/flashinfer.py:388:81: E501 Line too long (108 > 80)
|
Ruff (E501):
vllm/attention/backends/flashinfer.py#L392
vllm/attention/backends/flashinfer.py:392:81: E501 Line too long (118 > 80)
|
Ruff (E501):
vllm/attention/backends/flashinfer.py#L419
vllm/attention/backends/flashinfer.py:419:81: E501 Line too long (81 > 80)
|
Ruff (E501):
vllm/attention/backends/flashinfer.py#L423
vllm/attention/backends/flashinfer.py:423:81: E501 Line too long (100 > 80)
|