Skip to content

Adding cascade inference to vLLM #26963

Adding cascade inference to vLLM

Adding cascade inference to vLLM #26963

Triggered via pull request November 19, 2024 23:31
Status Failure
Total duration 23s
Artifacts

ruff.yml

on: pull_request
Matrix: ruff
Fit to window
Zoom out
Zoom in

Annotations

10 errors
Ruff (E501): vllm/attention/backends/flashinfer.py#L362
vllm/attention/backends/flashinfer.py:362:81: E501 Line too long (81 > 80)
Ruff (E501): vllm/attention/backends/flashinfer.py#L363
vllm/attention/backends/flashinfer.py:363:81: E501 Line too long (105 > 80)
Ruff (E501): vllm/attention/backends/flashinfer.py#L368
vllm/attention/backends/flashinfer.py:368:81: E501 Line too long (82 > 80)
Ruff (E501): vllm/attention/backends/flashinfer.py#L373
vllm/attention/backends/flashinfer.py:373:81: E501 Line too long (81 > 80)
Ruff (E501): vllm/attention/backends/flashinfer.py#L376
vllm/attention/backends/flashinfer.py:376:81: E501 Line too long (81 > 80)
Ruff (E501): vllm/attention/backends/flashinfer.py#L385
vllm/attention/backends/flashinfer.py:385:81: E501 Line too long (100 > 80)
Ruff (E501): vllm/attention/backends/flashinfer.py#L388
vllm/attention/backends/flashinfer.py:388:81: E501 Line too long (108 > 80)
Ruff (E501): vllm/attention/backends/flashinfer.py#L392
vllm/attention/backends/flashinfer.py:392:81: E501 Line too long (118 > 80)
Ruff (E501): vllm/attention/backends/flashinfer.py#L419
vllm/attention/backends/flashinfer.py:419:81: E501 Line too long (81 > 80)
Ruff (E501): vllm/attention/backends/flashinfer.py#L423
vllm/attention/backends/flashinfer.py:423:81: E501 Line too long (100 > 80)