[Model] LoRA with lm_head and embed_tokens fully trained - 4 #26213
Annotations
8 errors and 1 warning
Mypy:
vllm/lora/layers.py#L1245
Name "embedding_dim" already defined on line 1237 [no-redef]
|
Mypy:
vllm/lora/layers.py#L1249
Name "bias" already defined on line 1241 [no-redef]
|
Mypy:
vllm/lora/layers.py#L1269
"PunicaWrapperBase" has no attribute "bgmv_sample" [attr-defined]
|
Mypy:
vllm/lora/layers.py#L1281
"PunicaWrapperBase" has no attribute "bgmv_embedding" [attr-defined]
|
Mypy:
vllm/lora/layers.py#L1309
Signature of "set_lora" incompatible with supertype "BaseLayerWithLoRA" [override]
|
Mypy:
vllm/lora/utils.py#L141
Incompatible return value type (got "tuple[str, None, bool]", expected "tuple[str, bool, bool]") [return-value]
|
Mypy:
vllm/lora/models.py#L154
"LoRALayerWeights" has no attribute "lora_a_pin_memory" [attr-defined]
|
Mypy
Process completed with exit code 1.
|
ubuntu-latest pipelines will use ubuntu-24.04 soon. For more details, see https://github.com/actions/runner-images/issues/10636
|
Loading