Skip to content

Add attentive layer to Jepa #1490

Add attentive layer to Jepa

Add attentive layer to Jepa #1490

Triggered via pull request December 20, 2024 12:58
Status Failure
Total duration 4m 2s
Artifacts

ci_lint_py.yaml

on: pull_request
Fit to window
Zoom out
Zoom in

Annotations

27 errors
Lint Python / Lint: src/fairseq2/recipes/jepa/factory.py#L1
Imports are incorrectly sorted and/or formatted.
Lint Python / Lint: src/fairseq2/recipes/jepa/models.py#L1
Imports are incorrectly sorted and/or formatted.
Lint Python / Lint
Process completed with exit code 1.
Lint Python / Lint: src/fairseq2/nn/transformer/encoder_layer.py#L10
'collections.abc.Callable' imported but unused
Lint Python / Lint: src/fairseq2/nn/transformer/encoder_layer.py#L13
'torch.zeros' imported but unused
Lint Python / Lint: src/fairseq2/nn/transformer/encoder_layer.py#L14
'torch.nn.Parameter' imported but unused
Lint Python / Lint: src/fairseq2/nn/utils/module.py#L584
blank line contains whitespace
Lint Python / Lint: src/fairseq2/nn/utils/module.py#L591
blank line contains whitespace
Lint Python / Lint: src/fairseq2/nn/utils/module.py#L594
blank line contains whitespace
Lint Python / Lint: src/fairseq2/nn/utils/module.py#L597
blank line contains whitespace
Lint Python / Lint: src/fairseq2/nn/utils/module.py#L599
blank line contains whitespace
Lint Python / Lint: src/fairseq2/nn/utils/module.py#L611
blank line contains whitespace
Lint Python / Lint: src/fairseq2/recipes/jepa/factory.py#L24
'fairseq2.nn.transformer.TransformerDecoder' imported but unused
Lint Python / Lint
Process completed with exit code 1.
Lint Python / Lint: src/fairseq2/nn/utils/module.py#L585
Function is missing a type annotation
Lint Python / Lint: src/fairseq2/nn/utils/module.py#L601
Function is missing a return type annotation
Lint Python / Lint: src/fairseq2/recipes/jepa/models.py#L85
Item "None" of "MultiheadAttention | None" has no attribute "model_dim"
Lint Python / Lint: src/fairseq2/recipes/jepa/models.py#L99
Incompatible types in assignment (expression has type "MultiheadAttention | None", variable has type "MultiheadAttention")
Lint Python / Lint: src/fairseq2/recipes/jepa/models.py#L260
Unexpected keyword argument "device" for "Parameter"
Lint Python / Lint: src/fairseq2/recipes/jepa/models.py#L260
Unexpected keyword argument "dtype" for "Parameter"
Lint Python / Lint: src/fairseq2/recipes/jepa/models.py#L316
Returning Any from function declared to return "Tensor"