Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Given script might not be compatible with latest Huggingface versions! #7

Open
NamburiSrinath opened this issue Nov 29, 2023 · 2 comments

Comments

@NamburiSrinath
Copy link

NamburiSrinath commented Nov 29, 2023

Hi @jbcdnr, @martinjaggi

Thanks for this work, it's quite intuitive and easy to understand :)

When I am trying to run the script provided, I am getting the following error

Code provided -

from transformers import AutoModel
from collaborative_attention import swap_to_collaborative, BERTCollaborativeAdapter
import copy
import torch

model = AutoModel.from_pretrained("bert-base-cased-finetuned-mrpc")

# reparametrize the model with tensor decomposition to use collaborative heads
# decrease dim_shared_query_key to 384 for example to compress the model
collab_model = copy.deepcopy(model)
swap_to_collaborative(collab_model, BERTCollaborativeAdapter, dim_shared_query_key=768)

# check that output is not altered too much
any_input = torch.LongTensor(3, 25).random_(1000, 10000)
collab_model.eval()  # to disable dropout
out_collab = collab_model(any_input)

model.eval()
out_original = model(any_input)

print("Max l1 error: {:.1e}".format((out_collab[0] - out_original[0]).abs().max().item()))
# >>> Max l1 error: 1.9e-06

# You can evaluate the new model, refine tune it or save it.
# We also want to pretrain our collaborative head from scratch (if you were wondering).

Error message --

out_collab = collab_model(any_input)
TypeError: CollaborativeAttention.forward() takes from 2 to 6 positional arguments but 8 were given

My bet would be, the compatibility with versions. Can you please help regarding this?

Happy to share additional details if needed

@NamburiSrinath
Copy link
Author

And pip install -U -e collaborative-attention throws the following error

ERROR: Command errored out with exit status 1:
   command: /hdd2/srinath/anaconda3/bin/python /hdd2/srinath/anaconda3/lib/python3.9/site-packages/pip/_vendor/pep517/in_process/_in_process.py build_wheel /tmp/tmphchwzvz2
       cwd: /tmp/pip-install-vie9rjfw/tokenizers_91f33fd6867043c896dfeaa9e0ac37c4
  Complete output (46 lines):
  running bdist_wheel
  running build
  running build_py
  creating build
  creating build/lib.linux-x86_64-cpython-39
  creating build/lib.linux-x86_64-cpython-39/tokenizers
  copying tokenizers/__init__.py -> build/lib.linux-x86_64-cpython-39/tokenizers
  creating build/lib.linux-x86_64-cpython-39/tokenizers/models
  copying tokenizers/models/__init__.py -> build/lib.linux-x86_64-cpython-39/tokenizers/models
  creating build/lib.linux-x86_64-cpython-39/tokenizers/decoders
  copying tokenizers/decoders/__init__.py -> build/lib.linux-x86_64-cpython-39/tokenizers/decoders
  creating build/lib.linux-x86_64-cpython-39/tokenizers/normalizers
  copying tokenizers/normalizers/__init__.py -> build/lib.linux-x86_64-cpython-39/tokenizers/normalizers
  creating build/lib.linux-x86_64-cpython-39/tokenizers/pre_tokenizers
  copying tokenizers/pre_tokenizers/__init__.py -> build/lib.linux-x86_64-cpython-39/tokenizers/pre_tokenizers
  creating build/lib.linux-x86_64-cpython-39/tokenizers/processors
  copying tokenizers/processors/__init__.py -> build/lib.linux-x86_64-cpython-39/tokenizers/processors
  creating build/lib.linux-x86_64-cpython-39/tokenizers/trainers
  copying tokenizers/trainers/__init__.py -> build/lib.linux-x86_64-cpython-39/tokenizers/trainers
  creating build/lib.linux-x86_64-cpython-39/tokenizers/implementations
  copying tokenizers/implementations/base_tokenizer.py -> build/lib.linux-x86_64-cpython-39/tokenizers/implementations
  copying tokenizers/implementations/char_level_bpe.py -> build/lib.linux-x86_64-cpython-39/tokenizers/implementations
  copying tokenizers/implementations/__init__.py -> build/lib.linux-x86_64-cpython-39/tokenizers/implementations
  copying tokenizers/implementations/sentencepiece_bpe.py -> build/lib.linux-x86_64-cpython-39/tokenizers/implementations
  copying tokenizers/implementations/bert_wordpiece.py -> build/lib.linux-x86_64-cpython-39/tokenizers/implementations
  copying tokenizers/implementations/byte_level_bpe.py -> build/lib.linux-x86_64-cpython-39/tokenizers/implementations
  copying tokenizers/__init__.pyi -> build/lib.linux-x86_64-cpython-39/tokenizers
  copying tokenizers/models/__init__.pyi -> build/lib.linux-x86_64-cpython-39/tokenizers/models
  copying tokenizers/decoders/__init__.pyi -> build/lib.linux-x86_64-cpython-39/tokenizers/decoders
  copying tokenizers/normalizers/__init__.pyi -> build/lib.linux-x86_64-cpython-39/tokenizers/normalizers
  copying tokenizers/pre_tokenizers/__init__.pyi -> build/lib.linux-x86_64-cpython-39/tokenizers/pre_tokenizers
  copying tokenizers/processors/__init__.pyi -> build/lib.linux-x86_64-cpython-39/tokenizers/processors
  copying tokenizers/trainers/__init__.pyi -> build/lib.linux-x86_64-cpython-39/tokenizers/trainers
  running build_ext
  running build_rust
  error: can't find Rust compiler
  
  If you are using an outdated pip version, it is possible a prebuilt wheel is available for this package but pip is not able to install from it. Installing from the wheel would avoid the need for a Rust compiler.
  
  To update pip, run:
  
      pip install --upgrade pip
  
  and then retry package installation.
  
  If you did intend to build this package from source, try installing a Rust compiler from your system package manager and ensure it is on the PATH during installation. Alternatively, rustup (available at https://rustup.rs) is the recommended way to download and update the Rust compiler toolchain.
  ----------------------------------------
  ERROR: Failed building wheel for tokenizers
Failed to build tokenizers
ERROR: Could not build wheels for tokenizers which use PEP 517 and cannot be installed directly

@jbcdnr
Copy link
Collaborator

jbcdnr commented Dec 2, 2023

Hi @NamburiSrinath,

I am not surprised that this repo is not up to date with new packages versions. See setup.py for versions that were working at least at one point in time ;)

    install_requires=[
        "tensorly>=0.4.5",
        "transformers==2.11.0",
        "parameterized>=0.7.4",
        "tqdm>=4.46.0",
        "wandb==0.9.2",
    ],

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants