Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

download weights #9

Open
YibooZhao opened this issue Oct 21, 2024 · 1 comment
Open

download weights #9

YibooZhao opened this issue Oct 21, 2024 · 1 comment

Comments

@YibooZhao
Copy link

I use the following code to download weights:

from transformers import pipeline
pipe = pipeline("text2text-generation", model="omni-research/Tarsier-7b")

and then I got this error

Traceback (most recent call last):
  File "/mnt/yrfs/yanrong/pvc-34488cf7-703b-4654-9fe8-762a747bbc58/zhaoyibo/long_video_benchmark/eval/down.py", line 3, in <module>
    pipe = pipeline("text2text-generation", model="omni-research/Tarsier-7b")
  File "/home/zhaoyibo/.conda/envs/tarsier/lib/python3.9/site-packages/transformers/pipelines/__init__.py", line 870, in pipeline
    framework, model = infer_framework_load_model(
  File "/home/zhaoyibo/.conda/envs/tarsier/lib/python3.9/site-packages/transformers/pipelines/base.py", line 291, in infer_framework_load_model
    raise ValueError(
ValueError: Could not load model omni-research/Tarsier-7b with any of the following classes: (<class 'transformers.models.auto.modeling_auto.AutoModelForSeq2SeqLM'>,). See the original errors:

while loading with AutoModelForSeq2SeqLM, an error is thrown:
Traceback (most recent call last):
  File "/home/zhaoyibo/.conda/envs/tarsier/lib/python3.9/site-packages/transformers/pipelines/base.py", line 278, in infer_framework_load_model
    model = model_class.from_pretrained(model, **kwargs)
  File "/home/zhaoyibo/.conda/envs/tarsier/lib/python3.9/site-packages/transformers/models/auto/auto_factory.py", line 569, in from_pretrained
    raise ValueError(
ValueError: Unrecognized configuration class <class 'transformers.models.llava.configuration_llava.LlavaConfig'> for this kind of AutoModel: AutoModelForSeq2SeqLM.
Model type should be one of BartConfig, BigBirdPegasusConfig, BlenderbotConfig, BlenderbotSmallConfig, EncoderDecoderConfig, FSMTConfig, GPTSanJapaneseConfig, LEDConfig, LongT5Config, M2M100Config, MarianConfig, MBartConfig, MT5Config, MvpConfig, NllbMoeConfig, PegasusConfig, PegasusXConfig, PLBartConfig, ProphetNetConfig, SeamlessM4TConfig, SeamlessM4Tv2Config, SwitchTransformersConfig, T5Config, UMT5Config, XLMProphetNetConfig.

what should i do?

@jwwang424
Copy link
Collaborator

try to remove the parameter: "text2text-generation"

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants