Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Upstream model implementation to vLLM #20

Closed
ywang96 opened this issue Oct 11, 2024 · 2 comments
Closed

Upstream model implementation to vLLM #20

ywang96 opened this issue Oct 11, 2024 · 2 comments
Labels

Comments

@ywang96
Copy link

ywang96 commented Oct 11, 2024

Hello! This is Roger from the vLLM team!

It's great to see you are using the OOT model registration functionality to have this model supported on vLLM! Do you plan to make a PR in the upstream repository so that we can have this model officially supported?

If you don't have the bandwidth to make a PR, I'm happy to make it and put relevant team members as co authors - Let me know!

@aria-hacker
Copy link
Collaborator

@ywang96 Thanks for reaching out! I'd really appreciate your help with making the PR. I don't have much bandwidth right now, so your offer to create the PR would be fantastic.

Let me know what information you need from me to get started. I'm happy to provide any details about the implementation or review the PR once it's ready.

@xffxff xffxff added the vllm label Nov 11, 2024
@xffxff
Copy link
Collaborator

xffxff commented Nov 22, 2024

I'm working on this. I've submitted a pull request vllm-project/vllm#10514

@xffxff xffxff closed this as completed Nov 26, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants