-
-
Notifications
You must be signed in to change notification settings - Fork 5.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Misc] Enhance offline_inference to support user-configurable paramet… #10392
[Misc] Enhance offline_inference to support user-configurable paramet… #10392
Conversation
👋 Hi! Thank you for contributing to the vLLM project. Once the PR is approved and ready to go, your PR reviewer(s) can run CI to test the changes comprehensively before merging. To run CI, PR reviewers can do one of these:
🚀 |
603f180
to
3c27e97
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for this enhancement! However, since this is an example script used to introduce the usage of LLM
entrypoint for offline inference, we should keep it clean and simple as much as possible.
Therefore, we should expose arguments that are in common use or highlights in vLLM (like tensor parallel, chunked prefill and prefix caching etc).
…ers (vllm-project#10391) Signed-off-by: wchen61 <[email protected]>
Signed-off-by: wchen61 <[email protected]>
Signed-off-by: wchen61 <[email protected]>
Signed-off-by: wchen61 <[email protected]>
Signed-off-by: wchen61 <[email protected]>
eab5c7b
to
28d6d60
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Overall LGTM, just leave some nits.
Signed-off-by: wchen61 <[email protected]>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM!
vllm-project#10392) Signed-off-by: wchen61 <[email protected]> Signed-off-by: Linkun Chen <[email protected]>
vllm-project#10392) Signed-off-by: wchen61 <[email protected]>
vllm-project#10392) Signed-off-by: wchen61 <[email protected]>
vllm-project#10392) Signed-off-by: wchen61 <[email protected]> Signed-off-by: Maxime Fournioux <[email protected]>
vllm-project#10392) Signed-off-by: wchen61 <[email protected]> Signed-off-by: rickyx <[email protected]>
vllm-project#10392) Signed-off-by: wchen61 <[email protected]> Signed-off-by: Tyler Michael Smith <[email protected]>
vllm-project#10392) Signed-off-by: wchen61 <[email protected]>
vllm-project#10392) Signed-off-by: wchen61 <[email protected]>
FIX #10391