-
Notifications
You must be signed in to change notification settings - Fork 2
Issues: InftyAI/llmlite
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
Support fall back across several providers
feature
Categorizes issue or PR as related to a new feature.
important-longterm
Important over the long term, but may not be staffed and/or may need multiple releases to complete.
Support counting tokens
feature
Categorizes issue or PR as related to a new feature.
#59
opened Jul 1, 2024 by
kerthcet
vLLM not working as expected with ChatGLM2
bug
Categorizes issue or PR as related to a bug.
#55
opened Dec 27, 2023 by
kerthcet
Support serving via HTTP/RPC server
enhancement
New feature or request
#40
opened Nov 28, 2023 by
kerthcet
Support serving fine-tuning layers easily
enhancement
New feature or request
#39
opened Nov 28, 2023 by
kerthcet
support text-generation-inference
enhancement
New feature or request
#35
opened Nov 28, 2023 by
kerthcet
Add codeLlama example
documentation
Categorizes issue or PR as related to documentation.
good first issue
Good for newcomers
#33
opened Nov 28, 2023 by
kerthcet
Support system_prompt can be empty
good first issue
Good for newcomers
#31
opened Nov 11, 2023 by
kerthcet
ProTip!
What’s not been updated in a month: updated:<2024-10-24.