Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Continuous Batching #70

Closed
Meowmix42069 opened this issue Jul 27, 2024 · 1 comment
Closed

Continuous Batching #70

Meowmix42069 opened this issue Jul 27, 2024 · 1 comment
Labels
duplicate This issue or pull request already exists

Comments

@Meowmix42069
Copy link

Hello, I am the user of a llama.rn derivative app and I am wondering why continuous batching is not included in your implementation. As I understand it, continuous batching should be enabled by default for all server launches. What would be the easiest way to implement this necessary feature?

@jhen0409
Copy link
Member

Is this what you mean? https://github.com/ggerganov/llama.cpp/blob/2b1f616b208a4a21c4ee7a7eb85d822ff1d787af/examples/server/README.md?plain=1#L162-L167

If so, we have #30 already. Also, since we're need this internally too recently, I'm sure we'll be supporting it soon.

@jhen0409 jhen0409 added the duplicate This issue or pull request already exists label Jul 28, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
duplicate This issue or pull request already exists
Projects
None yet
Development

No branches or pull requests

2 participants