-
I noticed that when chat with the assistant, the tokens of Prompt is always less than 8192. How can I increase the max input prompt tokens to expand content in {knowledge}? Now many LLM context windows have far exceeded 8K. Thank you! @KevinHuSh |
Beta Was this translation helpful? Give feedback.
Answered by
KevinHuSh
Nov 13, 2024
Replies: 1 comment 2 replies
-
The context length depends on the LLM you've chosen. Chose anther LLM that supports longer context. |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
I got it.
There should be a blank to fill the context length while adding LLM.
We gona refine it later.