Replies: 1 comment
-
Hey @jaruan! I'm here to help you with any bugs, questions, or contributions you have. Let's tackle this together! Automatic prompt generation for agent chat is influenced by specific conditions and configurations in the codebase. The relevant conditions include the Additionally, the There are also known limitations related to token calculation. If the remaining number of tokens is insufficient according to the model's context token size limit and max token size limit, an exception is raised [3]. These conditions and limitations could affect the output and functionality of automatic prompt generation for agent chat. |
Beta Was this translation helpful? Give feedback.
-
I noticed that automatic prompt generation doesn't support the agent chat. Could something affect the output?
code
Beta Was this translation helpful? Give feedback.
All reactions