[openai_api_compatible]Error: API request failed #11786
Replies: 4 comments
-
@lijiaqi0121 Hello there! I'm here to help you with bug reports, questions, and becoming a contributor. Let's tackle this issue together while we wait for a human maintainer. The 400 error you're encountering when using the Here are some steps you can take to resolve this issue:
If these steps do not resolve the issue, consider checking for any updates or patches that might address this problem in the latest version of Dify. To continue talking to Dosu, mention @dosu. Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other |
Beta Was this translation helpful? Give feedback.
-
@dosu How can I view the complete error message? |
Beta Was this translation helpful? Give feedback.
-
To view the complete error message, you can try the following steps:
These steps should help you capture the full error message, providing more context for troubleshooting. To continue talking to Dosu, mention @dosu. Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other |
Beta Was this translation helpful? Give feedback.
-
@lijiaqi0121 You can see the complete log in the dify api container. |
Beta Was this translation helpful? Give feedback.
-
Self Checks
Dify version
0.13.2
Cloud or Self Hosted
Self Hosted (Docker)
Steps to reproduce
I added glm-4-9b-chat to openai-api-compatible and set up tool call and support for stream function calling. However, when I was debugging the agent, I received a 400 error, and due to the limitations of the message box, I was unable to see the complete error information.
What is the cause of the problem, and how should I go about resolving it?
✔️ Expected Behavior
I want to set the agent mode to function calling and use the agent normally.
❌ Actual Behavior
When I debug the agent, it reports a 400 error.
Beta Was this translation helpful? Give feedback.
All reactions