fix: o1 model error, use max_completion_tokens instead of max_tokens. #12037
+6
−3
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Summary
fix: o1 model error, when the max_tokens switch is on, a parameter error is indicated .use max_completion_tokens instead of max_tokens.
Tip
Close issue syntax:
Fixes #<issue number>
orResolves #<issue number>
, see documentation for more details.Screenshots
{"event": "error", "conversation_id": "3106d28d-5807-43e5-8a18-107fbd5f623c", "message_id": "4fc7a62a-9b6a-486f-85d5-9bb4f4ee97f6", "created_at": 1735021270, "code": "completion_request_error", "status": 400, "message": "[azure_openai:llm:o1-preview] Bad Request Error, Error code: 400 - {'error': {'message': "Unsupported parameter: 'max_tokens' is not supported with this model. Use 'max_completion_tokens' instead.", 'type': 'invalid_request_error', 'param': 'max_tokens', 'code': 'unsupported_parameter'}}"}
Checklist
Important
Please review the checklist below before submitting your pull request.
dev/reformat
(backend) andcd web && npx lint-staged
(frontend) to appease the lint gods