Skip to content

Commit

Permalink
[Bugfix] Fix the default value for temperature in ChatCompletionReque…
Browse files Browse the repository at this point in the history
…st (#11219)
  • Loading branch information
yansh97 authored Dec 16, 2024
1 parent 69ba344 commit 17138af
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion vllm/entrypoints/openai/protocol.py
Original file line number Diff line number Diff line change
Expand Up @@ -211,7 +211,7 @@ class ChatCompletionRequest(OpenAIBaseModel):
stop: Optional[Union[str, List[str]]] = Field(default_factory=list)
stream: Optional[bool] = False
stream_options: Optional[StreamOptions] = None
temperature: Optional[float] = 0.7
temperature: Optional[float] = 1.0
top_p: Optional[float] = 1.0
tools: Optional[List[ChatCompletionToolsParam]] = None
tool_choice: Optional[Union[Literal["none"], Literal["auto"],
Expand Down

0 comments on commit 17138af

Please sign in to comment.