Skip to content

Commit

Permalink
[Bugfix]: Clamp -inf logprob values in prompt_logprobs (vllm-projec…
Browse files Browse the repository at this point in the history
…t#11073)

Signed-off-by: Rafael Vasquez <[email protected]>
  • Loading branch information
rafvasq authored Dec 11, 2024
1 parent 2e32f5d commit 40766ca
Showing 1 changed file with 6 additions and 0 deletions.
6 changes: 6 additions & 0 deletions vllm/entrypoints/openai/serving_completion.py
Original file line number Diff line number Diff line change
Expand Up @@ -392,6 +392,12 @@ def request_output_to_completion_response(
prompt_token_ids = final_res.prompt_token_ids
assert prompt_token_ids is not None
prompt_logprobs = final_res.prompt_logprobs
if prompt_logprobs:
for logprob_dict in prompt_logprobs:
if logprob_dict:
for logprob_values in logprob_dict.values():
if logprob_values.logprob == float('-inf'):
logprob_values.logprob = -9999.0
prompt_text = final_res.prompt

token_ids: GenericSequence[int]
Expand Down

0 comments on commit 40766ca

Please sign in to comment.