Skip to content

Stop streaming if LiteLLM provide a finish_reason #1350

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged

Conversation

FlorianVal
Copy link
Contributor

Issue #1349.

Sometimes LiteLLM send a last message with finish_reason when generation is done.

example :

StreamingChoices(finish_reason='stop', index=0, delta=Delta(provider_specific_fields=None, content=None, role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)

The code in generate_stream only check is the key delta is None. But if finish_reason is provided, there might be a Delta object provided under the key delta, but with a content to None. In that case we should not append the None to the generation and we should just stop generation.

@FlorianVal
Copy link
Contributor Author

Forgot to make style, sorry !

Copy link
Member

@albertvillanova albertvillanova left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks! Good fixes for streaming!

if event.choices[0].delta is None:
if not getattr(event.choices[0], "finish_reason", None):
if event.choices[0].delta is None or event.choices[0].delta.content is None:
if getattr(event.choices[0], "finish_reason", None):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I remember having raised my concerns about a related issue before: #1236 (comment)

I'm wondering if event.choices[0].delta can be None or it is always a class instance.

Copy link
Member

@albertvillanova albertvillanova May 19, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do you know if LiteLLM can return None event.choices[0].delta when streaming?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I kept if event.choices[0].delta is None in case LiteLLM returns no Delta instance but not sure it is possible

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LiteLLM always return a Delta object because of this block. I will remove the check event.choices[0].delta is None

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Well seen!

Comment on lines 1062 to 1068
if event.choices[0].delta is None or event.choices[0].delta.content is None:
if getattr(event.choices[0], "finish_reason", None):
break # Stop streaming if a finish reason is provided
else:
raise ValueError(f"No content or tool calls in event: {event}")
else:
yield ChatMessageStreamDelta(
Copy link
Member

@albertvillanova albertvillanova May 19, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

And what about simplifying the logic even further?

Suggested change
if event.choices[0].delta is None or event.choices[0].delta.content is None:
if getattr(event.choices[0], "finish_reason", None):
break # Stop streaming if a finish reason is provided
else:
raise ValueError(f"No content or tool calls in event: {event}")
else:
yield ChatMessageStreamDelta(
if event.choices[0].delta and event.choices[0].delta.content:
yield ChatMessageStreamDelta(

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice one, added that to last commit along with removing the check of event.choices[0].delta

Copy link
Member

@albertvillanova albertvillanova left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks again! Much better now!

@albertvillanova albertvillanova merged commit 75c2ce1 into huggingface:main May 19, 2025
3 checks passed
@FlorianVal FlorianVal deleted the fix/error_liteLLM_return_None branch May 20, 2025 07:11
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[BUG] When planning LiteLLM can send a message after the termination message
2 participants