-
Notifications
You must be signed in to change notification settings - Fork 65
Add streaming chat completions #31
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
✅ This pull request was sent to the PullRequest network.
@bakks you can click here to see the review status or cancel the code review job.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
PullRequest Breakdown
Reviewable lines of change
+ 105
- 0
100% Go
Type of change
Feature - These changes are adding a new feature or improvement to existing code.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Changes look good to add the ChatCompletion calls. One minor question noted, but it's more for future diagnostics rather than anything substantial.
Reviewed with ❤️ by PullRequest
// make sure there isn't any extra whitespace before or after | ||
line = bytes.TrimSpace(line) | ||
// the completion API only returns data events | ||
if !bytes.HasPrefix(line, dataPrefix) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I believe this line is here partly to filter out empty lines in the streaming response, of which there are many, so I don't think it makes sense to add extra logging here.
Some background context - the OpenAI documentation on the streaming flag is:
Whether to stream back partial progress. If set, tokens will be sent as data-only server-sent events as they become available, with the stream terminated by a
data: [DONE]
message.
What I saw when debugging the responses is that every bit of data indeed had the data:
prefix, except for the completely empty lines. So this line is exercised during normal operation, and the JSON unmarshalling will take care of throwing an error if there's line corruption.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please merge when ready, i don't have write access |
I've rebased and pushed after the merge of #30, so this branch should only show my own changes now. Merge when ready. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the streaming support for this one too. I'll go ahead and merge this in.
Hey all - I'm just piggybacking on #30 and adding streaming support. Recommend that you merge @abatilo 's code first so that my additions are more clear. And please feel free to edit this PR yourself.