-
Notifications
You must be signed in to change notification settings - Fork 65
Switch to using pointers for omitempty number fields in ChatCompletionRequest #33
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Switch to using pointers for omitempty number fields in ChatCompletionRequest #33
Conversation
…nRequest Signed-off-by: Yoriyasu Yano <430092+yorinasub17@users.noreply.github.com>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
✅ This pull request was sent to the PullRequest network.
@yorinasub17 you can click here to see the review status or cancel the code review job.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
PullRequest Breakdown
Reviewable lines of change
+ 1
- 1
100% Go
Type of change
Minor Update - These changes appear to be a minor update to existing functionality and features.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good to update the model with potentially optional values, no issues are noted
Reviewed with ❤️ by PullRequest
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
thanks for debugging and fixing this issue. we may want to only change the temperature one for now just to maintain better backwards compatibility, although its open to discussion. mainly right now the temperature is the only one that will have any functional change from what I can see, but the other 3 will require developers to update their code when updating the library.
models.go
Outdated
@@ -68,10 +68,10 @@ type ChatCompletionRequest struct { | |||
MaxTokens int `json:"max_tokens,omitempty"` | |||
|
|||
// (-2, 2) Penalize tokens that haven't appeared yet in the history. | |||
PresencePenalty float32 `json:"presence_penalty,omitempty"` | |||
PresencePenalty *float32 `json:"presence_penalty,omitempty"` |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
it looks like PresencePenalty
and FrequencyPenalty
default to 0
in the API. so we could potentially not change these to avoid breaking existing code, although I do agree it would be more appropriate to not override any changing defaults later in the API itself.
@@ -50,13 +50,13 @@ type ChatCompletionRequest struct { | |||
Messages []ChatCompletionRequestMessage `json:"messages"` | |||
|
|||
// What sampling temperature to use, between 0 and 2. Higher values like 0.8 will make the output more random, while lower values like 0.2 will make it more focused and deterministic | |||
Temperature float32 `json:"temperature,omitempty"` | |||
Temperature *float32 `json:"temperature,omitempty"` |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
great catch, thanks for fixing this!
models.go
Outdated
|
||
// Number of responses to generate | ||
N int `json:"n,omitempty"` | ||
N *int `json:"n,omitempty"` |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
a value of 0 for N is actually invalid and will result in an error from openai, so this one we can potentially leave alone. that way it also doesn't break any existing users of this library.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ah, makes sense! Perhaps when backwards incompatible changes can be introduced (maybe when the big Engine
function deprecation happens?) the same parameter on CompletionRequest
should be updated to be a raw number?
Signed-off-by: Yoriyasu Yano <430092+yorinasub17@users.noreply.github.com>
I think the reasoning and comments for keeping the other parameters the same makes sense to me. I went ahead and revert the other parameters and only keep it for temperature. PTAL! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
thanks for the updates here, looks good!
Fixes #32
This updates the following fields in the
ChatCompletionRequest
to use pointers instead of the raw numbers (bringing it in line withCompletionRequest
). Since each of these are usingomitempty
on thejson
tag, if they are raw numbers it makes it impossible to set them to the0
value as that would cause the marshaller to omit the field, causing it to be set to the OpenAI API defaults.