Skip to content

Conversation

yorinasub17
Copy link
Contributor

@yorinasub17 yorinasub17 commented Mar 20, 2023

Fixes #32

This updates the following fields in the ChatCompletionRequest to use pointers instead of the raw numbers (bringing it in line with CompletionRequest). Since each of these are using omitempty on the json tag, if they are raw numbers it makes it impossible to set them to the 0 value as that would cause the marshaller to omit the field, causing it to be set to the OpenAI API defaults.

…nRequest

Signed-off-by: Yoriyasu Yano <430092+yorinasub17@users.noreply.github.com>
Copy link

@pullrequest pullrequest bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

✅ This pull request was sent to the PullRequest network.


@yorinasub17 you can click here to see the review status or cancel the code review job.

Copy link

@pullrequest pullrequest bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

PullRequest Breakdown

Reviewable lines of change

+ 1
- 1

100% Go

Type of change

Minor Update - These changes appear to be a minor update to existing functionality and features.

Copy link

@pullrequest pullrequest bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good to update the model with potentially optional values, no issues are noted

Image of David K David K


Reviewed with ❤️ by PullRequest

Copy link
Contributor

@tylermann tylermann left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

thanks for debugging and fixing this issue. we may want to only change the temperature one for now just to maintain better backwards compatibility, although its open to discussion. mainly right now the temperature is the only one that will have any functional change from what I can see, but the other 3 will require developers to update their code when updating the library.

models.go Outdated
@@ -68,10 +68,10 @@ type ChatCompletionRequest struct {
MaxTokens int `json:"max_tokens,omitempty"`

// (-2, 2) Penalize tokens that haven't appeared yet in the history.
PresencePenalty float32 `json:"presence_penalty,omitempty"`
PresencePenalty *float32 `json:"presence_penalty,omitempty"`
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

it looks like PresencePenalty and FrequencyPenalty default to 0 in the API. so we could potentially not change these to avoid breaking existing code, although I do agree it would be more appropriate to not override any changing defaults later in the API itself.

@@ -50,13 +50,13 @@ type ChatCompletionRequest struct {
Messages []ChatCompletionRequestMessage `json:"messages"`

// What sampling temperature to use, between 0 and 2. Higher values like 0.8 will make the output more random, while lower values like 0.2 will make it more focused and deterministic
Temperature float32 `json:"temperature,omitempty"`
Temperature *float32 `json:"temperature,omitempty"`
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

great catch, thanks for fixing this!

models.go Outdated

// Number of responses to generate
N int `json:"n,omitempty"`
N *int `json:"n,omitempty"`
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

a value of 0 for N is actually invalid and will result in an error from openai, so this one we can potentially leave alone. that way it also doesn't break any existing users of this library.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah, makes sense! Perhaps when backwards incompatible changes can be introduced (maybe when the big Engine function deprecation happens?) the same parameter on CompletionRequest should be updated to be a raw number?

Signed-off-by: Yoriyasu Yano <430092+yorinasub17@users.noreply.github.com>
@yorinasub17
Copy link
Contributor Author

yorinasub17 commented Mar 20, 2023

thanks for debugging and fixing this issue. we may want to only change the temperature one for now just to maintain better backwards compatibility, although its open to discussion. mainly right now the temperature is the only one that will have any functional change from what I can see, but the other 3 will require developers to update their code when updating the library.

I think the reasoning and comments for keeping the other parameters the same makes sense to me. I went ahead and revert the other parameters and only keep it for temperature. PTAL!

Copy link
Contributor

@tylermann tylermann left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

thanks for the updates here, looks good!

@tylermann tylermann merged commit f4f8f0f into PullRequestInc:main Mar 20, 2023
@yorinasub17 yorinasub17 deleted the bug/chat-completion-temperaturezero branch March 22, 2023 03:42
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Temperature 0 not really working out
2 participants