-
Notifications
You must be signed in to change notification settings - Fork 2.7k
Add stream_options/include_usage only for openai models #15615
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
It works for Universal, but with Coder I get: missing choices[0].tool_calls[0].type {"id":"ff4ab3b8a5b3442e9ffb9c2c2a5db8ed","object":"chat.completion.chunk","created":1747915081,"model":"mistral-large-latest","choices":[{"finish_reason":"tool_calls","index":0,"message":{"role":"assistant","tool_calls":[{"id":"7n3INi6di","function":{"name":"getFileContent","arguments":"{"file": "packages/ai-mcp/src/common/mcp-server-manager.ts"}"}}]},"logprobs":null}],"usage":{"prompt_tokens":2267,"total_tokens":2307,"completion_tokens":40}} |
(options.body as { messages: Array<ChatCompletionMessageParam> }).messages.forEach(m => { | ||
if (m.role === 'assistant' && m.tool_calls) { | ||
// this is optional and thus should be undefined | ||
// eslint-disable-next-line no-null/no-null | ||
if (m.refusal === null) { | ||
m.refusal = undefined; | ||
} | ||
// this is optional and thus should be undefined | ||
// eslint-disable-next-line no-null/no-null | ||
if ((m as unknown as { parsed: null | undefined }).parsed === null) { | ||
(m as unknown as { parsed: null | undefined }).parsed = undefined; | ||
} | ||
} | ||
}); | ||
}; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We're overriding prepareOptions
here. Do we lose the original behavior of prepareOptions
then? Why do we need this customization at all? Is this for mistral? Some more comments on why this is done would be nice.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The Mistral models worked for me with this PR.
During testing I ran into the same issue as on master
: #15646
dd8efef
to
ff53773
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Works for me!
Note that the Mistral models still report usage, even if we don't hand over the usage option.
What it does
Not every model that uses the openai api supports all features.
This is a quick fix for fixing mistral models.
The problematic property is:
This is not known by mistralAI and leads to issues.
As this is needed to track token usage for OpenAI own models, this is still added to the request but only in the case we have an OpenAI model.
How to test
Add this to your settings:
For a key register at mistral.
For codestrial: https://console.mistral.ai/codestral
Now you can test by changing the model used by an agent.
Follow-ups
A correct fix would be to add capabilities for each model and custom settings.
A task is tracked in the TheiaAI Epic, but now issue exists for this yet.
Breaking changes
Attribution
Review checklist
Reminder for reviewers