Skip to content

Conversation

eneufeld
Copy link
Contributor

What it does

Not every model that uses the openai api supports all features.
This is a quick fix for fixing mistral models.
The problematic property is:

stream_options: {
  include_usage: true
}

This is not known by mistralAI and leads to issues.
As this is needed to track token usage for OpenAI own models, this is still added to the request but only in the case we have an OpenAI model.

How to test

Add this to your settings:

 "ai-features.openAiCustom.customOpenAiModels": [
        {
            "model": "codestral-latest",
            "url": "https://codestral.mistral.ai/v1",
            "id": "Mistral/Codestral",
            "apiKey": "<MY CODESTRIAL KEY>",
            "developerMessageSettings": "system"
        },
        {
            "model": "mistral-medium-latest",
            "url": "https://api.mistral.ai/v1",
            "id": "Mistral/Mistral-Medium",
            "apiKey": "<MY MISTRAL KEY>",
            "developerMessageSettings": "system"
        }
    ]

For a key register at mistral.
For codestrial: https://console.mistral.ai/codestral

Now you can test by changing the model used by an agent.

Follow-ups

A correct fix would be to add capabilities for each model and custom settings.
A task is tracked in the TheiaAI Epic, but now issue exists for this yet.

Breaking changes

  • This PR introduces breaking changes and requires careful review. If yes, the breaking changes section in the changelog has been updated.

Attribution

Review checklist

Reminder for reviewers

@eneufeld eneufeld requested a review from JonasHelming May 20, 2025 10:55
@github-project-automation github-project-automation bot moved this to Waiting on reviewers in PR Backlog May 20, 2025
@eneufeld eneufeld requested a review from sdirix May 20, 2025 10:57
@eneufeld eneufeld mentioned this pull request May 20, 2025
61 tasks
@JonasHelming
Copy link
Contributor

It works for Universal, but with Coder I get:

missing choices[0].tool_calls[0].type {"id":"ff4ab3b8a5b3442e9ffb9c2c2a5db8ed","object":"chat.completion.chunk","created":1747915081,"model":"mistral-large-latest","choices":[{"finish_reason":"tool_calls","index":0,"message":{"role":"assistant","tool_calls":[{"id":"7n3INi6di","function":{"name":"getFileContent","arguments":"{"file": "packages/ai-mcp/src/common/mcp-server-manager.ts"}"}}]},"logprobs":null}],"usage":{"prompt_tokens":2267,"total_tokens":2307,"completion_tokens":40}}

@JonasHelming
Copy link
Contributor

@eneufeld Works well for me, but there are failing tests!
@sdirix Could you do a quick review, too.
This should be in the release next Wednesday if possible,

Comment on lines 201 to 215
(options.body as { messages: Array<ChatCompletionMessageParam> }).messages.forEach(m => {
if (m.role === 'assistant' && m.tool_calls) {
// this is optional and thus should be undefined
// eslint-disable-next-line no-null/no-null
if (m.refusal === null) {
m.refusal = undefined;
}
// this is optional and thus should be undefined
// eslint-disable-next-line no-null/no-null
if ((m as unknown as { parsed: null | undefined }).parsed === null) {
(m as unknown as { parsed: null | undefined }).parsed = undefined;
}
}
});
};
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We're overriding prepareOptions here. Do we lose the original behavior of prepareOptions then? Why do we need this customization at all? Is this for mistral? Some more comments on why this is done would be nice.

Copy link
Member

@sdirix sdirix left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The Mistral models worked for me with this PR.

During testing I ran into the same issue as on master: #15646

@eneufeld eneufeld force-pushed the fix/mistral branch 2 times, most recently from dd8efef to ff53773 Compare May 27, 2025 10:02
Copy link
Member

@sdirix sdirix left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Works for me!

Note that the Mistral models still report usage, even if we don't hand over the usage option.

@github-project-automation github-project-automation bot moved this from Waiting on reviewers to Needs merge in PR Backlog May 27, 2025
@eneufeld eneufeld merged commit ee2046b into master May 27, 2025
10 of 11 checks passed
@github-project-automation github-project-automation bot moved this from Needs merge to Done in PR Backlog May 27, 2025
@github-actions github-actions bot added this to the 1.62.0 milestone May 27, 2025
@JonasHelming JonasHelming mentioned this pull request May 31, 2025
12 tasks
@ndoschek ndoschek mentioned this pull request Jun 11, 2025
50 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Archived in project
Development

Successfully merging this pull request may close these issues.

3 participants