Skip to content

Conversation

dst03106
Copy link
Contributor

@dst03106 dst03106 commented Apr 18, 2025

PR Type

Enhancement, Documentation


Description

  • Added support for Mistral and Codestral models.

  • Updated API key handling for Mistral and Codestral.

  • Documented usage of Mistral and Codestral models in the usage guide.


Changes walkthrough 📝

Relevant files
Enhancement
__init__.py
Added Mistral and Codestral models to configuration           

pr_agent/algo/init.py

  • Added new Mistral and Codestral models to the model configuration
    dictionary.
  • Defined specific configurations for these models with respective token
    limits.
  • +14/-0   
    litellm_ai_handler.py
    Added API key support for Mistral and Codestral                   

    pr_agent/algo/ai_handlers/litellm_ai_handler.py

  • Added API key handling for Mistral models.
  • Added API key handling for Codestral models.
  • +8/-0     
    Documentation
    changing_a_model.md
    Added documentation for Mistral and Codestral usage           

    docs/docs/usage-guide/changing_a_model.md

  • Documented configuration and usage for Mistral models.
  • Documented configuration and usage for Codestral models.
  • Provided links to obtain API keys for Mistral and Codestral.
  • +28/-0   

    Need help?
  • Type /help how to ... in the comments thread for any questions about Qodo Merge usage.
  • Check out the documentation for more information.
  • Copy link
    Contributor

    PR Description updated to latest commit (869a179)

    @dst03106 dst03106 marked this pull request as ready for review April 18, 2025 05:36
    Copy link
    Contributor

    qodo-merge-for-open-source bot commented Apr 18, 2025

    PR Reviewer Guide 🔍

    (Review updated until commit 869a179)

    Here are some key observations to aid the review process:

    🎫 Ticket compliance analysis ✅

    1461 - Fully compliant

    Compliant requirements:

    • Add support for Codestral model through the Mistral API
    • Ensure the Mistral API integration works, which is currently free

    ⏱️ Estimated effort to review: 2 🔵🔵⚪⚪⚪
    🧪 No relevant tests
    🔒 No security concerns identified
    ⚡ Recommended focus areas for review

    Model Duplication

    The Codestral model appears to be added twice with different prefixes: "mistral/codestral-latest" and "codestral/codestral-latest". This could cause confusion for users about which one to use.

    "mistral/codestral-latest": 8191,
    "mistral/open-mistral-nemo": 128000,
    "mistral/open-mistral-nemo-2407": 128000,
    "mistral/open-codestral-mamba": 256000,
    "mistral/codestral-mamba-latest": 256000,
    "codestral/codestral-latest": 8191,
    "codestral/codestral-2405": 8191,
    Documentation Clarity

    The documentation for Mistral and Codestral models could be clearer about the differences between using "mistral/codestral-latest" versus "codestral/codestral-latest" and when to use each.

    ### Mistral
    
    To use models like Mistral or Codestral with Mistral, for example, set:
    
    ```toml
    [config] # in configuration.toml
    model = "mistral/mistral-small-latest"
    fallback_models = ["mistral/mistral-medium-latest"]
    [mistral] # in .secrets.toml
    key = "..." # your Mistral api key

    (you can obtain a Mistral key from here)

    Codestral

    To use Codestral model with Codestral, for example, set:

    [config] # in configuration.toml
    model = "codestral/codestral-latest"
    fallback_models = ["codestral/codestral-2405"]
    [codestral] # in .secrets.toml
    key = "..." # your Codestral api key

    (you can obtain a Codestral key from here)

    
    </details>
    
    </td></tr>
    </table>
    

    Copy link
    Contributor

    PR Code Suggestions ✨

    Explore these optional code suggestions:

    CategorySuggestion                                                                                                                                    Impact
    Possible issue
    Fix Codestral API key variable

    The environment variable for Codestral API key should be "MISTRAL_API_KEY" not
    "CODESTRAL_API_KEY" since Codestral is provided by Mistral and uses the same API
    infrastructure. Using a non-standard environment variable will cause
    authentication failures.

    pr_agent/algo/ai_handlers/litellm_ai_handler.py [104-106]

     # Support mistral models
     if get_settings().get("MISTRAL.KEY", None):
         os.environ["MISTRAL_API_KEY"] = get_settings().get("MISTRAL.KEY")
     
     # Support codestral models
     if get_settings().get("CODESTRAL.KEY", None):
    -    os.environ["CODESTRAL_API_KEY"] = get_settings().get("CODESTRAL.KEY")
    +    os.environ["MISTRAL_API_KEY"] = get_settings().get("CODESTRAL.KEY")

    [To ensure code accuracy, apply this suggestion manually]

    Suggestion importance[1-10]: 9

    __

    Why: This suggestion addresses a critical authentication issue. Since Codestral is provided by Mistral and uses the same API infrastructure, using "CODESTRAL_API_KEY" instead of "MISTRAL_API_KEY" would cause authentication failures when attempting to use Codestral models.

    High
    Align Codestral token limits

    There's inconsistency in the token limits for Codestral models. The
    Mistral-hosted Codestral models have 256000 tokens while the direct Codestral
    models have only 8191. According to Mistral's documentation, Codestral models
    support much larger context windows. Verify and align the token limits for
    consistency.

    pr_agent/algo/init.py [125-128]

     "mistral/open-codestral-mamba": 256000,
     "mistral/codestral-mamba-latest": 256000,
    -"codestral/codestral-latest": 8191,
    -"codestral/codestral-2405": 8191,
    +"codestral/codestral-latest": 256000,
    +"codestral/codestral-2405": 256000,
    • Apply this suggestion
    Suggestion importance[1-10]: 8

    __

    Why: The suggestion correctly identifies an inconsistency in token limits for Codestral models. The direct Codestral models have 8191 tokens while Mistral-hosted versions have 256000, which could lead to suboptimal performance. Aligning these values would ensure consistent behavior.

    Medium
    Learned
    best practice
    Retrieve configuration values once and store them in variables before using them to improve efficiency and prevent potential inconsistencies

    The code retrieves the API key twice for each model type - once for the
    condition check and again for the assignment. This is inefficient and could lead
    to inconsistencies if the settings change between calls. Store the API key in a
    variable first, then check if it exists before assigning it to the environment
    variable.

    pr_agent/algo/ai_handlers/litellm_ai_handler.py [100-106]

     # Support mistral models
    -if get_settings().get("MISTRAL.KEY", None):
    -    os.environ["MISTRAL_API_KEY"] = get_settings().get("MISTRAL.KEY")
    +mistral_key = get_settings().get("MISTRAL.KEY", None)
    +if mistral_key:
    +    os.environ["MISTRAL_API_KEY"] = mistral_key
     
     # Support codestral models
    -if get_settings().get("CODESTRAL.KEY", None):
    -    os.environ["CODESTRAL_API_KEY"] = get_settings().get("CODESTRAL.KEY")
    +codestral_key = get_settings().get("CODESTRAL.KEY", None)
    +if codestral_key:
    +    os.environ["CODESTRAL_API_KEY"] = codestral_key
    • Apply this suggestion
    Suggestion importance[1-10]: 6
    Low
    • More
    • Author self-review: I have reviewed the PR code suggestions, and addressed the relevant ones.

    Comment on lines +115 to +128
    "mistral/mistral-small-latest": 8191,
    "mistral/mistral-medium-latest": 8191,
    "mistral/mistral-large-2407": 128000,
    "mistral/mistral-large-latest": 128000,
    "mistral/open-mistral-7b": 8191,
    "mistral/open-mixtral-8x7b": 8191,
    "mistral/open-mixtral-8x22b": 8191,
    "mistral/codestral-latest": 8191,
    "mistral/open-mistral-nemo": 128000,
    "mistral/open-mistral-nemo-2407": 128000,
    "mistral/open-codestral-mamba": 256000,
    "mistral/codestral-mamba-latest": 256000,
    "codestral/codestral-latest": 8191,
    "codestral/codestral-2405": 8191,
    Copy link
    Contributor Author

    @dst03106 dst03106 Apr 18, 2025

    Choose a reason for hiding this comment

    The reason will be displayed to describe this comment to others. Learn more.

    I referred to the following liteLLM links for information about the model and max tokens:

    @anislanguer
    Copy link

    anislanguer commented Apr 18, 2025

    Cool, i want to run this fork locally in docker and i want to test it how to pass the codestral key ?

    @mrT23
    Copy link
    Collaborator

    mrT23 commented Apr 18, 2025

    /review

    @qodo-ai qodo-ai deleted a comment from qodo-merge-for-open-source bot Apr 18, 2025
    Copy link
    Contributor

    Persistent review updated to latest commit 869a179

    @mrT23
    Copy link
    Collaborator

    mrT23 commented Apr 18, 2025

    👍

    @mrT23 mrT23 merged commit f177ff4 into qodo-ai:main Apr 18, 2025
    @dst03106
    Copy link
    Contributor Author

    @anislanguer
    I added the environment variables to the .env file and ran the following Docker commands (ref link). Hope this helps!

    # .env
    CODESTRAL__KEY="..."
    GITHUB__USER_TOKEN="..."
    
    docker build -t pr-agent -f ./docker/Dockerfile .
    docker run --rm -it --env-file .env pr-agent --pr_url {PR_URL} review_pr
    

    @dst03106 dst03106 deleted the feature/add-support-for-mistral-and-codestral branch April 18, 2025 19:06
    @anislanguer
    Copy link

    anislanguer commented Apr 22, 2025

    @dst03106 it doesnt work for me idk why :( it shows me this error
    image

    image

    @mrT23
    Copy link
    Collaborator

    mrT23 commented Apr 22, 2025

    i removed one of the pictures, as it contained personal information.

    Also, the configuration seems wrong.
    You didn't change the model

    https://qodo-merge-docs.qodo.ai/usage-guide/changing_a_model/#mistral

    Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
    Projects
    None yet
    Development

    Successfully merging this pull request may close these issues.

    3 participants