Skip to content

Gemini provider uses google_ai_studio instead of gemini in documentation #1730

@dst03106

Description

@dst03106

Git provider (optional)

None

System Info (optional)

No response

Issues details

It seems that the Gemini provider for LiteLLM should be gemini, but the documentation currently shows google_ai_studio. If that’s correct, would it be okay for me to submit a PR to update the docs?

Also, since models not included in MAX_TOKENS (defined in pr_agent/algo/__init__.py) don’t work properly, I was wondering if it would be alright to add an warning log to make this behavior clearer.

  • As-is
Image
Error generating PR description {pr_url}: Failed to generate prediction with any model ofFailed to generate prediction with any model of [{model}, {fallback_models}]
  • To-be
+ (Warning level) generating PR description {...}: Ensure {model} is defined in MAX_TOKENS in ./pr_agent/algo/__init__.py or set a positive value for it in config.custom_model_max_tokens
Error generating PR description {...}: Failed to generate prediction with any model ofFailed to generate prediction with any model of [...]
Image Image
  • code
# pr_agent/algo/__init__.py
MAX_TOKENS = {
    ...
    'gemini/gemini-1.5-pro': 1048576,
    'gemini/gemini-1.5-flash': 1048576,
    'gemini/gemini-2.0-flash': 1048576,
    'gemini/gemini-2.5-pro-preview-03-25': 1048576,
    ...
}

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions