Skip to content

Conversation

qscqesze
Copy link
Contributor

@qscqesze qscqesze commented Jun 16, 2025

We would like to propose official support for the MiniMaxM1ForCausalLM model within vLLM. This model shares the exact same architecture and inference behavior as MiniMaxText01ForCausalLM, and is essentially a variant with different weights.

To support this, we added an alias in the vLLM codebase that maps MiniMaxM1ForCausalLM to the existing MiniMaxText01ForCausalLM implementation. This allows the new model to be used seamlessly without requiring redundant code or changes to the inference pipeline.

qscqesze added 3 commits June 16, 2025 15:23
Signed-off-by: QscQ <qscqesze@gmail.com>
Signed-off-by: QscQ <qscqesze@gmail.com>
Signed-off-by: QscQ <qscqesze@gmail.com>
Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Summary of Changes

Hello @qscqesze, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!

This pull request introduces support for the MiniMaxM1ForCausalLM model. Since it shares the same underlying architecture as MiniMaxText01ForCausalLM, the implementation leverages an alias in the model registry, avoiding redundant code and simplifying maintenance. The changes include updates to the model registry, documentation, and test configurations.

Highlights

  • New Model Support: Added official support for the MiniMaxM1ForCausalLM model.
  • Architecture Alias: Implemented support by aliasing MiniMaxM1ForCausalLM to the existing MiniMaxText01ForCausalLM implementation, as they share the same architecture.
  • Documentation and Testing: Updated the supported models documentation and added an example model ID for testing.
Using Gemini Code Assist

The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.

Invoking Gemini

You can request assistance from Gemini at any point in your pull request via creating an issue comment (i.e. comment on the pull request page) using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands.

Feature Command Description
Code Review /gemini review Performs a code review for the current pull request in its current state.
Pull Request Summary /gemini summary Provides a summary of the current pull request in its current state.
Comment @gemini-code-assist Responds in comments when explicitly tagged, both in issue comments and review comments.
Help /gemini help Displays a list of available commands.

Customization

To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.

Limitations & Feedback

Gemini Code Assist is currently in preview and may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments to provide feedback.

You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.

Footnotes

  1. Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configureGemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution.

Copy link

👋 Hi! Thank you for contributing to the vLLM project.

💬 Join our developer Slack at https://slack.vllm.ai to discuss your PR in #pr-reviews, coordinate on features in #feat- channels, or join special interest groups in #sig- channels.

Just a reminder: PRs would not trigger full CI run by default. Instead, it would only run fastcheck CI which starts running only a small and essential subset of CI tests to quickly catch errors. You can run other CI tests on top of those by going to your fastcheck build on Buildkite UI (linked in the PR checks section) and unblock them. If you do not have permission to unblock, ping simon-mo or khluu to add you in our Buildkite org.

Once the PR is approved and ready to go, your PR reviewer(s) can run CI to test the changes comprehensively before merging.

To run CI, PR reviewers can either: Add ready label to the PR or enable auto-merge.

🚀

@mergify mergify bot added the documentation Improvements or additions to documentation label Jun 16, 2025
Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This PR adds support for MiniMaxM1ForCausalLM by aliasing it to MiniMaxText01ForCausalLM. The changes update model registries and documentation. Verify that the Hugging Face configuration for the new model is fully compatible with the existing vLLM model code to prevent runtime issues.

@ywang96 ywang96 added the ready ONLY add when PR is ready to merge/full CI is needed label Jun 16, 2025
@ywang96 ywang96 enabled auto-merge (squash) June 16, 2025 08:28
Signed-off-by: QscQ <qscqesze@gmail.com>
auto-merge was automatically disabled June 16, 2025 08:44

Head branch was pushed to by a user without write access

Signed-off-by: QscQ <qscqesze@gmail.com>
@qscqesze
Copy link
Contributor Author

@ywang96 I noticed that the CI check has failed, but as far as I can tell, our changes are limited to documentation updates and configuration adjustments for model usage, which should not affect the final results.
Would it be possible to skip the failed CI in this case so that we can proceed with merging this PR into the main branch?

@DarkLight1337
Copy link
Member

Retrying the test

Copy link
Collaborator

@tlrmchlsmth tlrmchlsmth left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

sweet

@simon-mo simon-mo merged commit 387bdf0 into vllm-project:main Jun 16, 2025
68 of 70 checks passed
@simon-mo
Copy link
Collaborator

For anyone want to use this model, you can use vLLM nightly wheels to test it out!

uv pip install -U vllm --extra-index-url https://wheels.vllm.ai/nightly --torch-backend=auto

@celsowm
Copy link

celsowm commented Jun 18, 2025

is there any extra param to disable reasoning ? something similar to enable_thinking false from Qwen 3 ?

yeqcharlotte pushed a commit to yeqcharlotte/vllm that referenced this pull request Jun 22, 2025
…h MiniMaxText01ForCausalLM) (vllm-project#19677)

Signed-off-by: QscQ <qscqesze@gmail.com>
minpeter pushed a commit to minpeter/vllm that referenced this pull request Jun 24, 2025
…h MiniMaxText01ForCausalLM) (vllm-project#19677)

Signed-off-by: QscQ <qscqesze@gmail.com>
Signed-off-by: minpeter <kali2005611@gmail.com>
yangw-dev pushed a commit to yangw-dev/vllm that referenced this pull request Jun 24, 2025
…h MiniMaxText01ForCausalLM) (vllm-project#19677)

Signed-off-by: QscQ <qscqesze@gmail.com>
Signed-off-by: Yang Wang <elainewy@meta.com>
xjpang pushed a commit to xjpang/vllm that referenced this pull request Jun 30, 2025
…h MiniMaxText01ForCausalLM) (vllm-project#19677)

Signed-off-by: QscQ <qscqesze@gmail.com>
wseaton pushed a commit to wseaton/vllm that referenced this pull request Jun 30, 2025
…h MiniMaxText01ForCausalLM) (vllm-project#19677)

Signed-off-by: QscQ <qscqesze@gmail.com>
avigny pushed a commit to avigny/vllm that referenced this pull request Jul 31, 2025
…h MiniMaxText01ForCausalLM) (vllm-project#19677)

Signed-off-by: QscQ <qscqesze@gmail.com>
Signed-off-by: avigny <47987522+avigny@users.noreply.github.com>
googlercolin pushed a commit to googlercolin/vllm that referenced this pull request Aug 29, 2025
…h MiniMaxText01ForCausalLM) (vllm-project#19677)

Signed-off-by: QscQ <qscqesze@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
documentation Improvements or additions to documentation ready ONLY add when PR is ready to merge/full CI is needed
Projects
None yet
Development

Successfully merging this pull request may close these issues.

7 participants