-
Notifications
You must be signed in to change notification settings - Fork 2.3k
[bugfix] fix megatron model merger #1774
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Signed-off-by: ShareLer <ShareLe@163.com>
@ShareLer Thanks a lot for helping us to fix this, it helps a lot~ Could you briefly point out what causes vllm inference failure? Seems also involved a lot of refactorization. Is it caused by the missing parameters transferring? |
Signed-off-by: ShareLer <ShareLe@163.com>
Three main reasons:
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
is it possible to add a test that reproduces the issue?
You can reproduce this problem very simply by using the CI script (like job 'e2e_ppo_trainer_megatron-qwen3' in e2e_ppo_trainer_megatron.yml) just change command option in merger: There were no problems in the previous CI test because different logics were used in the |
@ShareLer Could you add some assertions for checking whether naming prefix is valid as Or whether it can have some robustness to get elements from string, like fetch the layer_index from [-3] index of split strings? |
Sorry, I don't quite understand what you mean. |
@@ -444,28 +485,28 @@ def _merge_state_dicts(self, model_state_dict_lst: list[list[dict]], tp_size: in | |||
print("skip lm_head and reward_head loading because of tie_word_embeddings") | |||
continue | |||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe at here we do some check on the mcore's key to check whether it's valid for _replace_name
to work?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, we can add an assert to the result of _replace_name
to ensure that it is not None, because we define the layer transformation relationship in self.params_mapping, and normally the result should not appear None.
Do you think this is feasible?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe I didn't make myself clear(qaq). I can help to add some checking and warnings~
Signed-off-by: ShareLer <ShareLe@163.com>
Signed-off-by: ShareLer <ShareLe@163.com>
@ETOgaosion Hi, I have added additional judgments for embedding layer and output_layer on the basis of your code. |
@ShareLer Thanks for helping, I was about to test on my machine~ |
### Checklist Before Starting - [x] Search for similar PR(s). ### What does this PR do? Fix megatron model merger. ### High-Level Design > Demonstrate the high-level design if this PR is complex. ### Specific Changes - Fix get rank method to support just TP. - Fix state_dict keys after convert. - Add mla/moe convert support. ### API > Demonstrate how the API changes if any. ### Usage Example > Provide usage example(s) for easier usage. ```python # Add code snippet or script demonstrating how to use this ``` ### Test Test with Qwen3-8B and Qwen2.5-7B. ### Additional Info. - **Issue Number**: Fixes issue volcengine#1757 - **Training**: [Note which backend this PR will affect: FSDP, Megatron, both, or none] - **Inference**: [Note which backend this PR will affect: vLLM, SGLang, both, or none] ### Checklist Before Submitting - [ ] Read the [Contribute Guide](https://github.com/volcengine/verl?tab=readme-ov-file#contribution-guide). - [ ] Apply [pre-commit checks](https://github.com/volcengine/verl?tab=readme-ov-file#code-linting-and-formatting). - [ ] Add `[BREAKING]` to the PR title if it breaks any API. - [ ] Update the documentation about your changes in the [docs](https://github.com/volcengine/verl/tree/main/docs). - [ ] Add CI test(s) if necessary. --------- Signed-off-by: ShareLer <ShareLe@163.com> Co-authored-by: ETOgaosion <gaoziyuan19@mails.ucas.ac.cn>
Checklist Before Starting
What does this PR do?
Fix megatron model merger.
High-Level Design
Specific Changes
API
Usage Example
# Add code snippet or script demonstrating how to use this
Test
Test with Qwen3-8B and Qwen2.5-7B.
Additional Info.
Checklist Before Submitting
[BREAKING]
to the PR title if it breaks any API.