Skip to content

Conversation

andyxning
Copy link
Contributor

@andyxning andyxning commented Jul 22, 2025

Essential Elements of an Effective PR Description Checklist

  • The purpose of the PR, such as "Fix some issue (link existing issues this PR will resolve)".
  • The test plan, such as providing test command.
  • The test results, such as pasting the results comparison before and after, or e2e results
  • (Optional) The necessary documentation update, such as updating supported_models.md and examples for a new model.

Fix #20996 (comment)

Purpose

This is a follow up about PR #20996 and is used to fix comment: #20996 (comment)

Test Plan

NA

Test Result

NA

(Optional) Documentation Update

Signed-off-by: Andy Xie <andy.xning@gmail.com>
@andyxning andyxning force-pushed the unify_variable_for_LLM_instance_v2 branch from 4b997b3 to 8c2bb04 Compare July 22, 2025 07:25
Copy link

👋 Hi! Thank you for contributing to the vLLM project.

💬 Join our developer Slack at https://slack.vllm.ai to discuss your PR in #pr-reviews, coordinate on features in #feat- channels, or join special interest groups in #sig- channels.

Just a reminder: PRs would not trigger full CI run by default. Instead, it would only run fastcheck CI which starts running only a small and essential subset of CI tests to quickly catch errors. You can run other CI tests on top of those by going to your fastcheck build on Buildkite UI (linked in the PR checks section) and unblock them. If you do not have permission to unblock, ping simon-mo or khluu to add you in our Buildkite org.

Once the PR is approved and ready to go, your PR reviewer(s) can run CI to test the changes comprehensively before merging.

To run CI, PR reviewers can either: Add ready label to the PR or enable auto-merge.

🚀

Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request continues the refactoring from a previous PR to unify the variable name for the LLM instance, changing llm.model to llm.llm in a test file. While the renaming is consistent, it appears to have propagated an incorrect attribute access path. I've pointed out a likely AttributeError and suggested a fix. It's important to ensure tests are run to catch such issues.

lambda self: self.model_runner.model.language_model.model.
normalizer.cpu().item())
config = llm.model.llm_engine.model_config.hf_config.text_config
config = llm.llm.llm_engine.model_config.hf_config.text_config
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

critical

It seems there's a potential AttributeError here. Based on the LLMEngine implementation, an LLMEngine instance has a model_config attribute directly, but not an llm_engine attribute.

Assuming llm.llm is an instance of LLMEngine, the path to the config should be llm.llm.model_config.hf_config.text_config. The extra .llm_engine seems incorrect and will likely cause the test to fail.

config = llm.llm.model_config.hf_config.text_config

lambda self: self.model_runner.model.model.normalizer.cpu(
).item())
config = llm.model.llm_engine.model_config.hf_config
config = llm.llm.llm_engine.model_config.hf_config
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

critical

Similar to the case above, it appears there's an incorrect attribute access here. The LLMEngine class does not have an llm_engine attribute.

If llm.llm is an LLMEngine instance, the correct way to access the model config would be directly via llm.llm.model_config.

config = llm.llm.model_config.hf_config

@DarkLight1337
Copy link
Member

Can you check the latest nightly run which runs all tests: https://buildkite.com/vllm/ci/builds?query=Nightly and fix any issues resulting from the naming change?

@andyxning
Copy link
Contributor Author

Will try this later. Let's first fix this problem.

Can you check the latest nightly run which runs all tests: https://buildkite.com/vllm/ci/builds?query=Nightly and fix any issues resulting from the naming change?

@DarkLight1337
Copy link
Member

I prefer fixing them all at once since CI is broken until then anyway

@andyxning
Copy link
Contributor Author

OK,will do this later today.

@andyxning
Copy link
Contributor Author

Seems that there is only one error is related to VllmRunner' object has no attribute 'model'. Others failures are not related with refactor. /cc @DarkLight1337

Copy link
Member

@hmellor hmellor left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The current changes LGTM.

@andyxning you can check the latest nightly build (https://buildkite.com/vllm/ci/builds?branch=main&query=nightly) to find any other failing tests related to the renaming.

@DarkLight1337 DarkLight1337 enabled auto-merge (squash) July 22, 2025 09:58
@DarkLight1337
Copy link
Member

Seems that there is only one error is related to VllmRunner' object has no attribute 'model'. Others failures are not related with refactor.

Thanks for checking this! LGTM then

@github-actions github-actions bot added the ready ONLY add when PR is ready to merge/full CI is needed label Jul 22, 2025
@andyxning
Copy link
Contributor Author

The current changes LGTM.

@andyxning you can check the latest nightly build (https://buildkite.com/vllm/ci/builds?branch=main&query=nightly) to find any other failing tests related to the renaming.

Yes. I have checked this. For now, there is only one.

@andyxning
Copy link
Contributor Author

@DarkLight1337 @hmellor Seems the ci failure is unrelated with this PR. Can you help verifying this? Seems flakes.

@DarkLight1337
Copy link
Member

Retrying

@DarkLight1337
Copy link
Member

The failure is unrelated so I'll just merge this

@vllm-bot vllm-bot merged commit 0df4d9b into vllm-project:main Jul 22, 2025
53 of 56 checks passed
@andyxning andyxning deleted the unify_variable_for_LLM_instance_v2 branch July 22, 2025 14:11
yeqcharlotte pushed a commit to yeqcharlotte/vllm that referenced this pull request Jul 23, 2025
Signed-off-by: Andy Xie <andy.xning@gmail.com>
zixi-qi pushed a commit to zixi-qi/vllm that referenced this pull request Jul 23, 2025
Signed-off-by: Andy Xie <andy.xning@gmail.com>
Signed-off-by: qizixi <qizixi@meta.com>
LyrisZhong pushed a commit to LyrisZhong/vllm that referenced this pull request Jul 23, 2025
Signed-off-by: Andy Xie <andy.xning@gmail.com>
avigny pushed a commit to avigny/vllm that referenced this pull request Jul 31, 2025
Signed-off-by: Andy Xie <andy.xning@gmail.com>
Signed-off-by: avigny <47987522+avigny@users.noreply.github.com>
wenscarl pushed a commit to wenscarl/vllm that referenced this pull request Aug 4, 2025
Signed-off-by: Andy Xie <andy.xning@gmail.com>
Signed-off-by: shuw <shuw@nvidia.com>
x22x22 pushed a commit to x22x22/vllm that referenced this pull request Aug 5, 2025
Signed-off-by: Andy Xie <andy.xning@gmail.com>
Signed-off-by: x22x22 <wadeking@qq.com>
Pradyun92 pushed a commit to Pradyun92/vllm that referenced this pull request Aug 6, 2025
Signed-off-by: Andy Xie <andy.xning@gmail.com>
npanpaliya pushed a commit to odh-on-pz/vllm-upstream that referenced this pull request Aug 6, 2025
Signed-off-by: Andy Xie <andy.xning@gmail.com>
jinzhen-lin pushed a commit to jinzhen-lin/vllm that referenced this pull request Aug 9, 2025
Signed-off-by: Andy Xie <andy.xning@gmail.com>
Signed-off-by: Jinzhen Lin <linjinzhen@hotmail.com>
paulpak58 pushed a commit to paulpak58/vllm that referenced this pull request Aug 13, 2025
Signed-off-by: Andy Xie <andy.xning@gmail.com>
Signed-off-by: Paul Pak <paulpak58@gmail.com>
taneem-ibrahim pushed a commit to taneem-ibrahim/vllm that referenced this pull request Aug 14, 2025
Signed-off-by: Andy Xie <andy.xning@gmail.com>
diegocastanibm pushed a commit to diegocastanibm/vllm that referenced this pull request Aug 15, 2025
Signed-off-by: Andy Xie <andy.xning@gmail.com>
Signed-off-by: Diego-Castan <diego.castan@ibm.com>
epwalsh pushed a commit to epwalsh/vllm that referenced this pull request Aug 28, 2025
Signed-off-by: Andy Xie <andy.xning@gmail.com>
googlercolin pushed a commit to googlercolin/vllm that referenced this pull request Aug 29, 2025
Signed-off-by: Andy Xie <andy.xning@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ready ONLY add when PR is ready to merge/full CI is needed
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants