Skip to content

[Automodel] Fix CP device_mesh issue, use PTL distsampler (#13473) #13636

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 2 commits into from
May 19, 2025

Conversation

akoumpa
Copy link
Member

@akoumpa akoumpa commented May 17, 2025

  • Fix dp_cp mesh usage

  • Apply isort and black reformatting

  • Remove hf_dataset dist_sampler, let PTL handle the dist_sampler instead

  • Remove print

  • Apply isort and black reformatting

  • Deal with CP on all padding ranks

  • Apply isort and black reformatting

  • Remove unused import

  • update loss nan conversion

  • Change dp_cp mesh check

  • Apply isort and black reformatting

  • Fix dp_cp check bug

  • Update automodel.py

  • Add conversion to int for parallel sizes

  • Apply isort and black reformatting


Important

The Update branch button must only be pressed in very rare occassions.
An outdated branch is never blocking the merge of a PR.
Please reach out to the automation team before pressing that button.

What does this PR do ?

Add a one line overview of what this PR aims to accomplish.

Collection: [Note which collection this PR will affect]

Changelog

  • Add specific line by line info of high level changes in this PR.

Usage

  • You can potentially add a usage example below
# Add a code snippet demonstrating how to use this 

GitHub Actions CI

The Jenkins CI system has been replaced by GitHub Actions self-hosted runners.

The GitHub Actions CI will run automatically when the "Run CICD" label is added to the PR.
To re-run CI remove and add the label again.
To run CI on an untrusted fork, a NeMo user with write access must first click "Approve and run".

Before your PR is "Ready for review"

Pre checks:

  • Make sure you read and followed Contributor guidelines
  • Did you write any new necessary tests?
  • Did you add or update any necessary documentation?
  • Does the PR affect components that are optional to install? (Ex: Numba, Pynini, Apex etc)
    • Reviewer: Does the PR have correct import guards for all optional libraries?

PR Type:

  • New Feature
  • Bugfix
  • Documentation

If you haven't finished some of the above items you can still open "Draft" PR.

Who can review?

Anyone in the community is free to review the PR once the checks have passed.
Contributor guidelines contains specific people who can review PRs to various areas.

Additional Information

  • Related to # (issue)

* Fix dp_cp mesh usage

Signed-off-by: Boxiang Wang <boxiangw@nvidia.com>

* Apply isort and black reformatting

Signed-off-by: BoxiangW <BoxiangW@users.noreply.github.com>

* Remove hf_dataset dist_sampler, let PTL handle the dist_sampler instead

Signed-off-by: Boxiang Wang <boxiangw@nvidia.com>

* Remove print

Signed-off-by: Boxiang Wang <boxiangw@nvidia.com>

* Apply isort and black reformatting

Signed-off-by: BoxiangW <BoxiangW@users.noreply.github.com>

* Deal with CP on all padding ranks

Signed-off-by: Boxiang Wang <boxiangw@nvidia.com>

* Apply isort and black reformatting

Signed-off-by: BoxiangW <BoxiangW@users.noreply.github.com>

* Remove unused import

Signed-off-by: Boxiang Wang <boxiangw@nvidia.com>

* update loss nan conversion

Signed-off-by: Boxiang Wang <boxiangw@nvidia.com>

* Change dp_cp mesh check

Signed-off-by: Boxiang Wang <boxiangw@nvidia.com>

* Apply isort and black reformatting

Signed-off-by: BoxiangW <BoxiangW@users.noreply.github.com>

* Fix dp_cp check bug

Signed-off-by: Boxiang Wang <boxiangw@nvidia.com>

* Update automodel.py

Signed-off-by: BoxiangW <boxiangw@nvidia.com>

* Add conversion to int for parallel sizes

Signed-off-by: Boxiang Wang <boxiangw@nvidia.com>

* Apply isort and black reformatting

Signed-off-by: BoxiangW <BoxiangW@users.noreply.github.com>

---------

Signed-off-by: Boxiang Wang <boxiangw@nvidia.com>
Signed-off-by: BoxiangW <BoxiangW@users.noreply.github.com>
Signed-off-by: BoxiangW <boxiangw@nvidia.com>
Co-authored-by: BoxiangW <BoxiangW@users.noreply.github.com>
@akoumpa akoumpa marked this pull request as ready for review May 17, 2025 20:37
chtruong814
chtruong814 previously approved these changes May 18, 2025
Copy link
Collaborator

@chtruong814 chtruong814 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

CI passed in this PR that included other cherry-picks

#13641

Signed-off-by: Alexandros Koumparoulis <153118171+akoumpa@users.noreply.github.com>
@github-actions github-actions bot removed the Run CICD label May 19, 2025
Copy link
Contributor

[🤖]: Hi @akoumpa 👋,

We wanted to let you know that a CICD pipeline for this PR just finished successfully

So it might be time to merge this PR or get some approvals

I'm just a bot so I'll leave it you what to do next.

//cc @pablo-garay @ko3n1g

@akoumpa
Copy link
Member Author

akoumpa commented May 19, 2025

@chtruong814 let's merge :)

@chtruong814 chtruong814 merged commit 63b9692 into r2.3.0 May 19, 2025
249 checks passed
@chtruong814 chtruong814 deleted the akoumparouli/cherry-pick-13473-r2.3.0 branch May 19, 2025 17:00
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants