Skip to content

Conversation

kwen2501
Copy link
Contributor

@kwen2501 kwen2501 commented May 28, 2024

Stack from ghstack (oldest at bottom):

Actually retiring module after deprecation warning for a while.
The new supported module is: torch.distributed.pipelining.
Please migrate.

cc @mrshenli @pritamdamania87 @zhaojuanmao @satgera @gqchen @aazzolini @osalpekar @jiayisuse @H-Huang @awgu @penguinwu @fegin @XilunWu @wanchaol @fduwjj @wz337 @tianyu-l @wconstab @yf225 @chauhang @d4l3k @ezyang @gchanan @albanD

@kwen2501 kwen2501 requested a review from albanD as a code owner May 28, 2024 23:07
Copy link

pytorch-bot bot commented May 28, 2024

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/127354

Note: Links to docs will display an error until the docs builds have been completed.

❌ 1 New Failure

As of commit 0f84020 with merge base 70724bd (image):

NEW FAILURE - The following job has failed:

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@pytorch-bot pytorch-bot bot added oncall: distributed Add this issue/PR to distributed oncall triage queue release notes: distributed (pipeline) release notes category labels May 28, 2024
kwen2501 added a commit that referenced this pull request May 28, 2024
ghstack-source-id: f7a211e
Pull Request resolved: #127354
@kwen2501 kwen2501 requested review from wconstab and H-Huang May 28, 2024 23:11
@kwen2501 kwen2501 added module: bc-breaking Related to a BC-breaking change topic: bc breaking topic category suppress-bc-linter Suppresses the failures of API backward-compatibility linter (Lint/bc_linter) suppress-api-compatibility-check Suppresses the failures of API backward-compatibility linter (Lint/bc_linter) labels May 28, 2024
Copy link
Contributor

@wconstab wconstab left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

oops i submitted a parallel PR to remove old pipeline.rst. ill cancel mine.

@@ -606,29 +606,22 @@
# torch.distributed.optim.utils
"as_functional_optim",
"register_functional_optim",
# torch.distributed.pipeline.sync.checkpoint
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please delete the corresponding entries! Not just the comment saying which module they come from!

Copy link
Contributor Author

@kwen2501 kwen2501 May 31, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

hahahaha, sorry my bad, I thought those are real entries, but just commented out :) Thanks a lot!

Actually retiring module after deprecation warning for a while.
The new supported module is: torch.distributed.pipelining. 
Please migrate.

cc mrshenli pritamdamania87 zhaojuanmao satgera gqchen aazzolini osalpekar jiayisuse H-Huang awgu penguinwu fegin XilunWu wanchaol fduwjj wz337 tianyu-l wconstab yf225 chauhang d4l3k ezyang gchanan albanD

[ghstack-poisoned]
kwen2501 added a commit that referenced this pull request Jun 4, 2024
ghstack-source-id: 851179d
Pull Request resolved: #127354
Actually retiring module after deprecation warning for a while.
The new supported module is: torch.distributed.pipelining. 
Please migrate.

cc mrshenli pritamdamania87 zhaojuanmao satgera gqchen aazzolini osalpekar jiayisuse H-Huang awgu penguinwu fegin XilunWu wanchaol fduwjj wz337 tianyu-l wconstab yf225 chauhang d4l3k ezyang gchanan albanD

[ghstack-poisoned]
kwen2501 added a commit that referenced this pull request Jun 4, 2024
ghstack-source-id: 1405ee5
Pull Request resolved: #127354
@kwen2501
Copy link
Contributor Author

kwen2501 commented Jun 4, 2024

@pytorchbot merge -f "Minor change to fix conflict. All tests have passed previously"

@pytorchmergebot
Copy link
Collaborator

Merge started

Your change will be merged immediately since you used the force (-f) flag, bypassing any CI checks (ETA: 1-5 minutes). Please use -f as last resort and instead consider -i/--ignore-current to continue the merge ignoring current failures. This will allow currently pending tests to finish and report signal before the merge.

Learn more about merging in the wiki.

Questions? Feedback? Please reach out to the PyTorch DevX Team

Advanced Debugging
Check the merge workflow status
here

@huydhn
Copy link
Contributor

huydhn commented Jun 4, 2024

@pytorchbot revert -m 'Sorry for reverting your change but the doc build failure looks legit https://hud.pytorch.org/pytorch/pytorch/commit/b9c058c203ee38032594f898f27cd8404f113a63' -c ignoredsignal

@pytorchmergebot
Copy link
Collaborator

@pytorchbot successfully started a revert job. Check the current status here.
Questions? Feedback? Please reach out to the PyTorch DevX Team

pytorchmergebot added a commit that referenced this pull request Jun 4, 2024
This reverts commit b9c058c.

Reverted #127354 on behalf of https://github.com/huydhn due to Sorry for reverting your change but the doc build failure looks legit https://hud.pytorch.org/pytorch/pytorch/commit/b9c058c203ee38032594f898f27cd8404f113a63 ([comment](#127354 (comment)))
@pytorchmergebot
Copy link
Collaborator

@kwen2501 your PR has been successfully reverted.

bigfootjon pushed a commit that referenced this pull request Jun 5, 2024
Actually retiring module after deprecation warning for a while.
The new supported module is: torch.distributed.pipelining.
Please migrate.

Pull Request resolved: #127354
Approved by: https://github.com/wconstab

(cherry picked from commit b9c058c)
bigfootjon pushed a commit that referenced this pull request Jun 5, 2024
This reverts commit b9c058c.

Reverted #127354 on behalf of https://github.com/huydhn due to Sorry for reverting your change but the doc build failure looks legit https://hud.pytorch.org/pytorch/pytorch/commit/b9c058c203ee38032594f898f27cd8404f113a63 ([comment](#127354 (comment)))

(cherry picked from commit 0ff6023)
petrex pushed a commit to petrex/pytorch that referenced this pull request Jun 5, 2024
Actually retiring module after deprecation warning for a while.
The new supported module is: torch.distributed.pipelining.
Please migrate.

Pull Request resolved: pytorch#127354
Approved by: https://github.com/wconstab
petrex pushed a commit to petrex/pytorch that referenced this pull request Jun 5, 2024
Actually retiring module after deprecation warning for a while.
The new supported module is: torch.distributed.pipelining. 
Please migrate.

cc mrshenli pritamdamania87 zhaojuanmao satgera gqchen aazzolini osalpekar jiayisuse H-Huang awgu penguinwu fegin XilunWu wanchaol fduwjj wz337 tianyu-l wconstab yf225 chauhang d4l3k ezyang gchanan albanD

[ghstack-poisoned]
@kwen2501
Copy link
Contributor Author

kwen2501 commented Jun 7, 2024

@pytorchbot merge -f "Minor doc fix; CI should pass now"

@pytorchmergebot
Copy link
Collaborator

Merge started

Your change will be merged immediately since you used the force (-f) flag, bypassing any CI checks (ETA: 1-5 minutes). Please use -f as last resort and instead consider -i/--ignore-current to continue the merge ignoring current failures. This will allow currently pending tests to finish and report signal before the merge.

Learn more about merging in the wiki.

Questions? Feedback? Please reach out to the PyTorch DevX Team

Advanced Debugging
Check the merge workflow status
here

@pytorchmergebot
Copy link
Collaborator

Merge failed

Reason: Command git -C /home/runner/work/pytorch/pytorch cherry-pick -x 37c5c4d93d2e16fa11d0ad070b8a00c77d5b05b6 returned non-zero exit code 1

Auto-merging .lintrunner.toml
Auto-merging docs/source/conf.py
CONFLICT (content): Merge conflict in docs/source/conf.py
Auto-merging docs/source/distributed.pipelining.rst
CONFLICT (content): Merge conflict in docs/source/distributed.pipelining.rst
Auto-merging test/allowlist_for_publicAPI.json
Auto-merging torch/distributed/pipelining/PipelineSchedule.py
error: could not apply 37c5c4d93d2... Retire torch.distributed.pipeline
hint: After resolving the conflicts, mark them with
hint: "git add/rm <pathspec>", then run
hint: "git cherry-pick --continue".
hint: You can instead skip this commit with "git cherry-pick --skip".
hint: To abort and get back to the state before "git cherry-pick",
hint: run "git cherry-pick --abort".
hint: Disable this message with "git config advice.mergeConflict false"
Details for Dev Infra team Raised by workflow job

Actually retiring module after deprecation warning for a while.
The new supported module is: torch.distributed.pipelining. 
Please migrate.

cc mrshenli pritamdamania87 zhaojuanmao satgera gqchen aazzolini osalpekar jiayisuse H-Huang awgu penguinwu fegin XilunWu wanchaol fduwjj wz337 tianyu-l wconstab yf225 chauhang d4l3k ezyang gchanan albanD

[ghstack-poisoned]
kwen2501 added a commit that referenced this pull request Jun 7, 2024
ghstack-source-id: b354d98
Pull Request resolved: #127354
@kwen2501
Copy link
Contributor Author

kwen2501 commented Jun 7, 2024

@pytorchbot merge -f "rebase"

Copy link

pytorch-bot bot commented Jun 7, 2024

You need to provide a reason for using force merge, in the format @pytorchbot merge -f 'Explanation'.
The explanation needs to be clear on why this is needed. Here are some good examples:

  • Bypass checks due to unrelated upstream failures from ...
  • This is a minor fix to ..., which shouldn't break anything
  • This is pre-tested in a previous CI run
  • Bypass flaky ... check

@kwen2501
Copy link
Contributor Author

kwen2501 commented Jun 7, 2024

@pytorchbot merge -f "this is a rebase to resolve merge conflict in doc"

@pytorchmergebot
Copy link
Collaborator

Merge started

Your change will be merged immediately since you used the force (-f) flag, bypassing any CI checks (ETA: 1-5 minutes). Please use -f as last resort and instead consider -i/--ignore-current to continue the merge ignoring current failures. This will allow currently pending tests to finish and report signal before the merge.

Learn more about merging in the wiki.

Questions? Feedback? Please reach out to the PyTorch DevX Team

Advanced Debugging
Check the merge workflow status
here

TharinduRusira pushed a commit to TharinduRusira/pytorch that referenced this pull request Jun 14, 2024
Actually retiring module after deprecation warning for a while.
The new supported module is: torch.distributed.pipelining.
Please migrate.

Pull Request resolved: pytorch#127354
Approved by: https://github.com/wconstab
@github-actions github-actions bot deleted the gh/kwen2501/40/head branch July 8, 2024 01:58
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Merged module: bc-breaking Related to a BC-breaking change oncall: distributed Add this issue/PR to distributed oncall triage queue release notes: distributed (pipeline) release notes category Reverted skip-pr-sanity-checks suppress-api-compatibility-check Suppresses the failures of API backward-compatibility linter (Lint/bc_linter) suppress-bc-linter Suppresses the failures of API backward-compatibility linter (Lint/bc_linter) topic: bc breaking topic category
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants