Skip to content

Conversation

angelayi
Copy link
Contributor

Test Plan: CI

Differential Revision: D53075379

Copy link

pytorch-bot bot commented Jan 26, 2024

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/118425

Note: Links to docs will display an error until the docs builds have been completed.

❌ 1 New Failure

As of commit fe03b0f with merge base 65f8276 (image):

NEW FAILURE - The following job has failed:

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D53075379

Comment on lines 22 to 24
args = fx_pytree.tree_flatten_spec(
args, self._in_spec, exact_structural_match=True
) # type: ignore[assignment]
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@suo I had to change this back to tree_flatten_spec because it's able to handle cases where we pass in kwargs with different orders than how it was exported (ex. {"kwarg1": 1, "kwarg2":2} vs. {"kwarg2": 2, "kwarg1": 1}) . Do you have a better suggestion?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

strictly speaking, what you described should be an error:
image

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

either way, not supporting custom pytrees is a bigger downside than not being relaxed about dict ordering, so still not sold on tree_flatten_spec here.

Copy link
Member

@suo suo left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

probably still no on tree_flatten_spec, can we rewrite the tests instead?

Comment on lines 22 to 24
args = fx_pytree.tree_flatten_spec(
args, self._in_spec, exact_structural_match=True
) # type: ignore[assignment]
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

strictly speaking, what you described should be an error:
image

Comment on lines 22 to 24
args = fx_pytree.tree_flatten_spec(
args, self._in_spec, exact_structural_match=True
) # type: ignore[assignment]
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

either way, not supporting custom pytrees is a bigger downside than not being relaxed about dict ordering, so still not sold on tree_flatten_spec here.

@pytorch-bot pytorch-bot bot added the ciflow/trunk Trigger trunk jobs on your pull request label Jan 26, 2024
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D53075379

@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D53075379

@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D53075379

angelayi added a commit to angelayi/pytorch that referenced this pull request Jan 27, 2024
Summary: Pull Request resolved: pytorch#118425

Test Plan: CI

Reviewed By: suo

Differential Revision: D53075379
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D53075379

@suo
Copy link
Member

suo commented Jan 29, 2024

for the pytree thing #118505

Summary: Pull Request resolved: pytorch#118425

Test Plan: CI

Reviewed By: suo

Differential Revision: D53075379
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D53075379

@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D53075379

@facebook-github-bot
Copy link
Contributor

@pytorchbot merge -f 'Landed internally'

(Initiating merge automatically since Phabricator Diff has merged, using force because this PR might not pass merge_rules.json but landed internally)

@pytorchmergebot
Copy link
Collaborator

Merge started

Your change will be merged immediately since you used the force (-f) flag, bypassing any CI checks (ETA: 1-5 minutes). Please use -f as last resort and instead consider -i/--ignore-current to continue the merge ignoring current failures. This will allow currently pending tests to finish and report signal before the merge.

Learn more about merging in the wiki.

Questions? Feedback? Please reach out to the PyTorch DevX Team

Advanced Debugging
Check the merge workflow status
here

jeffdaily pushed a commit to ROCm/pytorch that referenced this pull request Feb 8, 2024
Test Plan: CI

Differential Revision: D53075379

Pull Request resolved: pytorch#118425
Approved by: https://github.com/suo
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ciflow/inductor ciflow/trunk Trigger trunk jobs on your pull request fb-exported Merged
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants