-
Notifications
You must be signed in to change notification settings - Fork 3.8k
Replace rank inference with shape inference for Einsum op #6010
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Thanks for your contribution! Could you check the CI errors? |
It would be helpful to improve the tests for Einsum starting from this line: onnx/onnx/test/shape_inference_test.py Line 7023 in a600e2f
|
Signed-off-by: peishenyan <peishen.yan@intel.com>
Signed-off-by: peishenyan <peishen.yan@intel.com>
Signed-off-by: peishenyan <peishen.yan@intel.com>
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## main #6010 +/- ##
==========================================
+ Coverage 56.82% 56.94% +0.11%
==========================================
Files 506 506
Lines 30377 30461 +84
Branches 4592 4592
==========================================
+ Hits 17263 17347 +84
Misses 12285 12285
Partials 829 829 ☔ View full report in Codecov by Sentry. |
bf9f7b1
to
6ce4699
Compare
Signed-off-by: peishenyan <peishen.yan@intel.com>
Hi @justinchuby , thanks for your comments. I have improved the tests for einsum and fix some bugs in my previous code with the help of tests! The code is ready for review now. PTAL, thanks! |
LGTM. I wonder if there are more test cases that should be added? Consider referring to the pytorch tests here: https://github.com/pytorch/pytorch/blob/02bb2180f497d6d10dd11da51fffefaa5af80aca/torch/testing/_internal/common_methods_invocations.py#L6083-L6117 |
As well as some complicated ones: |
Or this table in your reference: https://fdwr.github.io/MachineLearningOperators/OperatorFormulas.html
You may leverage |
…tps://fdwr.github.io/MachineLearningOperators/OperatorFormulas.html Signed-off-by: peishenyan <peishen.yan@intel.com>
Signed-off-by: peishenyan <peishen.yan@intel.com>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you!
Signed-off-by: peishenyan <peishen.yan@intel.com>
@gramalingam @xadupre could you help approve? Thanks! |
That reminds me, I should review this one too (I only reviewed the ORT WebNN EP one)... |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Two minor comments.
Signed-off-by: peishenyan <peishen.yan@intel.com>
Head branch was pushed to by a user without write access
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
👍
continue; | ||
} | ||
|
||
const auto inserted = label_maps.insert({term[index], num_labels}).second; |
Check notice
Code scanning / CodeQL
For loop variable changed in body
} | ||
} | ||
} else { // Infer the dimension for right-hand side | ||
// If there's an ellipsis, add it's corresponding dimensions | ||
// If there's an ellipsis, add its corresponding dimensions |
Check notice
Code scanning / CodeQL
For loop variable changed in body
Signed-off-by: peishenyan <peishen.yan@intel.com>
@fdwr : do you know if the onnxruntime implementation supports broadcasting across the ellipsis dimensions? The ONNX spec says "broadcasting" ... which would mean the shape-inference should do the broadcasting logic for the ellipsis dimensions (or, at least, leave them as unknown). |
(update) Judging from this comment in the CPU EP, it probably does support ellipsis broadcasting:
|
Signed-off-by: peishenyan <peishen.yan@intel.com>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
👍
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the extension ... greatly appreciated.
In the development of ONNX Runtime, we need know the output shape of each Op node for statical graph compilation. However, we found that we could use onnx shape inference to achieve almost all output shapes except the output shape of Einsum. In `onnx/defs/math/defs.cc`, we found that there was only Rank Inference function for Einsum instead of Shape Inference. Given the results of equation parsing and the input shapes of the Einsum operation, we can easily infer the output shape. As such, we have expanded the rank inference to include shape inference for Einsum. --------- Signed-off-by: peishenyan <peishen.yan@intel.com> Co-authored-by: G. Ramalingam <grama@microsoft.com> Signed-off-by: Linsho Kaku <linsho@preferred.jp>
…22376) ### Description <!-- Describe your changes. --> Pick up onnx/onnx#6010 to support EinSum shape inference ### Motivation and Context <!-- - Why is this change required? What problem does it solve? - If it fixes an open issue, please link to the issue here. --> This change allows EinSum operator's output shape to be inferenced so that it can run on accelerators.
…22376) ### Description <!-- Describe your changes. --> Pick up onnx/onnx#6010 to support EinSum shape inference ### Motivation and Context <!-- - Why is this change required? What problem does it solve? - If it fixes an open issue, please link to the issue here. --> This change allows EinSum operator's output shape to be inferenced so that it can run on accelerators.
In the development of ONNX Runtime, we need know the output shape of each Op node for statical graph compilation. However, we found that we could use onnx shape inference to achieve almost all output shapes except the output shape of Einsum. In
onnx/defs/math/defs.cc
, we found that there was only Rank Inference function for Einsum instead of Shape Inference.Given the results of equation parsing and the input shapes of the Einsum operation, we can easily infer the output shape. As such, we have expanded the rank inference to include shape inference for Einsum.