Skip to content

Conversation

zewenli98
Copy link
Collaborator

Description

Support 1d ITensor offsets for embedding_bag converter.
Note that this is only for 1d inputs.

There's a bug that same offsets with different types (tensor or ITensor) when include_last_offset=True will give different results. I doubt this is a bug from PyTorch, I'm still investigating it.

Fixes #2345

Type of change

  • New feature (non-breaking change which adds functionality)

Checklist:

  • My code follows the style guidelines of this project (You can use the linters)
  • I have performed a self-review of my own code
  • I have commented my code, particularly in hard-to-understand areas and hacks
  • I have made corresponding changes to the documentation
  • I have added tests to verify my fix or my feature
  • New and existing unit tests pass locally with my changes
  • I have added the relevant labels to my PR in so that relevant reviewers are notified

@github-actions github-actions bot added component: tests Issues re: Tests component: conversion Issues re: Conversion stage component: converters Issues re: Specific op converters component: api [Python] Issues re: Python API component: dynamo Issues relating to the `torch.compile` or `torch._dynamo.export` paths labels Mar 8, 2024
@github-actions github-actions bot requested a review from peri044 March 8, 2024 01:24
@zewenli98 zewenli98 self-assigned this Mar 8, 2024
@github-actions github-actions bot added the component: lowering Issues re: The lowering / preprocessing passes label Mar 12, 2024
@zewenli98 zewenli98 changed the title [WIP] feat: support 1d ITensor offsets for embedding_bag converter feat: support 1d ITensor offsets for embedding_bag converter Mar 12, 2024
Copy link
Collaborator

@gs-olive gs-olive left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Overall a great improvement to embedding_bag; added a few small comments and suggestions

@zewenli98
Copy link
Collaborator Author

Thanks for the review! I just refactored the embedding_bag with native TRT layers.

@zewenli98 zewenli98 requested a review from gs-olive April 4, 2024 01:05
Copy link
Collaborator

@gs-olive gs-olive left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Overall, looks good! Added one small comment, and pending CI validation

@zewenli98 zewenli98 requested a review from gs-olive April 4, 2024 18:06
@zewenli98
Copy link
Collaborator Author

This is currently blocked by a Myelin bug. Filed a NVBug internally.

@zewenli98 zewenli98 force-pushed the embedding_bag_1d_ITensor_offsets_dynamo_converter branch from 2d40091 to 1f58d47 Compare May 1, 2024 00:23
@zewenli98 zewenli98 requested a review from gs-olive May 1, 2024 00:27
@zewenli98
Copy link
Collaborator Author

zewenli98 commented May 1, 2024

With the latest Pytorch, TRT-10 GA, and Torch-TRT main branch, the embedding_bag converter works on DLRM now.

@zewenli98 zewenli98 merged commit de81be2 into pytorch:main May 1, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
cla signed component: api [Python] Issues re: Python API component: conversion Issues re: Conversion stage component: converters Issues re: Specific op converters component: dynamo Issues relating to the `torch.compile` or `torch._dynamo.export` paths component: lowering Issues re: The lowering / preprocessing passes component: tests Issues re: Tests
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Implement embedding bag convertor
3 participants