-
Notifications
You must be signed in to change notification settings - Fork 370
Support nn.EmbeddingBag #519
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
@alexandresablayrolles has updated the pull request. You must reimport the pull request before landing. |
@facebook-github-bot has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Cool, definitely make sense given that the inputs to the backward hook itself is a list of tensors.
Left one inline comment. Additionally, please take a look at the linter output. Unit test error is irrelevant as it's inherited from main, but the flake is new I think
@register_grad_sampler(nn.EmbeddingBag) | ||
def compute_embeddingbag_gradsampler(layer, inputs, backprops): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can we maybe have a test for it, like we have for other supported layers?
@alexandresablayrolles has updated the pull request. You must reimport the pull request before landing. |
@facebook-github-bot has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator. |
8000
@alexandresablayrolles has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator. |
@alexandresablayrolles has updated the pull request. You must reimport the pull request before landing. |
…r embeddingbag as it is difficult to make it work.
f7af51d
to
6bdf86f
Compare
@alexandresablayrolles has updated the pull request. You must reimport the pull request before landing. |
@alexandresablayrolles has updated the pull request. You must reimport the pull request before landing. |
@@ -282,7 +282,7 @@ def capture_activations_hook( | |||
|
|||
if not hasattr(module, "activations"): | |||
module.activations = [] | |||
module.activations.append(forward_input[0].detach()) # pyre-ignore | |||
module.activations.append([t.detach() for t in forward_input]) # pyre-ignore |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
should this be append or extend?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think append is right, as activations is a list of tuples in this case
LGTM |
@alexandresablayrolles has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator. |
WIP. Passing activations as a list instead of as a tensor.