-
Notifications
You must be signed in to change notification settings - Fork 24.4k
opinfo : nn.functional.embedding #66622
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
opinfo : nn.functional.embedding #66622
Conversation
CI Flow Status⚛️ CI FlowRuleset - Version:
You can add a comment to the PR and tag @pytorchbot with the following commands: # ciflow rerun, "ciflow/default" will always be added automatically
@pytorchbot ciflow rerun
# ciflow rerun with additional labels "-l <ciflow/label_name>", which is equivalent to adding these labels manually and trigger the rerun
@pytorchbot ciflow rerun -l ciflow/scheduled -l ciflow/slow For more information, please take a look at the CI Flow Wiki. |
🔗 Helpful links
💊 CI failures summary and remediationsAs of commit fce376a (more details on the Dr. CI page):
ci.pytorch.org: 1 failedThis comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.Please report bugs/suggestions to the (internal) Dr. CI Users group. |
@zou3519 gentle ping :) |
These all make sense, those flags on embedding cause weird things to happen to the gradients. |
yield SampleInput(make_input((M, S)), args=(idx,),) | ||
|
||
if not requires_grad: | ||
# Following inputs return different gradient from the numerical gradient. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nit: we should make it clear that these are expected
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM! One really minor nit
@zou3519 has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator. |
Sorry, I will have to revert this as it is breaking master:
|
This pull request has been reverted by 067365fa87516cdaa9d6df3984f4ec13583016a7. To re-land this change, follow these steps. |
This pull request has been reverted by 94f4b22. To re-land this change, follow these steps. |
I guess we should have rebased first. @kshitij12345 could you open a new PR please? |
This pull request has been reverted by 94f4b22. To re-land this change, follow these steps. |
@soulitzer why was this reverted? Thanks |
It didn't get reverted (again). I think this is just an issue with facebook-github-bot replaying the revert comment from earlier, so we probably don't need to worry. You can check that the revert commit has the same commit hash as in #66622 (comment). |
@soulitzer Ah, I see. Apologies for the noise. Thanks! |
Adds opinfo for
nn.functional.embedding
Few cases where
numerical
gradient doesn't match (gradcheck fails)