-
Notifications
You must be signed in to change notification settings - Fork 24.4k
Back out "Revert D34524207: [pytorch][PR] remove _s_where" #73579
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Summary: Original commit changeset: 87b1220d851c Original Phabricator Diff: D34524207 (pytorch@4eb2482) Test Plan: OSS tests Differential Revision: D34554432 fbshipit-source-id: 4366db10349289ef447f95a2f0615b2b9447a633
CI Flow Status⚛️ CI FlowRuleset - Version:
|
🔗 Helpful links
💊 CI failures summary and remediationsAs of commit 896f3ed (more details on the Dr. CI page):
🕵️ 1 new failure recognized by patternsThe following CI failures do not appear to be due to upstream breakages:
|
This pull request was exported from Phabricator. Differential Revision: D34554432 |
@JackCaoG, can you please prepare xla patch for this? The failure on original PR: https://ossci-raw-job-status.s3.amazonaws.com/log/5371985105
This PR removes |
Looks like #62084 is stalled and won't be landed any time soon. |
@ngimel OK I can work on this fix now then |
Thanks @JackCaoG, I'll try to land this PR tomorrow, I'll let you know when it lands |
@ngimel sounds good, I need to fix some error on the xla side but it should be ready by tmr |
Thanks, let me know if you want to wait till Monday. |
@ngimel Finding some additional bug regarding |
is there any difference between |
Ah ok, it just landed but I'll revert. No, there is no difference between |
I got some xla error
Pytorch/XLA where under the hood calls |
This pull request has been reverted by 5552563. To re-land this change, please open another pull request, assignthe same reviewers, fix the CI failures that caused the revert and make sure that the failing CI runs on the PR by applying the proper ciflow label (e.g., ciflow/trunk). |
@ngimel Where can I find the where broadcasting logic? I run into issues like
which is easy to fix but from things like
are more confusing |
Broadcasting semantics is described in https://pytorch.org/docs/stable/notes/broadcasting.html?highlight=broadcasting, you line up tensors at the last dimension, and expand dimensions of size 1 starting from there. Previously all the tensors were sent to |
I was trying to reverse engineering |
@ngimel has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator. |
628C
@ngimel xla pr is ready, I think we are ready to merge. |
@JackCaoG diff is landing |
Summary: Original commit changeset: 87b1220d851c Original Phabricator Diff: D34524207 (4eb2482) (4eb2482) Pull Request resolved: #73579 Test Plan: OSS tests tested with canary https://www.internalfb.com/intern/ads/canary/441912928798660873 Reviewed By: ezyang Differential Revision: D34688237 Pulled By: ngimel fbshipit-source-id: 32f3a0046053ef52e95ab45a26bfc1de17e7e061
Hey @ngimel. |
Summary:
Original commit changeset: 87b1220d851c
Original Phabricator Diff: D34524207 (4eb2482)
Test Plan: OSS tests
Differential Revision: D34554432