-
Notifications
You must be signed in to change notification settings - Fork 25.7k
Testing: Unblock new_* testing on MPS
#137003
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
By changing `other_dtype` to `torch.half` rather than `double` in `sample_inputs_new_fns` if MPS is available [ghstack-poisoned]
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/137003
Note: Links to docs will display an error until the docs builds have been completed. ❌ 1 New Failure, 1 Unrelated FailureAs of commit e8f1603 with merge base 156ca01 ( NEW FAILURE - The following job has failed:
FLAKY - The following job failed but was likely due to flakiness present on trunk:
This comment was automatically generated by Dr. CI and updates every 15 minutes. |
| error_regex='margin_ranking_loss : All input tensors should') | ||
|
|
||
| def sample_inputs_new_fns(self, device, dtype, requires_grad, *, is_strided=False, **kwargs): | ||
| other_dtype = torch.half if torch.backends.mps.is_available() else torch.double |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
is there a better variable we can use to probe for float64 support?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
No, not really. Or at least I can't think of any. Also, this test perhaps should be further extended to pick different dtype if input dtype is torch.double/torch.half
By changing `other_dtype` to `torch.half` rather than `double` in `sample_inputs_new_fns` if MPS is available [ghstack-poisoned]
By changing `other_dtype` to `torch.half` rather than `double` in `sample_inputs_new_fns` if MPS is available Pull Request resolved: pytorch#137003 Approved by: https://github.com/Skylion007 ghstack dependencies: pytorch#136981, pytorch#136982, pytorch#136983, pytorch#136984, pytorch#136985, pytorch#136986
Test like `new_*` and `empty_*` fail the current implementation, see Pull Request resolved: pytorch#137004 Approved by: https://github.com/Skylion007 ghstack dependencies: pytorch#136981, pytorch#136982, pytorch#136983, pytorch#136984, pytorch#136985, pytorch#136986, pytorch#137003
Stack from ghstack (oldest at bottom):
Noneas alias for all types #137004new_*testing on MPS #137003nan_to_numfor bfloat16 #136986torch.linalg.crossfor bfloat16 #136984fmin/fmax/copysignandnextafterto blfoat #136982By changing
other_dtypetotorch.halfrather thandoubleinsample_inputs_new_fnsif MPS is available