-
Notifications
You must be signed in to change notification settings - Fork 25.7k
Add max_pool3d for MPS
#156467
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add max_pool3d for MPS
#156467
Conversation
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/156467
Note: Links to docs will display an error until the docs builds have been completed. ❗ 1 Active SEVsThere are 1 currently active SEVs. If your PR is affected, please view them below: ⏳ 11 Pending, 1 Unrelated FailureAs of commit 05ce616 with merge base 43a0918 ( FLAKY - The following job failed but was likely due to flakiness present on trunk:
UNSTABLE - The following job is marked as unstable, possibly due to flakiness on trunk:
This comment was automatically generated by Dr. CI and updates every 15 minutes. |
Attention! native_functions.yaml was changedIf you are adding a new function or defaulted argument to native_functions.yaml, you cannot use it from pre-existing Python frontend code until our FC window passes (two weeks). Split your PR into two PRs, one which adds the new C++ functionality, and one that makes use of it from Python, and land them two weeks apart. See https://github.com/pytorch/pytorch/wiki/PyTorch's-Python-Frontend-Backward-and-Forward-Compatibility-Policy#forwards-compatibility-fc for more info. Caused by: |
Attention! PyTorch one of the C-stable API file was changedYou MUST NOT change existing function declarations in this, as this header defines a stable C ABI. If you need to change the signature for a function, introduce a new v2 version of the function and modify code generation to target the new version of the function. Caused by: |
| struct PoolingParams { | ||
| int32_t dims; | ||
| int32_t pooling_dims; | ||
| _ARRAY_NS::array<int64_t, N> input_sizes; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
One small thing to note about providing these parameters in a struct is that the buffers in the struct need to be a fixed size at compile time (if I understand correctly), whereas before I made this change, the sizes could be determined at runtime. So this puts a limit on how many dimensions we can support while reusing the same code. But at the moment, pytorch doesn't support anything higher than 3-D pooling with 2 leading dims and I'm guessing there aren't any plans to support any higher dimensionality, so it's probably not really an issue. Of course if an arbitrary number of dims are needed in the future, we can just switch it back to specifying these params directly to mtl_setArgs instead of in a struct
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, this is correct, but even for total number of dims there is an upper limit on total number of tensor dimensions, which I think is currently 16
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Oh I didn't realize that, good to know
|
@pytorchbot merge -f "Hopefully it's all good" |
Merge startedYour change will be merged immediately since you used the force (-f) flag, bypassing any CI checks (ETA: 1-5 minutes). Please use Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team |
Stack from ghstack (oldest at bottom):
max_pool3dfor MPS #156467Fixes #100674
cc @voznesenskym @penguinwu @EikanWang @jgong5 @Guobing-Chen @XiaobingSuper @zhuhaozhe @blzheng @wenzhe-nrv @jiayisunx @ipiszy @chenyang78 @kadeng @muchulee8 @amjames @chauhang @aakhundov