KEMBAR78
Dispatch `numpy.take_along_axis` to `torch.take_along_dim` by guilhermeleobas · Pull Request #108880 · pytorch/pytorch · GitHub
Skip to content

Conversation

@pytorch-bot
Copy link

pytorch-bot bot commented Sep 8, 2023

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/108880

Note: Links to docs will display an error until the docs builds have been completed.

✅ You can merge normally! (2 Unrelated Failures)

As of commit 1173859 with merge base bde75eb (image):

UNSTABLE - The following jobs failed but were likely due to flakiness present on trunk and has been marked as unstable:

This comment was automatically generated by Dr. CI and updates every 15 minutes.

cc voznesenskym penguinwu EikanWang jgong5 Guobing-Chen XiaobingSuper zhuhaozhe blzheng Xia-Weiwen wenzhe-nrv jiayisunx chenyang78 aakhundov kadeng

[ghstack-poisoned]
cc voznesenskym penguinwu EikanWang jgong5 Guobing-Chen XiaobingSuper zhuhaozhe blzheng Xia-Weiwen wenzhe-nrv jiayisunx chenyang78 aakhundov kadeng

[ghstack-poisoned]
cc voznesenskym penguinwu EikanWang jgong5 Guobing-Chen XiaobingSuper zhuhaozhe blzheng Xia-Weiwen wenzhe-nrv jiayisunx chenyang78 aakhundov kadeng

[ghstack-poisoned]
torch.float32,
requires_grad=False,
include_0d=False,
include_empty=False,
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What's the issue with the empty one?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It fails in a test case on test_misc.py as the indices has dtype uint8_t:

...
torch/_dynamo/utils.py:1339: in <lambda>
    lambda: run_node(tx.output, node, args, kwargs, nnmodule)
torch/_dynamo/utils.py:1411: in run_node
    raise RuntimeError(fn_str + str(e)).with_traceback(e.__traceback__) from e
torch/_dynamo/utils.py:1398: in run_node
    return node.target(*args, **kwargs)
torch/_dynamo/utils.py:1699: in __call__
    out = self.f(*args, **kwargs)
torch/_numpy/_normalizations.py:213: in wrapped
    result = func(*args, **kwds)
torch/_numpy/_funcs_impl.py:889: in take_along_axis
    return torch.take_along_dim(arr, indices, axis)
E   torch._dynamo.exc.TorchRuntimeError: Failed running call_function <Wrapped function <original take_along_axis>>(*(FakeTensor(..., size=(s0,)), FakeTensor(..., size=(0,), dtype=torch.uint8), 0), **{}):
E   torch.take_along_dim(): dtype of indices should be Long but got Byte
E
E   from user code:
E      File "/home/guilhermeleobas/git/pytorch/test/dynamo/test_misc.py", line 1283, in fn
E       return np.take_along_axis(x, i, a)
E
E   Set TORCH_LOGS="+dynamo" and TORCHDYNAMO_VERBOSE=1 for more information
E
E
E   You can suppress this exception and fall back to eager by setting:
E       import torch._dynamo
E       torch._dynamo.config.suppress_errors = True
E
E
E   To execute this test, run the following from the base repo dir:
E        python test/dynamo/test_misc.py -k test_numpy_take_along_axis
E
E   This message can be suppressed by setting PYTORCH_PRINT_REPRO_ON_FAILURE=0

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Makes sense. Actually, it may be best to use sample_inputs_take_along_dim. Note that the order of the args for indices and dims is swapped wrt. gather.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Are you ok if I skip the tests from sample_inputs_take_along_dim that don't include dim?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What do you mean exactly?

Copy link
Collaborator Author

@guilhermeleobas guilhermeleobas Sep 12, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sorry for not including more information in my previous comment. There are two tests that don't include the dim argument in sample_inputs_take_along_dim

https://github.com/pytorch/pytorch/blob/f9a250c35bd061e2e6f4c2d92e2b1b16390e8636/torch/testing/_internal/common_methods_invocations.py#L2842-L2846

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actually, Numpy also has this behaviour, so we should also implement it. Can you modify the decomposition accordingly?

Also, could you send a separate PR fixing the docs of this operation noting that dim can also be None?

cc voznesenskym penguinwu EikanWang jgong5 Guobing-Chen XiaobingSuper zhuhaozhe blzheng Xia-Weiwen wenzhe-nrv jiayisunx chenyang78 aakhundov kadeng

[ghstack-poisoned]
cc voznesenskym penguinwu EikanWang jgong5 Guobing-Chen XiaobingSuper zhuhaozhe blzheng Xia-Weiwen wenzhe-nrv jiayisunx chenyang78 aakhundov kadeng

[ghstack-poisoned]
guilhermeleobas added a commit that referenced this pull request Sep 8, 2023
Copy link
Collaborator

@lezcano lezcano left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just a small note, feel free to merge once you change the sampling function.
You can even clean-up the include_0d and the include_empty args for sample_inputs_gather as these will not be used anywhere any more.

Feel free to merge once you change this. In a stack, if you call pytorchbot merge on a given PR, it'll merge that PR and all the previous ones.

torch.float32,
requires_grad=False,
include_0d=False,
include_empty=False,
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Makes sense. Actually, it may be best to use sample_inputs_take_along_dim. Note that the order of the args for indices and dims is swapped wrt. gather.

cc voznesenskym penguinwu EikanWang jgong5 Guobing-Chen XiaobingSuper zhuhaozhe blzheng Xia-Weiwen wenzhe-nrv jiayisunx chenyang78 aakhundov kadeng

[ghstack-poisoned]
cc voznesenskym penguinwu EikanWang jgong5 Guobing-Chen XiaobingSuper zhuhaozhe blzheng Xia-Weiwen wenzhe-nrv jiayisunx chenyang78 aakhundov kadeng

[ghstack-poisoned]
cc voznesenskym penguinwu EikanWang jgong5 Guobing-Chen XiaobingSuper zhuhaozhe blzheng Xia-Weiwen wenzhe-nrv jiayisunx chenyang78 aakhundov kadeng

[ghstack-poisoned]
guilhermeleobas added a commit that referenced this pull request Sep 12, 2023
@lezcano
Copy link
Collaborator

lezcano commented Sep 12, 2023

Also, note that you can even clean-up the include_0d and the include_empty args for sample_inputs_gather as these will not be used anywhere any more.

cc voznesenskym penguinwu EikanWang jgong5 Guobing-Chen XiaobingSuper zhuhaozhe blzheng Xia-Weiwen wenzhe-nrv jiayisunx chenyang78 aakhundov kadeng

[ghstack-poisoned]
cc voznesenskym penguinwu EikanWang jgong5 Guobing-Chen XiaobingSuper zhuhaozhe blzheng Xia-Weiwen wenzhe-nrv jiayisunx chenyang78 aakhundov kadeng

[ghstack-poisoned]
pytorchmergebot pushed a commit that referenced this pull request Sep 13, 2023
pytorchmergebot pushed a commit that referenced this pull request Sep 13, 2023
@facebook-github-bot facebook-github-bot deleted the gh/guilhermeleobas/3/head branch September 17, 2023 14:24
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants