KEMBAR78
Aten ops might not support non-contiguous inputs, this is a problem for fallbacks. · Issue #140462 · pytorch/pytorch · GitHub
Skip to content

Aten ops might not support non-contiguous inputs, this is a problem for fallbacks. #140462

@Chillee

Description

@Chillee

🚀 The feature, motivation and pitch

#140452 is caused by aten._weight_norm_interface_backward not supporting non-contiguous inputs.

adding make_fallback(aten._weight_norm_interface_backward, require_contiguous) to lowerings.py fixed the issue.

But this seems like a broader problem - I don't think we strictly require that all operators support non-contiguous inputs in PyTorch. Comprehensive padding exacerbates this problem significantly.

cc: @shunting314

Alternatives

No response

Additional context

No response

cc @ezyang @chauhang @penguinwu @voznesenskym @EikanWang @jgong5 @Guobing-Chen @XiaobingSuper @zhuhaozhe @blzheng @wenzhe-nrv @jiayisunx @ipiszy @yf225 @chenyang78 @kadeng @muchulee8 @ColinPeppler @amjames @desertfire @aakhundov

Metadata

Metadata

Assignees

Labels

module: inductoroncall: pt2triagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate module

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions