-
Notifications
You must be signed in to change notification settings - Fork 25.7k
Closed
Closed
Copy link
Labels
module: inductoroncall: pt2triagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate moduleThis issue has been looked at a team member, and triaged and prioritized into an appropriate module
Description
🚀 The feature, motivation and pitch
#140452 is caused by aten._weight_norm_interface_backward not supporting non-contiguous inputs.
adding make_fallback(aten._weight_norm_interface_backward, require_contiguous) to lowerings.py fixed the issue.
But this seems like a broader problem - I don't think we strictly require that all operators support non-contiguous inputs in PyTorch. Comprehensive padding exacerbates this problem significantly.
cc: @shunting314
Alternatives
No response
Additional context
No response
cc @ezyang @chauhang @penguinwu @voznesenskym @EikanWang @jgong5 @Guobing-Chen @XiaobingSuper @zhuhaozhe @blzheng @wenzhe-nrv @jiayisunx @ipiszy @yf225 @chenyang78 @kadeng @muchulee8 @ColinPeppler @amjames @desertfire @aakhundov
eqy and ZelboK
Metadata
Metadata
Assignees
Labels
module: inductoroncall: pt2triagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate moduleThis issue has been looked at a team member, and triaged and prioritized into an appropriate module