-
Notifications
You must be signed in to change notification settings - Fork 25.7k
Closed
Closed
Copy link
Labels
actionablemodule: autogradRelated to torch.autograd, and the autograd engine in generalRelated to torch.autograd, and the autograd engine in generaltriagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate moduleThis issue has been looked at a team member, and triaged and prioritized into an appropriate module
Description
🐛 Describe the bug
When using torch.tensordot with tensors that have requires_grad=True, the function should raise a RuntimeError when the out argument is passed, as the operation does not support automatic differentiation.
Code
import torch
# Create input tensors with requires_grad=True
a = torch.empty((2, 3), requires_grad=True)
b = torch.empty((3, 4), requires_grad=True)
c = torch.empty((2, 4))
# Should throw RuntimeError: "functions with out=... arguments don't support automatic differentiation"
torch.tensordot(a, b, dims=([1], [0]), out=c)
Similar PR
Versions
PyTorch version: 2.6.0+cu124
Is debug build: False
CUDA used to build PyTorch: 12.4
ROCM used to build PyTorch: N/A
cc @ezyang @albanD @gqchen @pearu @nikitaved @soulitzer @Varal7 @xmfan
Metadata
Metadata
Assignees
Labels
actionablemodule: autogradRelated to torch.autograd, and the autograd engine in generalRelated to torch.autograd, and the autograd engine in generaltriagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate moduleThis issue has been looked at a team member, and triaged and prioritized into an appropriate module