KEMBAR78
`RuntimeError` not raised for `out=` argument in `torch.tensordot` with `requires_grad` tensors · Issue #147846 · pytorch/pytorch · GitHub
Skip to content

RuntimeError not raised for out= argument in torch.tensordot with requires_grad tensors #147846

@vwrewsge

Description

@vwrewsge

🐛 Describe the bug

When using torch.tensordot with tensors that have requires_grad=True, the function should raise a RuntimeError when the out argument is passed, as the operation does not support automatic differentiation.

Code

import torch

# Create input tensors with requires_grad=True
a = torch.empty((2, 3), requires_grad=True)
b = torch.empty((3, 4), requires_grad=True)
c = torch.empty((2, 4))

# Should throw RuntimeError: "functions with out=... arguments don't support automatic differentiation"
torch.tensordot(a, b, dims=([1], [0]), out=c)

Similar PR

#117067

Versions

PyTorch version: 2.6.0+cu124
Is debug build: False
CUDA used to build PyTorch: 12.4
ROCM used to build PyTorch: N/A

cc @ezyang @albanD @gqchen @pearu @nikitaved @soulitzer @Varal7 @xmfan

Metadata

Metadata

Assignees

No one assigned

    Labels

    actionablemodule: autogradRelated to torch.autograd, and the autograd engine in generaltriagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate module

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions