KEMBAR78
torch.compile errors on torch.autograd.backward · Issue #125287 · pytorch/pytorch · GitHub
Skip to content

torch.compile errors on torch.autograd.backward #125287

@bdhirsh

Description

@bdhirsh

Min repro:

import torch

@torch.compile(backend='aot_eager')
def f(x):
    y = x.sin().sin()
    torch.autograd.backward([y], [torch.ones_like(y)])

x = torch.ones(4, requires_grad=True)
f(x)
print(x_ref.grad)

gives:

RuntimeError: Cannot backprop through mirrored meta, file a bug in PyTorch

It looks like we are not properly skipping autograd.backward in dynamo's trace rules

cc @ezyang @msaroufim @anijain2305 @chauhang @voznesenskym @penguinwu @EikanWang @jgong5 @Guobing-Chen @XiaobingSuper @zhuhaozhe @blzheng @wenzhe-nrv @jiayisunx @chenyang78 @kadeng

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions