KEMBAR78
[contextlib] `DecoratorContextManager` and hence `set_grad_enabled` behaves in an unexpected way · Issue #113298 · pytorch/pytorch · GitHub
Skip to content

[contextlib] DecoratorContextManager and hence set_grad_enabled behaves in an unexpected way #113298

@jon-chuang

Description

@jon-chuang

🐛 Describe the bug

import torch

assert torch.is_grad_enabled()

@torch.set_grad_enabled(False)  # unexpectedly, this mutates the grad mode!
def inner_func(x):
    return x.sin()

assert torch.is_grad_enabled()  # AssertionError

Rather, I think that if set_grad_enabled were called on a function after initialization, it needs to reset to its exit state, prior to becoming a decorator ctx manager.

Versions

main

cc: @yanboliang

cc @ezyang @gchanan @albanD @zou3519 @gqchen @pearu @nikitaved @soulitzer @lezcano @Varal7

Metadata

Metadata

Assignees

No one assigned

    Labels

    module: autogradRelated to torch.autograd, and the autograd engine in generalmodule: bc-breakingRelated to a BC-breaking changetriagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate module

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions