-
Notifications
You must be signed in to change notification settings - Fork 25.7k
Closed
Labels
module: autogradRelated to torch.autograd, and the autograd engine in generalRelated to torch.autograd, and the autograd engine in generalmodule: bc-breakingRelated to a BC-breaking changeRelated to a BC-breaking changetriagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate moduleThis issue has been looked at a team member, and triaged and prioritized into an appropriate module
Description
🐛 Describe the bug
import torch
assert torch.is_grad_enabled()
@torch.set_grad_enabled(False) # unexpectedly, this mutates the grad mode!
def inner_func(x):
return x.sin()
assert torch.is_grad_enabled() # AssertionErrorRather, I think that if set_grad_enabled were called on a function after initialization, it needs to reset to its exit state, prior to becoming a decorator ctx manager.
Versions
main
cc: @yanboliang
cc @ezyang @gchanan @albanD @zou3519 @gqchen @pearu @nikitaved @soulitzer @lezcano @Varal7
Metadata
Metadata
Assignees
Labels
module: autogradRelated to torch.autograd, and the autograd engine in generalRelated to torch.autograd, and the autograd engine in generalmodule: bc-breakingRelated to a BC-breaking changeRelated to a BC-breaking changetriagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate moduleThis issue has been looked at a team member, and triaged and prioritized into an appropriate module