-
Notifications
You must be signed in to change notification settings - Fork 25.7k
Move module_tracker to logging for confused hierarchy #134467
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/134467
Note: Links to docs will display an error until the docs builds have been completed. ✅ You can merge normally! (1 Unrelated Failure)As of commit eda0a63 with merge base 78d69bf ( BROKEN TRUNK - The following job failed but were present on the merge base:👉 Rebase onto the `viable/strict` branch to avoid these failures
This comment was automatically generated by Dr. CI and updates every 15 minutes. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you for the fix
| from copy import copy | ||
|
|
||
| import torch | ||
| from torch import nn |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nitpicking: this is a nice BE change, but could have been done as part of separate PR :)
| return t.node | ||
| if t.requires_grad and t.grad_fn is None: | ||
| node = t.view_as(t).grad_fn.next_functions[0][0] # type: ignore[union-attr] | ||
| with torch.enable_grad(): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
note that this line is a non-trivial change and needs to be confirmed by @soulitzer
This happens here because we register multi-grad hook during the backward where grad_mode is disabled.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This change sounds good, though I'm not completely sure I understand why the hooks can be registered when grad_mode is disabled.
(If it is the checkpoint case, then we should reenable grad before recomputing forward)
|
@pytorchbot merge |
Merge startedYour change will be merged once all checks pass (ETA 0-4 Hours). Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team |
|
@pytorchbot -h |
PyTorchBot HelpMergeRevertRebaseLabelDr CIcherry-pickCloseusage: @pytorchbot close Close a PR [Can be used on issues] |
|
@pytorchbot cherry-pick --onto release/2.4 --fixes #134242 -c regression |
Cherry picking #134467Command Details for Dev Infra teamRaised by workflow job |
Fixes pytorch#134242 Make sure to never raise an error when confused. Logs for confusion can be enabled with `TORCH_LOGS="torch.utils.module_tracker"` or the usual python systems. Pull Request resolved: pytorch#134467 Approved by: https://github.com/malfet
) * Move module_tracker to logging for confused hierarchy (#134467) Fixes #134242 Make sure to never raise an error when confused. Logs for confusion can be enabled with `TORCH_LOGS="torch.utils.module_tracker"` or the usual python systems. Pull Request resolved: #134467 Approved by: https://github.com/malfet * Fix bad merge conflict resolution
Fixes pytorch#134242 Make sure to never raise an error when confused. Logs for confusion can be enabled with `TORCH_LOGS="torch.utils.module_tracker"` or the usual python systems. Pull Request resolved: pytorch#134467 Approved by: https://github.com/malfet
Fixes #134242
Make sure to never raise an error when confused. Logs for confusion can be enabled with
TORCH_LOGS="torch.utils.module_tracker"or the usual python systems.