KEMBAR78
[dynamo] Fix an error in _dynamo.compiled_autograd.reset() by rec · Pull Request #137889 · pytorch/pytorch · GitHub
Skip to content

Conversation

@rec
Copy link
Collaborator

@rec rec commented Oct 14, 2024

[ghstack-poisoned]
@pytorch-bot
Copy link

pytorch-bot bot commented Oct 14, 2024

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/137889

Note: Links to docs will display an error until the docs builds have been completed.

✅ No Failures

As of commit 0acb0ec with merge base 41977a0 (image):
💚 Looks good so far! There are no failures yet. 💚

This comment was automatically generated by Dr. CI and updates every 15 minutes.

rec added a commit that referenced this pull request Oct 14, 2024
* From #133492

ghstack-source-id: a393a00
Pull Request resolved: #137889
rec added a commit that referenced this pull request Oct 14, 2024
* From #133492

ghstack-source-id: a393a00
Pull Request resolved: #137889
@rec rec added the topic: not user facing topic category label Oct 14, 2024
@rec rec changed the title [dynamo] Fix an error in .compiled_autograd.reset() [dynamo] Fix an error in _dynamo.compiled_autograd.reset() Oct 14, 2024
@rec rec requested a review from albanD October 14, 2024 12:15
@rec
Copy link
Collaborator Author

rec commented Oct 14, 2024

A minimal version.

I thought of hiding that mutable, global variable entire behind a pair of module-level getter/setter functions which would also force us to clean up rough ideas like this one but decided it wasn't important enough...

@Skylion007
Copy link
Collaborator

@pytorchbot merge

@pytorch-bot pytorch-bot bot added the ciflow/trunk Trigger trunk jobs on your pull request label Oct 14, 2024
@pytorchmergebot
Copy link
Collaborator

Merge started

Your change will be merged once all checks pass (ETA 0-4 Hours).

Learn more about merging in the wiki.

Questions? Feedback? Please reach out to the PyTorch DevX Team

Advanced Debugging
Check the merge workflow status
here

@Skylion007 Skylion007 added this to the 2.5.1 milestone Oct 14, 2024
@kit1980
Copy link
Contributor

kit1980 commented Oct 23, 2024

2.5.1 is an emergency patch release to address known large regressions, moving this to 2.6.0

@kit1980 kit1980 modified the milestones: 2.5.1, 2.6.0 Oct 23, 2024
@github-actions github-actions bot deleted the gh/rec/72/head branch November 23, 2024 02:06
@kit1980
Copy link
Contributor

kit1980 commented Jan 28, 2025

Verified for PyTorch 2.6 release candidate:

import torch
from torch._dynamo import compiled_autograd
compiled_autograd.compiled_autograd_enabled = True
compiled_autograd.reset()
assert(compiled_autograd.compiled_autograd_enabled == False)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants