KEMBAR78
Fix memory leak in `ModuleTracker` by danthe3rd · Pull Request #141960 · pytorch/pytorch · GitHub
Skip to content

Conversation

@danthe3rd
Copy link
Contributor

@danthe3rd danthe3rd commented Dec 3, 2024

Thanks @drisspg and @albanD for finding the fix
cc @pragupta

TEST PLAN

import gc
import torch
import torch.nn as nn
from torch.utils.module_tracker import ModuleTracker


class MyModel(nn.Module):
    def forward(self, x):
        return x * x

print(f"torch=={torch.__version__}")
m = MyModel()
m.cuda()
m.to(torch.bfloat16)
mt = ModuleTracker()
for i in range(1000):
    if i % 100 == 0:
        gc.collect()
        print("memory_allocated:", torch.cuda.memory_allocated())
    x = torch.randn([128, 256], device="cuda", dtype=torch.bfloat16, requires_grad=True)
    with mt:
        m(x)

@danthe3rd danthe3rd requested a review from albanD December 3, 2024 12:27
@pytorch-bot
Copy link

pytorch-bot bot commented Dec 3, 2024

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/141960

Note: Links to docs will display an error until the docs builds have been completed.

✅ No Failures

As of commit b6227e4 with merge base 78543e6 (image):
💚 Looks good so far! There are no failures yet. 💚

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@danthe3rd
Copy link
Contributor Author

@pytorchbot label "topic: bug fixes"

@pytorch-bot pytorch-bot bot added the topic: bug fixes topic category label Dec 3, 2024
@danthe3rd danthe3rd added the release notes: python_frontend python frontend release notes category label Dec 3, 2024
Copy link
Collaborator

@albanD albanD left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We could make one of the test in test/test_module_tracker.py run on cuda device and enable leak detection on it to catch this. But might be a bit too much for this PR, it sounds ok as is if you don't have time.

@danthe3rd
Copy link
Contributor Author

@pytorchbot merge

@pytorch-bot pytorch-bot bot added the ciflow/trunk Trigger trunk jobs on your pull request label Dec 3, 2024
@pytorchmergebot
Copy link
Collaborator

Merge started

Your change will be merged once all checks pass (ETA 0-4 Hours).

Learn more about merging in the wiki.

Questions? Feedback? Please reach out to the PyTorch DevX Team

Advanced Debugging
Check the merge workflow status
here

pobin6 pushed a commit to pobin6/pytorch that referenced this pull request Dec 5, 2024
Thanks @drisspg and @albanD for finding the fix

**TEST PLAN**
```
import gc
import torch
import torch.nn as nn
from torch.utils.module_tracker import ModuleTracker

class MyModel(nn.Module):
    def forward(self, x):
        return x * x

print(f"torch=={torch.__version__}")
m = MyModel()
m.cuda()
m.to(torch.bfloat16)
mt = ModuleTracker()
for i in range(1000):
    if i % 100 == 0:
        gc.collect()
        print("memory_allocated:", torch.cuda.memory_allocated())
    x = torch.randn([128, 256], device="cuda", dtype=torch.bfloat16, requires_grad=True)
    with mt:
        m(x)

```
Pull Request resolved: pytorch#141960
Approved by: https://github.com/albanD
AmdSampsa pushed a commit to AmdSampsa/pytorch that referenced this pull request Dec 9, 2024
Thanks @drisspg and @albanD for finding the fix

**TEST PLAN**
```
import gc
import torch
import torch.nn as nn
from torch.utils.module_tracker import ModuleTracker

class MyModel(nn.Module):
    def forward(self, x):
        return x * x

print(f"torch=={torch.__version__}")
m = MyModel()
m.cuda()
m.to(torch.bfloat16)
mt = ModuleTracker()
for i in range(1000):
    if i % 100 == 0:
        gc.collect()
        print("memory_allocated:", torch.cuda.memory_allocated())
    x = torch.randn([128, 256], device="cuda", dtype=torch.bfloat16, requires_grad=True)
    with mt:
        m(x)

```
Pull Request resolved: pytorch#141960
Approved by: https://github.com/albanD
@github-actions github-actions bot deleted the dhaziza-mod-tracker-memleak branch January 3, 2025 02:07
pragupta pushed a commit to pragupta/pytorch that referenced this pull request Jan 16, 2025
Thanks @drisspg and @albanD for finding the fix

**TEST PLAN**
```
import gc
import torch
import torch.nn as nn
from torch.utils.module_tracker import ModuleTracker

class MyModel(nn.Module):
    def forward(self, x):
        return x * x

print(f"torch=={torch.__version__}")
m = MyModel()
m.cuda()
m.to(torch.bfloat16)
mt = ModuleTracker()
for i in range(1000):
    if i % 100 == 0:
        gc.collect()
        print("memory_allocated:", torch.cuda.memory_allocated())
    x = torch.randn([128, 256], device="cuda", dtype=torch.bfloat16, requires_grad=True)
    with mt:
        m(x)

```
Pull Request resolved: pytorch#141960
Approved by: https://github.com/albanD

(cherry picked from commit 9125e91)
pruthvistony pushed a commit to ROCm/pytorch that referenced this pull request Jan 16, 2025
)

Thanks @drisspg and @albanD for finding the fix

**TEST PLAN**
```
import gc
import torch
import torch.nn as nn
from torch.utils.module_tracker import ModuleTracker

class MyModel(nn.Module):
    def forward(self, x):
        return x * x

print(f"torch=={torch.__version__}")
m = MyModel()
m.cuda()
m.to(torch.bfloat16)
mt = ModuleTracker()
for i in range(1000):
    if i % 100 == 0:
        gc.collect()
        print("memory_allocated:", torch.cuda.memory_allocated())
    x = torch.randn([128, 256], device="cuda", dtype=torch.bfloat16, requires_grad=True)
    with mt:
        m(x)

```
Pull Request resolved: pytorch#141960
Approved by: https://github.com/albanD

(cherry picked from commit 9125e91)

Fixes #ISSUE_NUMBER

Co-authored-by: dan_the_3rd <43445237+danthe3rd@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

ciflow/trunk Trigger trunk jobs on your pull request Merged release notes: python_frontend python frontend release notes category topic: bug fixes topic category

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants