KEMBAR78
[fx] fix qualified name for methods of torch.Tensor by isuruf · Pull Request #162224 · pytorch/pytorch · GitHub
Skip to content

Conversation

@isuruf
Copy link
Collaborator

@isuruf isuruf commented Sep 4, 2025

[ghstack-poisoned]
@pytorch-bot
Copy link

pytorch-bot bot commented Sep 4, 2025

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/162224

Note: Links to docs will display an error until the docs builds have been completed.

✅ You can merge normally! (1 Unrelated Failure)

As of commit 51e597e with merge base b04e922 (image):

BROKEN TRUNK - The following job failed but were present on the merge base:

👉 Rebase onto the `viable/strict` branch to avoid these failures

This comment was automatically generated by Dr. CI and updates every 15 minutes.

isuruf added a commit that referenced this pull request Sep 4, 2025
ghstack-source-id: 4aff298
Pull Request resolved: #162224
@isuruf isuruf requested a review from williamwen42 September 4, 2025 22:47
williamwen42
williamwen42 previously approved these changes Sep 4, 2025
@williamwen42 williamwen42 self-requested a review September 4, 2025 23:10
@williamwen42 williamwen42 dismissed their stale review September 4, 2025 23:11

have comments

Copy link
Member

@williamwen42 williamwen42 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is Dynamo still generating an FX graph with non-existent methods? If so, shouldn't the fix be somewhere there?

if isinstance(
func, (types.MethodDescriptorType, types.WrapperDescriptorType)
) and func is getattr(torch.Tensor, func.__name__, None):
if (
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@isuruf do you have an explanation of what the bug was and what caused the error?

For example, did Dynamo put an incorrect function in the graph and did that cause something (AOTAutograd?) to later error out during tracing?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's a printing problem. If we don't get the right qualified name we will generate FX code that isn't actually runnable.

and func.__qualname__ == f"Tensor.{func.__name__}"
):
return f"torch.Tensor.{func.__name__}"
name = func.__name__
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Using name = func.__qualname__ here is another solution for this problem.

Problem is that for func = torch._tensor.Tensor.split, we need to use __qualname__

>>> import torch
>>> func = torch.Tensor.split
>>> func.__module__
'torch._tensor'
>>> func.__name__
'split'
>>> func.__qualname__
'Tensor.split'

However, I'm not sure if using __qualname__ is okay here. @benjaminglass1 mentioned that __qualname__ can be incorrect with pybind11>=3.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@isuruf If this only parses torch.Tensor and it works for you on latest main, then you should be fine. We're already on pybind11 == 3.0.1; I thought this was a more generic function when I provided that feedback.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's not only torch.Tensor. It parses quite a few types including lambdas

@ezyang
Copy link
Contributor

ezyang commented Sep 6, 2025

@pytorchbot merge

@pytorch-bot pytorch-bot bot added the ciflow/trunk Trigger trunk jobs on your pull request label Sep 6, 2025
@pytorchmergebot
Copy link
Collaborator

Merge started

Your change will be merged once all checks pass (ETA 0-4 Hours).

Learn more about merging in the wiki.

Questions? Feedback? Please reach out to the PyTorch DevX Team

Advanced Debugging
Check the merge workflow status
here

isinstance(func, (types.MethodDescriptorType, types.WrapperDescriptorType))
and func is getattr(torch.Tensor, func.__name__, None)
) or (
func.__module__ == torch._tensor
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
func.__module__ == torch._tensor
func.__module__ == torch._tensor.__name__

callable.__module__ is str | None rather than the module object.

daisyden pushed a commit to daisyden/pytorch that referenced this pull request Sep 8, 2025
markc-614 pushed a commit to markc-614/pytorch that referenced this pull request Sep 17, 2025
mansiag05 pushed a commit to mansiag05/pytorch that referenced this pull request Sep 22, 2025
cleonard530 pushed a commit to cleonard530/pytorch that referenced this pull request Sep 22, 2025
dsashidh pushed a commit to dsashidh/pytorch that referenced this pull request Sep 26, 2025
@github-actions github-actions bot deleted the gh/isuruf/145/head branch October 7, 2025 02:08
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

9 participants