KEMBAR78
Grab bag of (mostly) typing improvements by benjaminglass1 · Pull Request #158075 · pytorch/pytorch · GitHub
Skip to content

Conversation

@benjaminglass1
Copy link
Collaborator

@benjaminglass1 benjaminglass1 commented Jul 10, 2025

Stack from ghstack (oldest at bottom):

Collects some scattershot improvements made while attempting to enable training for AOTInductor. Non-typing changes are:

  1. Swapping a few custom searches for the output node in an FX graph for calling graph.output_node().
  2. Removing two unused parameters from torch.export._unlift._unlift.
  3. Switching handles to constants in cpp_wrapper_cpu to use C++ references for memory efficiency.
  4. Cleaning out unused, unexported imports from torch/export/__init__.py, and adding one missing export to __all__.

cc @voznesenskym @penguinwu @EikanWang @jgong5 @Guobing-Chen @XiaobingSuper @zhuhaozhe @blzheng @wenzhe-nrv @jiayisunx @ipiszy @chenyang78 @kadeng @muchulee8 @amjames @chauhang @aakhundov @coconutruben @Lucaskabela

[ghstack-poisoned]
@pytorch-bot
Copy link

pytorch-bot bot commented Jul 10, 2025

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/158075

Note: Links to docs will display an error until the docs builds have been completed.

✅ No Failures

As of commit c00bba8 with merge base 70b4a88 (image):
💚 Looks good so far! There are no failures yet. 💚

This comment was automatically generated by Dr. CI and updates every 15 minutes.

Copy link
Collaborator

@Skylion007 Skylion007 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Typing nits

Copy link
Collaborator

@Skylion007 Skylion007 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Once these nits are resolved. Looks good.

[ghstack-poisoned]
@benjaminglass1
Copy link
Collaborator Author

BC lint is failing on updating some existing types from tuple[Any] to tuple[Any, ...]. Adding a label to suppress that.

@benjaminglass1 benjaminglass1 added the suppress-bc-linter Suppresses the failures of API backward-compatibility linter (Lint/bc_linter) label Jul 15, 2025
@benjaminglass1
Copy link
Collaborator Author

With this amount of test failures, I'm going to back out the changes to tuple[Any, ...]; they seem to have broken some non typing things.

@benjaminglass1 benjaminglass1 removed the suppress-bc-linter Suppresses the failures of API backward-compatibility linter (Lint/bc_linter) label Jul 15, 2025
[ghstack-poisoned]
[ghstack-poisoned]
[ghstack-poisoned]
[ghstack-poisoned]
[ghstack-poisoned]
[ghstack-poisoned]
@benjaminglass1
Copy link
Collaborator Author

@pytorchbot merge

@pytorch-bot pytorch-bot bot added the ciflow/trunk Trigger trunk jobs on your pull request label Jul 21, 2025
@pytorchmergebot
Copy link
Collaborator

Merge started

Your change will be merged once all checks pass (ETA 0-4 Hours).

Learn more about merging in the wiki.

Questions? Feedback? Please reach out to the PyTorch DevX Team

Advanced Debugging
Check the merge workflow status
here

facebook-github-bot pushed a commit to pytorch/benchmark that referenced this pull request Jul 23, 2025
Summary:
Collects some scattershot improvements made while attempting to enable training for AOTInductor. Non-typing changes are:

1. Swapping a few custom searches for the output node in an FX graph for calling `graph.output_node()`.
2. Removing two unused parameters from `torch.export._unlift._unlift`.
3. Switching handles to constants in `cpp_wrapper_cpu` to use C++ references for memory efficiency.
4. Cleaning out unused, unexported imports from `torch/export/__init__.py`, and adding one missing export to `__all__`.

X-link: pytorch/pytorch#158075
Approved by: https://github.com/Skylion007

Reviewed By: ZainRizvi

Differential Revision: D78691998

fbshipit-source-id: cdd51fe27cef89786ac7728775c103f5c11fcb1b
@github-actions github-actions bot deleted the gh/benjaminglass1/94/head branch August 21, 2025 02:15
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants