KEMBAR78
[fx] fix split_module with symint by kshitij12345 · Pull Request #160093 · pytorch/pytorch · GitHub
Skip to content

Conversation

@kshitij12345
Copy link
Collaborator

@kshitij12345 kshitij12345 commented Aug 7, 2025

@pytorch-bot
Copy link

pytorch-bot bot commented Aug 7, 2025

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/160093

Note: Links to docs will display an error until the docs builds have been completed.

✅ No Failures

As of commit 12d9a77 with merge base 83875cd (image):
💚 Looks good so far! There are no failures yet. 💚

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@pytorch-bot pytorch-bot bot added the release notes: fx release notes category label Aug 7, 2025
@ezyang
Copy link
Contributor

ezyang commented Aug 10, 2025

Test ig?

@kshitij12345
Copy link
Collaborator Author

Test ig?

Sorry, I didn't understand.

@ezyang
Copy link
Contributor

ezyang commented Aug 10, 2025

Needs a test!

@kshitij12345
Copy link
Collaborator Author

kshitij12345 commented Aug 11, 2025

Sure, I was just checking the patch to see if this negatively affects anything.

keep_original_order=True, keep_original_node_name=True)
return split_gm

actual = torch.compile(moe, backend=backend)(inp)
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Couldn't repro the error with torch.fx.symbolic_trace or make_fx.

@kshitij12345 kshitij12345 changed the title [WIP][fx] fix split_module with symint [fx] fix split_module with symint Aug 11, 2025
@kshitij12345 kshitij12345 marked this pull request as ready for review August 11, 2025 12:43
@kshitij12345 kshitij12345 requested a review from ezyang August 11, 2025 13:31
@IvanYashchuk IvanYashchuk requested a review from Copilot August 11, 2025 15:13
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

This PR fixes a bug in the split_module function where symint dependencies were not being properly tracked across partitions. The fix ensures that when a symbolic integer dependency is detected, the dependent partition properly records the dependency on the partition that defines the symbol.

  • Adds missing dependency tracking for symbolic integer nodes across partitions
  • Includes a comprehensive test case that reproduces the original issue with GraniteMoe model

Reviewed Changes

Copilot reviewed 2 out of 2 changed files in this pull request and generated 1 comment.

File Description
torch/fx/passes/split_module.py Adds dependency tracking for symbolic integer nodes to prevent missing dependencies
test/test_fx_experimental.py Adds test case to verify symint dependency handling in split_module

from torch.testing._internal.common_methods_invocations import op_db
from torch.testing._internal.common_nn import module_tests, get_new_module_tests
from torch.testing._internal.common_utils import TEST_Z3, run_tests, TestCase
from torch.testing._internal.common_utils import TEST_Z3, run_tests, TestCase, TEST_WITH_CROSSREF
Copy link

Copilot AI Aug 11, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[nitpick] The import statement is mixing single imports with multiple imports. Consider using a consistent import style by importing TEST_WITH_CROSSREF separately or grouping all imports from common_utils on a single line.

Copilot uses AI. Check for mistakes.

Copy link
Contributor

@ezyang ezyang left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice catch!

@ezyang
Copy link
Contributor

ezyang commented Aug 13, 2025

@pytorchbot merge

@pytorch-bot pytorch-bot bot added the ciflow/trunk Trigger trunk jobs on your pull request label Aug 13, 2025
@pytorchmergebot
Copy link
Collaborator

Merge started

Your change will be merged once all checks pass (ETA 0-4 Hours).

Learn more about merging in the wiki.

Questions? Feedback? Please reach out to the PyTorch DevX Team

Advanced Debugging
Check the merge workflow status
here

@ezyang
Copy link
Contributor

ezyang commented Aug 13, 2025

@pytorchbot merge

@pytorchmergebot
Copy link
Collaborator

The merge job was canceled or timed out. This most often happen if two merge requests were issued for the same PR, or if merge job was waiting for more than 6 hours for tests to finish. In later case, please do not hesitate to reissue the merge command
For more information see pytorch-bot wiki.

@pytorchmergebot
Copy link
Collaborator

Merge started

Your change will be merged once all checks pass (ETA 0-4 Hours).

Learn more about merging in the wiki.

Questions? Feedback? Please reach out to the PyTorch DevX Team

Advanced Debugging
Check the merge workflow status
here

chuanhaozhuge pushed a commit that referenced this pull request Aug 14, 2025
chuanhaozhuge pushed a commit that referenced this pull request Aug 18, 2025
can-gaa-hou pushed a commit to can-gaa-hou/pytorch that referenced this pull request Aug 22, 2025
markc-614 pushed a commit to markc-614/pytorch that referenced this pull request Sep 17, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

ciflow/trunk Trigger trunk jobs on your pull request fx Merged open source release notes: fx release notes category

Projects

None yet

Development

Successfully merging this pull request may close these issues.

KeyError when using fx.split_module

5 participants