KEMBAR78
[export] runtime asserts for while HOP subgraphs by pianpwk · Pull Request #158467 · pytorch/pytorch · GitHub
Skip to content

Conversation

@pianpwk
Copy link
Contributor

@pianpwk pianpwk commented Jul 16, 2025

Differential Revision: D78431075

For #158366

  • Calls runtime asserts pass for HOP subgraphs (in reenter_make_fx)
  • For while_loop only (can be expanded), clones input tensors for subgraph tracing, so unbacked memos (item, nonzero, etc.) aren't reused

cc @voznesenskym @penguinwu @EikanWang @jgong5 @Guobing-Chen @XiaobingSuper @zhuhaozhe @blzheng @wenzhe-nrv @jiayisunx @chenyang78 @kadeng @chauhang @amjames @Lucaskabela @zou3519 @ydwu4

@pianpwk pianpwk requested a review from zou3519 as a code owner July 16, 2025 19:09
@pytorch-bot
Copy link

pytorch-bot bot commented Jul 16, 2025

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/158467

Note: Links to docs will display an error until the docs builds have been completed.

❗ 1 Active SEVs

There are 1 currently active SEVs. If your PR is affected, please view them below:

✅ You can merge normally! (1 Unrelated Failure)

As of commit 88751f4 with merge base c917c63 (image):

UNSTABLE - The following job is marked as unstable, possibly due to flakiness on trunk:

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D78431075

@pianpwk pianpwk added module: higher order operators torch.cond and similar and removed module: dynamo labels Jul 16, 2025
facebook-github-bot pushed a commit that referenced this pull request Jul 16, 2025
Summary:

For #158366, disables unbacked memoization across HOP subgraphs and adds runtime assertions, so asserts stay in their respective graphs

Rollback Plan:

Differential Revision: D78431075
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D78431075

1 similar comment
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D78431075

facebook-github-bot pushed a commit that referenced this pull request Jul 16, 2025
Summary:

For #158366
- Calls runtime asserts pass for HOP subgraphs (in reenter_make_fx)
- For while_loop only (can be expanded), adds disabling unbacked memoization (item, nonzero, unique), to separate runtime asserts in their respective graphs

Test Plan:
test_export

Rollback Plan:

Differential Revision: D78431075
facebook-github-bot pushed a commit that referenced this pull request Jul 16, 2025
Summary:

For #158366
- Calls runtime asserts pass for HOP subgraphs (in reenter_make_fx)
- For while_loop only (can be expanded), adds disabling unbacked memoization (item, nonzero, unique), to separate runtime asserts in their respective graphs

Test Plan:
test_export

Rollback Plan:

Differential Revision: D78431075
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D78431075

facebook-github-bot pushed a commit that referenced this pull request Jul 17, 2025
Summary:

For #158366
- Calls runtime asserts pass for HOP subgraphs (in reenter_make_fx)
- For while_loop only (can be expanded), adds disabling unbacked memoization (item, nonzero, unique), to separate runtime asserts in their respective graphs

Test Plan:
test_export

Rollback Plan:

Differential Revision: D78431075
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D78431075

@pianpwk pianpwk changed the title [WIP][export] runtime asserts for HOP subgraphs [export] runtime asserts for while HOP subgraphs Jul 17, 2025
@pytorch-bot pytorch-bot bot added the ciflow/trunk Trigger trunk jobs on your pull request label Jul 17, 2025
@pianpwk pianpwk requested a review from ydwu4 July 17, 2025 23:20
_custom_ops_profile: Optional[Any] = None

# Disable memoization for data-dependent ops like item(), nonzero(), unique()
disable_unbacked_memo = False
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this probably shouldn't be a dynamo config? Since what we're trying to do is more dynamic shape/hop related.

@register_op_impl(torch.ops.aten._local_scalar_dense.default)
def local_scalar_dense(fake_mode, func, arg):
if (r := arg.item_memo) is not None:
if (
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

is the item memo associated with the fake_tensor?

Disabling item_memo may not be what we want e.g. users might be doing something like:

def body_fn(c):
  for _ in range(10):
    c_val = c.item()
    ...

In this case, we don't want to create an unbacked symint every iteration since they're essentially the same .item().

facebook-github-bot pushed a commit that referenced this pull request Jul 21, 2025
Summary:

For #158366
- Calls runtime asserts pass for HOP subgraphs (in reenter_make_fx)
- For while_loop only (can be expanded), clones input tensors for subgraph tracing, so unbacked memos (item, nonzero, etc.) aren't reused

Test Plan:
test_export

Rollback Plan:

Differential Revision: D78431075
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D78431075

facebook-github-bot pushed a commit that referenced this pull request Jul 21, 2025
Summary:

For #158366
- Calls runtime asserts pass for HOP subgraphs (in reenter_make_fx)
- For while_loop only (can be expanded), clones input tensors for subgraph tracing, so unbacked memos (item, nonzero, etc.) aren't reused

Test Plan:
test_export

Rollback Plan:

Differential Revision: D78431075
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D78431075

facebook-github-bot pushed a commit that referenced this pull request Jul 22, 2025
Summary:

For #158366
- Calls runtime asserts pass for HOP subgraphs (in reenter_make_fx)
- For while_loop only (can be expanded), clones input tensors for subgraph tracing, so unbacked memos (item, nonzero, etc.) aren't reused

Test Plan:
test_export

Rollback Plan:

Differential Revision: D78431075
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D78431075

facebook-github-bot pushed a commit that referenced this pull request Jul 22, 2025
Summary:

For #158366
- Calls runtime asserts pass for HOP subgraphs (in reenter_make_fx)
- For while_loop only (can be expanded), clones input tensors for subgraph tracing, so unbacked memos (item, nonzero, etc.) aren't reused

Test Plan:
test_export

Rollback Plan:

Differential Revision: D78431075
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D78431075

@pianpwk pianpwk requested a review from ydwu4 July 22, 2025 18:31
@ydwu4
Copy link
Contributor

ydwu4 commented Jul 22, 2025

A follow up is to make all control flow operators (cond, scan, associative_scan, map) clone the inputs before speculate_subgraph.

new_operands_seq = [
unspecialize_carried_inputs(tx, carry) for carry in operands_seq
# clone inputs across subgraphs, to avoid unbacked memoization in fake prop
cond_operands_seq = [
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this should be put under with discard_graph_changes(tx):

)
return reenter_make_fx(fn)(*cloned_carried_inputs, *additional_inputs)

cond_graph = produce_graph(cond_fn)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

put under with disable_proxy_modes_tracing():

Summary:

For #158366
- Calls runtime asserts pass for HOP subgraphs (in reenter_make_fx)
- For while_loop only (can be expanded), clones input tensors for subgraph tracing, so unbacked memos (item, nonzero, etc.) aren't reused

Test Plan:
test_export

Rollback Plan:

Reviewed By: ydwu4

Differential Revision: D78431075
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D78431075

@facebook-github-bot
Copy link
Contributor

@pytorchbot merge

(Initiating merge automatically since Phabricator Diff has merged)

@pytorchmergebot
Copy link
Collaborator

Merge started

Your change will be merged once all checks pass (ETA 0-4 Hours).

Learn more about merging in the wiki.

Questions? Feedback? Please reach out to the PyTorch DevX Team

Advanced Debugging
Check the merge workflow status
here

@github-actions github-actions bot deleted the export-D78431075 branch August 22, 2025 02:15
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants