KEMBAR78
[ONNX] Fix lower opset version support in dynamo=True by justinchuby · Pull Request #161056 · pytorch/pytorch · GitHub
Skip to content

Conversation

@justinchuby
Copy link
Collaborator

@justinchuby justinchuby commented Aug 20, 2025

After we switched to constructing the registry with the specified opset version in dynamo=True, support for opset<18 was broken because there would be no torchlib ops registered for these opsets. I updated the registry creation logic to always use opset 18 if the requested opset is lower, and use the version converter (as designed) to target those opsets.

This requires onnxscript>=0.4 (#161312)

Fixes onnx/onnx#7235

cc @titaiwangms

@pytorch-bot
Copy link

pytorch-bot bot commented Aug 20, 2025

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/161056

Note: Links to docs will display an error until the docs builds have been completed.

✅ You can merge normally! (1 Unrelated Failure)

As of commit 171f3f0 with merge base d228a77 (image):

BROKEN TRUNK - The following job failed but were present on the merge base:

👉 Rebase onto the `viable/strict` branch to avoid these failures

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@pytorch-bot pytorch-bot bot added the release notes: onnx torch.onnx related changes that should show up in the release notes label Aug 20, 2025
@justinchuby justinchuby requested a review from Copilot August 20, 2025 14:21
@justinchuby justinchuby added module: onnx Related to torch.onnx topic: bug fixes topic category labels Aug 20, 2025
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

This PR fixes support for ONNX opset versions lower than 18 when using dynamo=True by ensuring the torchlib registry is always constructed with opset 18 or higher, then relying on version conversion to target lower opsets.

Key changes:

  • Modified registry creation logic to use _constants.TORCHLIB_OPSET (18) when requested opset is lower
  • Added informational logging when falling back to higher opset version
  • Added test coverage for lower opset version support with opset 16

Reviewed Changes

Copilot reviewed 2 out of 2 changed files in this pull request and generated 1 comment.

File Description
torch/onnx/_internal/exporter/_compat.py Updated registry creation logic to handle lower opset versions and added logging
test/onnx/exporter/test_api.py Added test model and test case to verify opset 16 support works correctly

Tip: Customize your code reviews with copilot-instructions.md. Create the file or learn how to get started.

@justinchuby justinchuby changed the title [ONNX] Fix lower version support in dynamo=True [ONNX] Fix lower opset version support in dynamo=True Aug 20, 2025
Copy link
Collaborator

@titaiwangms titaiwangms left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I wonder if we should consider supporting the opset_version < 18.

@justinchuby
Copy link
Collaborator Author

I wonder if we should consider supporting the opset_version < 18.

We do that via version converter. Or do you mean implementing the ops directly? Even opset 18 is kind of old at this point though

@titaiwangms
Copy link
Collaborator

I wonder if we should consider supporting the opset_version < 18.

We do that via version converter. Or do you mean implementing the ops directly? Even opset 18 is kind of old at this point though

I forget about how version converter does on converting opset to < 18. Does it only do its best? (It fails for attribute changed ops, for example, ReduceMean?)

@justinchuby
Copy link
Collaborator Author

I wonder if we should consider supporting the opset_version < 18.

We do that via version converter. Or do you mean implementing the ops directly? Even opset 18 is kind of old at this point though

I forget about how version converter does on converting opset to < 18. Does it only do its best? (It fails for attribute changed ops, for example, ReduceMean?)

I think so

@justinchuby
Copy link
Collaborator Author

@pytorchbot rebase

@pytorchmergebot
Copy link
Collaborator

@pytorchbot started a rebase job onto refs/remotes/origin/viable/strict. Check the current status here

@pytorchmergebot
Copy link
Collaborator

Successfully rebased justinchu/opset-version onto refs/remotes/origin/viable/strict, please pull locally before adding more changes (for example, via git checkout justinchu/opset-version && git pull --rebase)

@pytorchmergebot pytorchmergebot force-pushed the justinchu/opset-version branch from 5973ac0 to de50d28 Compare August 23, 2025 00:11
@justinchuby
Copy link
Collaborator Author

@pytorchbot merge

@pytorch-bot pytorch-bot bot added the ciflow/trunk Trigger trunk jobs on your pull request label Aug 23, 2025
@pytorchmergebot
Copy link
Collaborator

Merge started

Your change will be merged once all checks pass (ETA 0-4 Hours).

Learn more about merging in the wiki.

Questions? Feedback? Please reach out to the PyTorch DevX Team

Advanced Debugging
Check the merge workflow status
here

@pytorchmergebot
Copy link
Collaborator

Merge failed

Reason: 1 mandatory check(s) failed. The first few are:

Dig deeper by viewing the failures on hud

Details for Dev Infra team Raised by workflow job

Failing merge rule: Core Maintainers

Signed-off-by: Justin Chu <justinchuby@users.noreply.github.com>
Signed-off-by: Justin Chu <justinchuby@users.noreply.github.com>
Signed-off-by: Justin Chu <justinchuby@users.noreply.github.com>
Signed-off-by: Justin Chu <justinchuby@users.noreply.github.com>
@justinchuby justinchuby force-pushed the justinchu/opset-version branch from de50d28 to 5df8fb6 Compare August 23, 2025 02:10
@justinchuby
Copy link
Collaborator Author

@pytorchbot merge

@pytorchmergebot
Copy link
Collaborator

Merge started

Your change will be merged once all checks pass (ETA 0-4 Hours).

Learn more about merging in the wiki.

Questions? Feedback? Please reach out to the PyTorch DevX Team

Advanced Debugging
Check the merge workflow status
here

Signed-off-by: Justin Chu <justinchuby@users.noreply.github.com>
Signed-off-by: Justin Chu <justinchuby@users.noreply.github.com>
@pytorchmergebot
Copy link
Collaborator

Merge failed

Reason: New commits were pushed while merging. Please rerun the merge command.

Details for Dev Infra team Raised by workflow job

Signed-off-by: Justin Chu <justinchuby@users.noreply.github.com>
@justinchuby
Copy link
Collaborator Author

@pytorchbot merge

@pytorchmergebot
Copy link
Collaborator

Merge started

Your change will be merged once all checks pass (ETA 0-4 Hours).

Learn more about merging in the wiki.

Questions? Feedback? Please reach out to the PyTorch DevX Team

Advanced Debugging
Check the merge workflow status
here

@justinchuby justinchuby deleted the justinchu/opset-version branch August 23, 2025 13:46
markc-614 pushed a commit to markc-614/pytorch that referenced this pull request Sep 17, 2025
After we switched to constructing the registry with the specified opset version in dynamo=True, support for opset<18 was broken because there would be no torchlib ops registered for these opsets. I updated the registry creation logic to always use opset 18 if the requested opset is lower, and use the version converter (as designed) to target those opsets.

This requires onnxscript>=0.4 (pytorch#161312)

Fixes onnx/onnx#7235

Pull Request resolved: pytorch#161056
Approved by: https://github.com/titaiwangms
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

ciflow/trunk Trigger trunk jobs on your pull request Merged module: onnx Related to torch.onnx open source release notes: onnx torch.onnx related changes that should show up in the release notes topic: bug fixes topic category

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Onnx resnet18 conversion fails on opset 17

4 participants