-
Notifications
You must be signed in to change notification settings - Fork 25.7k
[ONNX] Drop draft_export in exporter API #161454
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/161454
Note: Links to docs will display an error until the docs builds have been completed. ✅ No FailuresAs of commit 94ffaa7 with merge base ca9fe01 ( This comment was automatically generated by Dr. CI and updates every 15 minutes. |
|
@pytorchbot merge |
Merge startedYour change will be merged once all checks pass (ETA 0-4 Hours). Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team |
Merge failedReason: 2 jobs have failed, first few of them are: trunk / win-vs2022-cuda12.6-py3 / build, trunk / win-vs2022-cpu-py3 / build Details for Dev Infra teamRaised by workflow job |
|
@pytorchbot merge |
Merge startedYour change will be merged once all checks pass (ETA 0-4 Hours). Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team |
Use `TORCH_ONNX_ENABLE_DRAFT_EXPORT` to control whether draft_export should be used as a strategy in onnx export. Follow up of #161454 Pull Request resolved: #162225 Approved by: https://github.com/xadupre, https://github.com/titaiwangms
Use `TORCH_ONNX_ENABLE_DRAFT_EXPORT` to control whether draft_export should be used as a strategy in onnx export. Follow up of pytorch#161454 Pull Request resolved: pytorch#162225 Approved by: https://github.com/xadupre, https://github.com/titaiwangms
If onnx exporter fallbacks to draft_export with big models, this is taking forever for users, and possibly spam the printout, which keeps users from their stack trace with strict=False. We could consider make another API for draft_export as debugging tool, or combine it with report=True when "model is small"? Pull Request resolved: pytorch#161454 Approved by: https://github.com/justinchuby
Use `TORCH_ONNX_ENABLE_DRAFT_EXPORT` to control whether draft_export should be used as a strategy in onnx export. Follow up of pytorch#161454 Pull Request resolved: pytorch#162225 Approved by: https://github.com/xadupre, https://github.com/titaiwangms
Use `TORCH_ONNX_ENABLE_DRAFT_EXPORT` to control whether draft_export should be used as a strategy in onnx export. Follow up of pytorch#161454 Pull Request resolved: pytorch#162225 Approved by: https://github.com/xadupre, https://github.com/titaiwangms
Use `TORCH_ONNX_ENABLE_DRAFT_EXPORT` to control whether draft_export should be used as a strategy in onnx export. Follow up of pytorch#161454 Pull Request resolved: pytorch#162225 Approved by: https://github.com/xadupre, https://github.com/titaiwangms
Use `TORCH_ONNX_ENABLE_DRAFT_EXPORT` to control whether draft_export should be used as a strategy in onnx export. Follow up of pytorch#161454 Pull Request resolved: pytorch#162225 Approved by: https://github.com/xadupre, https://github.com/titaiwangms
If onnx exporter fallbacks to draft_export with big models, this is taking forever for users, and possibly spam the printout, which keeps users from their stack trace with strict=False.
We could consider make another API for draft_export as debugging tool, or combine it with report=True when "model is small"?
cc @justinchuby