KEMBAR78
[dtensor] Introduce full_tensor API to DTensor by wanchaol · Pull Request #112224 · pytorch/pytorch · GitHub
Skip to content

Conversation

@wanchaol
Copy link
Collaborator

@wanchaol wanchaol commented Oct 27, 2023

Stack from ghstack (oldest at bottom):

This PR introduces a full_tensor API to DTensor, there were so many
callsites that exercises the redistribute(replicate) path and I feel
it deserves a separate API, mostly just a syntactic sugar

This PR introduces a `full_tensor` API to DTensor, there were so many
callsites that exercises the `redistribute(replicate)` path and I feel
it deserves a separate API, mostly just a syntactic sugar

[ghstack-poisoned]
@pytorch-bot
Copy link

pytorch-bot bot commented Oct 27, 2023

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/112224

Note: Links to docs will display an error until the docs builds have been completed.

✅ No Failures

As of commit 1cdc187 with merge base bf01a7b (image):
💚 Looks good so far! There are no failures yet. 💚

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@wanchaol wanchaol added release notes: distributed (dtensor) release notes category ciflow/trunk Trigger trunk jobs on your pull request labels Oct 27, 2023
This PR introduces a `full_tensor` API to DTensor, there were so many
callsites that exercises the `redistribute(replicate)` path and I feel
it deserves a separate API, mostly just a syntactic sugar

[ghstack-poisoned]
wanchaol added a commit that referenced this pull request Oct 27, 2023
This PR introduces a `full_tensor` API to DTensor, there were so many
callsites that exercises the `redistribute(replicate)` path and I feel
it deserves a separate API, mostly just a syntactic sugar

ghstack-source-id: 4708435
Pull Request resolved: #112224
This PR introduces a `full_tensor` API to DTensor, there were so many
callsites that exercises the `redistribute(replicate)` path and I feel
it deserves a separate API, mostly just a syntactic sugar

[ghstack-poisoned]
wanchaol added a commit that referenced this pull request Oct 27, 2023
This PR introduces a `full_tensor` API to DTensor, there were so many
callsites that exercises the `redistribute(replicate)` path and I feel
it deserves a separate API, mostly just a syntactic sugar

ghstack-source-id: ad1157b
Pull Request resolved: #112224
@wanchaol wanchaol requested a review from XilunWu October 31, 2023 00:14
Copy link
Contributor

@wz337 wz337 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great!!! LGTM!
I will follow up to change some of the state dict stuff to use full_tensor API then!

@wanchaol
Copy link
Collaborator Author

@pytorchbot merge

@pytorchmergebot
Copy link
Collaborator

Merge started

Your change will be merged once all checks pass (ETA 0-4 Hours).

Learn more about merging in the wiki.

Questions? Feedback? Please reach out to the PyTorch DevX Team

Advanced Debugging
Check the merge workflow status
here

xuhancn pushed a commit to xuhancn/pytorch that referenced this pull request Nov 7, 2023
This PR introduces a `full_tensor` API to DTensor, there were so many
callsites that exercises the `redistribute(replicate)` path and I feel
it deserves a separate API, mostly just a syntactic sugar
Pull Request resolved: pytorch#112224
Approved by: https://github.com/wz337
Skylion007 pushed a commit to Skylion007/pytorch that referenced this pull request Nov 14, 2023
This PR introduces a `full_tensor` API to DTensor, there were so many
callsites that exercises the `redistribute(replicate)` path and I feel
it deserves a separate API, mostly just a syntactic sugar
Pull Request resolved: pytorch#112224
Approved by: https://github.com/wz337
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

ciflow/trunk Trigger trunk jobs on your pull request Merged release notes: distributed (dtensor) release notes category

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants