-
Notifications
You must be signed in to change notification settings - Fork 25.7k
StackDataset batched sampling #110694
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
StackDataset batched sampling #110694
Conversation
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/110694
Note: Links to docs will display an error until the docs builds have been completed. ✅ You can merge normally! (2 Unrelated Failures)As of commit 7a31096 with merge base 65afa76 ( UNSTABLE - The following jobs failed but were likely due to flakiness present on trunk and has been marked as unstable:
This comment was automatically generated by Dr. CI and updates every 15 minutes. |
|
@ejguan will you be able to review this? |
Sure thing. I will review it later today. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Overall LGTM with a few nit comments
torch/utils/data/dataset.py
Outdated
| dict_batch: List[T_dict] = [{} for _ in indices] | ||
| for k, dataset in self.datasets.items(): | ||
| if callable(getattr(dataset, "__getitems__", None)): | ||
| for data, d_sample in zip(dataset.__getitems__(indices), dict_batch): # type: ignore[attr-defined] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Could you pls add an Error if the size of result from dataset.__getitems__(indices) is not correct?
torch/utils/data/dataset.py
Outdated
| list_batch: List[list] = [[] for _ in indices] | ||
| for dataset in self.datasets: | ||
| if callable(getattr(dataset, "__getitems__", None)): | ||
| for data, t_sample in zip(dataset.__getitems__(indices), list_batch): # type: ignore[attr-defined] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ditto
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks, overall LGTM with a comment on the test case.
| raise ValueError("Nested dataset's output size mismatch." | ||
| f" Expected {len(indices)}, got {len(items)}") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can you pls add an unit test to validate the ValueError is raised as expectation?
|
@pytorchbot merge |
Merge startedYour change will be merged once all checks pass (ETA 0-4 Hours). Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team |
Optimization of loading minibatches