KEMBAR78
StackDataset batched sampling by stsouko · Pull Request #110694 · pytorch/pytorch · GitHub
Skip to content

Conversation

@stsouko
Copy link
Contributor

@stsouko stsouko commented Oct 6, 2023

Optimization of loading minibatches

@stsouko stsouko requested a review from ejguan as a code owner October 6, 2023 10:18
@pytorch-bot pytorch-bot bot added the release notes: dataloader release notes category label Oct 6, 2023
@pytorch-bot
Copy link

pytorch-bot bot commented Oct 6, 2023

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/110694

Note: Links to docs will display an error until the docs builds have been completed.

✅ You can merge normally! (2 Unrelated Failures)

As of commit 7a31096 with merge base 65afa76 (image):

UNSTABLE - The following jobs failed but were likely due to flakiness present on trunk and has been marked as unstable:

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@ezyang ezyang added the triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module label Oct 6, 2023
@ezyang
Copy link
Contributor

ezyang commented Oct 6, 2023

@ejguan will you be able to review this?

@ejguan
Copy link
Contributor

ejguan commented Oct 6, 2023

@ejguan will you be able to review this?

Sure thing. I will review it later today.

Copy link
Contributor

@ejguan ejguan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Overall LGTM with a few nit comments

dict_batch: List[T_dict] = [{} for _ in indices]
for k, dataset in self.datasets.items():
if callable(getattr(dataset, "__getitems__", None)):
for data, d_sample in zip(dataset.__getitems__(indices), dict_batch): # type: ignore[attr-defined]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could you pls add an Error if the size of result from dataset.__getitems__(indices) is not correct?

list_batch: List[list] = [[] for _ in indices]
for dataset in self.datasets:
if callable(getattr(dataset, "__getitems__", None)):
for data, t_sample in zip(dataset.__getitems__(indices), list_batch): # type: ignore[attr-defined]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ditto

@stsouko stsouko requested a review from ejguan October 8, 2023 08:51
Copy link
Contributor

@ejguan ejguan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks, overall LGTM with a comment on the test case.

Comment on lines +262 to +263
raise ValueError("Nested dataset's output size mismatch."
f" Expected {len(indices)}, got {len(items)}")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you pls add an unit test to validate the ValueError is raised as expectation?

@stsouko stsouko requested a review from ejguan October 10, 2023 07:47
@ejguan
Copy link
Contributor

ejguan commented Oct 10, 2023

@pytorchbot merge

@pytorch-bot pytorch-bot bot added the ciflow/trunk Trigger trunk jobs on your pull request label Oct 10, 2023
@pytorchmergebot
Copy link
Collaborator

Merge started

Your change will be merged once all checks pass (ETA 0-4 Hours).

Learn more about merging in the wiki.

Questions? Feedback? Please reach out to the PyTorch DevX Team

Advanced Debugging
Check the merge workflow status
here

@stsouko stsouko deleted the getitems_stack branch October 10, 2023 22:08
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

ciflow/trunk Trigger trunk jobs on your pull request Merged open source release notes: dataloader release notes category triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants