-
Notifications
You must be signed in to change notification settings - Fork 25.7k
[NCCL][Profiler] Add functionality to call dump function of NCCL profiler plugin #137523
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
This appears to be a diff that was exported from phabricator, but the PR author does not have sufficient permissions to run CI. @zhiyongww, please do step 2 of internal wiki to get write access so you do not need to get CI approvals in the future. If you think this is a mistake, please contact the Pytorch Dev Infra team. |
|
|
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/137523
Note: Links to docs will display an error until the docs builds have been completed. ✅ No FailuresAs of commit 162468a with merge base 76ab1ab ( This comment was automatically generated by Dr. CI and updates every 15 minutes. |
|
This pull request was exported from Phabricator. Differential Revision: D61929401 |
…iler plugin (pytorch#137523) Summary: NCCL 2.23.4 provides the profiler plugin feature, which traces collective, p2p, proxyOps, and other events. The diff supports the following feature: when NCCL times out, the flight recorder can also dump traces in the profiler plugin. Test Plan: ``` tensor = torch.tensor([dist.get_rank()], dtype=torch.int32, device=dev) # Create a list with same number of elements as world size (aka no. of ranks) # During allgather this list is going to be populated with tensors from all ranks (aka all gather) gathered_tensors = [torch.zeros_like(tensor) for _ in range(WORLD_SIZE)] # get collective from all ranks if i <= 10 or RANK != 0: dist.all_gather(gathered_tensors, tensor) ``` My script triggers flight recoder. ``` trainer/0 [0]:E0927 12:07:22.643702 1012209 ProcessGroupNCCL.cpp:1356] [PG ID 0 PG GUID 0(default_pg) Rank 0] ProcessGroupNCCL preparing to dump debug info. trainer/0 [0]:I0927 12:07:22.643784 1012209 ProcessGroupNCCL.cpp:392] NCCL_PROFILER_PLUGIN: /data/users/zhiyongww/fbsource/fbcode/scripts/nbahl/libnccl_profiler_plugin.so trainer/0 [0]:I0927 12:07:22.643805 1012209 plugin.cpp:559] Profiler start dump trainer/0 [0]:I0927 12:07:22.645249 1012209 ProcessGroupNCCL.cpp:1363] [PG ID 0 PG GUID 0(default_pg) Rank 0] ProcessGroupNCCL dumping nccl trace to /tmp/nccl_trace_rank_0 trainer/0 [0]:I0927 12:07:22.645418 1012209 NCCLUtils.cpp:348] Finished writing NCCLPG debug info to /tmp/nccl_trace_rank_0 ``` Content from /tmp/nccl_trace_rank_0: P1614645283 Differential Revision: D61929401
2bc47e0 to
162468a
Compare
|
This pull request was exported from Phabricator. Differential Revision: D61929401 |
1 similar comment
|
This pull request was exported from Phabricator. Differential Revision: D61929401 |
|
This appears to be a diff that was exported from phabricator, but the PR author does not have sufficient permissions to run CI. @zhiyongww, please do step 2 of internal wiki to get write access so you do not need to get CI approvals in the future. If you think this is a mistake, please contact the Pytorch Dev Infra team. |
|
@pytorchbot merge -f 'Landed internally' (Initiating merge automatically since Phabricator Diff has merged, using force because this PR might not pass merge_rules.json but landed internally) |
Merge startedYour change will be merged immediately since you used the force (-f) flag, bypassing any CI checks (ETA: 1-5 minutes). Please use Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team |
…ler plugin (#139847) Reverts PR #137523 Reasons for the reversion: 1. NCCL profiler plugin is meant to be opened by NCCL. And the profiler's implementation is meant to be provided by a profiler. There is no evidence that `torch.distributed` is at a better position to be either an opener or a provider. (The PR to be reverted made `torch.distributed` an opener). 2. The main purpose of the reverted PR is to dlopen a dump function, with the help of an environment variable `NCCL_PROFILER_PLUGIN_FUN` that provides the symbol name, as in code below: https://github.com/pytorch/pytorch/blob/c19c38469030cdf399714eb2051887f4583006a8/torch/csrc/distributed/c10d/ProcessGroupNCCL.cpp#L415-L427 After some investigation, NCCL does not support env var `NCCL_PROFILER_PLUGIN_FUN`. And NCCL's profiler contract `nccl_profiler.h` does not have a function called "ncclProfilerPluginDump" defined. So this looks like a private add-on. Pull Request resolved: #139847 Approved by: https://github.com/c-p-i-o
…L profiler plugin (pytorch#139847) Reverts PR pytorch#137523 Reasons for the reversion: 1. NCCL profiler plugin is meant to be opened by NCCL. And the profiler's implementation is meant to be provided by a profiler. There is no evidence that `torch.distributed` is at a better position to be either an opener or a provider. (The PR to be reverted made `torch.distributed` an opener). 2. The main purpose of the reverted PR is to dlopen a dump function, with the help of an environment variable `NCCL_PROFILER_PLUGIN_FUN` that provides the symbol name, as in code below: https://github.com/pytorch/pytorch/blob/c19c38469030cdf399714eb2051887f4583006a8/torch/csrc/distributed/c10d/ProcessGroupNCCL.cpp#L415-L427 After some investigation, NCCL does not support env var `NCCL_PROFILER_PLUGIN_FUN`. And NCCL's profiler contract `nccl_profiler.h` does not have a function called "ncclProfilerPluginDump" defined. So this looks like a private add-on. Pull Request resolved: pytorch#139847 Approved by: https://github.com/c-p-i-o
Summary:
NCCL 2.23.4 provides the profiler plugin feature, which traces collective, p2p, proxyOps, and other events.
The diff supports the following feature: when NCCL times out, the flight recorder can also dump traces in the profiler plugin.
Test Plan:
My script triggers flight recoder.
Content from /tmp/nccl_trace_rank_0: P1614645283
Differential Revision: D61929401
cc @XilunWu @H-Huang @awgu @kwen2501 @wanchaol @fegin @fduwjj @wz337 @wconstab @d4l3k @c-p-i-o