KEMBAR78
Revert "torch.set_num_threads sets MKL option too" by ssnl · Pull Request #4967 · pytorch/pytorch · GitHub
Skip to content

Conversation

@ssnl
Copy link
Collaborator

@ssnl ssnl commented Jan 31, 2018

Reverts #4949

The recent addition of setting MKL num threads in torch.set_num_threads in multiprocessing on cuda CI machines causes segfaults. Our dataloader workers indeed do torch.set_num_threads(1) before the loading loop. Let's revert this first to unblock the PRs.

@soumith soumith merged commit f2d3f20 into pytorch:master Jan 31, 2018
@ssnl ssnl deleted the revert-4949-mklthreads branch January 31, 2018 20:39
@apaszke
Copy link
Contributor

apaszke commented Jan 31, 2018

Hey, why did you revert multiple other PRs?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants