Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix on log and metric issue on Windows when using pytorch with multithread #3604

Merged
merged 1 commit into from
May 12, 2021
Merged

Conversation

Ivanfangsc
Copy link
Contributor

@Ivanfangsc Ivanfangsc commented May 4, 2021

The problem with log and metric when using torch.utils.data.DataLoader with num_workers>0 on windows is known for a long time. I find that the log and metric file is whipped of its content whenever a DataLoader is enumerated. I reached out to pytorch for this, and they told me it is a issue with the multiprocessing library. It uses on default spawn mode in windows, in which

unnecessary file descriptors and handles from the parent process will not be inherited (https://docs.python.org/3/library/multiprocessing.html#contexts-and-start-methods)

This cause the log and metric file which are opened at top level in __init__.py and runtime/platform/local.py. Thankfully there is a simple workaround on this, that is to change the file to 'a' open mode. I tested it on my local machine and it works.

@ghost
Copy link

ghost commented May 4, 2021

CLA assistant check
All CLA requirements met.

@ultmaster ultmaster requested a review from liuzhe-lz May 7, 2021 08:49
@ultmaster ultmaster requested a review from J-shang May 7, 2021 08:55
@ultmaster ultmaster added bug Something isn't working Training Service labels May 7, 2021
@ultmaster ultmaster merged commit 85cff74 into microsoft:master May 12, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants