Pinning memory issue #11
Labels
Priority: High
second priority
Status: 3-Completed
finished
Type: Maintenance
maintain existing codes
Milestone
Hi,
I'm currently using ckip-transformers-ws as a preprocessing tool in my project, and I noticed that the DataLoader's pin_memory flag was hard-coded
True
inutil.py
.As pinning memory is incompatible with multiprocessing (or multiple workers) [1], when users leverage ckip-transformers in their collate_fn of DataLoader with multiple workers, a CUDA error will occur as shown in [1], even if only using CPU for inference.
Therefore, I think it would be better that:
Regards.
[1] https://discuss.pytorch.org/t/pin-memory-vs-sending-direct-to-gpu-from-dataset/33891/2
The text was updated successfully, but these errors were encountered: