Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

prefetch_factor #45

Open
zhuang-maowei opened this issue May 20, 2023 · 1 comment
Open

prefetch_factor #45

zhuang-maowei opened this issue May 20, 2023 · 1 comment

Comments

@zhuang-maowei
Copy link

fastNLP\core\dataloaders\torch_dataloader\fdl.py in init(self, dataset, batch_size, shuffle, sampler, batch_sampler, num_workers, collate_fn, pin_memory, drop_last, timeout, worker_init_fn, multiprocessing_context, generator, prefetch_factor, persistent_workers, **kwargs)
150 persistent_workers=persistent_workers)
...
--> 245 raise ValueError('prefetch_factor option could only be specified in multiprocessing.'
246 'let num_workers > 0 to enable multiprocessing, otherwise set prefetch_factor to None.')
247 elif num_workers > 0 and prefetch_factor is None:

ValueError: prefetch_factor option could only be specified in multiprocessing.let num_workers > 0 to enable multiprocessing, otherwise set prefetch_factor to None.

@diyuhuo9977
Copy link

have same problem, set prefetch_factor param value to None can fix this error
fastNLP==1.0.1
fasthan==2.0
path:
FastModel.py#__preprocess_sentence

dataloader = TorchDataLoader(dataset, **prefetch_factor=None,**
                                     batch_sampler=BucketedBatchSampler(
                                         dataset=dataset,
                                         batch_size=1,
                                         length='seq_len'))

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants