Set torch.utils.data.DataLoader.batch_sampler
epoch if defined
#3123
Labels
enhancement
New (engineering) enhancements, such as features or API changes.
🚀 Feature Request
Current implementation of composer trainer call the
DistributedSampler.set_epoch
method only on theDataloader.sampler
attribute but not on theDataloader.batch_sampler
, even if it is defined.One example here.
Motivation
When doing distributed learning based on batch sampler one might want the epoch to be properly set on the
batch_sampler
since it is usually used to seed new seed over time.This could be useful for Metric learning where
batch_sampler
can be a worth feature.For now composer trainer only handle regular sampler.
Implementation
I'll propose a PR with a technical implementation. The idea is to check if batch_sampler is defined, in which case we can set the epoch properly, else we do it on the regular sampler (always defined in
torch.utils.data.DalaLoader
).The text was updated successfully, but these errors were encountered: