Dynamic dataloader

Hi everyone!

What’s the best way to periodically reinitialize train_dataloader. Like if I want to use progressively more aggressive augmentations after a certain number of epochs?


for every epoch reload you can use Trainer(reload_dataloaders_every_epoch=True, ...).

For reloading every n epochs there is a PR for it: https://github.com/PyTorchLightning/pytorch-lightning/pull/5043

1 Like