Pytorch-lightning: pre-fetch 1 batch disable

Hello,

I am facing a problem of saving current batch output as the input for next batch since the Lightning always load one batch ahead before current training gets output.

https://pytorch-lightning.readthedocs.io/en/stable/guides/data.html

“When iterable datasets are used, Lightning will pre-fetch 1 batch (in addition to the current batch) so it can detect when the training will stop and run validation if necessary.”

I wonder is there any way I can disable this pre-fetch? or is there any way I can save current output first and read the output as the input for the next batch?

Thanks