Hello, I’m facing an issue of a weird number of steps per epochs being displayed and processed while training.
the number of steps per epoch should be as defined by my code
len(train_dataloader) // BATCH_SIZE however I’m getting another number that corresponds to neither my train train_dataloser nor the
len(train_dataloader) // BATCH_SIZE
Below is a colab link to my code : https://colab.research.google.com/drive/1w6scXBJwLvZC3UlR1WTrpzBXx1miYTP0?usp=sharing
Any thoughts why I’m getting this ?