What is the recommended or most common practices to using a scheduler?

In particular if

  1. I am training with epochs
  2. I am training with iterations instead

For 1 is calling the scheduler every epoch the most common?
For 2 is calling the scheduler every 150 its the most common? (to approximate every epoch).

Or is whenever the validation loss is not decreasing anymore…?

hey @Brando_Miranda

you might want to setup the scheduler configuration accordingly. Check out the examples here: LightningModule — PyTorch Lightning 1.8.0dev documentation

Also, we have moved the discussions to GitHub Discussions. You might want to check that out instead to get a quick response. The forums will be marked read-only soon.

Thank you