LightningCLI and I’d like to start training at epoch 0 from a model checkpoint. I tried the
--trainer.resume_from_checkpoint flag but it resumes from epoch 50, which was the last epoch when training the checkpoint. This does not play nicely with my learning rate schedule. So what I really want is to use the checkpoint as initialization and start training at epoch 0. Any idea? Thanks!