I have a use case where I need to access the Lightning Module’s optimizers in a callback. The motivation is to call a function in
on_save_checkpoint() to update the optimizer state dict before dumping to a checkpoint. I want to do this in a callback as opposed to the lightning module’s
on_save_checkpoint() as this functionality is specifically tied to the optimizer, which could be used across a number of lightning modules.
trainer.train_loop.get_optimizers_iterable() the right API to use for this purpose? https://github.com/PyTorchLightning/pytorch-lightning/blob/master/pytorch_lightning/trainer/training_loop.py#L548