Accessing available values to monitor when saving checkpoints

I would like to save the top-10 checkpionts along training. By checking documentations, setting save_top_k, monitor and mode options in ModelCheckpoint jointly seem to do the job.

But I am not sure what are the parameters available for the this callback to monitor. Are they logged values saved during training_step() or validation_step() through self.log("loss", XYZ)?

Thank you in advance!