View Hparam in log

Please tell me how I can save, display and compare the launch parameters of the model, in some dashboard like tensorboard

You shall be able to see all hyper params in your TensorBoard if you have saved them in your model

class MyModel(LightningModule):
  def __init__(self, my_arg):
    super().__init__()
    ...
    self. save_hyperparameters()

for more information, see ref in docs
evetualy you can have extra call to save your hyper-parameters from Trainer like
trainer.logger.log_hyperparams(model.hparams_initial)

1 Like

tnx!
it’s great work!

hp_metric you know how add in end train, for best values of val set?

You can use for example
trainer.logger.log_metric('hp_metric', trainer.callback_metrics['loss])
at the end of training, for example at on_fit_end() or i do it on_epoch_end(), so if training brake i got last epoch value.

if you are using a checkpointing system, and the hp_metric is what you monitor, you can call checkpoint attribute best_model_score, see the link to the code