Whyis loaded checkpoints' saved hparams empty?

One of my model has this line:

def __init__(self, hparams):
    super(DownstreamBridgeModel, self).__init__()
    # TODO: Assume no fine-tuning for now.
    # Set to evaluation mode
    model_encoder = ModelA.load_from_checkpoint(hparams.pretrained_model)
    model_encoder.eval()
    self.model_encoder = model_encoder

    # Merge the namespaces of saved checkpoint and current
    # model
    hparams = Namespace(**vars(model_encoder.hparams_initial), **vars(hparams))
    self.hparams = hparams

What it does is it loads a checkpoint, and some hyperparameters are obtained from the loaded checkpoint. In the model being checkpointed, there are these lines:

    def __init__(self, hparams, *args, **kwargs):
        super(ModelA, self).__init__()

        if isinstance(hparams, dict):
            hparams = Namespace(**hparams)

        self.hparams = hparams
        self.save_hyperparameters(hparams)

The checkpoint can be loaded with no problem. However, when I try to access modelA’s hparams, the namespace object is always empty:

print(vars(self.model_encoder.hparams)) 

This line will always give an empty dictionary. How is this possible that after the initialization of loaded model the object is being cleared? Any help is appreciated, thanks.

It won’t work like this. Here is my suggestion:

def __init__(self, **hparams):  # name your hparams
    super(ModelA, self).__init__()
    # self.hparams = hparams REMOVE THIS
    self.save_hyperparameters()  # NO INPUTS HERE

# if you have a namespace, pass it in like this for example. 
# if it's a dict, the vars call is not needed
model = Model(**vars(hparams))
1 Like

In the official best practice it’s suggested that we pass in a namespace or a dict to the model, and pass that to self.save_hyperparameters(). Why do we not need to pass anything in self.save_hyperparameters?

ref:
https://pytorch-lightning.readthedocs.io/en/latest/hyperparameters.html

@Chris_Liu self.save_hyperparameters() will automatically catch all the kwargs passed as Model(**vars(hparams)) and save them. If you want some specific ones to be saved, then you can pass them like self.save_hyperparameters('layer_1_dim', 'learning_rate')

1 Like