LM.load_from_checkpoint() doesn't allow me to use LM.model

Hi all,

I tried to create an inference program using my PL module.
At first, I though that I can load a PL module (e.g., my_pl.load_from_checkpoint(“xyz.pt”)) and use the model inside it (e.g., my_pl.model(x)).
But its output was completely garbage :frowning:

After some try-and-error, I found a workaround as follows:

my_model = MyPlModule().model
state_dict = {}
for k, v in torch.load(“xyz.pt”)[“state_dict”].items():
state_dict[k[6:]] = v # remove prefix “model.”

This code works, but doesn’t look nice :-0

Can you give an advice how to load and use a model for inference purpose?

Best regards,