Skip parameters in load_fromcheckpoint

I have some parameters in my checkpoint that are unused and not needed anymore in my network. But when I call load_fromcheckpoint() this of course throws in error because it finds parameters in the checkpoint that aren’t in the model. How can I prevent this. Removing the parameters from the state_dict by fist manually loading the checkpoint with torch.load() and then removing the items from the state_dict seems to break the checkpoint, so it can’t be read by load_fromcheckpoint().

My Code:

torch_checkpoint = torch.load(args.checkpoint)

        import re
        r1 = re.compile('model.up_compress_r2p_layers.*')
        r2 = re.compile('model.ds_compress_r2p_layers.*')
        for k in list(filter(r1.match, torch_checkpoint['state_dict'].keys())):
            del torch_checkpoint['state_dict'][k]
        for k in list(filter(r2.match, torch_checkpoint['state_dict'].keys())):
            del torch_checkpoint['state_dict'][k]
        

        model = FFB6DModule.load_from_checkpoint(torch_checkpoint)

The Error:

AttributeError: 'dict' object has no attribute 'seek'. You can only torch.load from a file that is seekable. Please pre-load the data into a buffer like io.BytesIO and try to load from it instead.

Using model = FFB6DModule.load_from_checkpoint(args.checkpoint, strict=False) is the solution.