Retraining only for few steps

Scenario: I have a model trained on a dataset.

So any false positives or false negatives I encounter during testing, I want to use those samples to retrain my model, but this time, not adding to the entire dataset and retrain, but only using those few numbers of the samples (also by augmenting the new data with few samples) and use a very smaller learning rate and freezing few layers.

Is there a way to do this in PytorchLightning, without rewriting much

You can load a model from checkpoint, and pass new hyperparameters:

retrain_model = Model.load_from_checkpoint(
    "pretrained.ckpt", 
    learning_rate=0.0001, 
    freeze=True
)

trainer.fit(retrain_model, incorrect_samples_datamodule)

If you want to do this immediately after the initial training, you can just set the class parameters manually:

model.freeze = True
model.learning_rate = 0.0001

Then your model would just need some logic to set requires_grad=False to your desired layers when freeze is True. Hope this helps!

1 Like

Thank you Teddy, That makes sense, Will try and let you know