Saving custom layer parameters

Hi there,

I am a new of Pytorch lightning.
I’m trying to add some custom layers to an existing model and I want to check if the custom layers’ parameters are updating each epochs.
As long as I read the document (especially here), on_epoch_start or on_epoch_end hooks could work for me.
I mean that if I create a function that save the model parameters and set it to on_epoch_start/on_epoch_end, I will be able to confirm the parameters changing.

However Pytorch lightning has already implemented a lot of useful functions so I wonder if that is the best solution.
Could anyone give me any comments or other suggestions to accomplish my purpose?

Thank you.

You can use the track_grad_norm flag of the trainer to ensure that the custom layer is receiving gradients.

1 Like

Thank you! Let me try the flag.
Does the flag generate a json file or something?

It plots directly to your tensorboard logs.

OK. Thank you so much!