Track_grad_norm flag not working

I use the trainer and tensorboard logger like this and I cannot find any gradient information logged to the tensorboard. Can anyone help?

logger = pl_loggers.TensorBoardLogger(expr_dir)
trainer = pl.Trainer(
  logger=logger,
  max_epochs=100,
  gpus=n_gpus,
  track_grad_norm=2,
  distributed_backend='dp')

Hello, my apology for the late reply. We are slowly converging to deprecate this forum in favor of the GH build-in version… Could we kindly ask you to recreate your question there - Lightning Discussions