Track_grad_norm flag not working

I use the trainer and tensorboard logger like this and I cannot find any gradient information logged to the tensorboard. Can anyone help?

logger = pl_loggers.TensorBoardLogger(expr_dir)
trainer = pl.Trainer(
  logger=logger,
  max_epochs=100,
  gpus=n_gpus,
  track_grad_norm=2,
  distributed_backend='dp')