Multiple scalars (e.g. train and valid loss) in same Tensorboard graph

Hi everybody,

I’m having some trouble drawing my loss curves. Can someone tell me how to log my train and valid loss in a single graph? From the respective SO question (PyTorch Lightning: Multiple scalars (e.g. train and valid loss) in same Tensorboard graph - Stack Overflow):

With PyTorch Tensorboard I can log my train and valid loss in a single Tensorboard graph like this:

writer = torch.utils.tensorboard.SummaryWriter()

for i in range(1, 100):
    writer.add_scalars('loss', {'train': 1 / i}, i)

for i in range(1, 100):
    writer.add_scalars('loss', {'valid': 2 / i}, i)

enter image description here

How can I achieve the same with Pytorch Lightning’s default Tensorboard logger?

def training_step(self, batch: Tuple[Tensor, Tensor], _batch_idx: int) -> Tensor:
    inputs_batch, labels_batch = batch

    outputs_batch = self(inputs_batch)
    loss = self.criterion(outputs_batch, labels_batch)

    self.log('loss/train', loss.item())  # creates separate graph

    return loss

def validation_step(self, batch: Tuple[Tensor, Tensor], _batch_idx: int) -> None:
    inputs_batch, labels_batch = batch

    outputs_batch = self(inputs_batch)
    loss = self.criterion(outputs_batch, labels_batch)

    self.log('loss/valid', loss.item(), on_step=True)  # creates separate graph

Thanks in advance!

Solved on StackOverflow [1]. The solution is to use self.logger.experiment instead of self.log().

[1] PyTorch Lightning: Multiple scalars (e.g. train and valid loss) in same Tensorboard graph - Stack Overflow

1 Like

Hello, sorry to revive your post.

An issue with the above solution I have is that scalars logged using this technique do not get reduced (e.g. mean). Due to which I am getting saw tooth like graphs. (See attached image). Any leads to correct it?

Capture