Implement multiple losses in PL

Can someone help me implement multiple losses in lightning. I have tried to implement softmax loss + center loss but i dont think the gradients are backpropogated properly. The loss doesnt seem to be coming down

def training_step(self, batch, batch_idx):
        inputs, target = batch
        features, output = self.forward(inputs)
        values_, indices_ = torch.topk(output, 1)
        values_ = values_.flatten()
        indices_ = indices_.flatten()
        GAP = torch.tensor(self.GAP(indices_, values_, target))
        softmax_loss = nn.CrossEntropyLoss()(output, target.long())
        loss = self.center_loss(features, target) * 5e-5 + softmax_loss.item()

Have you tried lowering your learning rate?

if you want to backprop through the softmax_loss, you should remove .item() as it detaches it from the computational graph