Does lightning alway return loss in training step?

I saw in the PyTorch-lightning document, there is always a “return loss” in the training_step function, but nothing returns to validation_step or test_step, is there a return loss to training_step unconditionally? Could you give me a link if there is any explanation on this in the documentation?

Hello, my apology for the late reply. We are slowly converging to deprecate this forum in favor of the GH build-in version… Could we kindly ask you to recreate your question there - Lightning Discussions