Call optimizer_step inside train_epoch_end()

I am training a model where I need to calculate the gradients of loss functions wrt to the model parameters after all the training batches have been used. Can I use train_epoch_end(), pass the losses of each individual batches to train_epoch_end() and call optimizer_step(). Otherwise the model parameters will be update wrt each batch-