Understanding self.log()

Does self.log(xxx) respect the trainer parameter log_every_n_steps?
If I call self.log(xxx) inside training_step, does that mean it will not log anything until it reaches the required step?
If xxx is expensive to compute, how should one do this?

it depends, it self.log(xxx) is done with on_epoch=False then it will be somewhat ignored if it doesn’t reach the required step w.r.t log_every_n_step but if on_epoch=True then it will be used to aggregate results at epoch end.