Search code examples
pythonloggingpytorchpytorch-lightning

What is called when `log_every_n_steps` of a pytorch lightning trainer is reached?


PL lightning trainer offers a parameter log_every_n_steps which it states controls "How often to add logging rows", however what is the function actually being called here? We can do our own logging every step with the example code below

def training_step(self, batch, batch_idx):
    self.log("performance", {"acc": acc, "recall": recall})

But is the trainer doing the same at the every nth step?


Solution

  • log_every_n_steps will make the training log every n batches. This value is used by self.log if on_step=True. If you want a less bloated log-file, with the results per epoch only, you could do self.log(metrics, on_step=False, on_epoch=True)