Search code examples
pytorchlearning-rateviewmodel-savedstate

Re-setting learning rate while training in Pytorch


I am training a model using Learning Rate Scheduler in Pytorch to decrease the value of learning rate. By using learning rate scheduler, I reduced learning rate from 0.0001 to 1e-5, and save all the weights, parameters, learning rate values, etc at a particular checkpoint. Now, I want to resume training the model, but with different value of learning rate, while remaining all other values. How can I do this? This is the code for saving checkpoint. I used Adam optimizer

checkpoint = {
        'epoch': epoch + 1,
        'val_loss_min': val_loss['total'].avg,
        'state_dict': model.state_dict(),
        'optimizer': optimizer.state_dict(),
        'scheduler': scheduler.state_dict(),
    }

When loading checkpoint, I used this code:

checkpoint = torch.load(args.SAVED_MODEL)
        # Load current epoch from checkpoint
        epochs = checkpoint['epoch']
        # Load state_dict from checkpoint to model
        model.load_state_dict(checkpoint['state_dict'])
        # Load optimizer from checkpoint to optimizer
        optimizer.load_state_dict(checkpoint['optimizer'])
        # Load valid_loss_min from checkpoint to valid_loss_min
        val_loss_min = checkpoint['val_loss_min']
        # Load scheduler from checkpoint to scheduler
        scheduler.load_state_dict(checkpoint['scheduler'])

Solution

  • You can change the learning rate of your optimizer by accessing its param_groups attribute. Depending on whether you have multiple groups or not, you can do the following (after having loaded the checkpoint onto it):

    for g in optimizer.param_groups:
        g['lr'] = new_lr