I have trained an Inception Resnet v2 model on a dataset for 61000 steps so far with the following values in the configuration file of the model:
adam_optimizer: {
learning_rate: {
manual_step_learning_rate {
initial_learning_rate: 0.0003
schedule {
step: 150000
learning_rate: .0001
}
Now, If I want to reduce the learning rate of my model from now on, will making the below change:
adam_optimizer: {
learning_rate: {
manual_step_learning_rate {
initial_learning_rate: 0.0003
schedule {
step: 60000
learning_rate: .0001
}
And restarting from the checkpoint actually reduces the learning rate of my model from 0.0003
to 0.0001
since the number of steps that it has already trained for so far is greater than 60000?
If not, is there any other way to achieve this?
One possible way is to use the already trained 61000 steps model file as the fine-tune checkpoint and then you can modify the lr as you like. In this case, you are essentially training from step 1.