I'm training a model with this piece of pipeline.config:
optimizer {
momentum_optimizer: {
learning_rate: {
cosine_decay_learning_rate {
learning_rate_base: 0.3
total_steps: 200000
warmup_learning_rate: 0.13333
warmup_steps: 2000
}
}
momentum_optimizer_value: 0.9
}
use_moving_average: false
}
after 200k epochs the learning rate fall to 0
two questions:
I'm using tensorflow 1.15, object_detection API and python 3.6
You can't. You can only do that with ExponentialDecayLearningRate
. See the source.
message ExponentialDecayLearningRate {
optional float initial_learning_rate = 1 [default = 0.002];
optional uint32 decay_steps = 2 [default = 4000000];
optional float decay_factor = 3 [default = 0.95];
optional bool staircase = 4 [default = true];
optional float burnin_learning_rate = 5 [default = 0.0];
optional uint32 burnin_steps = 6 [default = 0];
> optional float min_learning_rate = 7 [default = 0.0];
}