TensorFlow 网站中降低学习率的示例是否真的降低了学习率?

Does the example for decaying the learning rate in TensorFlow website actually decay the learning rate?

我正在阅读衰减的学习率,并认为文档中可能有错误并想确认。它说衰减方程是:

decayed_learning_rate = learning_rate * decay_rate ^ (global_step / decay_steps)

但是,如果 global_step = 0 我猜永远不会有衰减,对吧?但是,看例子:

...
global_step = tf.Variable(0, trainable=False)
starter_learning_rate = 0.1
learning_rate = tf.train.exponential_decay(starter_learning_rate, global_step,
                                           100000, 0.96, staircase=True)
# Passing global_step to minimize() will increment it at each step.
learning_step = (
    tf.GradientDescentOptimizer(learning_rate)
    .minimize(...my loss..., global_step=global_step)
)

它有一个设置为零的 global_step = tf.Variable(0, trainable=False)。因此,没有腐烂。这是正确的推论吗?

我认为当阶梯函数设置为 true 时整数除法可能会有一个警告,但即使在整数除法中似乎仍然没有衰减。还是对楼梯的作用有误解?

变量global_step传递给minimize函数,每次训练操作learning_step为运行时都会递增。

甚至在你的代码的注释中写了:

# Passing global_step to minimize() will increment it at each step.

除了 Olivier 的回答之外,全局步骤也在 apply_gradients (which is one of the steps in minimize) 中递增。

If global_step was not None, that operation also increments global_step

因此无论您如何优化(仅最小化或修改梯度),全局步长都会增加。