如何将更多参数传递给调度程序?
How to pass more arguments to scheduler?
我正在研究示例 in this link,但我不确定调度程序函数是如何同时接收纪元和学习率 (lr) 的。他们是如何通过的?我怎样才能传递更多参数?
我尝试按照此示例进行操作,但收到一条错误消息,指出调度程序收到了一个额外的参数“lr”,因此我不确定如何解决该问题。
你通常使用tf.keras.optimizers.schedules and pass them directly to the model optimizer. The link you are referring to is actually a callback
that needs some kind of scheduler function. Here is an example of an tf.keras.optimizers.schedules.ExponentialDecay
function according to the docs:
initial_learning_rate = 0.1
lr_schedule = tf.keras.optimizers.schedules.ExponentialDecay(
initial_learning_rate,
decay_steps=100000,
decay_rate=0.96,
staircase=True)
model.compile(optimizer=tf.keras.optimizers.Adam(learning_rate=lr_schedule),
loss='sparse_categorical_crossentropy',
metrics=['accuracy'])
还有这个link shows you a good example of how to use a custom callable, which uses the initial learning rate defined in the optimizer. In this case, it is 0.01.
我正在研究示例 in this link,但我不确定调度程序函数是如何同时接收纪元和学习率 (lr) 的。他们是如何通过的?我怎样才能传递更多参数?
我尝试按照此示例进行操作,但收到一条错误消息,指出调度程序收到了一个额外的参数“lr”,因此我不确定如何解决该问题。
你通常使用tf.keras.optimizers.schedules and pass them directly to the model optimizer. The link you are referring to is actually a callback
that needs some kind of scheduler function. Here is an example of an tf.keras.optimizers.schedules.ExponentialDecay
function according to the docs:
initial_learning_rate = 0.1
lr_schedule = tf.keras.optimizers.schedules.ExponentialDecay(
initial_learning_rate,
decay_steps=100000,
decay_rate=0.96,
staircase=True)
model.compile(optimizer=tf.keras.optimizers.Adam(learning_rate=lr_schedule),
loss='sparse_categorical_crossentropy',
metrics=['accuracy'])
还有这个link shows you a good example of how to use a custom callable, which uses the initial learning rate defined in the optimizer. In this case, it is 0.01.