如何将变量添加到 Keras 中的进度条?
How to add variables to progress bar in Keras?
我想监控例如。进度条和 Tensorboard 中 Keras 训练期间的学习率。我认为必须有一种方法来指定记录哪些变量,但是 Keras website.
上没有立即澄清这个问题
我想这与创建自定义 Callback 函数有关,但是,应该可以修改已经存在的进度条回调,不是吗?
可以通过自定义指标来实现。以学习率为例:
def get_lr_metric(optimizer):
def lr(y_true, y_pred):
return optimizer.lr
return lr
x = Input((50,))
out = Dense(1, activation='sigmoid')(x)
model = Model(x, out)
optimizer = Adam(lr=0.001)
lr_metric = get_lr_metric(optimizer)
model.compile(loss='binary_crossentropy', optimizer=optimizer, metrics=['acc', lr_metric])
# reducing the learning rate by half every 2 epochs
cbks = [LearningRateScheduler(lambda epoch: 0.001 * 0.5 ** (epoch // 2)),
TensorBoard(write_graph=False)]
X = np.random.rand(1000, 50)
Y = np.random.randint(2, size=1000)
model.fit(X, Y, epochs=10, callbacks=cbks)
LR会打印在进度条中:
Epoch 1/10
1000/1000 [==============================] - 0s 103us/step - loss: 0.8228 - acc: 0.4960 - lr: 0.0010
Epoch 2/10
1000/1000 [==============================] - 0s 61us/step - loss: 0.7305 - acc: 0.4970 - lr: 0.0010
Epoch 3/10
1000/1000 [==============================] - 0s 62us/step - loss: 0.7145 - acc: 0.4730 - lr: 5.0000e-04
Epoch 4/10
1000/1000 [==============================] - 0s 58us/step - loss: 0.7129 - acc: 0.4800 - lr: 5.0000e-04
Epoch 5/10
1000/1000 [==============================] - 0s 58us/step - loss: 0.7124 - acc: 0.4810 - lr: 2.5000e-04
Epoch 6/10
1000/1000 [==============================] - 0s 63us/step - loss: 0.7123 - acc: 0.4790 - lr: 2.5000e-04
Epoch 7/10
1000/1000 [==============================] - 0s 61us/step - loss: 0.7119 - acc: 0.4840 - lr: 1.2500e-04
Epoch 8/10
1000/1000 [==============================] - 0s 61us/step - loss: 0.7117 - acc: 0.4880 - lr: 1.2500e-04
Epoch 9/10
1000/1000 [==============================] - 0s 59us/step - loss: 0.7116 - acc: 0.4880 - lr: 6.2500e-05
Epoch 10/10
1000/1000 [==============================] - 0s 63us/step - loss: 0.7115 - acc: 0.4880 - lr: 6.2500e-05
然后,您可以在TensorBoard中可视化LR曲线。
另一种方式(实际上是 encouraged one) of how to pass custom values to TensorBoard is by sublcassing the keras.callbacks.TensorBoard
class。这允许您应用自定义函数来获取所需的指标并将它们直接传递给 TensorBoard。
这里是 Adam
优化器学习率的例子:
class SubTensorBoard(TensorBoard):
def __init__(self, *args, **kwargs):
super(SubTensorBoard, self).__init__(*args, **kwargs)
def lr_getter(self):
# Get vals
decay = self.model.optimizer.decay
lr = self.model.optimizer.lr
iters = self.model.optimizer.iterations # only this should not be const
beta_1 = self.model.optimizer.beta_1
beta_2 = self.model.optimizer.beta_2
# calculate
lr = lr * (1. / (1. + decay * K.cast(iters, K.dtype(decay))))
t = K.cast(iters, K.floatx()) + 1
lr_t = lr * (K.sqrt(1. - K.pow(beta_2, t)) / (1. - K.pow(beta_1, t)))
return np.float32(K.eval(lr_t))
def on_epoch_end(self, episode, logs = {}):
logs.update({"lr": self.lr_getter()})
super(SubTensorBoard, self).on_epoch_end(episode, logs)
我来到这个问题是因为我想在 Keras 进度条中记录更多变量。这是我阅读此处答案后的做法:
class UpdateMetricsCallback(tf.keras.callbacks.Callback):
def on_batch_end(self, batch, logs):
logs.update({'my_batch_metric' : 0.1, 'my_other_batch_metric': 0.2})
def on_epoch_end(self, epoch, logs):
logs.update({'my_epoch_metric' : 0.1, 'my_other_epoch_metric': 0.2})
model.fit(...,
callbacks=[UpdateMetricsCallback()]
)
希望对大家有所帮助。
我想监控例如。进度条和 Tensorboard 中 Keras 训练期间的学习率。我认为必须有一种方法来指定记录哪些变量,但是 Keras website.
上没有立即澄清这个问题我想这与创建自定义 Callback 函数有关,但是,应该可以修改已经存在的进度条回调,不是吗?
可以通过自定义指标来实现。以学习率为例:
def get_lr_metric(optimizer):
def lr(y_true, y_pred):
return optimizer.lr
return lr
x = Input((50,))
out = Dense(1, activation='sigmoid')(x)
model = Model(x, out)
optimizer = Adam(lr=0.001)
lr_metric = get_lr_metric(optimizer)
model.compile(loss='binary_crossentropy', optimizer=optimizer, metrics=['acc', lr_metric])
# reducing the learning rate by half every 2 epochs
cbks = [LearningRateScheduler(lambda epoch: 0.001 * 0.5 ** (epoch // 2)),
TensorBoard(write_graph=False)]
X = np.random.rand(1000, 50)
Y = np.random.randint(2, size=1000)
model.fit(X, Y, epochs=10, callbacks=cbks)
LR会打印在进度条中:
Epoch 1/10
1000/1000 [==============================] - 0s 103us/step - loss: 0.8228 - acc: 0.4960 - lr: 0.0010
Epoch 2/10
1000/1000 [==============================] - 0s 61us/step - loss: 0.7305 - acc: 0.4970 - lr: 0.0010
Epoch 3/10
1000/1000 [==============================] - 0s 62us/step - loss: 0.7145 - acc: 0.4730 - lr: 5.0000e-04
Epoch 4/10
1000/1000 [==============================] - 0s 58us/step - loss: 0.7129 - acc: 0.4800 - lr: 5.0000e-04
Epoch 5/10
1000/1000 [==============================] - 0s 58us/step - loss: 0.7124 - acc: 0.4810 - lr: 2.5000e-04
Epoch 6/10
1000/1000 [==============================] - 0s 63us/step - loss: 0.7123 - acc: 0.4790 - lr: 2.5000e-04
Epoch 7/10
1000/1000 [==============================] - 0s 61us/step - loss: 0.7119 - acc: 0.4840 - lr: 1.2500e-04
Epoch 8/10
1000/1000 [==============================] - 0s 61us/step - loss: 0.7117 - acc: 0.4880 - lr: 1.2500e-04
Epoch 9/10
1000/1000 [==============================] - 0s 59us/step - loss: 0.7116 - acc: 0.4880 - lr: 6.2500e-05
Epoch 10/10
1000/1000 [==============================] - 0s 63us/step - loss: 0.7115 - acc: 0.4880 - lr: 6.2500e-05
然后,您可以在TensorBoard中可视化LR曲线。
另一种方式(实际上是 encouraged one) of how to pass custom values to TensorBoard is by sublcassing the keras.callbacks.TensorBoard
class。这允许您应用自定义函数来获取所需的指标并将它们直接传递给 TensorBoard。
这里是 Adam
优化器学习率的例子:
class SubTensorBoard(TensorBoard):
def __init__(self, *args, **kwargs):
super(SubTensorBoard, self).__init__(*args, **kwargs)
def lr_getter(self):
# Get vals
decay = self.model.optimizer.decay
lr = self.model.optimizer.lr
iters = self.model.optimizer.iterations # only this should not be const
beta_1 = self.model.optimizer.beta_1
beta_2 = self.model.optimizer.beta_2
# calculate
lr = lr * (1. / (1. + decay * K.cast(iters, K.dtype(decay))))
t = K.cast(iters, K.floatx()) + 1
lr_t = lr * (K.sqrt(1. - K.pow(beta_2, t)) / (1. - K.pow(beta_1, t)))
return np.float32(K.eval(lr_t))
def on_epoch_end(self, episode, logs = {}):
logs.update({"lr": self.lr_getter()})
super(SubTensorBoard, self).on_epoch_end(episode, logs)
我来到这个问题是因为我想在 Keras 进度条中记录更多变量。这是我阅读此处答案后的做法:
class UpdateMetricsCallback(tf.keras.callbacks.Callback):
def on_batch_end(self, batch, logs):
logs.update({'my_batch_metric' : 0.1, 'my_other_batch_metric': 0.2})
def on_epoch_end(self, epoch, logs):
logs.update({'my_epoch_metric' : 0.1, 'my_other_epoch_metric': 0.2})
model.fit(...,
callbacks=[UpdateMetricsCallback()]
)
希望对大家有所帮助。