自定义回调中的访问损失和模型
Access loss and model in a custom callback
我读了这个 https://www.tensorflow.org/guide/keras/custom_callback,但我不知道如何获得所有其他参数。
这是我的代码
(hits, ndcgs) = evaluate_model(model, testRatings, testNegatives, topK, evaluation_threads)
hr, ndcg, loss = np.array(hits).mean(), np.array(ndcgs).mean(), hist.history['loss'][0]
print('Iteration %d [%.1f s]: HR = %.4f, NDCG = %.4f, loss = %.4f [%.1f s]'
% (epoch, t2-t1, hr, ndcg, loss, time()-t2))
if hr > best_hr:
best_hr, best_ndcg, best_iter = hr, ndcg, epoch
if args.out > 0:
model.save(model_out_file, overwrite=True)
如您所见,我需要 model
、hist
和 model.save
。
有没有办法在自定义回调中使用这三个参数?
这样我就可以将所有这些写入自定义回调中。
class CustomCallback(keras.callbacks.Callback):
def on_epoch_end(self, logs=None):
keys = list(logs.keys())
print("Stop training; got log keys: {}".format(keys))
该模型是一个 attribute of tf.keras.callbacks.Callback
, so you can access it directly with self.model
. For accessing the value of the loss, you can use the "logs" object that is passed to the methods of tf.keras.callbacks.Callback
,它将包含一个名为“loss”的键。
如果你需要访问其他变量(在训练期间不会改变),那么你可以将它们设置为回调的实例变量,并通过定义 __init__
函数。
class CustomCallback(keras.callbacks.Callback):
def __init__(self, testRatings, testNegatives, topK, evaluation_threads):
super().__init__()
self.testRatings = testRatings
self.testNegatives = testNegatives
self.topK = topK
self.evaluation_threads = evaluation_threads
def on_epoch_end(self, epoch, logs=None):
logs = logs or {}
current_loss = logs.get("loss")
if current_loss:
print("my_loss: ", current_loss)
print("my_model", self.model)
# the attributes are accessble with self
print("my topK atributes", self.topK)
# you can then create the callback by passing the correct attributes
my_callback = CustomCallback(testRatings, testNegatives, topK, evaluation_threads)
注意:如果你想做的是在每个时期之间评估模型,并在模型获得最佳指标时保存模型,我建议你看看:
fit
function,真正可以提供测试集的地方
- metrics module,提供将在训练集和测试集上计算的指标
ModelCheckpoint
callback,将在每个 epoch 保存模型,如果提供选项 save_best_only
,则保持最佳权重
我读了这个 https://www.tensorflow.org/guide/keras/custom_callback,但我不知道如何获得所有其他参数。
这是我的代码
(hits, ndcgs) = evaluate_model(model, testRatings, testNegatives, topK, evaluation_threads)
hr, ndcg, loss = np.array(hits).mean(), np.array(ndcgs).mean(), hist.history['loss'][0]
print('Iteration %d [%.1f s]: HR = %.4f, NDCG = %.4f, loss = %.4f [%.1f s]'
% (epoch, t2-t1, hr, ndcg, loss, time()-t2))
if hr > best_hr:
best_hr, best_ndcg, best_iter = hr, ndcg, epoch
if args.out > 0:
model.save(model_out_file, overwrite=True)
如您所见,我需要 model
、hist
和 model.save
。
有没有办法在自定义回调中使用这三个参数?
这样我就可以将所有这些写入自定义回调中。
class CustomCallback(keras.callbacks.Callback):
def on_epoch_end(self, logs=None):
keys = list(logs.keys())
print("Stop training; got log keys: {}".format(keys))
该模型是一个 attribute of tf.keras.callbacks.Callback
, so you can access it directly with self.model
. For accessing the value of the loss, you can use the "logs" object that is passed to the methods of tf.keras.callbacks.Callback
,它将包含一个名为“loss”的键。
如果你需要访问其他变量(在训练期间不会改变),那么你可以将它们设置为回调的实例变量,并通过定义 __init__
函数。
class CustomCallback(keras.callbacks.Callback):
def __init__(self, testRatings, testNegatives, topK, evaluation_threads):
super().__init__()
self.testRatings = testRatings
self.testNegatives = testNegatives
self.topK = topK
self.evaluation_threads = evaluation_threads
def on_epoch_end(self, epoch, logs=None):
logs = logs or {}
current_loss = logs.get("loss")
if current_loss:
print("my_loss: ", current_loss)
print("my_model", self.model)
# the attributes are accessble with self
print("my topK atributes", self.topK)
# you can then create the callback by passing the correct attributes
my_callback = CustomCallback(testRatings, testNegatives, topK, evaluation_threads)
注意:如果你想做的是在每个时期之间评估模型,并在模型获得最佳指标时保存模型,我建议你看看:
fit
function,真正可以提供测试集的地方- metrics module,提供将在训练集和测试集上计算的指标
ModelCheckpoint
callback,将在每个 epoch 保存模型,如果提供选项save_best_only
,则保持最佳权重