如何在 Keras 自定义损失函数中使用张量?
How to use the tensors inside a Keras custom loss function?
我需要训练一个带有自定义损失函数的模型,该模型还应在预测后立即更新一些外部函数,如下所示:
def loss_fct(y_true, y_pred):
global feeder
# Change values of feeder given y_pred
for value in y_pred:
feeder.do_something(value)
return K.mean(y_true - y_pred, axis=-1)
但是这不起作用,因为 TF 无法迭代 AutoGraph 中的张量:
OperatorNotAllowedInGraphError: iterating over `tf.Tensor` is not allowed: AutoGraph did convert this function. This might indicate you are trying to use an unsupported feature.
我的模型是这样的
model = Sequential()
model.add(Input(shape=(DIM, )))
model.add(Dense(DIM, activation=None))
model.add(Dense(16, activation=None))
model.add(Dense(4, activation="softmax"))
model.compile(optimizer="adam", loss=loss_fct)
model.summary()
它是这样训练的:
model.fit(x=feeder.feed,
epochs=18,
verbose=1,
callbacks=None,
)
其中 feeder.feed
是生成 2 个 NumPy 数组的生成器。
经过大量研究,我发现 . It seems that nothing is wrong with the approach, but it's rather a Tensorflow >= 2.2.0
bug, where Eager Execution 默认启用。
最后,要解决此问题,请使用 model.compile(..., run_eagerly=True)
并在训练期间迭代和访问张量。
我需要训练一个带有自定义损失函数的模型,该模型还应在预测后立即更新一些外部函数,如下所示:
def loss_fct(y_true, y_pred):
global feeder
# Change values of feeder given y_pred
for value in y_pred:
feeder.do_something(value)
return K.mean(y_true - y_pred, axis=-1)
但是这不起作用,因为 TF 无法迭代 AutoGraph 中的张量:
OperatorNotAllowedInGraphError: iterating over `tf.Tensor` is not allowed: AutoGraph did convert this function. This might indicate you are trying to use an unsupported feature.
我的模型是这样的
model = Sequential()
model.add(Input(shape=(DIM, )))
model.add(Dense(DIM, activation=None))
model.add(Dense(16, activation=None))
model.add(Dense(4, activation="softmax"))
model.compile(optimizer="adam", loss=loss_fct)
model.summary()
它是这样训练的:
model.fit(x=feeder.feed,
epochs=18,
verbose=1,
callbacks=None,
)
其中 feeder.feed
是生成 2 个 NumPy 数组的生成器。
经过大量研究,我发现 Tensorflow >= 2.2.0
bug, where Eager Execution 默认启用。
最后,要解决此问题,请使用 model.compile(..., run_eagerly=True)
并在训练期间迭代和访问张量。