"Binary Cross Entropy" 的 Tensorflow 损失是多少?
What is the Tensorflow loss equivalent of "Binary Cross Entropy"?
我正在尝试将 Keras 图重写为 Tensorflow 图,但想知道哪个损失函数相当于 "Binary Cross Entropy"。是 tf.nn.softmax_cross_entropy_with_logits_v2?
非常感谢!
不,带有 tensorflow 后端的 binary_crossentropy
的实现被定义为 here 为
@tf_export('keras.backend.binary_crossentropy')
def binary_crossentropy(target, output, from_logits=False):
"""Binary crossentropy between an output tensor and a target tensor.
Arguments:
target: A tensor with the same shape as `output`.
output: A tensor.
from_logits: Whether `output` is expected to be a logits tensor.
By default, we consider that `output`
encodes a probability distribution.
Returns:
A tensor.
"""
# Note: nn.sigmoid_cross_entropy_with_logits
# expects logits, Keras expects probabilities.
if not from_logits:
# transform back to logits
epsilon_ = _to_tensor(epsilon(), output.dtype.base_dtype)
output = clip_ops.clip_by_value(output, epsilon_, 1 - epsilon_)
output = math_ops.log(output / (1 - output))
return nn.sigmoid_cross_entropy_with_logits(labels=target, logits=output)
因此,它使用 sigmoid_crossentropy
而不是 softmax_crossentropy
。
我正在尝试将 Keras 图重写为 Tensorflow 图,但想知道哪个损失函数相当于 "Binary Cross Entropy"。是 tf.nn.softmax_cross_entropy_with_logits_v2?
非常感谢!
不,带有 tensorflow 后端的 binary_crossentropy
的实现被定义为 here 为
@tf_export('keras.backend.binary_crossentropy')
def binary_crossentropy(target, output, from_logits=False):
"""Binary crossentropy between an output tensor and a target tensor.
Arguments:
target: A tensor with the same shape as `output`.
output: A tensor.
from_logits: Whether `output` is expected to be a logits tensor.
By default, we consider that `output`
encodes a probability distribution.
Returns:
A tensor.
"""
# Note: nn.sigmoid_cross_entropy_with_logits
# expects logits, Keras expects probabilities.
if not from_logits:
# transform back to logits
epsilon_ = _to_tensor(epsilon(), output.dtype.base_dtype)
output = clip_ops.clip_by_value(output, epsilon_, 1 - epsilon_)
output = math_ops.log(output / (1 - output))
return nn.sigmoid_cross_entropy_with_logits(labels=target, logits=output)
因此,它使用 sigmoid_crossentropy
而不是 softmax_crossentropy
。