对数损失和交叉熵损失的结果不同

Not the same result for log loss and cross entropy loss

The negative log-likelihood for logistic regression is given by […] This is also called the cross-entropy error function.

— Page 246, Machine Learning: A Probabilistic Perspective, 2012

所以我试了一下,发现有点不同:

from sklearn.metrics import log_loss
y_true = [0, 0 , 0, 0]
y_pred = [0.5, 0.5, 0.5, 0.5]
log_loss(y_true, y_pred, labels=[0, 1]) # 0.6931471805599453

from math import log2
def cross_entropy(p, q):
    return -sum([p[i]*log2(q[i]) for i in range(len(p))])
cross_entropy(y_true, y_pred) #-0.0

为什么?

首先,sklearn.metrics.log_loss 自然对数 math.lognumpy.log)应用于概率,而不是 base-2 对数。

其次,您显然得到了 -0.0,因为在 y_true 中将对数概率乘以零。对于二进制情况,log-loss 是

-logP(y_true, y_pred) = -(y_true*log(y_pred) + (1-y_true)*log(1-y_pred))

第三,你忘了在你的代码中计算平均对数损失。

from math import log

def bin_cross_entropy(p, q):
    n = len(p)
    return -sum(p[i]*log(q[i]) + (1-p[i])*log(1-q[i]) for i in range(n)) / n

bin_cross_entropy(y_true, y_pred)  # 0.6931471805599453