交叉熵和对数损失误差有什么区别?

What is the difference between cross-entropy and log loss error?

交叉熵和对数损失误差有什么区别?两者的公式似乎非常相似。

它们本质上是一样的;通常,我们使用术语 log loss 来表示二进制 classification 问题,而更一般的 cross-entropy (loss) multi-class class化的一般情况,但即使这种区别也不一致,您会经常发现这些术语可互换用作同义词。

来自Wikipedia entry for cross-entropy

The logistic loss is sometimes called cross-entropy loss. It is also known as log loss

来自fast.ai wiki entry on log loss[link现已死亡]:

Log loss and cross-entropy are slightly different depending on the context, but in machine learning when calculating error rates between 0 and 1 they resolve to the same thing.

来自ML Cheatsheet

Cross-entropy loss, or log loss, measures the performance of a classification model whose output is a probability value between 0 and 1.