我实现交叉熵函数有什么问题?
What is the problem with my implementation of the cross-entropy function?
我正在学习神经网络,我想在python中写一个函数cross_entropy
。其中定义为
其中N
是样本数,k
是class个数,log
是自然对数,如果样本 i
在 class j
中,t_i,j
为 1,否则为 0
,p_i,j
是样本 i
在 class j
中。
为避免对数出现数值问题,请将预测限制在 [10^{−12}, 1 − 10^{−12}]
范围内。
根据上面的描述,我通过将预测剪裁到 [epsilon, 1 − epsilon]
范围内,然后根据上面的公式计算 cross_entropy 来写下代码。
def cross_entropy(predictions, targets, epsilon=1e-12):
"""
Computes cross entropy between targets (encoded as one-hot vectors)
and predictions.
Input: predictions (N, k) ndarray
targets (N, k) ndarray
Returns: scalar
"""
predictions = np.clip(predictions, epsilon, 1. - epsilon)
ce = - np.mean(np.log(predictions) * targets)
return ce
下面的代码将用于检查函数 cross_entropy
是否正确。
predictions = np.array([[0.25,0.25,0.25,0.25],
[0.01,0.01,0.01,0.96]])
targets = np.array([[0,0,0,1],
[0,0,0,1]])
ans = 0.71355817782 #Correct answer
x = cross_entropy(predictions, targets)
print(np.isclose(x,ans))
以上代码的输出为False,也就是说我定义函数cross_entropy
的代码不正确。然后我打印 cross_entropy(predictions, targets)
的结果。它给出了0.178389544455
,正确的结果应该是ans = 0.71355817782
。谁能帮我检查一下我的代码有什么问题吗?
您离得并不远,但请记住,您取的是 N 个总和的平均值,其中 N = 2(在本例中)。所以你的代码可以是:
def cross_entropy(predictions, targets, epsilon=1e-12):
"""
Computes cross entropy between targets (encoded as one-hot vectors)
and predictions.
Input: predictions (N, k) ndarray
targets (N, k) ndarray
Returns: scalar
"""
predictions = np.clip(predictions, epsilon, 1. - epsilon)
N = predictions.shape[0]
ce = -np.sum(targets*np.log(predictions+1e-9))/N
return ce
predictions = np.array([[0.25,0.25,0.25,0.25],
[0.01,0.01,0.01,0.96]])
targets = np.array([[0,0,0,1],
[0,0,0,1]])
ans = 0.71355817782 #Correct answer
x = cross_entropy(predictions, targets)
print(np.isclose(x,ans))
在这里,我认为如果你坚持使用 np.sum()
会更清楚一些。此外,我将 1e-9 添加到 np.log()
中以避免在计算中出现 log(0) 的可能性。希望这对您有所帮助!
注意:根据@Peter 的评论,如果您的 epsilon 值大于 0
,1e-9
的偏移量确实是多余的。
def cross_entropy(x, y):
""" Computes cross entropy between two distributions.
Input: x: iterabale of N non-negative values
y: iterabale of N non-negative values
Returns: scalar
"""
if np.any(x < 0) or np.any(y < 0):
raise ValueError('Negative values exist.')
# Force to proper probability mass function.
x = np.array(x, dtype=np.float)
y = np.array(y, dtype=np.float)
x /= np.sum(x)
y /= np.sum(y)
# Ignore zero 'y' elements.
mask = y > 0
x = x[mask]
y = y[mask]
ce = -np.sum(x * np.log(y))
return ce
def cross_entropy_via_scipy(x, y):
''' SEE: https://en.wikipedia.org/wiki/Cross_entropy'''
return entropy(x) + entropy(x, y)
from scipy.stats import entropy, truncnorm
x = truncnorm.rvs(0.1, 2, size=100)
y = truncnorm.rvs(0.1, 2, size=100)
print np.isclose(cross_entropy(x, y), cross_entropy_via_scipy(x, y))
我正在学习神经网络,我想在python中写一个函数cross_entropy
。其中定义为
其中N
是样本数,k
是class个数,log
是自然对数,如果样本 i
在 class j
中,t_i,j
为 1,否则为 0
,p_i,j
是样本 i
在 class j
中。
为避免对数出现数值问题,请将预测限制在 [10^{−12}, 1 − 10^{−12}]
范围内。
根据上面的描述,我通过将预测剪裁到 [epsilon, 1 − epsilon]
范围内,然后根据上面的公式计算 cross_entropy 来写下代码。
def cross_entropy(predictions, targets, epsilon=1e-12):
"""
Computes cross entropy between targets (encoded as one-hot vectors)
and predictions.
Input: predictions (N, k) ndarray
targets (N, k) ndarray
Returns: scalar
"""
predictions = np.clip(predictions, epsilon, 1. - epsilon)
ce = - np.mean(np.log(predictions) * targets)
return ce
下面的代码将用于检查函数 cross_entropy
是否正确。
predictions = np.array([[0.25,0.25,0.25,0.25],
[0.01,0.01,0.01,0.96]])
targets = np.array([[0,0,0,1],
[0,0,0,1]])
ans = 0.71355817782 #Correct answer
x = cross_entropy(predictions, targets)
print(np.isclose(x,ans))
以上代码的输出为False,也就是说我定义函数cross_entropy
的代码不正确。然后我打印 cross_entropy(predictions, targets)
的结果。它给出了0.178389544455
,正确的结果应该是ans = 0.71355817782
。谁能帮我检查一下我的代码有什么问题吗?
您离得并不远,但请记住,您取的是 N 个总和的平均值,其中 N = 2(在本例中)。所以你的代码可以是:
def cross_entropy(predictions, targets, epsilon=1e-12):
"""
Computes cross entropy between targets (encoded as one-hot vectors)
and predictions.
Input: predictions (N, k) ndarray
targets (N, k) ndarray
Returns: scalar
"""
predictions = np.clip(predictions, epsilon, 1. - epsilon)
N = predictions.shape[0]
ce = -np.sum(targets*np.log(predictions+1e-9))/N
return ce
predictions = np.array([[0.25,0.25,0.25,0.25],
[0.01,0.01,0.01,0.96]])
targets = np.array([[0,0,0,1],
[0,0,0,1]])
ans = 0.71355817782 #Correct answer
x = cross_entropy(predictions, targets)
print(np.isclose(x,ans))
在这里,我认为如果你坚持使用 np.sum()
会更清楚一些。此外,我将 1e-9 添加到 np.log()
中以避免在计算中出现 log(0) 的可能性。希望这对您有所帮助!
注意:根据@Peter 的评论,如果您的 epsilon 值大于 0
,1e-9
的偏移量确实是多余的。
def cross_entropy(x, y):
""" Computes cross entropy between two distributions.
Input: x: iterabale of N non-negative values
y: iterabale of N non-negative values
Returns: scalar
"""
if np.any(x < 0) or np.any(y < 0):
raise ValueError('Negative values exist.')
# Force to proper probability mass function.
x = np.array(x, dtype=np.float)
y = np.array(y, dtype=np.float)
x /= np.sum(x)
y /= np.sum(y)
# Ignore zero 'y' elements.
mask = y > 0
x = x[mask]
y = y[mask]
ce = -np.sum(x * np.log(y))
return ce
def cross_entropy_via_scipy(x, y):
''' SEE: https://en.wikipedia.org/wiki/Cross_entropy'''
return entropy(x) + entropy(x, y)
from scipy.stats import entropy, truncnorm
x = truncnorm.rvs(0.1, 2, size=100)
y = truncnorm.rvs(0.1, 2, size=100)
print np.isclose(cross_entropy(x, y), cross_entropy_via_scipy(x, y))