如何使用 pytorch 计算 softmax 回归的成本

How to calculate cost for softmax regression with pytorch

我想计算 softmax 回归的成本。 要计算的成本函数在页面底部给出。

对于numpy我可以得到如下代价:

"""
X.shape = 2,300 # floats
y.shape = 300,  # integers
W.shape = 2,3
b.shape = 3,1
"""
import numpy as np
np.random.seed(100)

# Data and labels
X = np.random.randn(300,2)
y = np.ones(300)
y[0:100] = 0
y[200:300] = 2
y = y.astype(np.int) 

# weights and bias
W = np.random.randn(2,3)
b = np.random.randn(3)

N = X.shape[0]
scores = np.dot(X, W) + b
hyp = np.exp(scores-np.max(scores, axis=0, keepdims=True)) 
probs = hyp / np.sum(hyp, axis = 0)
logprobs = np.log(probs[range(N),y])
cost_data = -1/N * np.sum(logprobs)

print("hyp.shape = {}".format(hyp.shape)) # hyp.shape = (300, 3)
print(cost_data)

但是,当我尝试 torch 时,我无法得到这个。 到目前为止我得到了这个:

"""
X.shape = 2,300 # floats
y.shape = 300,  # integers
W.shape = 2,3
b.shape = 3,1
"""
import numpy as np
import torch
from torch.autograd import Variable
np.random.seed(100)


# Data and labels
X = np.random.randn(300,2)
y = np.ones(300)
y[0:100] = 0
y[200:300] = 2
y = y.astype(np.int)

X = Variable(torch.from_numpy(X),requires_grad=True).type(torch.FloatTensor)
y = Variable(torch.from_numpy(y),requires_grad=True).type(torch.LongTensor)

# weights and bias
W = Variable(torch.randn(2,3),requires_grad=True)
b = Variable(torch.randn(3),requires_grad=True)

N = X.shape[0]
scores = torch.mm(X, W) + b
hyp = torch.exp(scores - torch.max(scores))
probs = hyp / torch.sum(hyp)
correct_probs = probs[range(N),y] # got problem HERE
# logprobs = np.log(correct_probs)
# cost_data = -1/N * torch.sum(logprobs)

# print(cost_data)

我在计算 类 的正确概率时遇到问题。

我们如何解决这个问题并获得正确的成本值。

要计算的代价函数如下:

您的问题是您不能将 range(N)pytorch 一起使用,而是使用切片 0:N

hyp = torch.exp(scores - torch.max(scores))
probs = hyp / torch.sum(hyp)
correct_probs = probs[0:N,y] # problem solved
logprobs = torch.log(correct_probs)
cost_data = -1/N * torch.sum(logprobs)

另一点是你的标签y不需要渐变,你最好有:

y = Variable(torch.from_numpy(y),requires_grad=False).type(torch.LongTensor)