torch.nn.embedding 有 运行 时间错误

torch.nn.embedding has run time error

我想使用 torch.nn.Embedding。我遵循了嵌入命令文档中的代码。 这是代码:

# an Embedding module containing 10 tensors of size 3
embedding = nn.Embedding(10, 3)
# a batch of 2 samples of 4 indices each
input = torch.LongTensor([[1,2,4,5],[4,3,2,9]])
embedding(input)

文档说您将收到此输出:

tensor([[[-0.0251, -1.6902,  0.7172],
         [-0.6431,  0.0748,  0.6969],
         [ 1.4970,  1.3448, -0.9685],
         [-0.3677, -2.7265, -0.1685]],

        [[ 1.4970,  1.3448, -0.9685],
         [ 0.4362, -0.4004,  0.9400],
         [-0.6431,  0.0748,  0.6969],
         [ 0.9124, -2.3616,  1.1151]]])

但我没有收到此输出。相反,我收到此错误:

Traceback (most recent call last):
  File "/home/mahsa/PycharmProjects/PyTorch_env_project/PyTorchZeroToAll-master/temporary.py", line 12, in <module>
    embedding(input)
  File "/home/mahsa/anaconda3/envs/pytorch_env/lib/python3.5/site-packages/torch/nn/modules/module.py", line 224, in __call__
    result = self.forward(*input, **kwargs)
  File "/home/mahsa/anaconda3/envs/pytorch_env/lib/python3.5/site-packages/torch/nn/modules/sparse.py", line 94, in forward
    self.scale_grad_by_freq, self.sparse
RuntimeError: save_for_backward can only save input or output tensors, but argument 0 doesn't satisfy this condition

任何人都可以指导我解决这个错误吗?关于 torch.nn.Embedding?

的工作

如果我们改变这一行:

input = torch.LongTensor([[1,2,4,5],[4,3,2,9]])

有了这个:

input = autograd.Variable(torch.LongTensor([[1,2,4,5],[4,3,2,9]]))

问题已解决!