Torch 张量和输入冲突:"Tensor Object Is Not Callable"
Torch Tensor & Input Conflicting: "Tensor Object Is Not Callable"
由于代码“torch.tensor”,我在添加“输入”时收到错误“张量对象不可调用”。有谁知道我该如何解决这个问题?
import torch
from torch.nn import functional as F
from transformers import GPT2Tokenizer, GPT2LMHeadModel
tokenizer = GPT2Tokenizer.from_pretrained('gpt2')
model = GPT2LMHeadModel.from_pretrained('gpt2')
text0 = "In order to"
text = tokenizer.encode("In order to")
input, past = torch.tensor([text]), None
logits, past = model(input, past = past)
logits = logits[0,-1]
probabilities = torch.nn.functional.softmax(logits)
best_logits, best_indices = logits.topk(5)
best_words = [tokenizer.decode([idx.item()]) for idx in best_indices]
text.append(best_indices[0].item())
best_probabilities = probabilities[best_indices].tolist()
for i in range(5):
f = ('Generated {}: {}'.format(i, best_words[i]))
print(f)
option = input("Pick a Option:")
z = text0.append(option)
print(z)
错误堆栈跟踪:
TypeError Traceback (most recent call last)
<ipython-input-2-82e8d88e81c1> in <module>()
25
26
---> 27 option = input("Pick a Option:")
28 z = text0.append(option)
29 print(z)
TypeError: 'Tensor' object is not callable
问题是您已经定义了一个名称为 input
的变量,它将代替 input
函数使用。只需为您的变量使用不同的名称,它就会按预期工作。
还有一个 python 字符串没有追加方法。
import torch
from torch.nn import functional as F
from transformers import GPT2Tokenizer, GPT2LMHeadModel
tokenizer = GPT2Tokenizer.from_pretrained('gpt2')
model = GPT2LMHeadModel.from_pretrained('gpt2')
text0 = "In order to"
text = tokenizer.encode("In order to")
myinput, past = torch.tensor([text]), None
logits, past = model(myinput, past = past)
logits = logits[0,-1]
probabilities = torch.nn.functional.softmax(logits)
best_logits, best_indices = logits.topk(5)
best_words = [tokenizer.decode([idx.item()]) for idx in best_indices]
text.append(best_indices[0].item())
best_probabilities = probabilities[best_indices].tolist()
for i in range(5):
f = ('Generated {}: {}'.format(i, best_words[i]))
print(f)
option = input("Pick a Option:")
z = text0 + ' ' + option
print(z)
由于代码“torch.tensor”,我在添加“输入”时收到错误“张量对象不可调用”。有谁知道我该如何解决这个问题?
import torch
from torch.nn import functional as F
from transformers import GPT2Tokenizer, GPT2LMHeadModel
tokenizer = GPT2Tokenizer.from_pretrained('gpt2')
model = GPT2LMHeadModel.from_pretrained('gpt2')
text0 = "In order to"
text = tokenizer.encode("In order to")
input, past = torch.tensor([text]), None
logits, past = model(input, past = past)
logits = logits[0,-1]
probabilities = torch.nn.functional.softmax(logits)
best_logits, best_indices = logits.topk(5)
best_words = [tokenizer.decode([idx.item()]) for idx in best_indices]
text.append(best_indices[0].item())
best_probabilities = probabilities[best_indices].tolist()
for i in range(5):
f = ('Generated {}: {}'.format(i, best_words[i]))
print(f)
option = input("Pick a Option:")
z = text0.append(option)
print(z)
错误堆栈跟踪:
TypeError Traceback (most recent call last)
<ipython-input-2-82e8d88e81c1> in <module>()
25
26
---> 27 option = input("Pick a Option:")
28 z = text0.append(option)
29 print(z)
TypeError: 'Tensor' object is not callable
问题是您已经定义了一个名称为 input
的变量,它将代替 input
函数使用。只需为您的变量使用不同的名称,它就会按预期工作。
还有一个 python 字符串没有追加方法。
import torch
from torch.nn import functional as F
from transformers import GPT2Tokenizer, GPT2LMHeadModel
tokenizer = GPT2Tokenizer.from_pretrained('gpt2')
model = GPT2LMHeadModel.from_pretrained('gpt2')
text0 = "In order to"
text = tokenizer.encode("In order to")
myinput, past = torch.tensor([text]), None
logits, past = model(myinput, past = past)
logits = logits[0,-1]
probabilities = torch.nn.functional.softmax(logits)
best_logits, best_indices = logits.topk(5)
best_words = [tokenizer.decode([idx.item()]) for idx in best_indices]
text.append(best_indices[0].item())
best_probabilities = probabilities[best_indices].tolist()
for i in range(5):
f = ('Generated {}: {}'.format(i, best_words[i]))
print(f)
option = input("Pick a Option:")
z = text0 + ' ' + option
print(z)