为什么在推理中使用 Variable()?
Why use Variable() in inference?
我正在学习 PyTorch 来完成图像分类任务,我 运行 编写了代码,其中有人在他们的预测函数中使用了 PyTorch Variable()
:
def predict_image(image):
image_tensor = test_transforms(image).float()
image_tensor = image_tensor.unsqueeze_(0)
input = Variable(image_tensor)
input = input.to(device)
output = model(input)
index = output.data.cpu().numpy().argmax()
return index
为什么他们在这里使用Variable()
? (即使没有它也能正常工作。)
您可以放心地忽略它。变量是 PyTorch 的遗留组件,now deprecated,过去是 autograd 所必需的:
Variable
(deprecated)
WARNING
The Variable
API has been deprecated: Variables are no longer necessary to use autograd with tensors. Autograd automatically supports Tensors with requires_grad
set to True
. Below please find a quick guide on what has changed:
Variable(tensor)
and Variable(tensor, requires_grad)
still work as expected, but they return Tensors instead of Variables.
我正在学习 PyTorch 来完成图像分类任务,我 运行 编写了代码,其中有人在他们的预测函数中使用了 PyTorch Variable()
:
def predict_image(image):
image_tensor = test_transforms(image).float()
image_tensor = image_tensor.unsqueeze_(0)
input = Variable(image_tensor)
input = input.to(device)
output = model(input)
index = output.data.cpu().numpy().argmax()
return index
为什么他们在这里使用Variable()
? (即使没有它也能正常工作。)
您可以放心地忽略它。变量是 PyTorch 的遗留组件,now deprecated,过去是 autograd 所必需的:
Variable
(deprecated)WARNING
The
Variable
API has been deprecated: Variables are no longer necessary to use autograd with tensors. Autograd automatically supports Tensors withrequires_grad
set toTrue
. Below please find a quick guide on what has changed:
Variable(tensor)
andVariable(tensor, requires_grad)
still work as expected, but they return Tensors instead of Variables.