LBFGS 使用 Optimizer.step 时出现张量对象不可调用错误

LBFGS Giving Tensor Object not Callable Error when using Optimizer.step

我正在尝试使用 sgd, adamLBFGS 优化器。

部分代码为:

for batch_idx, (inputs, targets) in enumerate(trainloader):
            batch_size = inputs.size(0)
            total += batch_size
            one_hot_targets = torch.FloatTensor(batch_size, 10).zero_()
            one_hot_targets = one_hot_targets.scatter_(1, targets.view(batch_size, 1), 1.0)
            one_hot_targets = one_hot_targets.float()
            if use_cuda:
                inputs, one_hot_targets = inputs.cuda(), one_hot_targets.cuda()
            inputs, one_hot_targets = Variable(inputs), Variable(one_hot_targets)
            
            
            if optimizer_val=='sgd' or optimizer_val=='adam':
              outputs = F.softmax(net(inputs))
              loss = criterion(outputs, one_hot_targets)

              loss.backward()
              optimizer.step()

            else:
              def closure():
                optimizer.zero_grad()
                outputs = F.softmax(net(inputs))
                loss = criterion(outputs, one_hot_targets)
                loss.backward()
                return loss    

              optimizer.step(closure())

LBFGS 中的 optimizer.step(closure()) 部分(else 中的 运行)我收到此错误:

TypeError: 'Tensor' object is not callable

查了一下,loss是张量类型

如何让它发挥作用?

您需要将函数回调传递给 optimizer.step 函数,请勿调用它:

optimizer.step(closure)