在 MxNet 中添加损失函数 - "Operator _copyto is non-differentiable because it didn't register FGradient attribute"
Adding loss functions in MxNet - "Operator _copyto is non-differentiable because it didn't register FGradient attribute"
我有一个生成训练数据的系统,我想将损失函数加在一起以获得批量大小。我正在尝试做 (full code at commit in question),
for epoch in range(100):
with mx.autograd.record():
loss = 0.0
for k in range(40):
(i, x), (j, y) = random.choice(data), random.choice(data)
# Just compute loss on last output
if i == j:
loss = loss - l2loss(net(mx.nd.array(x)), net(mx.nd.array(y)))
else:
loss = loss + l2loss(net(mx.nd.array(x)), net(mx.nd.array(y)))
loss.backward()
trainer.step(BATCH_SIZE)
但是我得到了这样的错误,
---------------------------------------------------------------------------
MXNetError Traceback (most recent call last)
<ipython-input-39-14981406278a> in <module>()
21 else:
22 loss = loss + l2loss(net(mx.nd.array(x)), net(mx.nd.array(y)))
---> 23 loss.backward()
24 trainer.step(BATCH_SIZE)
25 avg_loss += mx.nd.mean(loss).asscalar()
... More trace ...
MXNetError: [16:52:49] src/pass/gradient.cc:187: Operator _copyto is non-differentiable because it didn't register FGradient attribute.
如何像我尝试的那样逐步添加损失函数?
您使用的是哪个版本的 MXNet?我无法使用最新的代码库重现这一点。您可以尝试 GitHub master 分支或版本 0.12.
我有一个生成训练数据的系统,我想将损失函数加在一起以获得批量大小。我正在尝试做 (full code at commit in question),
for epoch in range(100):
with mx.autograd.record():
loss = 0.0
for k in range(40):
(i, x), (j, y) = random.choice(data), random.choice(data)
# Just compute loss on last output
if i == j:
loss = loss - l2loss(net(mx.nd.array(x)), net(mx.nd.array(y)))
else:
loss = loss + l2loss(net(mx.nd.array(x)), net(mx.nd.array(y)))
loss.backward()
trainer.step(BATCH_SIZE)
但是我得到了这样的错误,
---------------------------------------------------------------------------
MXNetError Traceback (most recent call last)
<ipython-input-39-14981406278a> in <module>()
21 else:
22 loss = loss + l2loss(net(mx.nd.array(x)), net(mx.nd.array(y)))
---> 23 loss.backward()
24 trainer.step(BATCH_SIZE)
25 avg_loss += mx.nd.mean(loss).asscalar()
... More trace ...
MXNetError: [16:52:49] src/pass/gradient.cc:187: Operator _copyto is non-differentiable because it didn't register FGradient attribute.
如何像我尝试的那样逐步添加损失函数?
您使用的是哪个版本的 MXNet?我无法使用最新的代码库重现这一点。您可以尝试 GitHub master 分支或版本 0.12.