如何在 TensorFlow GradientTape 中使用多个渐变?
how to use multiple gradients with TensorFlow GradientTape?
像下面的代码一样连接了 3 个神经网络,我们如何从初始网络中获取两个梯度??第一个梯度工作,但第二个梯度返回 None
张量。似乎他们彼此没有关系来获得梯度。这里有什么问题??
with tf.GradientTape() as tape1:
with tf.GradientTape() as tape2:
output1 = NN_model1(input1, training=True)
output2 = NN_model2(output1, training=True)
output3 = NN_model3([input1, output1, output2], training=True)
loss1 = -tf.math.reduce_mean(output3)
loss2 = -tf.math.reduce_mean(output2)
grad1 = tape2.gradient(loss1, NN_model1.trainable_variables)
grad2 = tape1.gradient(loss2, grad1)
optimizer.apply_gradients(zip(grad2, NN_model1.trainable_variables))
我认为正确的做法应该是这样的:
with tf.GradientTape() as tape:
output1 = NN_model1(input1, training=True)
output2 = NN_model2(output1, training=True)
output3 = NN_model3([input1, output1, output2], training=True)
loss1 = -tf.math.reduce_mean(output3)
loss2 = -tf.math.reduce_mean(output2)
grad = tape.gradient([loss1, loss2], NN_model1.trainable_variables)
optimizer.apply_gradients(zip(grad, NN_model1.trainable_variables))
像下面的代码一样连接了 3 个神经网络,我们如何从初始网络中获取两个梯度??第一个梯度工作,但第二个梯度返回 None
张量。似乎他们彼此没有关系来获得梯度。这里有什么问题??
with tf.GradientTape() as tape1:
with tf.GradientTape() as tape2:
output1 = NN_model1(input1, training=True)
output2 = NN_model2(output1, training=True)
output3 = NN_model3([input1, output1, output2], training=True)
loss1 = -tf.math.reduce_mean(output3)
loss2 = -tf.math.reduce_mean(output2)
grad1 = tape2.gradient(loss1, NN_model1.trainable_variables)
grad2 = tape1.gradient(loss2, grad1)
optimizer.apply_gradients(zip(grad2, NN_model1.trainable_variables))
我认为正确的做法应该是这样的:
with tf.GradientTape() as tape:
output1 = NN_model1(input1, training=True)
output2 = NN_model2(output1, training=True)
output3 = NN_model3([input1, output1, output2], training=True)
loss1 = -tf.math.reduce_mean(output3)
loss2 = -tf.math.reduce_mean(output2)
grad = tape.gradient([loss1, loss2], NN_model1.trainable_variables)
optimizer.apply_gradients(zip(grad, NN_model1.trainable_variables))