您必须为占位符张量提供一个值

You must feed a value for placeholder tensor

我正在尝试使用 tesorboarb 实现 lstm nn,但收到此错误消息:您必须为占位符张量提供一个值 'performance_1/loss_summary'。

我已经在很多问题中搜索过没有结果。

with tf.name_scope('performance'):
loss = tf.placeholder(tf.float32,shape=None,name='loss_summary') 
    tf_loss_summary = tf.summary.scalar('loss', loss)
    tf_accuracy_ph = tf.placeholder(tf.float32,shape=None, name='accuracy_summary') 
    tf_accuracy_summary = tf.summary.scalar('accuracy', tf_accuracy_ph)

# Gradient norm summary
for g in gradients:
for var in v:
    if 'hidden3' in var.name and 'w' in var.name:
        with tf.name_scope('Gradients'):
            tf_last_grad_norm = tf.sqrt(tf.reduce_mean(g**2))
            tf_gradnorm_summary = tf.summary.scalar('grad_norm', tf_last_grad_norm)
            break   

# Merge all summaries together
performance_summaries = tf.summary.merge([tf_loss_summary,tf_accuracy_summary])

我得到错误的代码的另一部分是:

for ep in range(epochs):       

for step in range(train_seq_length//batch_size):

    u_data, u_labels = data_gen.unroll_batches()

    feed_dict = {}
    for ui,(dat,lbl) in enumerate(zip(u_data,u_labels)):            
        feed_dict[train_inputs[ui]] = dat.reshape(-1,1)
        feed_dict[train_outputs[ui]] = lbl.reshape(-1,1)

    feed_dict.update({tf_learning_rate: 0.0001, tf_min_learning_rate:0.000001})

    _, l = session.run([optimizer, loss], feed_dict=feed_dict)

    average_loss += l


if (ep+1) % valid_summary == 0:

    average_loss = average_loss/(valid_summary*(train_seq_length//batch_size))

  # The average loss
    if (ep+1)%valid_summary==0:
        print('Average loss at step %d: %f' % (ep+1, average_loss))

    train_mse_ot.append(average_loss)

    average_loss = 0 # reset loss

    predictions_seq = []

    mse_test_loss_seq = []

提前谢谢你。

loss 是一个占位符,所以你必须给它一个值。您可能没有注意到它并覆盖了您的实际损失函数。通常摘要不是占位符,因此您对变量和代码流有误解。

loss初始化为一个变量。当您将某物定义为占位符时,您必须在 运行 图形依赖于它时提供它的值。