在张量流中,变量值是会话存储的唯一上下文信息吗?

In tensorflow, is variable value the only context information a session stores?

假设我们 运行 session.run([tensor_to_eval]),唯一的驻留信息是 tf.Variable 值,所有其他评估结果要么从调用返回,要么丢弃?

假设我们有两个会话,除了默认图表,它们是否共享任何东西?

run() calls ("steps") in a TensorFlow session:

之间保留了各种形式的状态
  • 正如您已经指出的那样,tf.Variable 对象在调用之间存储值,这些值可以由任何步骤读取和写入。
  • TensorFlow 队列(tutorial) allow you to enqueue one or many values into a bounded buffer in a step, and dequeue one or many values in a later step. Queues also support coordination between steps, such as back-pressure in a producer/consumer relationship. They also allow you to switch between element-by-element and batched computation, using functions like tf.train.batch().
  • TensorFlow 读者 (tutorial) act like implicit file pointers that remember their current position in a file between steps. Subsequent executions of the read() op yield, for examples, different lines of a text file.

在单进程版本的 TensorFlow 中,会话不共享任何状态。它们可能共享相同的图(如果它们都是使用相同的默认图创建的),但是有状态组件(例如 tf.Variable 对象)将在不同的会话中采用不同的值。

distributed runtime 添加了对会话之间共享的 "resource containers" 的支持。这些包括变量、队列和读取器,并且可以通过将可选的 container 参数传递给这些对象的构造函数来进行配置。