如何将tf.layers变量放入tf.name_scope/tf.variable_scope?

How to put tf.layers variables in tf.name_scope/tf.variable_scope?

我对 Tensorflow 有疑问:

以下代码为卷积块生成正确的(大概)图:

def conv_layer(self, inputs, filter_size = 3, num_filters = 256, name = None):
    scope_name = name
    if name == None:
        scope_name = "conv_layer"

    with tf.name_scope(scope_name):
        conv = tf.contrib.layers.conv2d(inputs, num_filters, filter_size, activation_fn = None)
        batch_norm = tf.contrib.layers.batch_norm(conv)
        act = tf.nn.leaky_relu(batch_norm)

        return act

问题是 tf.layers API 产生了一些丑陋的变量,这些变量实际上并没有留在 name_scope 中。这是 Tensorboard 视图,因此您可以明白我的意思。

有没有办法让这些变量进入作用域?在可视化图表时这是一个大问题,因为我计划将这个网络扩大到更大。 (如右图所示,这已经是个大问题了,我每次启动 Tensorboard 时都必须手动从主图中删除它们。)

您可以尝试使用 tf.variable_scopetf.name_scope 被通过 tf.get_variable() 创建的变量忽略,通常由 tf.layers 函数使用。这与通过 tf.Variable.

创建的变量形成对比

有关差异的解释(尽管有些过时),请参阅

解决方案从问题移至答案:

name_scope 的每个实例更改为 variable_scope 问题已被忽略。但是,我必须为每个 variable_scope 分配一个唯一的 ID 并设置 reuse = False.

def conv_layer(self, inputs, filter_size = 3, num_filters = 256, name = None):
    scope_name = name
    if name == None:
        scope_name = "conv_layer_" + str(self.conv_id)
        self.conv_id += 1

    with tf.variable_scope(scope_name, reuse = False):
        conv = tf.contrib.layers.conv2d(inputs, num_filters, filter_size, activation_fn = None)
        batch_norm = tf.contrib.layers.batch_norm(conv)
        act = tf.nn.leaky_relu(batch_norm)

        return act

如您所见,变量很好地隐藏在正确的块中。