Tensorflow 图中未连接的系列
Unconnected series in Tensorflow graph
我的稀疏自动编码器模型主要由 10 个卷积层和 10 个转置卷积层组成。训练完成后,我在 Tensorboard 中得到如下图。
我的理解是这张图没有连接,因为 Conv1 和 Conv2 没有连接。这是我的第一个 Tensorflow 模型,所以我很困惑。请建议我做错了什么。此代码是基于 CIFAR10 多 GPU 代码开发的。
模型片段
def inference(images, labels, keep_prob, batch_size):
"""Build the cnn model.
Args:
images: Images returned from distorted_inputs() or inputs().
keep_prob: Dropout probability
Returns:
Logits.
"""
# conv1
with tf.variable_scope('conv1') as scope:
kernel1 = _variable_with_weight_decay('weights', shape=[5, 5, model_params.org_image['channels'], 100], stddev=1e-4, wd=0.0)
conv1 = tf.nn.conv2d(images, kernel1, [1, 1, 1, 1], padding='SAME')
biases1 = _variable_on_cpu('biases', [100], tf.constant_initializer(0.0))
bias1 = tf.nn.bias_add(conv1, biases1)
conv1 = tf.nn.relu(bias1, name=scope.name)
print(tf.abs(conv1))
_activation_summary(conv1)
# norm1
norm1 = tf.nn.batch_normalization(conv1, mean=0.6151888371, variance=0.2506813109, offset=None, scale=False, variance_epsilon=0.001, name='norm1')
# conv2
with tf.variable_scope('conv2') as scope:
kernel2 = _variable_with_weight_decay('weights', shape=[5, 5, 100, 120], stddev=1e-4, wd=0.0)
conv2 = tf.nn.conv2d(norm1, kernel2, [1, 1, 1, 1], padding='SAME')
biases2 = _variable_on_cpu('biases', [120], tf.constant_initializer(0.1))
bias2 = tf.nn.bias_add(conv2, biases2)
conv2 = tf.nn.relu(bias2, name=scope.name)
print(tf.abs(conv2))
_activation_summary(conv2)
# norm2
norm2 = tf.nn.batch_normalization(conv2, mean=0.6151888371, variance=0.2506813109, offset=None, scale=False, variance_epsilon=0.001, name='norm2')
# pool2
.....
我什至不明白为什么 "IsVariable" 显示在我的图表中。任何类型的帮助将不胜感激。
更新
我发现这个 solution 上面写着 "multi-GPU graph looks like that is because the namescoping in the multi-GPU version creates tower_N namespaces that have incoming edges (tensors) above a certain threshold, at which point we extract those nodes on the side since usually they end up being auxiliary and not part of the main net architecture." 尽管如此,我还是很困惑我的图表是否完美。
我 运行 原始 CIFAR10 多 GPU 代码并检查 CIFAR10 张量板结果,这与我的图表相似。所以我的结论是我的图表很好。
我的稀疏自动编码器模型主要由 10 个卷积层和 10 个转置卷积层组成。训练完成后,我在 Tensorboard 中得到如下图。
我的理解是这张图没有连接,因为 Conv1 和 Conv2 没有连接。这是我的第一个 Tensorflow 模型,所以我很困惑。请建议我做错了什么。此代码是基于 CIFAR10 多 GPU 代码开发的。
模型片段
def inference(images, labels, keep_prob, batch_size):
"""Build the cnn model.
Args:
images: Images returned from distorted_inputs() or inputs().
keep_prob: Dropout probability
Returns:
Logits.
"""
# conv1
with tf.variable_scope('conv1') as scope:
kernel1 = _variable_with_weight_decay('weights', shape=[5, 5, model_params.org_image['channels'], 100], stddev=1e-4, wd=0.0)
conv1 = tf.nn.conv2d(images, kernel1, [1, 1, 1, 1], padding='SAME')
biases1 = _variable_on_cpu('biases', [100], tf.constant_initializer(0.0))
bias1 = tf.nn.bias_add(conv1, biases1)
conv1 = tf.nn.relu(bias1, name=scope.name)
print(tf.abs(conv1))
_activation_summary(conv1)
# norm1
norm1 = tf.nn.batch_normalization(conv1, mean=0.6151888371, variance=0.2506813109, offset=None, scale=False, variance_epsilon=0.001, name='norm1')
# conv2
with tf.variable_scope('conv2') as scope:
kernel2 = _variable_with_weight_decay('weights', shape=[5, 5, 100, 120], stddev=1e-4, wd=0.0)
conv2 = tf.nn.conv2d(norm1, kernel2, [1, 1, 1, 1], padding='SAME')
biases2 = _variable_on_cpu('biases', [120], tf.constant_initializer(0.1))
bias2 = tf.nn.bias_add(conv2, biases2)
conv2 = tf.nn.relu(bias2, name=scope.name)
print(tf.abs(conv2))
_activation_summary(conv2)
# norm2
norm2 = tf.nn.batch_normalization(conv2, mean=0.6151888371, variance=0.2506813109, offset=None, scale=False, variance_epsilon=0.001, name='norm2')
# pool2
.....
我什至不明白为什么 "IsVariable" 显示在我的图表中。任何类型的帮助将不胜感激。
更新
我发现这个 solution 上面写着 "multi-GPU graph looks like that is because the namescoping in the multi-GPU version creates tower_N namespaces that have incoming edges (tensors) above a certain threshold, at which point we extract those nodes on the side since usually they end up being auxiliary and not part of the main net architecture." 尽管如此,我还是很困惑我的图表是否完美。
我 运行 原始 CIFAR10 多 GPU 代码并检查 CIFAR10 张量板结果,这与我的图表相似。所以我的结论是我的图表很好。