隐藏层在哪里?
Where are the hidden layers?
我对自动编码器有点陌生。我有来自 Keras (https://blog.keras.io/building-autoencoders-in-keras.html) 的这段代码。我想知道我在此处代码中的注释是否正确?
input_img = keras.Input(shape=(784,)) # input
encoded = layers.Dense(128, activation='relu')(input_img) # is it hidden layer???
encoded = layers.Dense(64, activation='relu')(encoded) # is it hidden layer???
encoded = layers.Dense(32, activation='relu')(encoded) # is it hidden layer???
decoded = layers.Dense(64, activation='relu')(encoded) # is it hidden layer???
decoded = layers.Dense(128, activation='relu')(decoded) # is it hidden layer???
decoded = layers.Dense(784, activation='sigmoid')(decoded) # output
如果可以的话,你们能再解释一下吗?谢谢!
隐藏层是位于输入层和输出层之间的任何层 (ref)。因此,所有这些都是您网络中的隐藏层:
encoded = layers.Dense(128, activation='relu')(input_img)
encoded = layers.Dense(64, activation='relu')(encoded)
encoded = layers.Dense(32, activation='relu')(encoded)
decoded = layers.Dense(64, activation='relu')(encoded)
decoded = layers.Dense(128, activation='relu')(decoded)
在自动编码器中,有一个特别有趣的隐藏层:网络中的“瓶颈”隐藏层,它强制对原始输入进行压缩知识表示。在您的示例中,它是 784 到 32 压缩,瓶颈隐藏层为:
encoded = layers.Dense(32, activation='relu')(encoded)
我对自动编码器有点陌生。我有来自 Keras (https://blog.keras.io/building-autoencoders-in-keras.html) 的这段代码。我想知道我在此处代码中的注释是否正确?
input_img = keras.Input(shape=(784,)) # input
encoded = layers.Dense(128, activation='relu')(input_img) # is it hidden layer???
encoded = layers.Dense(64, activation='relu')(encoded) # is it hidden layer???
encoded = layers.Dense(32, activation='relu')(encoded) # is it hidden layer???
decoded = layers.Dense(64, activation='relu')(encoded) # is it hidden layer???
decoded = layers.Dense(128, activation='relu')(decoded) # is it hidden layer???
decoded = layers.Dense(784, activation='sigmoid')(decoded) # output
如果可以的话,你们能再解释一下吗?谢谢!
隐藏层是位于输入层和输出层之间的任何层 (ref)。因此,所有这些都是您网络中的隐藏层:
encoded = layers.Dense(128, activation='relu')(input_img)
encoded = layers.Dense(64, activation='relu')(encoded)
encoded = layers.Dense(32, activation='relu')(encoded)
decoded = layers.Dense(64, activation='relu')(encoded)
decoded = layers.Dense(128, activation='relu')(decoded)
在自动编码器中,有一个特别有趣的隐藏层:网络中的“瓶颈”隐藏层,它强制对原始输入进行压缩知识表示。在您的示例中,它是 784 到 32 压缩,瓶颈隐藏层为:
encoded = layers.Dense(32, activation='relu')(encoded)