Keras中这两种添加神经网络层的方式有什么区别?
What is the difference between these two ways of adding Neural Network layers in Keras?
我使用 Keras 和 Theano 作为后端,我有顺序神经网络模型。
请问下面有区别吗?
model.add(Convolution2D(32, 3, 3, activation='relu'))
和
model.add(Convolution2D(32, 3, 3))
model.add(Activation('relu'))
它们本质上是一样的。分开放的好处是可以在中间添加其他层(比如BatchNormalization
)。
在Keras中,如果不指定,Convolution2D
会默认使用'linear'激活,也就是恒等函数
def linear(x):
'''
The function returns the variable that is passed in, so all types work.
'''
return x
Activation
层所做的就是将激活函数应用于输入
def call(self, x, mask=None):
return self.activation(x)
编辑:
So basically Convolution2D(activation = 'relu')
applies relu activation function after performing convolution, which is the same as applying Activation('relu')
after Convolution2D(32, 3, 3)
Convolution2D
层的call
函数的最后两行是
output = self.activation(output)
return output
其中output
是卷积的输出。所以我们知道应用激活函数是 Convolution2D
.
的最后一步
源代码:
Convolution2D
层:https://github.com/fchollet/keras/blob/a981a8c42c316831183cac7598266d577a1ea96a/keras/layers/convolutional.py
Activation
层:https://github.com/fchollet/keras/blob/a981a8c42c316831183cac7598266d577a1ea96a/keras/layers/core.py
激活函数:https://github.com/fchollet/keras/blob/master/keras/activations.py
我使用 Keras 和 Theano 作为后端,我有顺序神经网络模型。
请问下面有区别吗?
model.add(Convolution2D(32, 3, 3, activation='relu'))
和
model.add(Convolution2D(32, 3, 3))
model.add(Activation('relu'))
它们本质上是一样的。分开放的好处是可以在中间添加其他层(比如BatchNormalization
)。
在Keras中,如果不指定,Convolution2D
会默认使用'linear'激活,也就是恒等函数
def linear(x):
'''
The function returns the variable that is passed in, so all types work.
'''
return x
Activation
层所做的就是将激活函数应用于输入
def call(self, x, mask=None):
return self.activation(x)
编辑:
So basically
Convolution2D(activation = 'relu')
applies relu activation function after performing convolution, which is the same as applyingActivation('relu')
afterConvolution2D(32, 3, 3)
Convolution2D
层的call
函数的最后两行是
output = self.activation(output)
return output
其中output
是卷积的输出。所以我们知道应用激活函数是 Convolution2D
.
源代码:
Convolution2D
层:https://github.com/fchollet/keras/blob/a981a8c42c316831183cac7598266d577a1ea96a/keras/layers/convolutional.py
Activation
层:https://github.com/fchollet/keras/blob/a981a8c42c316831183cac7598266d577a1ea96a/keras/layers/core.py
激活函数:https://github.com/fchollet/keras/blob/master/keras/activations.py