如何使用 leaky ReLus 作为 pylearn2 隐藏层的激活函数
How to use leaky ReLus as the activation function in hidden layers in pylearn2
我正在使用 pylearn2 库来设计 CNN。我想使用 Leaky ReLus 作为一层中的激活函数。有没有可能使用pylearn2来做到这一点?我是否必须为其编写自定义函数,或者 pylearn2 是否具有内置函数?如果是这样,如何编写自定义代码?请问有人可以帮我吗?
ConvElemwise super-class is a generic convolutional elemwise layer. Among its subclasses ConvRectifiedLinear is a convolutional rectified linear layer that uses RectifierConvNonlinearity class.
在apply()
方法中:
p = linear_response * (linear_response > 0.) + self.left_slope *\
linear_response * (linear_response < 0.)
正如 this 温和的评论指出:
... Maxout neuron (introduced recently by Goodfellow et al.) that generalizes the ReLU and its leaky version.
例子是MaxoutLocalC01B or MaxoutConvC01B.
pylearn2-user may be that pylearn2 is mostly written by researches at LISA lab and, thus, the threshold for point 13 in FAQ中没有答案的原因可能很高
我正在使用 pylearn2 库来设计 CNN。我想使用 Leaky ReLus 作为一层中的激活函数。有没有可能使用pylearn2来做到这一点?我是否必须为其编写自定义函数,或者 pylearn2 是否具有内置函数?如果是这样,如何编写自定义代码?请问有人可以帮我吗?
ConvElemwise super-class is a generic convolutional elemwise layer. Among its subclasses ConvRectifiedLinear is a convolutional rectified linear layer that uses RectifierConvNonlinearity class.
在apply()
方法中:
p = linear_response * (linear_response > 0.) + self.left_slope *\
linear_response * (linear_response < 0.)
正如 this 温和的评论指出:
... Maxout neuron (introduced recently by Goodfellow et al.) that generalizes the ReLU and its leaky version.
例子是MaxoutLocalC01B or MaxoutConvC01B.
pylearn2-user may be that pylearn2 is mostly written by researches at LISA lab and, thus, the threshold for point 13 in FAQ中没有答案的原因可能很高