如何修改 keras 的激活函数?

How do I modify the activation functions from keras?

我想使用参数 alpha 设置为 0.2 的激活函数 relu,但我不知道如何为我的模型完成此操作

import numpy
from tensorflow.keras.layers import Dense, Activation, Dropout, Input
from tensorflow.keras.models import Sequential, Model, load_model
from tensorflow.keras.optimizers import Adam

model_input = Input(shape = x_train[0].shape)
x = Dense(120, activation = 'relu')(model_input)
x = Dropout(0.01)(x)
x = Dense(120, activation = 'relu')(x)
x = Dropout(0.01)(x)
x = Dense(120, activation = 'relu')(x)
x = Dropout(0.01)(x)
model_output = Dense(numpy.shape(y_train)[1])(x)
model = Model(model_input, model_output)

我在这个 answer 中看到有一种方法可以做到这一点,它使用 model.add()。但是我不确定这对我有什么用,你能帮帮我吗?

提前致谢!

首先,请注意您将激活指定为字符串,而在答案中提供的示例中,您将我们链接到激活函数是通过创建 class 的对象来指定的激活函数。其次,请注意您要使用 "leaky ReLU" 激活函数,而您当前仅指定 "relu" 作为激活函数。

要回答你的问题,你可能会这样做

import numpy
from tensorflow.keras.layers import Dense, Activation, Dropout, Input
from tensorflow.keras.models import Sequential, Model, load_model
from tensorflow.keras.optimizers import Adam
from tensorflow.keras.layers import LeakyReLU

model_input = Input(shape = x_train[0].shape)
x = Dense(120)(model_input)
x = LeakyReLU(alpha=0.2)(x)
x = Dropout(0.01)(x)
x = Dense(120)(x)
x = LeakyReLU(alpha=0.2)(x)
x = Dropout(0.01)(x)
x = Dense(120)(x)
x = LeakyReLU(alpha=0.2)(x)
x = Dropout(0.01)(x)
model_output = Dense(numpy.shape(y_train)[1])(x)
model = Model(model_input, model_output)

我还没有尝试过这段代码,但它应该可以工作!