如何在一个 Keras 层中使用不同的激活函数?
How to use different activation functions in one Keras layer?
我在 Python 研究 Keras,我有一个神经网络(见下面的代码)。
目前它仅适用于 ReLu 激活。
出于实验原因,我希望在 ReLu 上有一些神经元,在 softmax(或任何其他激活函数)上有一些神经元。例如,在一个有 20 个神经元的层中,我想有 10 个 ReLu 和 10 个 Softmax。
我尝试了一些不同的方法,但总是无法得到输出。
你知道我应该怎么做吗?
# - Libraries
from keras.layers import Dense
from keras.models import Sequential
from keras.callbacks import EarlyStopping
early_spotting_monitor = EarlyStopping(patience=2)
layers = 4
neurons = 20
act = "ReLu"
# - Create Neural Network
model = Sequential()
model.add(Dense(neurons,activation=act,input_dim=X_train.shape[1]))
layers -= 1
while layers > 0:
model.add(Dense(neurons,activation=act))
layers -= 1
model.add(Dense(n_months))
model.compile(optimizer="adam",loss="mean_absolute_error")
model.fit(X_train,Y_train,validation_split=0.10,epochs=13,callbacks=[early_spotting_monitor])
编辑:这是我现在的(工作)代码:
# - Libraries
from keras.callbacks import EarlyStopping
early_spotting_monitor = EarlyStopping(patience=2)
from keras.layers import Input, Dense
from keras.models import Model
from keras.layers.merge import concatenate
# input layer
visible = Input(shape=(X_train.shape[1],))
hidden11 = Dense(14, activation='relu')(visible)
hidden12 = Dense(3, activation='softplus')(visible)
hidden13 = Dense(2, activation='linear')(visible)
hidden13 = Dense(2, activation='selu')(visible)
merge1 = concatenate([hidden11, hidden12, hidden13])
hidden21 = Dense(14, activation='relu')(merge1)
hidden22 = Dense(3, activation='softplus')(merge1)
hidden23 = Dense(2, activation='linear')(merge1)
hidden13 = Dense(2, activation='selu')(visible)
merge2 = concatenate([hidden21, hidden22, hidden23])
hidden3 = Dense(20, activation='relu')(merge2)
output = Dense(Y_train.shape[1],activation="linear")(hidden3)
model = Model(inputs=visible, outputs=output)
model.compile(optimizer="adam",loss="mean_absolute_error")
model.fit(X_train,Y_train,validation_split=0.10,epochs=13,callbacks=[early_spotting_monitor]) # starts training
return model
您必须使用 Functional API
来执行此操作,例如:
input = Input(shape = (X_train.shape[1]))
branchA = Dense(neuronsA, activation = "relu")(input)
branchB = Dense(neuronsB, activation = "sigmoid")(input)
out = concatenate([branchA, branchB])
Sequential API 无法做到这一点,因此我建议您将代码移至 functional API。
所以这是我最近一直在尝试做的事情,到目前为止,这就是我所做的。我认为它有效,但如果有人告诉我我在这里做错了什么,我将不胜感激。我只在输出层上这样做,我的输出层有两个单元:
def activations(l):
l_0 = tf.keras.activations.exponential(l[...,0])
l_1 = tf.keras.activations.elu(l[...,1])
lnew = tf.stack([l_0, l_1], axis = 1)
return lnew
model = tf.keras.Sequential([..., Dense(2, activation = activations)])
我在 Python 研究 Keras,我有一个神经网络(见下面的代码)。 目前它仅适用于 ReLu 激活。
出于实验原因,我希望在 ReLu 上有一些神经元,在 softmax(或任何其他激活函数)上有一些神经元。例如,在一个有 20 个神经元的层中,我想有 10 个 ReLu 和 10 个 Softmax。
我尝试了一些不同的方法,但总是无法得到输出。
你知道我应该怎么做吗?
# - Libraries
from keras.layers import Dense
from keras.models import Sequential
from keras.callbacks import EarlyStopping
early_spotting_monitor = EarlyStopping(patience=2)
layers = 4
neurons = 20
act = "ReLu"
# - Create Neural Network
model = Sequential()
model.add(Dense(neurons,activation=act,input_dim=X_train.shape[1]))
layers -= 1
while layers > 0:
model.add(Dense(neurons,activation=act))
layers -= 1
model.add(Dense(n_months))
model.compile(optimizer="adam",loss="mean_absolute_error")
model.fit(X_train,Y_train,validation_split=0.10,epochs=13,callbacks=[early_spotting_monitor])
编辑:这是我现在的(工作)代码:
# - Libraries
from keras.callbacks import EarlyStopping
early_spotting_monitor = EarlyStopping(patience=2)
from keras.layers import Input, Dense
from keras.models import Model
from keras.layers.merge import concatenate
# input layer
visible = Input(shape=(X_train.shape[1],))
hidden11 = Dense(14, activation='relu')(visible)
hidden12 = Dense(3, activation='softplus')(visible)
hidden13 = Dense(2, activation='linear')(visible)
hidden13 = Dense(2, activation='selu')(visible)
merge1 = concatenate([hidden11, hidden12, hidden13])
hidden21 = Dense(14, activation='relu')(merge1)
hidden22 = Dense(3, activation='softplus')(merge1)
hidden23 = Dense(2, activation='linear')(merge1)
hidden13 = Dense(2, activation='selu')(visible)
merge2 = concatenate([hidden21, hidden22, hidden23])
hidden3 = Dense(20, activation='relu')(merge2)
output = Dense(Y_train.shape[1],activation="linear")(hidden3)
model = Model(inputs=visible, outputs=output)
model.compile(optimizer="adam",loss="mean_absolute_error")
model.fit(X_train,Y_train,validation_split=0.10,epochs=13,callbacks=[early_spotting_monitor]) # starts training
return model
您必须使用 Functional API
来执行此操作,例如:
input = Input(shape = (X_train.shape[1]))
branchA = Dense(neuronsA, activation = "relu")(input)
branchB = Dense(neuronsB, activation = "sigmoid")(input)
out = concatenate([branchA, branchB])
Sequential API 无法做到这一点,因此我建议您将代码移至 functional API。
所以这是我最近一直在尝试做的事情,到目前为止,这就是我所做的。我认为它有效,但如果有人告诉我我在这里做错了什么,我将不胜感激。我只在输出层上这样做,我的输出层有两个单元:
def activations(l):
l_0 = tf.keras.activations.exponential(l[...,0])
l_1 = tf.keras.activations.elu(l[...,1])
lnew = tf.stack([l_0, l_1], axis = 1)
return lnew
model = tf.keras.Sequential([..., Dense(2, activation = activations)])