如何向我的 python keras ANN 添加噪声(抖动)以避免过度拟合?
How can I add noise (jittering) to my python keras ANN to avoid overfitting?
我在 Python Keras 中实现了一个人工神经网络模型,我的训练准确率很高,但测试准确率很低。这意味着数据中存在一些过度拟合。
我想避免过度拟合,其中一种技术是抖动或噪声添加。但是,我的问题是:如何在 Python 中完成?
这是我的 ANN 代码:
def designANN(input_nodes, dropout, layer_nodes, output_nodes):
classifier = Sequential()
classifier.add(Dense(units = layer_nodes, kernel_initializer = "uniform",
activation = "relu", input_dim = input_nodes))
classifier.add(Dropout(dropout))
classifier.add(Dense(units = layer_nodes, kernel_initializer = "uniform",
activation = "relu"))
classifier.add(Dropout(dropout))
classifier.add(Dense(units = output_nodes, kernel_initializer = "uniform",
activation = "sigmoid"))
classifier.compile(optimizer = "adam", loss = "binary_crossentropy", metrics = [npv])
return classifier
您只需要 GaussianNoise 层。你可以把它放在你的网络中。我建议在激活函数之前使用它。这是 relu 的情况,如果我们添加随机噪声
,输出值可能超出范围 (<0)
def designANN(input_nodes, dropout, layer_nodes, output_nodes):
classifier = Sequential()
classifier.add(Dense(units = layer_nodes, kernel_initializer = "uniform",
input_dim = input_nodes))
classifier.add(GaussianNoise(0.1))
classifier.add(Activation('relu'))
classifier.add(Dropout(dropout))
classifier.add(Dense(units = layer_nodes, kernel_initializer = "uniform"))
classifier.add(GaussianNoise(0.1))
classifier.add(Activation('relu'))
classifier.add(Dropout(dropout))
classifier.add(Dense(units = output_nodes, kernel_initializer = "uniform",
activation = "sigmoid"))
classifier.compile(optimizer = "adam", loss = "binary_crossentropy", metrics = [npv])
return classifier
我在 Python Keras 中实现了一个人工神经网络模型,我的训练准确率很高,但测试准确率很低。这意味着数据中存在一些过度拟合。
我想避免过度拟合,其中一种技术是抖动或噪声添加。但是,我的问题是:如何在 Python 中完成?
这是我的 ANN 代码:
def designANN(input_nodes, dropout, layer_nodes, output_nodes):
classifier = Sequential()
classifier.add(Dense(units = layer_nodes, kernel_initializer = "uniform",
activation = "relu", input_dim = input_nodes))
classifier.add(Dropout(dropout))
classifier.add(Dense(units = layer_nodes, kernel_initializer = "uniform",
activation = "relu"))
classifier.add(Dropout(dropout))
classifier.add(Dense(units = output_nodes, kernel_initializer = "uniform",
activation = "sigmoid"))
classifier.compile(optimizer = "adam", loss = "binary_crossentropy", metrics = [npv])
return classifier
您只需要 GaussianNoise 层。你可以把它放在你的网络中。我建议在激活函数之前使用它。这是 relu 的情况,如果我们添加随机噪声
,输出值可能超出范围 (<0)def designANN(input_nodes, dropout, layer_nodes, output_nodes):
classifier = Sequential()
classifier.add(Dense(units = layer_nodes, kernel_initializer = "uniform",
input_dim = input_nodes))
classifier.add(GaussianNoise(0.1))
classifier.add(Activation('relu'))
classifier.add(Dropout(dropout))
classifier.add(Dense(units = layer_nodes, kernel_initializer = "uniform"))
classifier.add(GaussianNoise(0.1))
classifier.add(Activation('relu'))
classifier.add(Dropout(dropout))
classifier.add(Dense(units = output_nodes, kernel_initializer = "uniform",
activation = "sigmoid"))
classifier.compile(optimizer = "adam", loss = "binary_crossentropy", metrics = [npv])
return classifier