为什么 Epoch 显示出相同的准确度?
Why Epoch showing the same accuracy?
我正在尝试构建 IDS 入侵检测系统并尝试预测标签是良性还是 DDos。但是我在各个时期都获得了相同的准确性。
代码:
from tensorflow import keras
import numpy as np
import datetime
import time
from tensorflow.keras.optimizers import Adam
from keras.models import Sequential
from keras.layers import Dense, Dropout
from keras import callbacks
x=pd.DataFrame(X)
x = x.values
sample = x.shape[0]
features = x.shape[1]
#Train: convert 2D to 3D for input RNN
x_train = np.reshape(x,(sample,features,1)) #shape = (125973, 18, 1)
#Test: convert 2D to 3D for input RNN
x_test=pd.DataFrame(X_test)
x_test = x_test.values
x_test = np.reshape(x_test,(x_test.shape[0],x_test.shape[1],1))
Model = keras.Sequential([
keras.layers.LSTM(80,input_shape=(features,x_train.shape[2]),
activation='sigmoid',recurrent_activation='hard_sigmoid'),
keras.layers.Dense(1,activation="softmax")
])
Model.compile(optimizer='rmsprop',loss='mse', metrics=['accuracy'])
#Training the model
Model.fit(x_train, y, epochs=10, batch_size= 32)
Model.summary()
# Final evaluation of the model
scores = Model.evaluate(x_test, y_test, verbose=0)
print('/n')
print("Accuracy: %.2f%%" % (scores[1]*100))
Epoch 1/10
1074/1074 [==============================] - 92s 83ms/step - loss: 0.4180 - accuracy: 0.5820
Epoch 2/10
1074/1074 [==============================] - 79s 74ms/step - loss: 0.4180 - accuracy: 0.5820
Epoch 3/10
1074/1074 [==============================] - 81s 76ms/step - loss: 0.4180 - accuracy: 0.5820
解决方法是什么?
因为"softmax"
激活1个神经元总是输出1。你的神经元不能调整它的输出来减少损失;它在数学上只能 return 1.
我正在尝试构建 IDS 入侵检测系统并尝试预测标签是良性还是 DDos。但是我在各个时期都获得了相同的准确性。
代码:
from tensorflow import keras
import numpy as np
import datetime
import time
from tensorflow.keras.optimizers import Adam
from keras.models import Sequential
from keras.layers import Dense, Dropout
from keras import callbacks
x=pd.DataFrame(X)
x = x.values
sample = x.shape[0]
features = x.shape[1]
#Train: convert 2D to 3D for input RNN
x_train = np.reshape(x,(sample,features,1)) #shape = (125973, 18, 1)
#Test: convert 2D to 3D for input RNN
x_test=pd.DataFrame(X_test)
x_test = x_test.values
x_test = np.reshape(x_test,(x_test.shape[0],x_test.shape[1],1))
Model = keras.Sequential([
keras.layers.LSTM(80,input_shape=(features,x_train.shape[2]),
activation='sigmoid',recurrent_activation='hard_sigmoid'),
keras.layers.Dense(1,activation="softmax")
])
Model.compile(optimizer='rmsprop',loss='mse', metrics=['accuracy'])
#Training the model
Model.fit(x_train, y, epochs=10, batch_size= 32)
Model.summary()
# Final evaluation of the model
scores = Model.evaluate(x_test, y_test, verbose=0)
print('/n')
print("Accuracy: %.2f%%" % (scores[1]*100))
Epoch 1/10
1074/1074 [==============================] - 92s 83ms/step - loss: 0.4180 - accuracy: 0.5820
Epoch 2/10
1074/1074 [==============================] - 79s 74ms/step - loss: 0.4180 - accuracy: 0.5820
Epoch 3/10
1074/1074 [==============================] - 81s 76ms/step - loss: 0.4180 - accuracy: 0.5820
解决方法是什么?
因为"softmax"
激活1个神经元总是输出1。你的神经元不能调整它的输出来减少损失;它在数学上只能 return 1.