Keras - 有没有办法减少 categorical_accuracy 和 val_categorical_accuracy 之间的价值差距?

Keras - Is There an way to reduce value gap between categorical_accuracy and val_categorical_accuracy?

我正在尝试构建和训练 LSTM 神经网络。

这是我的代码(摘要版):

X_train, X_test, y_train, y_test = train_test_split(np.array(sequences), to_categorical(labels).astype(int), test_size=0.2)
X_train, X_val, y_train, y_val = train_test_split(X_train, y_train, test_size=0.2)

log_dir = os.path.join('Logs')
tb_callback = TensorBoard(log_dir=log_dir)

model = Sequential()
model.add(LSTM(64, return_sequences=True, activation='tanh', input_shape=(60,1662)))
model.add(LSTM(128, return_sequences=True, activation='tanh', dropout=0.31))
model.add(LSTM(64, return_sequences=False, activation='tanh'))
model.add(Dense(32, activation='relu'))
model.add(Dense(len(actions), activation='softmax'))

model.compile(optimizer='Adam', loss='categorical_crossentropy', metrics=['categorical_accuracy'])

val_dataset = tf.data.Dataset.from_tensor_slices((X_val, y_val)) # default slice percentage check 
val_dataset = val_dataset.batch(256)

model.fit(X_train, y_train, batch_size=256, epochs=250, callbacks=[tb_callback], validation_data=val_dataset)

模型拟合结果:

Epoch 248/250
8/8 [==============================] - 2s 252ms/step - loss: 0.4563 - categorical_accuracy: 0.8641 - val_loss: 2.1406 - val_categorical_accuracy: 0.6104
Epoch 249/250
8/8 [==============================] - 2s 255ms/step - loss: 0.4542 - categorical_accuracy: 0.8672 - val_loss: 2.2365 - val_categorical_accuracy: 0.5667
Epoch 250/250
8/8 [==============================] - 2s 234ms/step - loss: 0.4865 - categorical_accuracy: 0.8562 - val_loss: 2.1668 - val_categorical_accuracy: 0.5875

我想缩小 categorical_accuracy 和 val_categorical_accuracy 之间的价值差距。

我能知道怎么做吗?

感谢您阅读我的文章。

当您的训练数据和验证数据之间存在如此大的差异时,这意味着您的模型过度拟合。

所以寻找如何防止过度拟合。通常您需要做的 是向您的数据集添加更多数据

It won’t work every time, but training with more data can help algorithms detect the signal better.

Try to stop before overfit

另一个方面是尝试停止模型并降低学习率