检查输入时出错:预期 lstm_28_input 具有形状 (5739, 8) 但得到形状为 (1, 8) 的数组
Error when checking input: expected lstm_28_input to have shape (5739, 8) but got array with shape (1, 8)
我得到了 keras 维度错误
输入的形状是这样的
print(train_X.shape, train_y.shape, test_X.shape, test_y.shape)
结果
(5739, 1, 8) (5739,) (1435, 1, 8) (1435,)
型号如下
batch_size=128
epochs=20
from keras_self_attention import SeqSelfAttention
from keras.layers import Flatten
model = keras.models.Sequential()
model.add(keras.layers.LSTM(epochs, input_shape=(train_X.shape[0], train_X.shape[2]), return_sequences=True))
model.add(Flatten())
model.add(keras.layers.Dense(units=1))
model.compile(loss='mse', optimizer='adam')
model.summary()
结果
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
lstm_33 (LSTM) (None, 5739, 20) 2320
_________________________________________________________________
seq_self_attention_35 (SeqSe (None, 5739, 20) 1345
_________________________________________________________________
flatten_8 (Flatten) (None, 114780) 0
_________________________________________________________________
dense_33 (Dense) (None, 1) 114781
=================================================================
Total params: 118,446
Trainable params: 118,446
Non-trainable params: 0
_________________________________________________________________
但我在拟合步骤中出错
history = model.fit(train_X, train_y, epochs=epochs, batch_size=batch_size, validation_data=(test_X, test_y), verbose=2, shuffle=False)
错误
ValueError: Error when checking input: expected lstm_33_input to have shape (5739, 8) but got array with shape (1, 8)
但我打印的输入形状是 (5739,8),但我无法理解 (1,8) 的来源。以及如何修复它。
input_shape=(train_X.shape[0], train_X.shape[2])
print(input_shape)
(5739, 8)
是test_X、test_Y还是train输入形状的问题?
我该如何解决?
Keras 中的 LSTM 层需要形状为 (n_timesteps, n_features)
的数据批次。您正在使用错误的尺寸构建图层。
首先,将训练数据重塑为 n_data_points, n_timesteps, n_features
:
的形状
train_X_ = np.swapaxes(train_X, 1, 2)
train_X_.shape # now of shape (5739, 8, 1)
然后用正确的尺寸指定您的模型:
model = keras.models.Sequential()
# input shape for the LSTM layer will be (8,1). No need to specify the batch shape.
model.add(keras.layers.LSTM(20, input_shape=(train_X_.shape[1], train_X_.shape[2]), return_sequences=True))
model.add(keras.layers.Flatten())
model.add(keras.layers.Dense(1))
model.compile(optimizer='adam', loss='mse')
这将正常工作:
model.fit(train_X_, train_y)
我得到了 keras 维度错误
输入的形状是这样的
print(train_X.shape, train_y.shape, test_X.shape, test_y.shape)
结果
(5739, 1, 8) (5739,) (1435, 1, 8) (1435,)
型号如下
batch_size=128
epochs=20
from keras_self_attention import SeqSelfAttention
from keras.layers import Flatten
model = keras.models.Sequential()
model.add(keras.layers.LSTM(epochs, input_shape=(train_X.shape[0], train_X.shape[2]), return_sequences=True))
model.add(Flatten())
model.add(keras.layers.Dense(units=1))
model.compile(loss='mse', optimizer='adam')
model.summary()
结果
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
lstm_33 (LSTM) (None, 5739, 20) 2320
_________________________________________________________________
seq_self_attention_35 (SeqSe (None, 5739, 20) 1345
_________________________________________________________________
flatten_8 (Flatten) (None, 114780) 0
_________________________________________________________________
dense_33 (Dense) (None, 1) 114781
=================================================================
Total params: 118,446
Trainable params: 118,446
Non-trainable params: 0
_________________________________________________________________
但我在拟合步骤中出错
history = model.fit(train_X, train_y, epochs=epochs, batch_size=batch_size, validation_data=(test_X, test_y), verbose=2, shuffle=False)
错误
ValueError: Error when checking input: expected lstm_33_input to have shape (5739, 8) but got array with shape (1, 8)
但我打印的输入形状是 (5739,8),但我无法理解 (1,8) 的来源。以及如何修复它。
input_shape=(train_X.shape[0], train_X.shape[2])
print(input_shape)
(5739, 8)
是test_X、test_Y还是train输入形状的问题? 我该如何解决?
Keras 中的 LSTM 层需要形状为 (n_timesteps, n_features)
的数据批次。您正在使用错误的尺寸构建图层。
首先,将训练数据重塑为 n_data_points, n_timesteps, n_features
:
train_X_ = np.swapaxes(train_X, 1, 2)
train_X_.shape # now of shape (5739, 8, 1)
然后用正确的尺寸指定您的模型:
model = keras.models.Sequential()
# input shape for the LSTM layer will be (8,1). No need to specify the batch shape.
model.add(keras.layers.LSTM(20, input_shape=(train_X_.shape[1], train_X_.shape[2]), return_sequences=True))
model.add(keras.layers.Flatten())
model.add(keras.layers.Dense(1))
model.compile(optimizer='adam', loss='mse')
这将正常工作:
model.fit(train_X_, train_y)