如何在keras中找到LSTM第一层的正确输入大小
How can I find the correct input size of the LSTM first layer in keras
我正在尝试为 keras 中的第一层 LSTM 设置正确的输入形状,但我很难理解什么是正确的 input_shape
对于 print(X_train.shape) 我得到 (9600, 64, 64, 1)
对于 print(y_train.shape) 我得到 (9600, 15)
#Initializing the classifier Network
classifier = Sequential()
#Adding the input LSTM network layer
classifier.add(LSTM(128, input_shape=(64,1), return_sequences=True))
classifier.add(Dropout(0.2))
如果您需要更多信息,请随时询问
此数据无法传递到需要 3D 数据的 LSTM 层。也许在添加时间步长维度后尝试 tf.keras.layers.ConvLSTM2D
:
import tensorflow as tf
images = tf.random.uniform((10, 1, 224, 224, 1))
classifier = tf.keras.Sequential([
tf.keras.layers.ConvLSTM2D(8,
kernel_size=(3, 3),
input_shape=(1, 224, 224, 1),
return_sequences=True)
])
classifier(images)
[[[[-3.53521258e-02, -2.02189311e-02, -2.47801729e-02, ...,
-2.34759413e-03, 4.60262299e-02, 4.76470888e-02],
[ 1.04620471e-03, -9.23185516e-03, 1.37878451e-02, ...,
-4.88127321e-02, 4.20494527e-02, 6.06664363e-03],
[ 1.26057174e-02, 1.07498122e-02, -1.85700115e-02, ...,
-1.49483923e-02, 1.21065099e-02, 1.71790868e-02]...,
我正在尝试为 keras 中的第一层 LSTM 设置正确的输入形状,但我很难理解什么是正确的 input_shape
对于 print(X_train.shape) 我得到 (9600, 64, 64, 1)
对于 print(y_train.shape) 我得到 (9600, 15)
#Initializing the classifier Network
classifier = Sequential()
#Adding the input LSTM network layer
classifier.add(LSTM(128, input_shape=(64,1), return_sequences=True))
classifier.add(Dropout(0.2))
如果您需要更多信息,请随时询问
此数据无法传递到需要 3D 数据的 LSTM 层。也许在添加时间步长维度后尝试 tf.keras.layers.ConvLSTM2D
:
import tensorflow as tf
images = tf.random.uniform((10, 1, 224, 224, 1))
classifier = tf.keras.Sequential([
tf.keras.layers.ConvLSTM2D(8,
kernel_size=(3, 3),
input_shape=(1, 224, 224, 1),
return_sequences=True)
])
classifier(images)
[[[[-3.53521258e-02, -2.02189311e-02, -2.47801729e-02, ...,
-2.34759413e-03, 4.60262299e-02, 4.76470888e-02],
[ 1.04620471e-03, -9.23185516e-03, 1.37878451e-02, ...,
-4.88127321e-02, 4.20494527e-02, 6.06664363e-03],
[ 1.26057174e-02, 1.07498122e-02, -1.85700115e-02, ...,
-1.49483923e-02, 1.21065099e-02, 1.71790868e-02]...,