如何更改 LSTM 层的输入维度

How to change input dimensions for LSTM layer

我需要制作一个具有 2 个 dropout 层和两个 LSTM 层的模型。不幸的是,我的第二个 LSTM 层的输入形状有问题。搜索问题后,我发现我需要更改输入尺寸,但我不知道该怎么做。我找到了一个需要使用 Lambda 层的选项,但我无法将其导入我的环境(这是一个 coursera 环境)。您对如何处理我的错误有什么建议吗?

model = Sequential()
Layer1 = model.add(Embedding(total_words, 64, input_length=max_sequence_len-1))
Layer2 = model.add(Bidirectional(LSTM(20)))
Layer3 = model.add(Dropout(.03))
Layer4 = model.add(LSTM(20))
Layer5 = model.add(Dense(total_words, 
    kernel_regularizer=regularizers.l1_l2(l1=1e-5, l2=1e-4),
    bias_regularizer=regularizers.l2(1e-4),
    activity_regularizer=regularizers.l2(1e-5)))
          # A Dense Layer including regularizers
Layer6 = model.add(Dense(total_words, activation = 'softmax'))
          
# Pick an optimizer
          
model.compile(loss = 'categorical_crossentropy', optimizer = 'adam', metrics = ['accuracy'])
model.summary()

错误:

ValueError: Input 0 of layer lstm_20 is incompatible with the layer: expected ndim=3, found ndim=2. Full shape received: [None, 40]

感谢@Marco Cerliani 和@Joanna Kastelik 的更新。

为了社区的利益,在此处使用示例数据提供解决方案,如下所示

import tensorflow as tf
import numpy as np
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Embedding, Bidirectional, LSTM, Dropout,  Dense
from tensorflow.keras.regularizers import l1_l2, l2

total_words = 478
max_sequence_len = 90

model = Sequential()
Layer1 = model.add(Embedding(total_words, 64, input_length=max_sequence_len-1))
Layer2 = model.add(Bidirectional(LSTM(20)))
Layer3 = model.add(Dropout(.03))
Layer4 = model.add(LSTM(20))
Layer5 = model.add(Dense(total_words, 
    kernel_regularizer=l1_l2(l1=1e-5, l2=1e-4),
    bias_regularizer=l2(1e-4),
    activity_regularizer=l2(1e-5)))
          # A Dense Layer including regularizers
Layer6 = model.add(Dense(total_words, activation = 'softmax'))
          
# Pick an optimizer
          
model.compile(loss = 'categorical_crossentropy', optimizer = 'adam', metrics = ['accuracy'])
model.summary()

输出:

---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
<ipython-input-1-8ce04225c92d> in <module>()
     12 Layer2 = model.add(Bidirectional(LSTM(20)))
     13 Layer3 = model.add(Dropout(.03))
---> 14 Layer4 = model.add(LSTM(20))
     15 Layer5 = model.add(Dense(total_words, 
     16     kernel_regularizer=l1_l2(l1=1e-5, l2=1e-4),

8 frames
/usr/local/lib/python3.7/dist-packages/tensorflow/python/keras/engine/input_spec.py in assert_input_compatibility(input_spec, inputs, layer_name)
    221                          'expected ndim=' + str(spec.ndim) + ', found ndim=' +
    222                          str(ndim) + '. Full shape received: ' +
--> 223                          str(tuple(shape)))
    224     if spec.max_ndim is not None:
    225       ndim = x.shape.rank

ValueError: Input 0 of layer lstm_1 is incompatible with the layer: expected ndim=3, found ndim=2. Full shape received: (None, 40)

固定码:

return_sequences=True 添加到 LSTM(即 Layer2)层后,您的问题就可以解决。

model = Sequential()
Layer1 = model.add(Embedding(total_words, 64, input_length=max_sequence_len-1))
Layer2 = model.add(Bidirectional(LSTM(20, return_sequences=True)))
Layer3 = model.add(Dropout(.03))
Layer4 = model.add(LSTM(20))
Layer5 = model.add(Dense(total_words, 
    kernel_regularizer=l1_l2(l1=1e-5, l2=1e-4),
    bias_regularizer=l2(1e-4),
    activity_regularizer=l2(1e-5)))
          # A Dense Layer including regularizers
Layer6 = model.add(Dense(total_words, activation = 'softmax'))
          
# Pick an optimizer
          
model.compile(loss = 'categorical_crossentropy', optimizer = 'adam', metrics = ['accuracy'])
model.summary()

输出:

Model: "sequential_1"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
embedding_1 (Embedding)      (None, 89, 64)            30592     
_________________________________________________________________
bidirectional_1 (Bidirection (None, 89, 40)            13600     
_________________________________________________________________
dropout_1 (Dropout)          (None, 89, 40)            0         
_________________________________________________________________
lstm_3 (LSTM)                (None, 20)                4880      
_________________________________________________________________
dense (Dense)                (None, 478)               10038     
_________________________________________________________________
dense_1 (Dense)              (None, 478)               228962    
=================================================================
Total params: 288,072
Trainable params: 288,072
Non-trainable params: 0
_________________________________________________________________