在keras后端将矩阵与其他不同形状的矩阵相乘

Multiply matrix with other matrix of different shapes in keras backend

我正在尝试实施基于 this model 的注意力模型 但我希望我的模型不只是看一帧来决定对该帧的注意力,我想要一个模型会尝试根据整个序列来查看帧。所以我正在做的是将每个帧乘以一个序列向量,这是一个 lstm (return_sequence=False)

的输出

这些是修改后的函数:

def build(self, input_shape):

    self.W = self.add_weight((input_shape[-1],),
                             initializer=self.init,
                             name='{}_W'.format(self.name))
    if self.lstm_size is None:
        self.lstm_size = input_shape[-1]
    self.vec_lstm = LSTM(self.lstm_size, return_sequences=False)
    self.vec_lstm.build(input_shape)
    self.seq_lstm = LSTM(self.lstm_size, return_sequences=True)
    self.seq_lstm.build(input_shape)
    self.trainable_weights = [self.W]+self.vec_lstm.trainable_weights + self.seq_lstm.trainable_weights
    super(Attention2, self).build(input_shape)  # Be sure to call this somewhere!

def call(self, x, mask=None):
    vec = self.vec_lstm(x)
    seq = self.seq_lstm(x)

#
    eij = # combine seq and vec somehow?
#
    eij = K.dot(eij,self.W)
    eij = K.tanh(eij)
    a = K.exp(eij)
    a /= K.cast(K.sum(a, axis=1, keepdims=True) + K.epsilon(), K.floatx())
    a = K.expand_dims(a)
    weighted_input = x * a
    attention = K.sum(weighted_input, axis=1)
    return attention

组合 2 个矩阵的原始代码是:

eij = np.zeros((batch_size,sequence_length,frame_size))
for i,one_seq in enumerate(seq):    
    for j,timestep in enumerate(one_seq):
        eij[i,j] = timestep*vec[i]  

如果能帮助我用 keras 后端实现这个,我将不胜感激。

谢谢!

这似乎提供了我想要的结果:

vec = vec_lstm(x)        
seq = seq_lstm(x)
repeat_vec = K.repeat(vec,seq.shape[1])
eij = seq * repeat_vec