tensorflow/keras 中输入的自相关

autocorrelation of the input in tensorflow/keras

我有一个一维输入信号。我想计算自相关作为神经网络的一部分,以便在网络内部进一步使用。 我需要执行输入与输入本身的卷积。 在 keras 自定义层/张量流中执行卷积。我们需要以下参数 data shape is "[batch, in_height, in_width, in_channels]", filter shape is "[filter_height, filter_width, in_channels, out_channels]

filter shape中没有batch,我需要输入

您可以通过将 "batch size" 视为 "depth":

来使用 tf.nn.conv3d
# treat the batch size as depth.
data = tf.reshape(input_data, [1, batch, in_height, in_width, in_channels])
kernel = [filter_depth, filter_height, filter_width, in_channels, out_channels]
out = tf.nn.conv3d(data, kernel, [1,1,1,1,1], padding='SAME')

这是一个可能的解决方案。

通过自卷积,我理解了一个规则的卷积,其中过滤器与输入完全相同(如果不是那样,请原谅我的误解)。

我们需要一个自定义函数,以及一个 Lambda 层。

起初我使用了padding = 'same',它使输出与输入的长度相同。我不确定你到底想要什么输出长度,但如果你想要更多,你应该在进行卷积之前自己添加填充。 (在长度为 7 的示例中,对于从一端到另一端的完整卷积,此手动填充将包括输入长度之前的 6 个零和输入长度之后的 6 个零,并使用 padding = 'valid'。找到 backend functions here

工作示例 - 输入 (5,7,2)

from keras.models import Model
from keras.layers import *
import keras.backend as K

batch_size = 5
length = 7
channels = 2
channels_batch = batch_size*channels

def selfConv1D(x):
    #this function unfortunately needs to know previously the shapes
    #mainly because of the for loop, for other lines, there are workarounds
    #but these workarounds are not necessary since we'll have this limitation anyway

    #original x: (batch_size, length, channels)

    #bring channels to the batch position:
    x = K.permute_dimensions(x,[2,0,1]) #(channels, batch_size, length)

    #suppose channels are just individual samples (since we don't mix channels)
    x = K.reshape(x,(channels_batch,length,1))

    #here, we get a copy of x reshaped to match filter shapes:
    filters = K.permute_dimensions(x,[1,2,0])  #(length, 1, channels_batch)

    #now, in the lack of a suitable available conv function, we make a loop
    allChannels = []
    for i in range (channels_batch):

        f = filters[:,:,i:i+1]
        allChannels.append(
            K.conv1d(
                x[i:i+1], 
                f, 
                padding='same', 
                data_format='channels_last'))
                    #although channels_last is my default config, I found this bug: 
                    #https://github.com/fchollet/keras/issues/8183

        #convolution output: (1, length, 1)

    #concatenate all results as samples
    x = K.concatenate(allChannels, axis=0) #(channels_batch,length,1)

    #restore the original form (passing channels to the end)
    x = K.reshape(x,(channels,batch_size,length))
    return K.permute_dimensions(x,[1,2,0]) #(batch_size, length, channels)


#input data for the test:
x = np.array(range(70)).reshape((5,7,2))

#little model that just performs the convolution
inp= Input((7,2))
out = Lambda(selfConv1D)(inp)

model = Model(inp,out)

#checking results
p = model.predict(x)
for i in range(5):
    print("x",x[i])
    print("p",p[i])

TensorFlow 现在有一个 auto_correlation 函数。它应该发布 1.6。如果您从源代码构建,您可以立即使用它(参见 the github code)。