keras:对小批量中的每个样本使用不同过滤器的一维卷积
keras: 1D convolutions with different filter for each sample in mini-batch
一般来说,我们使用同一组过滤器对小批量样本进行一维卷积运算。但现在我想为小批量中的每个样本使用不同的过滤器。有什么办法可以在 keras 中做到这一点,尤其是在不知道小批量大小的情况下?
具体来说,我有 (batch_size, maxlen, input_dim)
形状的输入数据,并且生成了一组 (batch_size, output_dim, kernel_size, input_dim)
形式的过滤器。我可以将输入与过滤器集进行卷积吗?
这非常棘手,我们从 K.depthwise_conv2d
(唯一单独处理通道的卷积)获得帮助,我们将样本转换为通道,每个通道具有所需的输出,然后重新整形达到预期
所以,想法是:
1 - 变换输入形状(适当重新排序)
#from `(batch_size, maxlen, input_dim)`
#to `(1, maxlen, input_dim, batch_size)`
x = K.expand_dims(x, axis=0)
x = K.permute_dimensions(x, (0,2,3,1))
2 - 过滤器的形状为 (kernel_size, input_dim, batch_size, output_dim)
#transform your kernels:
filters = K.permute_dimensions(filters, (2, 3, 0, 1))
3 转换结果(适当重新排序):
#from `(1, result_len, 1, batch_size * output_dim)`
#to `(batch_size, result_len, output_dim)`
results = K.reshape(results, (output_length, -1, output_dim) #-1 for batch size
results = K.permute_dimensions(results, (1,0,2))
这是一个示例代码:
from keras.layers import *
import keras.backend as K
from keras.models import Model
#dimensions
length = 11 #maxlen
features = 3 #input_dim
filtersize = 2 #kernel_size
out_dim = 5 #output_dim
samples = 7 #batch_size
#keep track of output length for reshaping
outlen = length - filtersize + 1
#creating dummy filters with the desired shape
npfilters = np.arange(features*filtersize*out_dim*samples)
npfilters = npfilters.astype(np.float64)
npfilters = npfilters.reshape((filtersize, features, samples, out_dim))
kerasfilters = K.variable(npfilters)
#function that performs the convolution
def sample_wise_conv1d(x):
#reshape and reorder inputs properly
x= K.expand_dims(x, axis=0)
x = K.permute_dimensions(x, (0,2,3,1))
print('in shape', K.int_shape(x))
#perform the convolution
print("filter shape", K.int_shape(kerasfilters))
results = K.depthwise_conv2d(x, kerasfilters)
print('out shape', K.int_shape(results))
#reshape and reorder the results properly
results = K.reshape(results, (outlen, samples, out_dim))
results = K.permute_dimensions(results, (1,0,2))
print('final shape', K.int_shape(results))
return results
#creating a model that performs the operation
inputs = Input((length, features))
outputs = Lambda(sample_wise_conv1d)(inputs)
model = Model(inputs, outputs)
#predicting from the model
inputdata = np.arange(samples*length*features).reshape((samples, length, features))
results = model.predict(inputdata)
print(results.shape)
测试代码
#creating a single conv1D model for each sample
for i in range(samples):
#get the respective input sample and filter
x = inputdata[i:i+1]
filts = npfilters[:,:,i,:]
print(filts.shape)
#make a model with conv1d
in1D = Input((length, features))
out1D = Lambda(lambda x: K.conv1d(x, K.variable(filts)))(in1D)
model1D = Model(in1D, out1D)
#compare this model's predictions with the respective prediction from the function
pred1D = model1D.predict(x)
pred2D = results[i]
print(pred1D == pred2D)
一般来说,我们使用同一组过滤器对小批量样本进行一维卷积运算。但现在我想为小批量中的每个样本使用不同的过滤器。有什么办法可以在 keras 中做到这一点,尤其是在不知道小批量大小的情况下?
具体来说,我有 (batch_size, maxlen, input_dim)
形状的输入数据,并且生成了一组 (batch_size, output_dim, kernel_size, input_dim)
形式的过滤器。我可以将输入与过滤器集进行卷积吗?
这非常棘手,我们从 K.depthwise_conv2d
(唯一单独处理通道的卷积)获得帮助,我们将样本转换为通道,每个通道具有所需的输出,然后重新整形达到预期
所以,想法是:
1 - 变换输入形状(适当重新排序)
#from `(batch_size, maxlen, input_dim)`
#to `(1, maxlen, input_dim, batch_size)`
x = K.expand_dims(x, axis=0)
x = K.permute_dimensions(x, (0,2,3,1))
2 - 过滤器的形状为 (kernel_size, input_dim, batch_size, output_dim)
#transform your kernels:
filters = K.permute_dimensions(filters, (2, 3, 0, 1))
3 转换结果(适当重新排序):
#from `(1, result_len, 1, batch_size * output_dim)`
#to `(batch_size, result_len, output_dim)`
results = K.reshape(results, (output_length, -1, output_dim) #-1 for batch size
results = K.permute_dimensions(results, (1,0,2))
这是一个示例代码:
from keras.layers import *
import keras.backend as K
from keras.models import Model
#dimensions
length = 11 #maxlen
features = 3 #input_dim
filtersize = 2 #kernel_size
out_dim = 5 #output_dim
samples = 7 #batch_size
#keep track of output length for reshaping
outlen = length - filtersize + 1
#creating dummy filters with the desired shape
npfilters = np.arange(features*filtersize*out_dim*samples)
npfilters = npfilters.astype(np.float64)
npfilters = npfilters.reshape((filtersize, features, samples, out_dim))
kerasfilters = K.variable(npfilters)
#function that performs the convolution
def sample_wise_conv1d(x):
#reshape and reorder inputs properly
x= K.expand_dims(x, axis=0)
x = K.permute_dimensions(x, (0,2,3,1))
print('in shape', K.int_shape(x))
#perform the convolution
print("filter shape", K.int_shape(kerasfilters))
results = K.depthwise_conv2d(x, kerasfilters)
print('out shape', K.int_shape(results))
#reshape and reorder the results properly
results = K.reshape(results, (outlen, samples, out_dim))
results = K.permute_dimensions(results, (1,0,2))
print('final shape', K.int_shape(results))
return results
#creating a model that performs the operation
inputs = Input((length, features))
outputs = Lambda(sample_wise_conv1d)(inputs)
model = Model(inputs, outputs)
#predicting from the model
inputdata = np.arange(samples*length*features).reshape((samples, length, features))
results = model.predict(inputdata)
print(results.shape)
测试代码
#creating a single conv1D model for each sample
for i in range(samples):
#get the respective input sample and filter
x = inputdata[i:i+1]
filts = npfilters[:,:,i,:]
print(filts.shape)
#make a model with conv1d
in1D = Input((length, features))
out1D = Lambda(lambda x: K.conv1d(x, K.variable(filts)))(in1D)
model1D = Model(in1D, out1D)
#compare this model's predictions with the respective prediction from the function
pred1D = model1D.predict(x)
pred2D = results[i]
print(pred1D == pred2D)