Keras Lambda 层和 Theano 代码:访问 TensorVariable 的维度和值

Keras Lambda Layer & Theano Code: Accessing dimensions and values of TensorVariable

首先,我对 Keras 和 Theano 还很陌生。我正在实施一个使用本地响应规范化 (LRN) 的 CNN。据我所知,在 keras 和 theano 的基本功能中都没有实现这样的层。所以我尝试使用 keras lambda 层来实现 LRN。

'def local_response_normalization(x): #LRN参数 k = 2 n = 5 阿尔法 = 0.0001 贝塔 = 0.75

result = x.eval()
x_tmp = x.eval()


#building functions for the theano computation graph
scalar_op = T.dscalar('scalar_op')
matrix1_op = T.dmatrix('matrix1_op')
matrix2_op = T.dmatrix('matrix2_op')
mul_result = scalar_op * matrix1_op
add_result = scalar_op + matrix1_op
pow_result = matrix1_op ** scalar_op
div_result = matrix1_op / matrix2_op

sc_mat_mul_f = function([scalar_op, matrix1_op], mul_result)
sc_mat_add_f = function([scalar_op, matrix1_op], add_result)
sc_mat_pow_f = function([scalar_op, matrix1_op], pow_result)
mat_div_f = function([matrix1_op, matrix2_op], div_result)

#x is supposed to be a 3-dimensional tensor (a x b x c)
a_dim = x_tmp.shape[0]
b_dim = x_tmp.shape[1]
c_dim = x_tmp.shape[2]

#iterating through channels
for i in range(0, a_dim):
    j_l = max(0, i-(n/2))# j_l(ower_bound)
    j_u = min(N-1, i+(n/2))# j_u(pper_bound)

    x_tmp = x.eval()
    #retrieving set of local 'neurons' 
    x_tmp = x_tmp[j_l:j_u+1,:,:]
    #building squared sum
    x_tmp = T.sqr(x_tmp)#TensorVariable
    x_tmp = T.sum(x_tmp, axis=0)#axis no. 0 = 'channel' axis
    #x_tmp is now 2-dimensional
    x_tmp = sc_mat_mul_f(alpha, x_tmp.eval())
    x_tmp = sc_mat_add_f(k, x_tmp)
    x_tmp = sc_mat_pow_f(beta, x_tmp)
    x_tmp = mat_div_f(result[i], x_tmp)
    #exchanging channel i with x_tmp ( = LRN )
    result[i] = x_tmp

return result`

我正在使用 model.add(local_response_normalization, ...) 将该层集成到模型中 尝试编译和拟合模型时,我得到:

theano.gof.fg.MissingInputError: Input 0 of the graph (indices start from 0), used to compute AbstractConv2d{convdim=2, border_mode='half', subsample=(4, 4), filter_flip=True, imshp=(None, 3, 220, 220), kshp=(96, 3, 11, 11), filter_dilation=(1, 1)}(/conv2d_1_input, InplaceDimShuffle{3,2,0,1}.0), was not provided and not given a value. Use the Theano flag exception_verbosity='high', for more information on this error.

主要问题似乎是在模型编译期间无法调用 eval()。除了使用 eval() 将 x 的元素(作为 conv2d 层的输出张量)转换为 numpy 数组外,我找不到其他方法来访问和操作 x 的元素,但这显然行不通。在我看来,我通常缺少 Lambda 层和 TensorVariables 背后的主要概念。
我花了最后两天时间处理这个问题,但我真的被卡住了。

我找到了一种无需评估 TensorVariables 即可对 LRN 进行所有计算的方法。这有点明显。