使用自定义层定义 tensorflow/tflearn 中的连接

Using a custom layer to define connectivity in tensorflow / tflearn

我想使用矩阵指定激活节点之间的连接,而不是全连接层。例如:

我有一个连接到 10 节点层的 20 节点层。使用典型的全连接层,我的 W 矩阵为 20 x 10,具有大小为 10 的 b 向量。

我的激活看起来像 relu(Wx + b)

如果我有一个与 W 大小相同的 1 和 0 矩阵,我们称它为 F,我可以在 W 和 [= 之间进行成对乘法15=] 删除第一层(20 个节点)和第二层(10 个节点)之间的连接

这是我当前的代码:

F.shape
# (20, 10)
import tflearn
import tensorflow as tf

input = tflearn.input_data(shape=[None, num_input])

first = tflearn.fully_connected(input, 20, activation='relu')
# Here is where I want to use a custom function, that uses my F matrix
# I dont want the second layer to be fully connected to the first, 
# I want only connections that are ones (and not zeros) in F

# Currently:
second = tflearn.fully_connected(first, 10, activation='relu')
# What I want:
second = tflearn.custom_layer(first, my_fun)

其中 my_fun 给我: relu( (FW)X + b)FW 是成对乘法

如何创建此函数?我似乎找不到关于它是如何完成的 tflearn 示例,但我也知道 tflearn 也允许基本的张量流函数

很难用 tflearn 严格做到这一点,但如果您愿意包括基本的 tensorflow 操作,这很简单:

F.shape
# (20, 10)
import tflearn
import tensorflow as tf

input = tflearn.input_data(shape=[None, num_input])
tf_F = tf.constant(F, shape=[20, 10])

first = tflearn.fully_connected(input, 20, activation='relu')
# Here is where I want to use a custom function, that uses my F matrix
# I want only connections that are ones (and not zeros) in F

# Old:
# second = tflearn.fully_connected(first, 10, activation='relu')
# Solution:
W = tf.Variable(tf.random_uniform([20, 10]), name='Weights')
b = tf.Variable(tf.zeros([10]), name='biases')
W_filtered = tf.mul(tf_F, W)
second = tf.matmul( W_filtered, first) + b