theano中的扫描函数,递归神经网络

scan function in theano, recurrent neural net

我一直在尝试在 theano 中使用 scan 来实现 RNN(示例改编自此处:https://github.com/valentin012/conspeech/blob/master/rnn_theano.py

def forward_prop_step(x_t, s_t_prev, U, V, W):
    u = T.dot(x_t,U)
    s_t = T.tanh(u+T.dot(s_t_prev,W)) 
    o_t = T.nnet.softmax(T.dot(s_t,V))
    return [o_t[0], s_t]
Q = np.zeros(self.hidden_dim)
init = theano.shared(Q)
[o,s], updates = theano.scan(
    forward_prop_step,
    sequences=x,
    outputs_info=[None, dict(initial=init)],
    non_sequences=[U, V, W],
    truncate_gradient=self.bptt_truncate,
    strict=False)

现在,我尝试做的是实现一个 RNN,其中输出变量直接相互影响(o_{t-1}o_t 通过权重链接)。我试着这样实现它:

def forward_prop_step(x_t, s_t_prev, o_t_prev, U, V, W, Q):
    u = T.dot(x_t,U)
    s_t = T.tanh(u+T.dot(s_t_prev,W)) 
    o_t = T.nnet.softmax(T.dot(o_t_prev,Q)+T.dot(s_t,V))
    return [o_t[0], s_t, o_t[0]]
R = np.zeros(self.hidden_dim)
init = theano.shared(R)
S = np.zeros(self.word_dim)
init_S = theano.shared(S)
[o,s,op], updates = theano.scan(
    forward_prop_step,
    sequences=x,
    outputs_info=[None, dict(initial=init), dict(initial=init_S)],
    non_sequences=[U, V, W, Q],
    truncate_gradient=self.bptt_truncate,
    strict=False)

但是,它不起作用,我不知道如何修复它。

错误信息是:

File "theano/scan_module/scan_perform.pyx", line 397, in theano.scan_module.scan_perform.perform (/home/mertens/.theano/compiledir_Linux-3.2--amd64-x86_64-with-debian-7.6--2.7.9-64/scan_perform/mod.cpp:4193) ValueError: Shape mismatch: A.shape[1] != x.shape[0] Apply node that caused the error: CGemv{inplace}(AllocEmpty{dtype='float64'}.0, TensorConstant{1.0}, Q_copy.T, , TensorConstant{0.0}) Toposort index: 10

编辑 这是确切的代码:

word_dim=3
hidden_dim=4

U = np.random.uniform(-np.sqrt(1./word_dim), np.sqrt(1./word_dim), (word_dim,hidden_dim))
V = np.random.uniform(-np.sqrt(1./hidden_dim), np.sqrt(1./hidden_dim), (hidden_dim,word_dim))
W = np.random.uniform(-np.sqrt(1./hidden_dim), np.sqrt(1./hidden_dim), (hidden_dim, hidden_dim))
Q = np.random.uniform(-np.sqrt(1./word_dim), np.sqrt(1./word_dim), (word_dim, word_dim))

U = theano.shared(name='U', value=U.astype(theano.config.floatX))
V = theano.shared(name='V', value=V.astype(theano.config.floatX))
W = theano.shared(name='W', value=W.astype(theano.config.floatX))
Q = theano.shared(name='Q', value=W.astype(theano.config.floatX))

def forward_prop_step(x_t, o_t_prev, s_t_prev, U, V, W, Q):
        u = T.dot(x_t,U)
        s_t = T.tanh(u+T.dot(s_t_prev,W))
        m = T.dot(o_t_prev,Q)
        mm = T.dot(s_t,V)
        SSS = mm
        o_t = T.nnet.softmax(SSS)
        q_t = o_t[0]
        return [q_t, s_t, m]

R = np.zeros(self.hidden_dim)
init = theano.shared(R)
S = np.zeros(self.word_dim)
init_S = theano.shared(S)
[o,s,loorky], updates = theano.scan(
        forward_prop_step,
        sequences=x,
        outputs_info=[dict(initial=init_S),dict(initial=init),None],
        non_sequences=[U, V, W, Q],
        truncate_gradient=self.bptt_truncate,
        strict=False)

self.my_forward_propagation = theano.function([x], [o,s,loorky])
aaa = np.zeros((1,3))+1
print self.my_forward_propagation(aaa)

当我从 return 语句(以及相应的 loorky 变量加上 outputs_info 中的最后一个 None 中省略输出 m 时,一切都是美好的。如果包括在内,我会收到一条错误消息 ValueError: Shape mismatch: A.shape[1] != x.shape[0]

表单实现不清楚,无法判断您的代码中有什么问题。 你能检查一下这里的行吗

o_t = T.nnet.softmax(T.dot(o_t_prev,Q)+T.dot(s_t,V))

什么是Q维度,是否适用加入s_t