Search code examples
pythontheanorecurrent-neural-network

scan function in theano, recurrent neural net


I've been trying to use scan in theano for implementing a RNN (the example is adapted from here: https://github.com/valentin012/conspeech/blob/master/rnn_theano.py)

def forward_prop_step(x_t, s_t_prev, U, V, W):
    u = T.dot(x_t,U)
    s_t = T.tanh(u+T.dot(s_t_prev,W)) 
    o_t = T.nnet.softmax(T.dot(s_t,V))
    return [o_t[0], s_t]
Q = np.zeros(self.hidden_dim)
init = theano.shared(Q)
[o,s], updates = theano.scan(
    forward_prop_step,
    sequences=x,
    outputs_info=[None, dict(initial=init)],
    non_sequences=[U, V, W],
    truncate_gradient=self.bptt_truncate,
    strict=False)

Now, what I tried to do is implement an RNN where the output variables directly influence each other (o_{t-1} and o_t are linked by weights). I tried to implement it like this:

def forward_prop_step(x_t, s_t_prev, o_t_prev, U, V, W, Q):
    u = T.dot(x_t,U)
    s_t = T.tanh(u+T.dot(s_t_prev,W)) 
    o_t = T.nnet.softmax(T.dot(o_t_prev,Q)+T.dot(s_t,V))
    return [o_t[0], s_t, o_t[0]]
R = np.zeros(self.hidden_dim)
init = theano.shared(R)
S = np.zeros(self.word_dim)
init_S = theano.shared(S)
[o,s,op], updates = theano.scan(
    forward_prop_step,
    sequences=x,
    outputs_info=[None, dict(initial=init), dict(initial=init_S)],
    non_sequences=[U, V, W, Q],
    truncate_gradient=self.bptt_truncate,
    strict=False)

However, it doesn't work and I don't know how to fix it.

The error message is:

File "theano/scan_module/scan_perform.pyx", line 397, in theano.scan_module.scan_perform.perform (/home/mertens/.theano/compiledir_Linux-3.2--amd64-x86_64-with-debian-7.6--2.7.9-64/scan_perform/mod.cpp:4193) ValueError: Shape mismatch: A.shape[1] != x.shape[0] Apply node that caused the error: CGemv{inplace}(AllocEmpty{dtype='float64'}.0, TensorConstant{1.0}, Q_copy.T, , TensorConstant{0.0}) Toposort index: 10

Edit This is the exact code:

word_dim=3
hidden_dim=4

U = np.random.uniform(-np.sqrt(1./word_dim), np.sqrt(1./word_dim), (word_dim,hidden_dim))
V = np.random.uniform(-np.sqrt(1./hidden_dim), np.sqrt(1./hidden_dim), (hidden_dim,word_dim))
W = np.random.uniform(-np.sqrt(1./hidden_dim), np.sqrt(1./hidden_dim), (hidden_dim, hidden_dim))
Q = np.random.uniform(-np.sqrt(1./word_dim), np.sqrt(1./word_dim), (word_dim, word_dim))

U = theano.shared(name='U', value=U.astype(theano.config.floatX))
V = theano.shared(name='V', value=V.astype(theano.config.floatX))
W = theano.shared(name='W', value=W.astype(theano.config.floatX))
Q = theano.shared(name='Q', value=W.astype(theano.config.floatX))

def forward_prop_step(x_t, o_t_prev, s_t_prev, U, V, W, Q):
        u = T.dot(x_t,U)
        s_t = T.tanh(u+T.dot(s_t_prev,W))
        m = T.dot(o_t_prev,Q)
        mm = T.dot(s_t,V)
        SSS = mm
        o_t = T.nnet.softmax(SSS)
        q_t = o_t[0]
        return [q_t, s_t, m]

R = np.zeros(self.hidden_dim)
init = theano.shared(R)
S = np.zeros(self.word_dim)
init_S = theano.shared(S)
[o,s,loorky], updates = theano.scan(
        forward_prop_step,
        sequences=x,
        outputs_info=[dict(initial=init_S),dict(initial=init),None],
        non_sequences=[U, V, W, Q],
        truncate_gradient=self.bptt_truncate,
        strict=False)

self.my_forward_propagation = theano.function([x], [o,s,loorky])
aaa = np.zeros((1,3))+1
print self.my_forward_propagation(aaa)

When I omit the output m from the return statement (and correspondingly the loorky variable plus the last None in outputs_info) everything is fine. If this is included, I get an error message ValueError: Shape mismatch: A.shape[1] != x.shape[0]


Solution

  • Form the implementation is not clear to tell what's wrong in your code. Could you check the line here

    o_t = T.nnet.softmax(T.dot(o_t_prev,Q)+T.dot(s_t,V))
    

    what is Q dimension and if it is applicable to be added to s_t