Suppose I have training data X, Y where X has shape (n,m,p)
I want to set up a neural network which applies a RNN (followed by a dense layer) given by f to each i-slice (i,a,b) and outputs f(m,x) which has shape (p') then concatenates all the output slices (presumably using tf.map_fn(f,X)) to form a vector of dimension (n,p') then runs the next neural network on (n,p').
Essentially something similar to: X' = tf.map_fn(f,X) Y= g(X')
I am having difficulty getting my head around how to prepare my training data, which shape should X, X' (and later Z) should be.
Further more what if I wanted to merge X' with another dataset, say Z? Y = g(X' concat Z)
I think you don't need map_fn, you need tf.dynamic_rnn instead. It takes an RNN cell (so it knows what's the output and what's the state) and returns the concatenated outputs and concatenated states.