I have a sequence of Nine 2000
dimensional vectors as o/p from a 2 bidirectional lstms. I'm merging them to get nine 4000 dim
vectors.
I need to get each of those 4000 dimensional vector and feed each of them into shared fully connected layer. How can I do this? Right now I'm reshaping merge o/p to feed into shared fully connected layer. But I don't know if this is needed?
I'm getting this error when I try to model entire network to take multiple i/p and produce multiple o/p as mentioned in this link
Code can be found here.
# we can then concatenate the two vectors:
N=3
merge_cv = merge([top_out, btm_out], mode='concat')#concat_axis=2 or -1 (last dim axis)
cv = Reshape((9,1, 4000))(merge_cv) # we want 9 vectors of dimension 4000 each for sharedfc_out below
#number of output classes per cell
n_classes = 80
sharedfc_out= Dense(output_dim=n_classes,input_dim=4000,activation='relu')
#partial counts
#pc = np.ndarray(shape=(1,n_classes), dtype=float)
#cells_pc = np.array([[pc for j in range(N)] for i in range(N)])
outpc=[]
for i in range(N):
for j in range(N):
# cells_pc[i][j] = sharedfc_out(cv[N*i+j])
outpc.append(sharedfc_out(cv[0][N*i+j]))
# out=merge(outpc,mode='concat')
# out2=Reshape(720)(out)
model = Model(input=cells_in, output=outpc)
dimensions of bi=lstm o/p
>>> merge_cv.shape
TensorShape([Dimension(1), Dimension(None), Dimension(4000)])
>>> cv.shape
TensorShape([Dimension(None), Dimension(9), Dimension(1), Dimension(4000)])
for the last line I'm getting type error.
TypeError Traceback (most recent call last)
in ()
----> 1 model = Model(input=cells_in, output=outpc)
/home/jkl/anaconda3/lib/python3.5/site-packages/keras/engine/topology.py in __init__(self, input, output, name)
1814 cls_name = self.__class__.__name__
1815 raise TypeError('Output tensors to a ' + cls_name + ' must be '
-> 1816 'Keras tensors. Found: ' + str(x))
1817 # Build self.output_layers:
1818 for x in self.outputs:
TypeError: Output tensors to a Model must be Keras tensors. Found: Tensor("Relu_9:0", shape=(1, 80), dtype=float32)
So in the end it turned out that the problem lied in wrong list slicing which ended up with passing None
as a layer to a list which was then merged into one input. After repairing this and making slicing consistent - problem was solved.