I have a network with keras / tf where two branches are built: - one where a short sequence of words is transformed into 300-dim embeddings - the other where the same sequence of words is transformed into ngrams
I then end up with two data structures:
termwords.shape = (?, 42, 300)
termngrams.shape = (?, 42)
(I make sure that both branches have the same 'length' of 42, ie. maximally 42 words and maximally 42 ngrams, padding/cutting where needed). I'd then need to merge these into one branch to arrive at the prediction layer.
But
merged = merge([termwords, termngrams], mode='concat')
tells me that the ranks don't match. I was hoping concat would allow me to append the 'termngrams' to the 'termwords' such that I end up just with a data structure of shape (?,42,301). But I can't find the proper way to express that.
The "rank" error is telling you that the tensors don't have the same number of dimensions. One is 2D and the other is 3D.
Use a Lambda
layer with expand_dims
to add an extra dimension to the 2D one.
import keras.backend as K
from keras.layers import Lambda
termngrams = Lambda(lambda x: K.expand_dims(x))(termngrams) #outputs (?,42,1)
Then use a Contatenate() layer (by default it uses the last axis, as you want).
merged = Concatenate()([termwords,termngrams])
(Assuming you're using a functional API Model
instead of sequential models, sequential models aren't good for branching)