Search code examples
numpykerasnlpword-embedding

Error with input shape in Keras while training CBOW model


I am training a continuous bowl of words model for word embeddings where each of the one-hot vectors has a shape is a column vector with the shape of (V, 1). I'm using a generator to generate training examples and labels based on the corpus, but I have an error with the input shape.

(Here V = 5778)

Here's my code:

def windows(words, C):
    i = C
    while len(words) - i > C:
        center = words[i]
        context_words = words[i-C:i] + words[i+1:i+C+1]
        i += 1
        yield context_words, center

def one_hot_rep(word, word_to_index, V):
    vec = np.zeros((V, 1))
    vec[word_to_index[word]] = 1
    return vec

def context_to_one_hot(words, word_to_index, V):
    arr = [one_hot_rep(w, word_to_index, V) for w in words]
    return np.mean(arr, axis=0)
def get_training_examples(words, C, words_to_index, V):
    for context_words, center_word in windows(words, C):
        yield context_to_one_hot(context_words, words_to_index, V), one_hot_rep(center_word, words_to_index, V)
V = len(vocab)
N = 50

w2i, i2w = build_dict(vocab)

model = keras.models.Sequential([
    keras.layers.Flatten(input_shape=(V, )),
    keras.layers.Dense(units=N, activation='relu'),
    keras.layers.Dense(units=V, activation='softmax')
])

model.compile(optimizer='adam', loss='categorical_crossentropy', metrics=['accuracy'])

model.fit_generator(get_training_examples(data, 2, w2i, V), epochs=5, steps_per_epoch=20)

Error I'm getting


Solution

  • I figured out what was causing the error. The model is expecting an input_shape = (None, V) where None holds the batch_size for Keras when the training starts but I was sending in arrays of shape (1, V) which when sent as a batch get an extra first dimension something like (128, 1, V) is being sent which clashes with that of the expected input_shape.