Search code examples
tensorflowkerasdeep-learninghuggingface-transformersbert-language-model

ValueError: Layer weight shape (30522, 768) not compatible with provided weight shape ()


I got word-embedding using BERT and need to feed it as an embedding layer in the Keras model, and the error I got is

ValueError: Layer weight shape (30522, 768) not compatible with provided weight shape ()

the model is

embedding = Embedding(30522, 768, mask_zero=True)(sentence)
model.layers[1].set_weights([embedding_matrix])

Solution

  • You are passing to set_weights a list of list:

    embedding_matrix = [np.random.uniform(0,1, (30522, 768))]
    
    sentence = Input((20,))
    embedding = Embedding(30522, 768, mask_zero=True)(sentence)
    model = Model(sentence, embedding)
    
    model.layers[1].set_weights([embedding_matrix])
    

    while you should simply pass a list of arrays:

    embedding_matrix = np.random.uniform(0,1, (30522, 768))
    
    sentence = Input((20,))
    embedding = Embedding(30522, 768, mask_zero=True)(sentence)
    model = Model(sentence, embedding)
    
    model.layers[1].set_weights([embedding_matrix])