The documentation of the Embedding
layer (https://www.cntk.ai/pythondocs/layerref.html#embedding) shows that it can be initialized with pretrained embeddings using the weights
parameter, but these embeddings are not updated during training.
Is there a way to initialize the Embedding
layer with pretrained embeddings and still update them during training?
If not, what's the most efficient way to do batch embeddings look up with one hot vectors?
Yes, just pass the initial values the init
argument instead. That will create a learnable parameter initialized with the array you pass in.