I have been working with pretrained embeddings (Glove) and would like to allow these to be finetuned. I currently use embeddings like this:
word_embeddingsA = nn.Embedding(vocab_size, embedding_length)
word_embeddingsA.weight = nn.Parameter(TEXT.vocab.vectors, requires_grad=False)
Should I simply set requires_grad=True to allow the embeddings to be trained? Or should I do something like this
word_embeddingsA = nn.Embedding.from_pretrained(TEXT.vocab.vectors, freeze=False)
Are these equivalent, and do I have a way to check that the embeddings are getting trained?
Yes they are equivalent as states in embedding:
freeze (boolean, optional)
– IfTrue
, the tensor does not get updated in the learning process. Equivalent toembedding.weight.requires_grad = False
. Default:True
If word_embeddingsA.requires_grad == True
, then embedding is getting trained, else it's not.