I want to train label embedding myself, (yes, label embedding like word embedding, but input is one hot vector of label)
When I found chainer.links.EmbedID
I found example in official document, it must pass W in it.
How to train embbeding W matrix, then later we can use it to train another model?
I mean, How to train embedding vector representation of word/ label ?
You don't need to take 2 step (train embedding followed by train another model), but you can train embedding in end-to-end way. Once you obtrained embed vector from categorical value, you can connect it to usual neural network to train loss as usual.
Word2vec is one official example which uses EmbedID
: