0

The word2vec model saves its layer weights as embeddings. But do CBOW and skipgram both store the input layer weights?

I know they learn different embeddings for the words in the context and for the word as center word. CBOW features the context embeddings in the input layer and Skip-gram the center words.

I had a look at the C-Code and it looks like they are both storing the input layer syn0 as word vectors.

Am I right in my understanding?

Are the CBOW vectors therefore the context vectors and not the center word vectors as in skip-gram?

Why does CBOW not store the center word embeddings (output layer)?

Thank you

Felix
  • 1

1 Answers1

0

As far as I know, both input and output weight layers could be used as word embeddings, and the input one is used by convention.

One of the answers here refers to a paper in which both input and output weights are used (Section 2 in A Dual Embedding Space Model for Document Ranking).

sroca
  • 11
  • 4
  • Thank you! So in CBOW the embeddings are the ones from the context words by convention? Thats what I thought. – Felix Feb 09 '19 at 12:32