May 2019
Intermediate to advanced
664 pages
15h 41m
English
Word2vec was developed by Tomas Mikolov, et al. at Google in 2013 as a response to making the neural-network-based training of the embedding more efficient, and since then it has become the de facto standard for developing pretrained word embedding.
Word2vec introduced the following two different learning models to learn the word embedding:
Both CBOW and Skip-Gram methods of learning are focused on learning the words given their local usage ...