O'Reilly logo

Hands-On Natural Language Processing with Python by Rajalingappaa Shanmugamani, Rajesh Arumugam

Stay ahead with the world's most comprehensive technology and business learning platform.

With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, tutorials, and more.

Start Free Trial

No credit card required

Word2vec

Vector representations of words allow for a continuous representation of semantically similar words, wherein words that are related to one another are mapped to points that are close to each other in a high dimensional space. Such an approach to word representations builds on the fact that words that share similar contexts also share semantic meanings. Word2vec is one such model, trying to directly predict a word by using its neighbors, learning small but dense vectors called embeddings. Word2vec is also a computationally efficient, unsupervised model that learns word embeddings from raw text. In order to learn these dense vectors, Word2vec is available in two flavors: the CBOW model and the skip-gram model (proposed by Mikolov et ...

With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, interactive tutorials, and more.

Start Free Trial

No credit card required