4
Advanced Word Vector Algorithms
In Chapter 3, Word2vec – Learning Word Embeddings, we introduced you to Word2vec, the basics of learning word embeddings, and the two common Word2vec algorithms: skip-gram and CBOW. In this chapter, we will discuss several other word vector algorithms:
- GloVe – Global Vectors
- ELMo – Embeddings from Language Models
- Document classification with ELMo
First, you will learn a word embedding learning technique known as Global Vectors (GloVe) and the specific advantages that GloVe has over skip-gram and CBOW.
You will also look at a recent approach for representing language called Embeddings from Language Models (ELMo). ELMo has an edge over other algorithms as it is able to disambiguate words, as well as capture ...
Get Natural Language Processing with TensorFlow - Second Edition now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.