February 2019
Beginner to intermediate
308 pages
7h 42m
English
Word embeddings are a learned form of vector representation for words. The main advantage of word embeddings is that they have fewer dimensions than the one-hot encoded representation, and they place similar words close to one another.
The following diagram shows an example of a word embedding:

Notice that the learned word embedding knows that the words "Elated", "Happy", and "Excited" are similar words, and hence should be placed near each other. Similarly, the words "Sad", "Disappointed", "Angry", and "Furious" are on the opposite ends of the spectrum, and should be placed far away.
We won't go into detail regarding the creation ...