July 2018
Beginner to intermediate
312 pages
8h 31m
English
Given a sequence of word IDs as input, the embedding layer transforms these into an output list of dense word vectors. The word vectors capture the semantics of the words, as we have seen in Chapter 3, Semantic Embedding using Shallow Models. In the deep learning frameworks such as TensorFlow, this part is usually handled by an embedding lookup layer which stores a lookup table to map the words represented by numeric IDs to their dense vector representations.
Read now
Unlock full access